WorldWideScience

Sample records for global probabilistic approach

  1. Global/local methods for probabilistic structural analysis

    Science.gov (United States)

    Millwater, H. R.; Wu, Y.-T.

    1993-04-01

    A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.

  2. Global assessment of predictability of water availability: A bivariate probabilistic Budyko analysis

    Science.gov (United States)

    Wang, Weiguang; Fu, Jianyu

    2018-02-01

    Estimating continental water availability is of great importance for water resources management, in terms of maintaining ecosystem integrity and sustaining society development. To more accurately quantify the predictability of water availability, on the basis of univariate probabilistic Budyko framework, a bivariate probabilistic Budyko approach was developed using copula-based joint distribution model for considering the dependence between parameter ω of Wang-Tang's equation and the Normalized Difference Vegetation Index (NDVI), and was applied globally. The results indicate the predictive performance in global water availability is conditional on the climatic condition. In comparison with simple univariate distribution, the bivariate one produces the lower interquartile range under the same global dataset, especially in the regions with higher NDVI values, highlighting the importance of developing the joint distribution by taking into account the dependence structure of parameter ω and NDVI, which can provide more accurate probabilistic evaluation of water availability.

  3. A global empirical system for probabilistic seasonal climate prediction

    Science.gov (United States)

    Eden, J. M.; van Oldenborgh, G. J.; Hawkins, E.; Suckling, E. B.

    2015-12-01

    Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.

  4. Probabilistic Approach to Enable Extreme-Scale Simulations under Uncertainty and System Faults. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Knio, Omar [Duke Univ., Durham, NC (United States). Dept. of Mechanical Engineering and Materials Science

    2017-05-05

    The current project develops a novel approach that uses a probabilistic description to capture the current state of knowledge about the computational solution. To effectively spread the computational effort over multiple nodes, the global computational domain is split into many subdomains. Computational uncertainty in the solution translates into uncertain boundary conditions for the equation system to be solved on those subdomains, and many independent, concurrent subdomain simulations are used to account for this bound- ary condition uncertainty. By relying on the fact that solutions on neighboring subdomains must agree with each other, a more accurate estimate for the global solution can be achieved. Statistical approaches in this update process make it possible to account for the effect of system faults in the probabilistic description of the computational solution, and the associated uncertainty is reduced through successive iterations. By combining all of these elements, the probabilistic reformulation allows splitting the computational work over very many independent tasks for good scalability, while being robust to system faults.

  5. A Markov Chain Approach to Probabilistic Swarm Guidance

    Science.gov (United States)

    Acikmese, Behcet; Bayard, David S.

    2012-01-01

    This paper introduces a probabilistic guidance approach for the coordination of swarms of autonomous agents. The main idea is to drive the swarm to a prescribed density distribution in a prescribed region of the configuration space. In its simplest form, the probabilistic approach is completely decentralized and does not require communication or collabo- ration between agents. Agents make statistically independent probabilistic decisions based solely on their own state, that ultimately guides the swarm to the desired density distribution in the configuration space. In addition to being completely decentralized, the probabilistic guidance approach has a novel autonomous self-repair property: Once the desired swarm density distribution is attained, the agents automatically repair any damage to the distribution without collaborating and without any knowledge about the damage.

  6. Using probabilistic finite automata to simulate hourly series of global radiation

    Energy Technology Data Exchange (ETDEWEB)

    Mora-Lopez, L. [Universidad de Malaga (Spain). Dpto. Lenguajes y Computacion; Sidrach-de-Cardona, M. [Universidad de Malaga (Spain). Dpto. Fisica Aplicada II

    2003-03-01

    A model to generate synthetic series of hourly exposure of global radiation is proposed. This model has been constructed using a machine learning approach. It is based on the use of a subclass of probabilistic finite automata which can be used for variable-order Markov processes. This model allows us to represent the different relationships and the representative information observed in the hourly series of global radiation; the variable-order Markov process can be used as a natural way to represent different types of days, and to take into account the ''variable memory'' of cloudiness. A method to generate new series of hourly global radiation, which incorporates the randomness observed in recorded series, is also proposed. As input data this method only uses the mean monthly value of the daily solar global radiation. We examine if the recorded and simulated series are similar. It can be concluded that both series have the same statistical properties. (author)

  7. Probabilistic approaches for geotechnical site characterization and slope stability analysis

    CERN Document Server

    Cao, Zijun; Li, Dianqing

    2017-01-01

    This is the first book to revisit geotechnical site characterization from a probabilistic point of view and provide rational tools to probabilistically characterize geotechnical properties and underground stratigraphy using limited information obtained from a specific site. This book not only provides new probabilistic approaches for geotechnical site characterization and slope stability analysis, but also tackles the difficulties in practical implementation of these approaches. In addition, this book also develops efficient Monte Carlo simulation approaches for slope stability analysis and implements these approaches in a commonly available spreadsheet environment. These approaches and the software package are readily available to geotechnical practitioners and alleviate them from reliability computational algorithms. The readers will find useful information for a non-specialist to determine project-specific statistics of geotechnical properties and to perform probabilistic analysis of slope stability.

  8. A probabilistic approach to examine the impacts of mitigation policies on future global PM emissions from on-road vehicles

    Science.gov (United States)

    Yan, F.; Winijkul, E.; Bond, T. C.; Streets, D. G.

    2012-12-01

    There is deficiency in the determination of emission reduction potential in the future, especially with consideration of uncertainty. Mitigation measures for some economic sectors have been proposed, but few studies provide an evaluation of the amount of PM emission reduction that can be obtained in future years by different emission reduction strategies. We attribute the absence of helpful mitigation strategy analysis to limitations in the technical detail of future emission scenarios, which result in the inability to relate technological or regulatory intervention to emission changes. The purpose of this work is to provide a better understanding of the potential benefits of mitigation policies in addressing global and regional emissions. In this work, we introduce a probabilistic approach to explore the impacts of retrofit and scrappage on global PM emissions from on-road vehicles in the coming decades. This approach includes scenario analysis, sensitivity analysis and Monte Carlo simulations. A dynamic model of vehicle population linked to emission characteristics, SPEW-Trend, is used to estimate future emissions and make policy evaluations. Three basic questions will be answered in this work: (1) what contribution can these two programs make to improve global emissions in the future? (2) in which regions are such programs most and least effective in reducing emissions and what features of the vehicle fleet cause these results? (3) what is the level of confidence in the projected emission reductions, given uncertain parameters in describing the dynamic vehicle fleet?

  9. Assessing uncertainties in global cropland futures using a conditional probabilistic modelling framework

    NARCIS (Netherlands)

    Engström, Kerstin; Olin, Stefan; Rounsevell, Mark D A; Brogaard, Sara; Van Vuuren, Detlef P.; Alexander, Peter; Murray-Rust, Dave; Arneth, Almut

    2016-01-01

    We present a modelling framework to simulate probabilistic futures of global cropland areas that are conditional on the SSP (shared socio-economic pathway) scenarios. Simulations are based on the Parsimonious Land Use Model (PLUM) linked with the global dynamic vegetation model LPJ-GUESS

  10. The probabilistic approach and the deterministic licensing procedure

    International Nuclear Information System (INIS)

    Fabian, H.; Feigel, A.; Gremm, O.

    1984-01-01

    If safety goals are given, the creativity of the engineers is necessary to transform the goals into actual safety measures. That is, safety goals are not sufficient for the derivation of a safety concept; the licensing process asks ''What does a safe plant look like.'' The answer connot be given by a probabilistic procedure, but need definite deterministic statements; the conclusion is, that the licensing process needs a deterministic approach. The probabilistic approach should be used in a complementary role in cases where deterministic criteria are not complete, not detailed enough or not consistent and additional arguments for decision making in connection with the adequacy of a specific measure are necessary. But also in these cases the probabilistic answer has to be transformed into a clear deterministic statement. (orig.)

  11. Standardized approach for developing probabilistic exposure factor distributions

    Energy Technology Data Exchange (ETDEWEB)

    Maddalena, Randy L.; McKone, Thomas E.; Sohn, Michael D.

    2003-03-01

    The effectiveness of a probabilistic risk assessment (PRA) depends critically on the quality of input information that is available to the risk assessor and specifically on the probabilistic exposure factor distributions that are developed and used in the exposure and risk models. Deriving probabilistic distributions for model inputs can be time consuming and subjective. The absence of a standard approach for developing these distributions can result in PRAs that are inconsistent and difficult to review by regulatory agencies. We present an approach that reduces subjectivity in the distribution development process without limiting the flexibility needed to prepare relevant PRAs. The approach requires two steps. First, we analyze data pooled at a population scale to (1) identify the most robust demographic variables within the population for a given exposure factor, (2) partition the population data into subsets based on these variables, and (3) construct archetypal distributions for each subpopulation. Second, we sample from these archetypal distributions according to site- or scenario-specific conditions to simulate exposure factor values and use these values to construct the scenario-specific input distribution. It is envisaged that the archetypal distributions from step 1 will be generally applicable so risk assessors will not have to repeatedly collect and analyze raw data for each new assessment. We demonstrate the approach for two commonly used exposure factors--body weight (BW) and exposure duration (ED)--using data for the U.S. population. For these factors we provide a first set of subpopulation based archetypal distributions along with methodology for using these distributions to construct relevant scenario-specific probabilistic exposure factor distributions.

  12. Probabilistic logics and probabilistic networks

    CERN Document Server

    Haenni, Rolf; Wheeler, Gregory; Williamson, Jon; Andrews, Jill

    2014-01-01

    Probabilistic Logic and Probabilistic Networks presents a groundbreaking framework within which various approaches to probabilistic logic naturally fit. Additionally, the text shows how to develop computationally feasible methods to mesh with this framework.

  13. Probabilistic Forecasting of Photovoltaic Generation: An Efficient Statistical Approach

    DEFF Research Database (Denmark)

    Wan, Can; Lin, Jin; Song, Yonghua

    2017-01-01

    This letter proposes a novel efficient probabilistic forecasting approach to accurately quantify the variability and uncertainty of the power production from photovoltaic (PV) systems. Distinguished from most existing models, a linear programming based prediction interval construction model for P...... power generation is proposed based on extreme learning machine and quantile regression, featuring high reliability and computational efficiency. The proposed approach is validated through the numerical studies on PV data from Denmark.......This letter proposes a novel efficient probabilistic forecasting approach to accurately quantify the variability and uncertainty of the power production from photovoltaic (PV) systems. Distinguished from most existing models, a linear programming based prediction interval construction model for PV...

  14. A tiered approach for probabilistic ecological risk assessment of contaminated sites

    International Nuclear Information System (INIS)

    Zolezzi, M.; Nicolella, C.; Tarazona, J.V.

    2005-01-01

    This paper presents a tiered methodology for probabilistic ecological risk assessment. The proposed approach starts from deterministic comparison (ratio) of single exposure concentration and threshold or safe level calculated from a dose-response relationship, goes through comparison of probabilistic distributions that describe exposure values and toxicological responses of organisms to the chemical of concern, and finally determines the so called distribution-based quotients (DBQs). In order to illustrate the proposed approach, soil concentrations of 1,2,4-trichlorobenzene (1,2,4- TCB) measured in an industrial contaminated site were used for site-specific probabilistic ecological risks assessment. By using probabilistic distributions, the risk, which exceeds a level of concern for soil organisms with the deterministic approach, is associated to the presence of hot spots reaching concentrations able to affect acutely more than 50% of the soil species, while the large majority of the area presents 1,2,4- TCB concentrations below those reported as toxic [it

  15. Probabilistic approach to EMP assessment

    International Nuclear Information System (INIS)

    Bevensee, R.M.; Cabayan, H.S.; Deadrick, F.J.; Martin, L.C.; Mensing, R.W.

    1980-09-01

    The development of nuclear EMP hardness requirements must account for uncertainties in the environment, in interaction and coupling, and in the susceptibility of subsystems and components. Typical uncertainties of the last two kinds are briefly summarized, and an assessment methodology is outlined, based on a probabilistic approach that encompasses the basic concepts of reliability. It is suggested that statements of survivability be made compatible with system reliability. Validation of the approach taken for simple antenna/circuit systems is performed with experiments and calculations that involve a Transient Electromagnetic Range, numerical antenna modeling, separate device failure data, and a failure analysis computer program

  16. A probabilistic approach to delineating functional brain regions

    DEFF Research Database (Denmark)

    Kalbitzer, Jan; Svarer, Claus; Frokjaer, Vibe G

    2009-01-01

    The purpose of this study was to develop a reliable observer-independent approach to delineating volumes of interest (VOIs) for functional brain regions that are not identifiable on structural MR images. The case is made for the raphe nuclei, a collection of nuclei situated in the brain stem known...... to be densely packed with serotonin transporters (5-hydroxytryptaminic [5-HTT] system). METHODS: A template set for the raphe nuclei, based on their high content of 5-HTT as visualized in parametric (11)C-labeled 3-amino-4-(2-dimethylaminomethyl-phenylsulfanyl)-benzonitrile PET images, was created for 10...... healthy subjects. The templates were subsequently included in the region sets used in a previously published automatic MRI-based approach to create an observer- and activity-independent probabilistic VOI map. The probabilistic map approach was tested in a different group of 10 subjects and compared...

  17. Learning Probabilistic Logic Models from Probabilistic Examples.

    Science.gov (United States)

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2008-10-01

    We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.

  18. The probabilistic approach in the licensing process and the development of probabilistic risk assessment methodology in Japan

    International Nuclear Information System (INIS)

    Togo, Y.; Sato, K.

    1981-01-01

    The probabilistic approach has long seemed to be one of the most comprehensive methods for evaluating the safety of nuclear plants. So far, most of the guidelines and criteria for licensing are based on the deterministic concept. However, there have been a few examples to which the probabilistic approach was directly applied, such as the evaluation of aircraft crashes and turbine missiles. One may find other examples of such applications. However, a much more important role is now to be played by this concept, in implementing the 52 recommendations from the lessons learned from the TMI accident. To develop the probabilistic risk assessment methodology most relevant to Japanese situations, a five-year programme plan has been adopted and is to be conducted by the Japan Atomic Research Institute from fiscal 1980. Various problems have been identified and are to be solved through this programme plan. The current status of developments is described together with activities outside the government programme. (author)

  19. A sampling-based approach to probabilistic pursuit evasion

    KAUST Repository

    Mahadevan, Aditya; Amato, Nancy M.

    2012-01-01

    Probabilistic roadmaps (PRMs) are a sampling-based approach to motion-planning that encodes feasible paths through the environment using a graph created from a subset of valid positions. Prior research has shown that PRMs can be augmented

  20. A random probabilistic approach to seismic nuclear power plant analysis

    International Nuclear Information System (INIS)

    Romo, M.P.

    1985-01-01

    A probabilistic method for the seismic analysis of structures which takes into account the random nature of earthquakes and of the soil parameter uncertainties is presented in this paper. The method was developed combining elements of the theory of perturbations, the Random vibration theory and the complex response method. The probabilistic method is evaluated by comparing the responses of a single degree of freedom system computed with this approach and the Monte Carlo method. (orig.)

  1. Application of probabilistic risk based optimization approaches in environmental restoration

    International Nuclear Information System (INIS)

    Goldammer, W.

    1995-01-01

    The paper presents a general approach to site-specific risk assessments and optimization procedures. In order to account for uncertainties in the assessment of the current situation and future developments, optimization parameters are treated as probabilistic distributions. The assessments are performed within the framework of a cost-benefit analysis. Radiation hazards and conventional risks are treated within an integrated approach. Special consideration is given to consequences of low probability events such as, earthquakes or major floods. Risks and financial costs are combined to an overall figure of detriment allowing one to distinguish between benefits of available reclamation options. The probabilistic analysis uses a Monte Carlo simulation technique. The paper demonstrates the applicability of this approach in aiding the reclamation planning using an example from the German reclamation program for uranium mining and milling sites

  2. PROBABILISTIC APPROACH TO OBJECT DETECTION AND RECOGNITION FOR VIDEOSTREAM PROCESSING

    Directory of Open Access Journals (Sweden)

    Volodymyr Kharchenko

    2017-07-01

    Full Text Available Purpose: The represented research results are aimed to improve theoretical basics of computer vision and artificial intelligence of dynamical system. Proposed approach of object detection and recognition is based on probabilistic fundamentals to ensure the required level of correct object recognition. Methods: Presented approach is grounded at probabilistic methods, statistical methods of probability density estimation and computer-based simulation at verification stage of development. Results: Proposed approach for object detection and recognition for video stream data processing has shown several advantages in comparison with existing methods due to its simple realization and small time of data processing. Presented results of experimental verification look plausible for object detection and recognition in video stream. Discussion: The approach can be implemented in dynamical system within changeable environment such as remotely piloted aircraft systems and can be a part of artificial intelligence in navigation and control systems.

  3. A global probabilistic tsunami hazard assessment from earthquake sources

    Science.gov (United States)

    Davies, Gareth; Griffin, Jonathan; Lovholt, Finn; Glimsdal, Sylfest; Harbitz, Carl; Thio, Hong Kie; Lorito, Stefano; Basili, Roberto; Selva, Jacopo; Geist, Eric L.; Baptista, Maria Ana

    2017-01-01

    Large tsunamis occur infrequently but have the capacity to cause enormous numbers of casualties, damage to the built environment and critical infrastructure, and economic losses. A sound understanding of tsunami hazard is required to underpin management of these risks, and while tsunami hazard assessments are typically conducted at regional or local scales, globally consistent assessments are required to support international disaster risk reduction efforts, and can serve as a reference for local and regional studies. This study presents a global-scale probabilistic tsunami hazard assessment (PTHA), extending previous global-scale assessments based largely on scenario analysis. Only earthquake sources are considered, as they represent about 80% of the recorded damaging tsunami events. Globally extensive estimates of tsunami run-up height are derived at various exceedance rates, and the associated uncertainties are quantified. Epistemic uncertainties in the exceedance rates of large earthquakes often lead to large uncertainties in tsunami run-up. Deviations between modelled tsunami run-up and event observations are quantified, and found to be larger than suggested in previous studies. Accounting for these deviations in PTHA is important, as it leads to a pronounced increase in predicted tsunami run-up for a given exceedance rate.

  4. Operational intervention levels in a nuclear emergency, general concepts and a probabilistic approach

    International Nuclear Information System (INIS)

    Lauritzen, B.; Baeverstam, U.; Naadland Holo, E.; Sinkko, K.

    1997-12-01

    This report deals with Operational Intervention Levels (OILs) in a nuclear or radiation emergency. OILs are defined as the values of environmental measurements, in particular dose rate measurements, above which specific protective actions should be carried out in emergency exposure situations. The derivation and the application of OILs are discussed, and an overview of the presently adopted values is provided, with emphasis on the situation in the Nordic countries. A new, probabilistic approach to derive OILs is presented and the method is illustrated by calculating dose rate OILs in a simplified setting. Contrary to the standard approach, the probabilistic approach allows for optimization of OILs. It is argued, that optimized OILs may be much larger than the presently adopted or suggested values. It is recommended, that the probabilistic approach is further developed and employed in determining site specific OILs and in optimizing environmental measuring strategies. (au)

  5. Probabilistic approach to mechanisms

    CERN Document Server

    Sandler, BZ

    1984-01-01

    This book discusses the application of probabilistics to the investigation of mechanical systems. The book shows, for example, how random function theory can be applied directly to the investigation of random processes in the deflection of cam profiles, pitch or gear teeth, pressure in pipes, etc. The author also deals with some other technical applications of probabilistic theory, including, amongst others, those relating to pneumatic and hydraulic mechanisms and roller bearings. Many of the aspects are illustrated by examples of applications of the techniques under discussion.

  6. Probabilistic approach to manipulator kinematics and dynamics

    International Nuclear Information System (INIS)

    Rao, S.S.; Bhatti, P.K.

    2001-01-01

    A high performance, high speed robotic arm must be able to manipulate objects with a high degree of accuracy and repeatability. As with any other physical system, there are a number of factors causing uncertainties in the behavior of a robotic manipulator. These factors include manufacturing and assembling tolerances, and errors in the joint actuators and controllers. In order to study the effect of these uncertainties on the robotic end-effector and to obtain a better insight into the manipulator behavior, the manipulator kinematics and dynamics are modeled using a probabilistic approach. Based on the probabilistic model, kinematic and dynamic performance criteria are defined to provide measures of the behavior of the robotic end-effector. Techniques are presented to compute the kinematic and dynamic reliabilities of the manipulator. The effects of tolerances associated with the various manipulator parameters on the reliabilities are studied. Numerical examples are presented to illustrate the procedures

  7. A probabilistic approach for representation of interval uncertainty

    International Nuclear Information System (INIS)

    Zaman, Kais; Rangavajhala, Sirisha; McDonald, Mark P.; Mahadevan, Sankaran

    2011-01-01

    In this paper, we propose a probabilistic approach to represent interval data for input variables in reliability and uncertainty analysis problems, using flexible families of continuous Johnson distributions. Such a probabilistic representation of interval data facilitates a unified framework for handling aleatory and epistemic uncertainty. For fitting probability distributions, methods such as moment matching are commonly used in the literature. However, unlike point data where single estimates for the moments of data can be calculated, moments of interval data can only be computed in terms of upper and lower bounds. Finding bounds on the moments of interval data has been generally considered an NP-hard problem because it includes a search among the combinations of multiple values of the variables, including interval endpoints. In this paper, we present efficient algorithms based on continuous optimization to find the bounds on second and higher moments of interval data. With numerical examples, we show that the proposed bounding algorithms are scalable in polynomial time with respect to increasing number of intervals. Using the bounds on moments computed using the proposed approach, we fit a family of Johnson distributions to interval data. Furthermore, using an optimization approach based on percentiles, we find the bounding envelopes of the family of distributions, termed as a Johnson p-box. The idea of bounding envelopes for the family of Johnson distributions is analogous to the notion of empirical p-box in the literature. Several sets of interval data with different numbers of intervals and type of overlap are presented to demonstrate the proposed methods. As against the computationally expensive nested analysis that is typically required in the presence of interval variables, the proposed probabilistic representation enables inexpensive optimization-based strategies to estimate bounds on an output quantity of interest.

  8. A sampling-based approach to probabilistic pursuit evasion

    KAUST Repository

    Mahadevan, Aditya

    2012-05-01

    Probabilistic roadmaps (PRMs) are a sampling-based approach to motion-planning that encodes feasible paths through the environment using a graph created from a subset of valid positions. Prior research has shown that PRMs can be augmented with useful information to model interesting scenarios related to multi-agent interaction and coordination. © 2012 IEEE.

  9. Global optimization of maintenance and surveillance testing based on reliability and probabilistic safety assessment. Research project

    International Nuclear Information System (INIS)

    Martorell, S.; Serradell, V.; Munoz, A.; Sanchez, A.

    1997-01-01

    Background, objective, scope, detailed working plan and follow-up and final product of the project ''Global optimization of maintenance and surveillance testing based on reliability and probabilistic safety assessment'' are described

  10. Development of a Quantitative Framework for Regulatory Risk Assessments: Probabilistic Approaches

    International Nuclear Information System (INIS)

    Wilmot, R.D.

    2003-11-01

    The Swedish regulators have been active in the field of performance assessment for many years and have developed sophisticated approaches to the development of scenarios and other aspects of assessments. These assessments have generally used dose as the assessment end-point and have been based on deterministic calculations. Recently introduced Swedish regulations have introduced a risk criterion for radioactive waste disposal: the annual risk of harmful effects after closure of a disposal facility should not exceed 10 -6 for a representative individual in the group exposed to the greatest risk. A recent review of the overall structure of risk assessments in safety cases concluded that there are a number of decisions and assumptions in the development of a risk assessment methodology that could potentially affect the calculated results. Regulatory understanding of these issues, potentially supported by independent calculations, is important in preparing for review of a proponent's risk assessment. One approach to evaluating risk in performance assessments is to use the concept of probability to express uncertainties, and to propagate these probabilities through the analysis. This report describes the various approaches available for undertaking such probabilistic analyses, both as a means of accounting for uncertainty in the determination of risk and more generally as a means of sensitivity and uncertainty analysis. The report discusses the overall nature of probabilistic analyses and how they are applied to both the calculation of risk and sensitivity analyses. Several approaches are available, including differential analysis, response surface methods and simulation. Simulation is the approach most commonly used, both in assessments for radioactive waste disposal and in other subject areas, and the report describes the key stages of this approach in detail. Decisions relating to the development of input PDFs, sampling methods (including approaches to the treatment

  11. A probabilistic approach for RIA fuel failure criteria

    International Nuclear Information System (INIS)

    Carlo Vitanza, Dr.

    2008-01-01

    Substantial experimental data have been produced in support of the definition of the RIA safety limits for water reactor fuels at high burn up. Based on these data, fuel failure enthalpy limits can be derived based on methods having a varying degree of complexity. However, regardless of sophistication, it is unlikely that any deterministic approach would result in perfect predictions of all failure and non failure data obtained in RIA tests. Accordingly, a probabilistic approach is proposed in this paper, where in addition to a best estimate evaluation of the failure enthalpy, a RIA fuel failure probability distribution is defined within an enthalpy band surrounding the best estimate failure enthalpy. The band width and the failure probability distribution within this band are determined on the basis of the whole data set, including failure and non failure data and accounting for the actual scatter of the database. The present probabilistic approach can be used in conjunction with any deterministic model or correlation. For deterministic models or correlations having good prediction capability, the probability distribution will be sharply increasing within a narrow band around the best estimate value. For deterministic predictions of lower quality, instead, the resulting probability distribution will be broad and coarser

  12. Probabilistic, meso-scale flood loss modelling

    Science.gov (United States)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  13. Probabilistic approaches to life prediction of nuclear plant structural components

    International Nuclear Information System (INIS)

    Villain, B.; Pitner, P.; Procaccia, H.

    1996-01-01

    In the last decade there has been an increasing interest at EDF in developing and applying probabilistic methods for a variety of purposes. In the field of structural integrity and reliability they are used to evaluate the effect of deterioration due to aging mechanisms, mainly on major passive structural components such as steam generators, pressure vessels and piping in nuclear plants. Because there can be numerous uncertainties involved in a assessment of the performance of these structural components, probabilistic methods. The benefits of a probabilistic approach are the clear treatment of uncertainly and the possibility to perform sensitivity studies from which it is possible to identify and quantify the effect of key factors and mitigative actions. They thus provide information to support effective decisions to optimize In-Service Inspection planning and maintenance strategies and for realistic lifetime prediction or reassessment. The purpose of the paper is to discuss and illustrate the methods available at EDF for probabilistic component life prediction. This includes a presentation of software tools in classical, Bayesian and structural reliability, and an application on two case studies (steam generator tube bundle, reactor pressure vessel). (authors)

  14. Probabilistic approaches to life prediction of nuclear plant structural components

    International Nuclear Information System (INIS)

    Villain, B.; Pitner, P.; Procaccia, H.

    1996-01-01

    In the last decade there has been an increasing interest at EDF in developing and applying probabilistic methods for a variety of purposes. In the field of structural integrity and reliability they are used to evaluate the effect of deterioration due to aging mechanisms, mainly on major passive structural components such as steam generators, pressure vessels and piping in nuclear plants. Because there can be numerous uncertainties involved in an assessment of the performance of these structural components, probabilistic methods provide an attractive alternative or supplement to more conventional deterministic methods. The benefits of a probabilistic approach are the clear treatment of uncertainty and the possibility to perform sensitivity studies from which it is possible to identify and quantify the effect of key factors and mitigative actions. They thus provide information to support effective decisions to optimize In-Service Inspection planning and maintenance strategies and for realistic lifetime prediction or reassessment. The purpose of the paper is to discuss and illustrate the methods available at EDF for probabilistic component life prediction. This includes a presentation of software tools in classical, Bayesian and structural reliability, and an application on two case studies (steam generator tube bundle, reactor pressure vessel)

  15. Global Infrasound Association Based on Probabilistic Clutter Categorization

    Science.gov (United States)

    Arora, Nimar; Mialle, Pierrick

    2016-04-01

    The IDC advances its methods and continuously improves its automatic system for the infrasound technology. The IDC focuses on enhancing the automatic system for the identification of valid signals and the optimization of the network detection threshold by identifying ways to refine signal characterization methodology and association criteria. An objective of this study is to reduce the number of associated infrasound arrivals that are rejected from the automatic bulletins when generating the reviewed event bulletins. Indeed, a considerable number of signal detections are due to local clutter sources such as microbaroms, waterfalls, dams, gas flares, surf (ocean breaking waves) etc. These sources are either too diffuse or too local to form events. Worse still, the repetitive nature of this clutter leads to a large number of false event hypotheses due to the random matching of clutter at multiple stations. Previous studies, for example [1], have worked on categorization of clutter using long term trends on detection azimuth, frequency, and amplitude at each station. In this work we continue the same line of reasoning to build a probabilistic model of clutter that is used as part of NETVISA [2], a Bayesian approach to network processing. The resulting model is a fusion of seismic, hydroacoustic and infrasound processing built on a unified probabilistic framework. References: [1] Infrasound categorization Towards a statistics based approach. J. Vergoz, P. Gaillard, A. Le Pichon, N. Brachet, and L. Ceranna. ITW 2011 [2] NETVISA: Network Processing Vertically Integrated Seismic Analysis. N. S. Arora, S. Russell, and E. Sudderth. BSSA 2013

  16. A Probabilistic Model for Diagnosing Misconceptions by a Pattern Classification Approach.

    Science.gov (United States)

    Tatsuoka, Kikumi K.

    A probabilistic approach is introduced to classify and diagnose erroneous rules of operation resulting from a variety of misconceptions ("bugs") in a procedural domain of arithmetic. The model is contrasted with the deterministic approach which has commonly been used in the field of artificial intelligence, and the advantage of treating the…

  17. PROBABILISTIC APPROACH OF STABILIZED ELECTROMAGNETIC FIELD EFFECTS

    Directory of Open Access Journals (Sweden)

    FELEA. I.

    2017-09-01

    Full Text Available The effects of the omnipresence of the electromagnetic field are certain and recognized. Assessing as accurately as possible these effects, which characterize random phenomena require the use of statistical-probabilistic calculation. This paper aims at assessing the probability of exceeding the admissible values of the characteristic sizes of the electromagnetic field - magnetic induction and electric field strength. The first part justifies the need for concern and specifies how to approach it. The mathematical model of approach and treatment is presented in the second part of the paper and the results obtained with reference to 14 power stations are synthesized in the third part. In the last part, are formulated the conclusions of the evaluations.

  18. Information fusion in signal and image processing major probabilistic and non-probabilistic numerical approaches

    CERN Document Server

    Bloch, Isabelle

    2010-01-01

    The area of information fusion has grown considerably during the last few years, leading to a rapid and impressive evolution. In such fast-moving times, it is important to take stock of the changes that have occurred. As such, this books offers an overview of the general principles and specificities of information fusion in signal and image processing, as well as covering the main numerical methods (probabilistic approaches, fuzzy sets and possibility theory and belief functions).

  19. A probabilistic approach to Radiological Environmental Impact Assessment

    International Nuclear Information System (INIS)

    Avila, Rodolfo; Larsson, Carl-Magnus

    2001-01-01

    Since a radiological environmental impact assessment typically relies on limited data and poorly based extrapolation methods, point estimations, as implied by a deterministic approach, do not suffice. To be of practical use for risk management, it is necessary to quantify the uncertainty margins of the estimates as well. In this paper we discuss how to work out a probabilistic approach for dealing with uncertainties in assessments of the radiological risks to non-human biota of a radioactive contamination. Possible strategies for deriving the relevant probability distribution functions from available empirical data and theoretical knowledge are outlined

  20. Probabilistic Approach to Conditional Probability of Release of Hazardous Materials from Railroad Tank Cars during Accidents

    Science.gov (United States)

    2009-10-13

    This paper describes a probabilistic approach to estimate the conditional probability of release of hazardous materials from railroad tank cars during train accidents. Monte Carlo methods are used in developing a probabilistic model to simulate head ...

  1. Comparison of Two Probabilistic Fatigue Damage Assessment Approaches Using Prognostic Performance Metrics

    Data.gov (United States)

    National Aeronautics and Space Administration — A general framework for probabilistic prognosis using maximum entropy approach, MRE, is proposed in this paper to include all available information and uncertainties...

  2. Some probabilistic aspects of fracture

    International Nuclear Information System (INIS)

    Thomas, J.M.

    1982-01-01

    Some probabilistic aspects of fracture in structural and mechanical components are examined. The principles of fracture mechanics, material quality and inspection uncertainty are formulated into a conceptual and analytical framework for prediction of failure probability. The role of probabilistic fracture mechanics in a more global context of risk and optimization of decisions is illustrated. An example, where Monte Carlo simulation was used to implement a probabilistic fracture mechanics analysis, is discussed. (orig.)

  3. Universal Generating Function Based Probabilistic Production Simulation Approach Considering Wind Speed Correlation

    Directory of Open Access Journals (Sweden)

    Yan Li

    2017-11-01

    Full Text Available Due to the volatile and correlated nature of wind speed, a high share of wind power penetration poses challenges to power system production simulation. Existing power system probabilistic production simulation approaches are in short of considering the time-varying characteristics of wind power and load, as well as the correlation between wind speeds at the same time, which brings about some problems in planning and analysis for the power system with high wind power penetration. Based on universal generating function (UGF, this paper proposes a novel probabilistic production simulation approach considering wind speed correlation. UGF is utilized to develop the chronological models of wind power that characterizes wind speed correlation simultaneously, as well as the chronological models of conventional generation sources and load. The supply and demand are matched chronologically to not only obtain generation schedules, but also reliability indices both at each simulation interval and the whole period. The proposed approach has been tested on the improved IEEE-RTS 79 test system and is compared with the Monte Carlo approach and the sequence operation theory approach. The results verified the proposed approach with the merits of computation simplicity and accuracy.

  4. Probabilistic and machine learning-based retrieval approaches for biomedical dataset retrieval

    Science.gov (United States)

    Karisani, Payam; Qin, Zhaohui S; Agichtein, Eugene

    2018-01-01

    Abstract The bioCADDIE dataset retrieval challenge brought together different approaches to retrieval of biomedical datasets relevant to a user’s query, expressed as a text description of a needed dataset. We describe experiments in applying a data-driven, machine learning-based approach to biomedical dataset retrieval as part of this challenge. We report on a series of experiments carried out to evaluate the performance of both probabilistic and machine learning-driven techniques from information retrieval, as applied to this challenge. Our experiments with probabilistic information retrieval methods, such as query term weight optimization, automatic query expansion and simulated user relevance feedback, demonstrate that automatically boosting the weights of important keywords in a verbose query is more effective than other methods. We also show that although there is a rich space of potential representations and features available in this domain, machine learning-based re-ranking models are not able to improve on probabilistic information retrieval techniques with the currently available training data. The models and algorithms presented in this paper can serve as a viable implementation of a search engine to provide access to biomedical datasets. The retrieval performance is expected to be further improved by using additional training data that is created by expert annotation, or gathered through usage logs, clicks and other processes during natural operation of the system. Database URL: https://github.com/emory-irlab/biocaddie

  5. Précis of bayesian rationality: The probabilistic approach to human reasoning.

    Science.gov (United States)

    Oaksford, Mike; Chater, Nick

    2009-02-01

    According to Aristotle, humans are the rational animal. The borderline between rationality and irrationality is fundamental to many aspects of human life including the law, mental health, and language interpretation. But what is it to be rational? One answer, deeply embedded in the Western intellectual tradition since ancient Greece, is that rationality concerns reasoning according to the rules of logic--the formal theory that specifies the inferential connections that hold with certainty between propositions. Piaget viewed logical reasoning as defining the end-point of cognitive development; and contemporary psychology of reasoning has focussed on comparing human reasoning against logical standards. Bayesian Rationality argues that rationality is defined instead by the ability to reason about uncertainty. Although people are typically poor at numerical reasoning about probability, human thought is sensitive to subtle patterns of qualitative Bayesian, probabilistic reasoning. In Chapters 1-4 of Bayesian Rationality (Oaksford & Chater 2007), the case is made that cognition in general, and human everyday reasoning in particular, is best viewed as solving probabilistic, rather than logical, inference problems. In Chapters 5-7 the psychology of "deductive" reasoning is tackled head-on: It is argued that purportedly "logical" reasoning problems, revealing apparently irrational behaviour, are better understood from a probabilistic point of view. Data from conditional reasoning, Wason's selection task, and syllogistic inference are captured by recasting these problems probabilistically. The probabilistic approach makes a variety of novel predictions which have been experimentally confirmed. The book considers the implications of this work, and the wider "probabilistic turn" in cognitive science and artificial intelligence, for understanding human rationality.

  6. Future trends in flood risk in Indonesia - A probabilistic approach

    Science.gov (United States)

    Muis, Sanne; Guneralp, Burak; Jongman, Brenden; Ward, Philip

    2014-05-01

    Indonesia is one of the 10 most populous countries in the world and is highly vulnerable to (river) flooding. Catastrophic floods occur on a regular basis; total estimated damages were US 0.8 bn in 2010 and US 3 bn in 2013. Large parts of Greater Jakarta, the capital city, are annually subject to flooding. Flood risks (i.e. the product of hazard, exposure and vulnerability) are increasing due to rapid increases in exposure, such as strong population growth and ongoing economic development. The increase in risk may also be amplified by increasing flood hazards, such as increasing flood frequency and intensity due to climate change and land subsidence. The implementation of adaptation measures, such as the construction of dykes and strategic urban planning, may counteract these increasing trends. However, despite its importance for adaptation planning, a comprehensive assessment of current and future flood risk in Indonesia is lacking. This contribution addresses this issue and aims to provide insight into how socio-economic trends and climate change projections may shape future flood risks in Indonesia. Flood risk were calculated using an adapted version of the GLOFRIS global flood risk assessment model. Using this approach, we produced probabilistic maps of flood risks (i.e. annual expected damage) at a resolution of 30"x30" (ca. 1km x 1km at the equator). To represent flood exposure, we produced probabilistic projections of urban growth in a Monte-Carlo fashion based on probability density functions of projected population and GDP values for 2030. To represent flood hazard, inundation maps were computed using the hydrological-hydraulic component of GLOFRIS. These maps show flood inundation extent and depth for several return periods and were produced for several combinations of GCMs and future socioeconomic scenarios. Finally, the implementation of different adaptation strategies was incorporated into the model to explore to what extent adaptation may be able to

  7. Dietary Exposure Assessment of Danish Consumers to Dithiocarbamate Residues in Food: a Comparison of the Deterministic and Probabilistic Approach

    DEFF Research Database (Denmark)

    Jensen, Bodil Hamborg; Andersen, Jens Hinge; Petersen, Annette

    2008-01-01

    Probabilistic and deterministic estimates of the acute and chronic exposure of the Danish populations to dithiocarbamate residues were performed. The Monte Carlo Risk Assessment programme (MCRA 4.0) was used for the probabilistic risk assessment. Food consumption data were obtained from...... the nationwide dietary survey conducted in 2000-02. Residue data for 5721 samples from the monitoring programme conducted in the period 1998-2003 were used for dithiocarbamates, which had been determined as carbon disulphide. Contributions from 26 commodities were included in the calculations. Using...... the probabilistic approach, the daily acute intakes at the 99.9% percentile for adults and children were 11.2 and 28.2 mu g kg(-1) body weight day(-1), representing 5.6% and 14.1% of the ARfD for maneb, respectively. When comparing the point estimate approach with the probabilistic approach, the outcome...

  8. Duplicate Detection in Probabilistic Data

    NARCIS (Netherlands)

    Panse, Fabian; van Keulen, Maurice; de Keijzer, Ander; Ritter, Norbert

    2009-01-01

    Collected data often contains uncertainties. Probabilistic databases have been proposed to manage uncertain data. To combine data from multiple autonomous probabilistic databases, an integration of probabilistic data has to be performed. Until now, however, data integration approaches have focused

  9. Use of adjoint methods in the probabilistic finite element approach to fracture mechanics

    Science.gov (United States)

    Liu, Wing Kam; Besterfield, Glen; Lawrence, Mark; Belytschko, Ted

    1988-01-01

    The adjoint method approach to probabilistic finite element methods (PFEM) is presented. When the number of objective functions is small compared to the number of random variables, the adjoint method is far superior to the direct method in evaluating the objective function derivatives with respect to the random variables. The PFEM is extended to probabilistic fracture mechanics (PFM) using an element which has the near crack-tip singular strain field embedded. Since only two objective functions (i.e., mode I and II stress intensity factors) are needed for PFM, the adjoint method is well suited.

  10. Multilevel probabilistic approach to evaluate manufacturing defect in composite aircraft structures

    International Nuclear Information System (INIS)

    Caracciolo, Paola

    2014-01-01

    In this work it is developed a reliable approach and its feasibility to the design and analysis of a composite structures. The metric is compared the robustness and reliability designs versus the traditional design, to demonstrate the gain that can be achieved with a probabilistic approach. The use of the stochastic approach of the uncertain parameteters in combination with the multi-scale levels analysis is the main objective of this paper. The work is dedicated to analyze the uncertainties in the design, tests, manufacturing process, and key gates such as materials characteristic

  11. Multilevel probabilistic approach to evaluate manufacturing defect in composite aircraft structures

    Energy Technology Data Exchange (ETDEWEB)

    Caracciolo, Paola, E-mail: paola.caracciolo@airbus.com [AIRBUS INDUSTRIES Germany, Department of Airframe Architecture and Integration-Research and Technology-Kreetslag, 10, D-21129, Hamburg (Germany)

    2014-05-15

    In this work it is developed a reliable approach and its feasibility to the design and analysis of a composite structures. The metric is compared the robustness and reliability designs versus the traditional design, to demonstrate the gain that can be achieved with a probabilistic approach. The use of the stochastic approach of the uncertain parameteters in combination with the multi-scale levels analysis is the main objective of this paper. The work is dedicated to analyze the uncertainties in the design, tests, manufacturing process, and key gates such as materials characteristic.

  12. Probabilistic interpretation of data a physicist's approach

    CERN Document Server

    Miller, Guthrie

    2013-01-01

    This book is a physicists approach to interpretation of data using Markov Chain Monte Carlo (MCMC). The concepts are derived from first principles using a style of mathematics that quickly elucidates the basic ideas, sometimes with the aid of examples. Probabilistic data interpretation is a straightforward problem involving conditional probability. A prior probability distribution is essential, and examples are given. In this small book (200 pages) the reader is led from the most basic concepts of mathematical probability all the way to parallel processing algorithms for Markov Chain Monte Carlo. Fortran source code (for eigenvalue analysis of finite discrete Markov Chains, for MCMC, and for nonlinear least squares) is included with the supplementary material for this book (available online).

  13. Stormwater Tank Performance: Design and Management Criteria for Capture Tanks Using a Continuous Simulation and a Semi-Probabilistic Analytical Approach

    Directory of Open Access Journals (Sweden)

    Flavio De Martino

    2013-10-01

    Full Text Available Stormwater tank performance significantly depends on management practices. This paper proposes a procedure to assess tank efficiency in terms of volume and pollutant concentration using four different capture tank management protocols. The comparison of the efficiency results reveals that, as expected, a combined bypass—stormwater tank system achieves better results than a tank alone. The management practices tested for the tank-only systems provide notably different efficiency results. The practice of immediately emptying after the end of the event exhibits significant levels of efficiency and operational advantages. All other configurations exhibit either significant operational problems or very low performances. The continuous simulation and semi-probabilistic approach for the best tank management practice are compared. The semi-probabilistic approach is based on a Weibull probabilistic model of the main characteristics of the rainfall process. Following this approach, efficiency indexes were established. The comparison with continuous simulations shows the reliability of the probabilistic approach even if this last is certainly very site sensitive.

  14. Probabilistic Logic and Probabilistic Networks

    NARCIS (Netherlands)

    Haenni, R.; Romeijn, J.-W.; Wheeler, G.; Williamson, J.

    2009-01-01

    While in principle probabilistic logics might be applied to solve a range of problems, in practice they are rarely applied at present. This is perhaps because they seem disparate, complicated, and computationally intractable. However, we shall argue in this programmatic paper that several approaches

  15. An empirical system for probabilistic seasonal climate prediction

    Science.gov (United States)

    Eden, Jonathan; van Oldenborgh, Geert Jan; Hawkins, Ed; Suckling, Emma

    2016-04-01

    Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.

  16. Semantics of probabilistic processes an operational approach

    CERN Document Server

    Deng, Yuxin

    2015-01-01

    This book discusses the semantic foundations of concurrent systems with nondeterministic and probabilistic behaviour. Particular attention is given to clarifying the relationship between testing and simulation semantics and characterising bisimulations from metric, logical, and algorithmic perspectives. Besides presenting recent research outcomes in probabilistic concurrency theory, the book exemplifies the use of many mathematical techniques to solve problems in computer science, which is intended to be accessible to postgraduate students in Computer Science and Mathematics. It can also be us

  17. Probabilistic Approaches to Video Retrieval

    NARCIS (Netherlands)

    Ianeva, Tzvetanka; Boldareva, L.; Westerveld, T.H.W.; Cornacchia, Roberto; Hiemstra, Djoerd; de Vries, A.P.

    Our experiments for TRECVID 2004 further investigate the applicability of the so-called “Generative Probabilistic Models to video retrieval��?. TRECVID 2003 results demonstrated that mixture models computed from video shot sequences improve the precision of “query by examples��? results when

  18. Probabilistic Structural Analysis of SSME Turbopump Blades: Probabilistic Geometry Effects

    Science.gov (United States)

    Nagpal, V. K.

    1985-01-01

    A probabilistic study was initiated to evaluate the precisions of the geometric and material properties tolerances on the structural response of turbopump blades. To complete this study, a number of important probabilistic variables were identified which are conceived to affect the structural response of the blade. In addition, a methodology was developed to statistically quantify the influence of these probabilistic variables in an optimized way. The identified variables include random geometric and material properties perturbations, different loadings and a probabilistic combination of these loadings. Influences of these probabilistic variables are planned to be quantified by evaluating the blade structural response. Studies of the geometric perturbations were conducted for a flat plate geometry as well as for a space shuttle main engine blade geometry using a special purpose code which uses the finite element approach. Analyses indicate that the variances of the perturbations about given mean values have significant influence on the response.

  19. Flood risk and adaptation strategies under climate change and urban expansion: A probabilistic analysis using global data.

    Science.gov (United States)

    Muis, Sanne; Güneralp, Burak; Jongman, Brenden; Aerts, Jeroen C J H; Ward, Philip J

    2015-12-15

    An accurate understanding of flood risk and its drivers is crucial for effective risk management. Detailed risk projections, including uncertainties, are however rarely available, particularly in developing countries. This paper presents a method that integrates recent advances in global-scale modeling of flood hazard and land change, which enables the probabilistic analysis of future trends in national-scale flood risk. We demonstrate its application to Indonesia. We develop 1000 spatially-explicit projections of urban expansion from 2000 to 2030 that account for uncertainty associated with population and economic growth projections, as well as uncertainty in where urban land change may occur. The projections show that the urban extent increases by 215%-357% (5th and 95th percentiles). Urban expansion is particularly rapid on Java, which accounts for 79% of the national increase. From 2000 to 2030, increases in exposure will elevate flood risk by, on average, 76% and 120% for river and coastal floods. While sea level rise will further increase the exposure-induced trend by 19%-37%, the response of river floods to climate change is highly uncertain. However, as urban expansion is the main driver of future risk, the implementation of adaptation measures is increasingly urgent, regardless of the wide uncertainty in climate projections. Using probabilistic urban projections, we show that spatial planning can be a very effective adaptation strategy. Our study emphasizes that global data can be used successfully for probabilistic risk assessment in data-scarce countries. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. PROBABILISTIC SENSITIVITY AND UNCERTAINTY ANALYSIS WORKSHOP SUMMARY REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Seitz, R

    2008-06-25

    Stochastic or probabilistic modeling approaches are being applied more frequently in the United States and globally to quantify uncertainty and enhance understanding of model response in performance assessments for disposal of radioactive waste. This increased use has resulted in global interest in sharing results of research and applied studies that have been completed to date. This technical report reflects the results of a workshop that was held to share results of research and applied work related to performance assessments conducted at United States Department of Energy sites. Key findings of this research and applied work are discussed and recommendations for future activities are provided.

  1. Convex models and probabilistic approach of nonlinear fatigue failure

    International Nuclear Information System (INIS)

    Qiu Zhiping; Lin Qiang; Wang Xiaojun

    2008-01-01

    This paper is concerned with the nonlinear fatigue failure problem with uncertainties in the structural systems. In the present study, in order to solve the nonlinear problem by convex models, the theory of ellipsoidal algebra with the help of the thought of interval analysis is applied. In terms of the inclusion monotonic property of ellipsoidal functions, the nonlinear fatigue failure problem with uncertainties can be solved. A numerical example of 25-bar truss structures is given to illustrate the efficiency of the presented method in comparison with the probabilistic approach

  2. Multiple sequential failure model: A probabilistic approach to quantifying human error dependency

    International Nuclear Information System (INIS)

    Samanta

    1985-01-01

    This paper rpesents a probabilistic approach to quantifying human error dependency when multiple tasks are performed. Dependent human failures are dominant contributors to risks from nuclear power plants. An overview of the Multiple Sequential Failure (MSF) model developed and its use in probabilistic risk assessments (PRAs) depending on the available data are discussed. A small-scale psychological experiment was conducted on the nature of human dependency and the interpretation of the experimental data by the MSF model show remarkable accommodation of the dependent failure data. The model, which provides an unique method for quantification of dependent failures in human reliability analysis, can be used in conjunction with any of the general methods currently used for performing the human reliability aspect in PRAs

  3. Probabilistic, multi-variate flood damage modelling using random forests and Bayesian networks

    Science.gov (United States)

    Kreibich, Heidi; Schröter, Kai

    2015-04-01

    Decisions on flood risk management and adaptation are increasingly based on risk analyses. Such analyses are associated with considerable uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention recently, they are hardly applied in flood damage assessments. Most of the damage models usually applied in standard practice have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. This presentation will show approaches for probabilistic, multi-variate flood damage modelling on the micro- and meso-scale and discuss their potential and limitations. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Schröter, K., Kreibich, H., Vogel, K., Riggelsen, C., Scherbaum, F., Merz, B. (2014): How useful are complex flood damage models? - Water Resources Research, 50, 4, p. 3378-3395.

  4. An approach to handle Real Time and Probabilistic behaviors in e-commerce

    DEFF Research Database (Denmark)

    Diaz, G.; Larsen, Kim Guldstrand; Pardo, J.

    2005-01-01

    In this work we describe an approach to deal with systems having at the same time probabilistic and real-time behav- iors. The main goal in the paper is to show the automatic translation from a real time model based on UPPAAL tool, which makes automatic verification of Real Time Systems, to the R...

  5. Probabilistic approaches to recommendations

    CERN Document Server

    Barbieri, Nicola; Ritacco, Ettore

    2014-01-01

    The importance of accurate recommender systems has been widely recognized by academia and industry, and recommendation is rapidly becoming one of the most successful applications of data mining and machine learning. Understanding and predicting the choices and preferences of users is a challenging task: real-world scenarios involve users behaving in complex situations, where prior beliefs, specific tendencies, and reciprocal influences jointly contribute to determining the preferences of users toward huge amounts of information, services, and products. Probabilistic modeling represents a robus

  6. A Probabilistic Approach to Predict Thermal Fatigue Life for Ball Grid Array Solder Joints

    Science.gov (United States)

    Wei, Helin; Wang, Kuisheng

    2011-11-01

    Numerous studies of the reliability of solder joints have been performed. Most life prediction models are limited to a deterministic approach. However, manufacturing induces uncertainty in the geometry parameters of solder joints, and the environmental temperature varies widely due to end-user diversity, creating uncertainties in the reliability of solder joints. In this study, a methodology for accounting for variation in the lifetime prediction for lead-free solder joints of ball grid array packages (PBGA) is demonstrated. The key aspects of the solder joint parameters and the cyclic temperature range related to reliability are involved. Probabilistic solutions of the inelastic strain range and thermal fatigue life based on the Engelmaier model are developed to determine the probability of solder joint failure. The results indicate that the standard deviation increases significantly when more random variations are involved. Using the probabilistic method, the influence of each variable on the thermal fatigue life is quantified. This information can be used to optimize product design and process validation acceptance criteria. The probabilistic approach creates the opportunity to identify the root causes of failed samples from product fatigue tests and field returns. The method can be applied to better understand how variation affects parameters of interest in an electronic package design with area array interconnections.

  7. A probabilistic approach to quantify the uncertainties in internal dose assessment using response surface and neural network

    International Nuclear Information System (INIS)

    Baek, M.; Lee, S.K.; Lee, U.C.; Kang, C.S.

    1996-01-01

    A probabilistic approach is formulated to assess the internal radiation exposure following the intake of radioisotopes. This probabilistic approach consists of 4 steps as follows: (1) screening, (2) quantification of uncertainties, (3) propagation of uncertainties, and (4) analysis of output. The approach has been applied for Pu-induced internal dose assessment and a multi-compartment dosimetric model is used for internal transport. In this approach, surrogate models of original system are constructed using response and neural network. And the results of these surrogate models are compared with those of original model. Each surrogate model well approximates the original model. The uncertainty and sensitivity analysis of the model parameters are evaluated in this process. Dominant contributors to each organ are identified and the results show that this approach could serve a good tool of assessing the internal radiation exposure

  8. Identification of failure type in corroded pipelines: a bayesian probabilistic approach.

    Science.gov (United States)

    Breton, T; Sanchez-Gheno, J C; Alamilla, J L; Alvarez-Ramirez, J

    2010-07-15

    Spillover of hazardous materials from transport pipelines can lead to catastrophic events with serious and dangerous environmental impact, potential fire events and human fatalities. The problem is more serious for large pipelines when the construction material is under environmental corrosion conditions, as in the petroleum and gas industries. In this way, predictive models can provide a suitable framework for risk evaluation, maintenance policies and substitution procedure design that should be oriented to reduce increased hazards. This work proposes a bayesian probabilistic approach to identify and predict the type of failure (leakage or rupture) for steel pipelines under realistic corroding conditions. In the first step of the modeling process, the mechanical performance of the pipe is considered for establishing conditions under which either leakage or rupture failure can occur. In the second step, experimental burst tests are used to introduce a mean probabilistic boundary defining a region where the type of failure is uncertain. In the boundary vicinity, the failure discrimination is carried out with a probabilistic model where the events are considered as random variables. In turn, the model parameters are estimated with available experimental data and contrasted with a real catastrophic event, showing good discrimination capacity. The results are discussed in terms of policies oriented to inspection and maintenance of large-size pipelines in the oil and gas industry. 2010 Elsevier B.V. All rights reserved.

  9. Blind RRT: A probabilistically complete distributed RRT

    KAUST Repository

    Rodriguez, Cesar; Denny, Jory; Jacobs, Sam Ade; Thomas, Shawna; Amato, Nancy M.

    2013-01-01

    Rapidly-Exploring Random Trees (RRTs) have been successful at finding feasible solutions for many types of problems. With motion planning becoming more computationally demanding, we turn to parallel motion planning for efficient solutions. Existing work on distributed RRTs has been limited by the overhead that global communication requires. A recent approach, Radial RRT, demonstrated a scalable algorithm that subdivides the space into regions to increase the computation locality. However, if an obstacle completely blocks RRT growth in a region, the planning space is not covered and is thus not probabilistically complete. We present a new algorithm, Blind RRT, which ignores obstacles during initial growth to efficiently explore the entire space. Because obstacles are ignored, free components of the tree become disconnected and fragmented. Blind RRT merges parts of the tree that have become disconnected from the root. We show how this algorithm can be applied to the Radial RRT framework allowing both scalability and effectiveness in motion planning. This method is a probabilistically complete approach to parallel RRTs. We show that our method not only scales but also overcomes the motion planning limitations that Radial RRT has in a series of difficult motion planning tasks. © 2013 IEEE.

  10. Blind RRT: A probabilistically complete distributed RRT

    KAUST Repository

    Rodriguez, Cesar

    2013-11-01

    Rapidly-Exploring Random Trees (RRTs) have been successful at finding feasible solutions for many types of problems. With motion planning becoming more computationally demanding, we turn to parallel motion planning for efficient solutions. Existing work on distributed RRTs has been limited by the overhead that global communication requires. A recent approach, Radial RRT, demonstrated a scalable algorithm that subdivides the space into regions to increase the computation locality. However, if an obstacle completely blocks RRT growth in a region, the planning space is not covered and is thus not probabilistically complete. We present a new algorithm, Blind RRT, which ignores obstacles during initial growth to efficiently explore the entire space. Because obstacles are ignored, free components of the tree become disconnected and fragmented. Blind RRT merges parts of the tree that have become disconnected from the root. We show how this algorithm can be applied to the Radial RRT framework allowing both scalability and effectiveness in motion planning. This method is a probabilistically complete approach to parallel RRTs. We show that our method not only scales but also overcomes the motion planning limitations that Radial RRT has in a series of difficult motion planning tasks. © 2013 IEEE.

  11. Exploring the uncertainties in cancer risk assessment using the integrated probabilistic risk assessment (IPRA) approach.

    Science.gov (United States)

    Slob, Wout; Bakker, Martine I; Biesebeek, Jan Dirk Te; Bokkers, Bas G H

    2014-08-01

    Current methods for cancer risk assessment result in single values, without any quantitative information on the uncertainties in these values. Therefore, single risk values could easily be overinterpreted. In this study, we discuss a full probabilistic cancer risk assessment approach in which all the generally recognized uncertainties in both exposure and hazard assessment are quantitatively characterized and probabilistically evaluated, resulting in a confidence interval for the final risk estimate. The methodology is applied to three example chemicals (aflatoxin, N-nitrosodimethylamine, and methyleugenol). These examples illustrate that the uncertainty in a cancer risk estimate may be huge, making single value estimates of cancer risk meaningless. Further, a risk based on linear extrapolation tends to be lower than the upper 95% confidence limit of a probabilistic risk estimate, and in that sense it is not conservative. Our conceptual analysis showed that there are two possible basic approaches for cancer risk assessment, depending on the interpretation of the dose-incidence data measured in animals. However, it remains unclear which of the two interpretations is the more adequate one, adding an additional uncertainty to the already huge confidence intervals for cancer risk estimates. © 2014 Society for Risk Analysis.

  12. A probabilistic approach to controlling crevice chemistry

    International Nuclear Information System (INIS)

    Millett, P.J.; Brobst, G.E.; Riddle, J.

    1995-01-01

    It has been generally accepted that the corrosion of steam generator tubing could be reduced if the local pH in regions where impurities concentrate could be controlled. The practice of molar ratio control is based on this assumption. Unfortunately, due to the complexity of the crevice concentration process, efforts to model the crevice chemistry based on bulk water conditions are quite uncertain. In-situ monitoring of the crevice chemistry is desirable, but may not be achievable in the near future. The current methodology for assessing the crevice chemistry is to monitor the hideout return chemistry when the plant shuts down. This approach also has its shortcomings, but may provide sufficient data to evaluate whether the crevice pH is in a desirable range. In this paper, an approach to controlling the crevice chemistry based on a target molar ratio indicator is introduced. The molar ratio indicator is based on what is believed to be the most reliable hideout return data. Probabilistic arguments are then used to show that the crevice pH will most likely be in a desirable range when the target molar ratio is achieved

  13. A probabilistic approach for debris impact risk with numerical simulations of debris behaviors

    International Nuclear Information System (INIS)

    Kihara, Naoto; Matsuyama, Masafumi; Fujii, Naoki

    2013-01-01

    We propose a probabilistic approach for evaluating the impact risk of tsunami debris through Monte Carlo simulations with a combined system comprising a depth-averaged two-dimensional shallow water model and a discrete element model customized to simulate the motions of floating objects such as vessels. In the proposed method, first, probabilistic tsunami hazard analysis is carried out, and the exceedance probability of tsunami height and numerous tsunami time series for various hazard levels on the offshore side of a target site are estimated. Second, a characteristic tsunami time series for each hazard level is created by cluster analysis. Third, using the Monte Carlo simulation model the debris impact probability with the buildings of interest and the exceedance probability of debris impact speed are evaluated. (author)

  14. Comparison of Two Probabilistic Fatigue Damage Assessment Approaches Using Prognostic Performance Metrics

    Directory of Open Access Journals (Sweden)

    Xuefei Guan

    2011-01-01

    Full Text Available In this paper, two probabilistic prognosis updating schemes are compared. One is based on the classical Bayesian approach and the other is based on newly developed maximum relative entropy (MRE approach. The algorithm performance of the two models is evaluated using a set of recently developed prognostics-based metrics. Various uncertainties from measurements, modeling, and parameter estimations are integrated into the prognosis framework as random input variables for fatigue damage of materials. Measures of response variables are then used to update the statistical distributions of random variables and the prognosis results are updated using posterior distributions. Markov Chain Monte Carlo (MCMC technique is employed to provide the posterior samples for model updating in the framework. Experimental data are used to demonstrate the operation of the proposed probabilistic prognosis methodology. A set of prognostics-based metrics are employed to quantitatively evaluate the prognosis performance and compare the proposed entropy method with the classical Bayesian updating algorithm. In particular, model accuracy, precision, robustness and convergence are rigorously evaluated in addition to the qualitative visual comparison. Following this, potential development and improvement for the prognostics-based metrics are discussed in detail.

  15. A Probabilistic Approach to Fitting Period–luminosity Relations and Validating Gaia Parallaxes

    Energy Technology Data Exchange (ETDEWEB)

    Sesar, Branimir; Fouesneau, Morgan; Bailer-Jones, Coryn A. L.; Gould, Andy; Rix, Hans-Walter [Max Planck Institute for Astronomy, Königstuhl 17, D-69117 Heidelberg (Germany); Price-Whelan, Adrian M., E-mail: bsesar@mpia.de [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States)

    2017-04-01

    Pulsating stars, such as Cepheids, Miras, and RR Lyrae stars, are important distance indicators and calibrators of the “cosmic distance ladder,” and yet their period–luminosity–metallicity (PLZ) relations are still constrained using simple statistical methods that cannot take full advantage of available data. To enable optimal usage of data provided by the Gaia mission, we present a probabilistic approach that simultaneously constrains parameters of PLZ relations and uncertainties in Gaia parallax measurements. We demonstrate this approach by constraining PLZ relations of type ab RR Lyrae stars in near-infrared W 1 and W 2 bands, using Tycho- Gaia Astrometric Solution (TGAS) parallax measurements for a sample of ≈100 type ab RR Lyrae stars located within 2.5 kpc of the Sun. The fitted PLZ relations are consistent with previous studies, and in combination with other data, deliver distances precise to 6% (once various sources of uncertainty are taken into account). To a precision of 0.05 mas (1 σ ), we do not find a statistically significant offset in TGAS parallaxes for this sample of distant RR Lyrae stars (median parallax of 0.8 mas and distance of 1.4 kpc). With only minor modifications, our probabilistic approach can be used to constrain PLZ relations of other pulsating stars, and we intend to apply it to Cepheid and Mira stars in the near future.

  16. Some thoughts on the future of probabilistic structural design of nuclear components

    International Nuclear Information System (INIS)

    Stancampiano, P.A.

    1978-01-01

    This paper presents some views on the future role of probabilistic methods in the structural design of nuclear components. The existing deterministic design approach is discussed and compared to the probabilistic approach. Some of the objections to both deterministic and probabilistic design are listed. Extensive research and development activities are required to mature the probabilistic approach suficiently to make it cost-effective and competitive with current deterministic design practices. The required research activities deal with probabilistic methods development, more realistic casual failure mode models development, and statistical data models development. A quasi-probabilistic structural design approach is recommended which accounts for the random error in the design models. (Auth.)

  17. Combination of Evidence with Different Weighting Factors: A Novel Probabilistic-Based Dissimilarity Measure Approach

    Directory of Open Access Journals (Sweden)

    Mengmeng Ma

    2015-01-01

    Full Text Available To solve the invalidation problem of Dempster-Shafer theory of evidence (DS with high conflict in multisensor data fusion, this paper presents a novel combination approach of conflict evidence with different weighting factors using a new probabilistic dissimilarity measure. Firstly, an improved probabilistic transformation function is proposed to map basic belief assignments (BBAs to probabilities. Then, a new dissimilarity measure integrating fuzzy nearness and introduced correlation coefficient is proposed to characterize not only the difference between basic belief functions (BBAs but also the divergence degree of the hypothesis that two BBAs support. Finally, the weighting factors used to reassign conflicts on BBAs are developed and Dempster’s rule is chosen to combine the discounted sources. Simple numerical examples are employed to demonstrate the merit of the proposed method. Through analysis and comparison of the results, the new combination approach can effectively solve the problem of conflict management with better convergence performance and robustness.

  18. Probabilistic flood damage modelling at the meso-scale

    Science.gov (United States)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2014-05-01

    Decisions on flood risk management and adaptation are usually based on risk analyses. Such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments. Most damage models have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood damage models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we show how the model BT-FLEMO (Bagging decision Tree based Flood Loss Estimation MOdel) can be applied on the meso-scale, namely on the basis of ATKIS land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany. The application of BT-FLEMO provides a probability distribution of estimated damage to residential buildings per municipality. Validation is undertaken on the one hand via a comparison with eight other damage models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official damage data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of damage estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation model BT-FLEMO is that it inherently provides quantitative information about the uncertainty of the prediction. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64.

  19. Limited probabilistic risk assessment applications in plant backfitting

    International Nuclear Information System (INIS)

    Desaedeleer, G.

    1987-01-01

    Plant backfitting programs are defined on the basis of deterministic (e.g. Systematic Evaluation Program) or probabilistic (e.g. Probabilistic Risk Assessment) approaches. Each approach provides valuable assets in defining the program and has its own advantages and disadvantages. Ideally one should combine the strong points of each approach. This chapter summarizes actual experience gained from combinations of deterministic and probabilistic approaches to define and implement PWR backfitting programs. Such combinations relate to limited applications of probabilistic techniques and are illustrated for upgrading fluid systems. These evaluations allow sound and rational optimization systems upgrade. However, the boundaries of the reliability analysis need to be clearly defined and system reliability may have to go beyond classical boundaries (e.g. identification of weak links in support systems). Also the implementation of upgrade on a system per system basis is not necessarily cost-effective. (author)

  20. Probabilistic reasoning for assembly-based 3D modeling

    KAUST Repository

    Chaudhuri, Siddhartha

    2011-01-01

    Assembly-based modeling is a promising approach to broadening the accessibility of 3D modeling. In assembly-based modeling, new models are assembled from shape components extracted from a database. A key challenge in assembly-based modeling is the identification of relevant components to be presented to the user. In this paper, we introduce a probabilistic reasoning approach to this problem. Given a repository of shapes, our approach learns a probabilistic graphical model that encodes semantic and geometric relationships among shape components. The probabilistic model is used to present components that are semantically and stylistically compatible with the 3D model that is being assembled. Our experiments indicate that the probabilistic model increases the relevance of presented components. © 2011 ACM.

  1. Transmission capacity assessment by probabilistic planning. An approach

    International Nuclear Information System (INIS)

    Lammintausta, M.

    2002-01-01

    The Finnish electricity markets participate in the Scandinavian markets, Nord-Pool. The Finnish market is free for marketers, producers and consumers. All these participants can be seen as customers of the transmission network, which in turn can be considered to be a market place in which electricity can be sold and bought. The Finnish transmission network is owned and operated by an independent company, Fingrid that has the full responsibility of the Finnish transmission system. The available transfer capacity of a transmission route is traditionally limited by deterministic security constraints. More efficient and flexible network utilisation could be achieved with probabilistic planning methods. This report introduces a simple and practical probabilistic approach for transfer limit and risk assessment. The method is based on the economical benefit and risk predictions. It uses also the existing results of deterministic data and it could be used side by side with the deterministic method. The basic concept and necessary equations for expected risks of various market players have been derived for further developments. The outage costs and thereby the risks of the market participants depend on how the system operator reacts to the faults. In the Finnish power system consumers will usually experience no costs due to the faults because of meshed network and counter trade method preferred by the system operator. The costs to the producers and dealers are also low because of the counter trade method. The network company will lose the cost of reparation, additional losses and cost of regulation power because of counter trades. In case power flows will be rearranged drastically because of aggressive strategies used in the electricity markets, the only way to fulfil the needs of free markets is that the network operator buys regulation power for short-term problems and reinforces the network in the long-term situations. The reinforcement is done if the network can not be

  2. Estimated Quality of Multistage Process on the Basis of Probabilistic Approach with Continuous Response Functions

    Directory of Open Access Journals (Sweden)

    Yuri B. Tebekin

    2011-11-01

    Full Text Available The article is devoted to the problem of the quality management for multiphase processes on the basis of the probabilistic approach. Method with continuous response functions is offered from the application of the method of Lagrange multipliers.

  3. Global Mindset: An Entrepreneur's Perspective on the Born-Global Approach

    Directory of Open Access Journals (Sweden)

    Robert Poole

    2012-10-01

    Full Text Available The born-global approach calls for a startup to address the needs of a global market from inception. This approach provides an attractive alternative to the conventional staged approach to internationalization whereby a startup first operates in its home market and then enters one or more foreign markets sequentially. This article highlights the mindset change that an entrepreneur must make to move from the conventional staged approach to the born-global approach. The author of this article is an experienced entrepreneur and the article describes his own mindset change that occurred when enacting the born-global approach. The author uses his own experience and company as a case study to develop recommendations for other entrepreneurs who are evaluating the born-global approach to launch and grow a technology company.

  4. Metabolic level recognition of progesterone in dairy Holstein cows using probabilistic models

    Directory of Open Access Journals (Sweden)

    Ludmila N. Turino

    2014-05-01

    Full Text Available Administration of exogenous progesterone is widely used in hormonal protocols for estrous (resynchronization of dairy cattle without regarding pharmacological issues for dose calculation. This happens because it is difficult to estimate the metabolic level of progesterone for each individual cow before administration. In the present contribution, progesterone pharmacokinetics has been determined in lactating Holstein cows with different milk production yields. A Bayesian approach has been implemented to build two probabilistic progesterone pharmacokinetic models for high and low yield dairy cows. Such models are based on a one-compartment Hill structure. Posterior probabilistic models have been structurally set up and parametric probability density functions have been empirically estimated. Moreover, a global sensitivity analysis has been done to know sensitivity profile of each model. Finally, posterior probabilistic models have adequately recognized cow’s progesterone metabolic level in a validation set when Kullback-Leibler based indices were used. These results suggest that milk yield may be a good index for estimating pharmacokinetic level of progesterone.

  5. Illustration of probabilistic approach in consequence assessment of accidental radioactive releases

    International Nuclear Information System (INIS)

    Pecha, P.; Hofman, R.; Kuca, P.

    2009-01-01

    We are describing a certain application of uncertainty analysis of environmental model HARP applied on atmospheric and deposition sub-model. Simulation of uncertainties propagation through the model is basic inevitable task bringing data for advanced techniques of probabilistic consequence assessment and further improvement of reliability of model predictions based on statistical procedures of assimilation with measured data. The activities are investigated in the institute IITA AV CR within the grant project supported by GACR (2007-2009). The problem is solved in close cooperation with section of information systems in institute NRPI. The subject of investigation concerns evaluation of consequences of radioactivity propagation after an accidental radioactivity release from nuclear facility.Transport of activity is studied from initial atmospheric propagation, deposition of radionuclides on terrain and spreading through food chains towards human body .Subsequent deposition processes of admixtures and food chain activity transport are modeled. In the final step a hazard estimation based on doses on population is integrated into the software system HARP. Extension to probabilistic approach has increased the complexity substantially, but offers much more informative background for modem methods of estimation accounting for inherent stochastic nature of the problem. Example of probabilistic assessment illustrated here is based on uncertainty analysis of input parameters of SGPM model. Predicted background field of Cs-137 deposition are labelled with index p. as P X SGPM . Final goal is estimation of a certain unknown true background vector χ true , which accounts also for deficiencies of the SGPM formulation in itself insisting in insufficient description of reality. We must have on mind, that even if we know true values of all input parameters θ m true (m= 1 ,..., M) of SGPM model, the χ true still remain uncertain. One possibility how to approach reality insists

  6. Illustration of probabilistic approach in consequence assessment of accidental radioactive releases

    International Nuclear Information System (INIS)

    Pecha, P.; Hofman, R.; Kuca, P.

    2008-01-01

    We are describing a certain application of uncertainty analysis of environmental model HARP applied on atmospheric and deposition sub-model. Simulation of uncertainties propagation through the model is basic inevitable task bringing data for advanced techniques of probabilistic consequence assessment and further improvement of reliability of model predictions based on statistical procedures of assimilation with measured data. The activities are investigated in the institute IITA AV CR within the grant project supported by GACR (2007-2009). The problem is solved in close cooperation with section of information systems in institute NRPI. The subject of investigation concerns evaluation of consequences of radioactivity propagation after an accidental radioactivity release from nuclear facility.Transport of activity is studied from initial atmospheric propagation, deposition of radionuclides on terrain and spreading through food chains towards human body .Subsequent deposition processes of admixtures and food chain activity transport are modeled. In the final step a hazard estimation based on doses on population is integrated into the software system HARP. Extension to probabilistic approach has increased the complexity substantially, but offers much more informative background for modem methods of estimation accounting for inherent stochastic nature of the problem. Example of probabilistic assessment illustrated here is based on uncertainty analysis of input parameters of SGPM model. Predicted background field of Cs-137 deposition are labelled with index p. as P X SGPM . Final goal is estimation of a certain unknown true background vector χ true , which accounts also for deficiencies of the SGPM formulation in itself insisting in insufficient description of reality. We must have on mind, that even if we know true values of all input parameters θ m true (m= 1 ,..., M) of SGPM model, the χ true still remain uncertain. One possibility how to approach reality insists

  7. Whither probabilistic security management for real-time operation of power systems ?

    OpenAIRE

    Karangelos, Efthymios; Panciatici, Patrick; Wehenkel, Louis

    2016-01-01

    This paper investigates the stakes of introducing probabilistic approaches for the management of power system’s security. In real-time operation, the aim is to arbitrate in a rational way between preventive and corrective control, while taking into account i) the prior probabilities of contingencies, ii) the possible failure modes of corrective control actions, iii) the socio-economic consequences of service interruptions. This work is a first step towards the construction of a globally co...

  8. Simplified probabilistic approach to determine safety factors in deterministic flaw acceptance criteria

    International Nuclear Information System (INIS)

    Barthelet, B.; Ardillon, E.

    1997-01-01

    The flaw acceptance rules in nuclear components rely on deterministic criteria supposed to ensure the safe operating of plants. The interest of having a reliable method of evaluating the safety margins and the integrity of components led Electricite de France to launch a study to link safety factors with requested reliability. A simplified analytical probabilistic approach is developed to analyse the failure risk in Fracture Mechanics. Assuming lognormal distributions of the main random variables, it is possible considering a simple Linear Elastic Fracture Mechanics model, to determine the failure probability as a function of mean values and logarithmic standard deviations. The 'design' failure point can be analytically calculated. Partial safety factors on the main variables (stress, crack size, material toughness) are obtained in relation with reliability target values. The approach is generalized to elastic plastic Fracture Mechanics (piping) by fitting J as a power law function of stress, crack size and yield strength. The simplified approach is validated by detailed probabilistic computations with PROBAN computer program. Assuming reasonable coefficients of variations (logarithmic standard deviations), the method helps to calibrate safety factors for different components taking into account reliability target values in normal, emergency and faulted conditions. Statistical data for the mechanical properties of the main basic materials complement the study. The work involves laboratory results and manufacture data. The results of this study are discussed within a working group of the French in service inspection code RSE-M. (authors)

  9. Approach to modeling of human performance for purposes of probabilistic risk assessment

    International Nuclear Information System (INIS)

    Swain, A.D.

    1983-01-01

    This paper describes the general approach taken in NUREG/CR-1278 to model human performance in sufficienct detail to permit probabilistic risk assessments of nuclear power plant operations. To show the basis for the more specific models in the above NUREG, a simplified model of the human component in man-machine systems is presented, the role of performance shaping factors is discussed, and special problems in modeling the cognitive aspect of behavior are described

  10. An approximate methods approach to probabilistic structural analysis

    Science.gov (United States)

    Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.

    1989-01-01

    A probabilistic structural analysis method (PSAM) is described which makes an approximate calculation of the structural response of a system, including the associated probabilistic distributions, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The method employs the fast probability integration (FPI) algorithm of Wu and Wirsching. Typical solution strategies are illustrated by formulations for a representative critical component chosen from the Space Shuttle Main Engine (SSME) as part of a major NASA-sponsored program on PSAM. Typical results are presented to demonstrate the role of the methodology in engineering design and analysis.

  11. A dynamic probabilistic safety margin characterization approach in support of Integrated Deterministic and Probabilistic Safety Analysis

    International Nuclear Information System (INIS)

    Di Maio, Francesco; Rai, Ajit; Zio, Enrico

    2016-01-01

    The challenge of Risk-Informed Safety Margin Characterization (RISMC) is to develop a methodology for estimating system safety margins in the presence of stochastic and epistemic uncertainties affecting the system dynamic behavior. This is useful to support decision-making for licensing purposes. In the present work, safety margin uncertainties are handled by Order Statistics (OS) (with both Bracketing and Coverage approaches) to jointly estimate percentiles of the distributions of the safety parameter and of the time required for it to reach these percentiles values during its dynamic evolution. The novelty of the proposed approach consists in the integration of dynamic aspects (i.e., timing of events) into the definition of a dynamic safety margin for a probabilistic Quantification of Margin and Uncertainties (QMU). The system here considered for demonstration purposes is the Lead–Bismuth Eutectic- eXperimental Accelerator Driven System (LBE-XADS). - Highlights: • We integrate dynamic aspects into the definition of a safety margins. • We consider stochastic and epistemic uncertainties affecting the system dynamics. • Uncertainties are handled by Order Statistics (OS). • We estimate the system grace time during accidental scenarios. • We apply the approach to an LBE-XADS accidental scenario.

  12. Variational approach to probabilistic finite elements

    Science.gov (United States)

    Belytschko, T.; Liu, W. K.; Mani, A.; Besterfield, G.

    1991-08-01

    Probabilistic finite element methods (PFEM), synthesizing the power of finite element methods with second-moment techniques, are formulated for various classes of problems in structural and solid mechanics. Time-invariant random materials, geometric properties and loads are incorporated in terms of their fundamental statistics viz. second-moments. Analogous to the discretization of the displacement field in finite element methods, the random fields are also discretized. Preserving the conceptual simplicity, the response moments are calculated with minimal computations. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. By construction, these methods are applicable when the scale of randomness is not very large and when the probabilistic density functions have decaying tails. The accuracy and efficiency of these methods, along with their limitations, are demonstrated by various applications. Results obtained are compared with those of Monte Carlo simulation and it is shown that good accuracy can be obtained for both linear and nonlinear problems. The methods are amenable to implementation in deterministic FEM based computer codes.

  13. Tractable approximations for probabilistic models: The adaptive Thouless-Anderson-Palmer mean field approach

    DEFF Research Database (Denmark)

    Opper, Manfred; Winther, Ole

    2001-01-01

    We develop an advanced mean held method for approximating averages in probabilistic data models that is based on the Thouless-Anderson-Palmer (TAP) approach of disorder physics. In contrast to conventional TAP. where the knowledge of the distribution of couplings between the random variables...... is required. our method adapts to the concrete couplings. We demonstrate the validity of our approach, which is so far restricted to models with nonglassy behavior? by replica calculations for a wide class of models as well as by simulations for a real data set....

  14. Memristive Probabilistic Computing

    KAUST Repository

    Alahmadi, Hamzah

    2017-10-01

    In the era of Internet of Things and Big Data, unconventional techniques are rising to accommodate the large size of data and the resource constraints. New computing structures are advancing based on non-volatile memory technologies and different processing paradigms. Additionally, the intrinsic resiliency of current applications leads to the development of creative techniques in computations. In those applications, approximate computing provides a perfect fit to optimize the energy efficiency while compromising on the accuracy. In this work, we build probabilistic adders based on stochastic memristor. Probabilistic adders are analyzed with respect of the stochastic behavior of the underlying memristors. Multiple adder implementations are investigated and compared. The memristive probabilistic adder provides a different approach from the typical approximate CMOS adders. Furthermore, it allows for a high area saving and design exibility between the performance and power saving. To reach a similar performance level as approximate CMOS adders, the memristive adder achieves 60% of power saving. An image-compression application is investigated using the memristive probabilistic adders with the performance and the energy trade-off.

  15. A probabilistic approach to the evaluation of the PTS issue

    International Nuclear Information System (INIS)

    Cheverton, R.D.; Selby, D.L.

    1991-01-01

    An integrated probabilistic approach for the evaluation of the pressurized-thermal-shock (PTS) issue was developed at the Oak Ridge National Laboratory (ORNL) at the request of the Nuclear Regulatory Commission (NRC). The purpose was to provide a method for identifying dominant plant design and operating features, evaluating possible remedial measures and the validity of the NRC PTS screening criteria, and to provide an additional tool for estimating vessel life expectancy. The approach was to be integrated in the sense that it would include the postulation of transients; estimates of their frequencies of occurrence; systems analyses to obtain the corresponding primary-system pressure, down-comer coolant temperature, and fluid-film heat-transfer coefficient adjacent to the vessel wall; and a probabilistic fracture-mechanics analysis, using the latter data as input. A summation of the products of frequency of transient and conditional probability of failure for all postulated transients provides an estimate of frequency of vessel failure. In the process of developing the integrated pressurized-thermal-shock (IPTS) methodology, three specific plant analyses were conducted. The results indicate that the NRC screening criteria may not be appropriate for all US pressurized water reactor (PWR) plants; that is, for some PWRs, the calculated mean frequency of vessel failure corresponding to the screening criteria may be greater than the maximum permissible value in Regulatory Guide 1.154. A recent view of the ORNL IPTS study, which was completed in 1985, indicates that there are a number of areas in which the methodology can and should be updated, but it is not clear whether the update will increase or decrease the calculated probabilities. 31 refs., 2 tabs

  16. A Probabilistic Approach for Robustness Evaluation of Timber Structures

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Sørensen, John Dalsgaard

    of Structures and a probabilistic modelling of the timber material proposed in the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS). Due to the framework in the Danish Code the timber structure has to be evaluated with respect to the following criteria where at least one shall...... to criteria a) and b) the timber frame structure has one column with a reliability index a bit lower than an assumed target level. By removal three columns one by one no significant extensive failure of the entire structure or significant parts of it are obatined. Therefore the structure can be considered......A probabilistic based robustness analysis has been performed for a glulam frame structure supporting the roof over the main court in a Norwegian sports centre. The robustness analysis is based on the framework for robustness analysis introduced in the Danish Code of Practice for the Safety...

  17. Determining reserve requirements in DK1 area of Nord Pool using a probabilistic approach

    DEFF Research Database (Denmark)

    Saez Gallego, Javier; Morales González, Juan Miguel; Madsen, Henrik

    2014-01-01

    a probabilistic framework where the reserve requirements are computed based on scenarios of wind power forecast error, load forecast errors and power plant outages. Our approach is first motivated by the increasing wind power penetration in power systems worldwide as well as the current market design of the DK1...... System Operator). © 2014 Elsevier Ltd. All rights reserved....

  18. 77 FR 29391 - An Approach for Probabilistic Risk Assessment in Risk-Informed Decisions on Plant-Specific...

    Science.gov (United States)

    2012-05-17

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0110] An Approach for Probabilistic Risk Assessment in Risk-Informed Decisions on Plant-Specific Changes to the Licensing Basis AGENCY: Nuclear Regulatory Commission. ACTION: Draft regulatory guide; request for comment. SUMMARY: The U.S. Nuclear Regulatory...

  19. Arbitrage and Hedging in a non probabilistic framework

    OpenAIRE

    Alvarez, Alexander; Ferrando, Sebastian; Olivares, Pablo

    2011-01-01

    The paper studies the concepts of hedging and arbitrage in a non probabilistic framework. It provides conditions for non probabilistic arbitrage based on the topological structure of the trajectory space and makes connections with the usual notion of arbitrage. Several examples illustrate the non probabilistic arbitrage as well perfect replication of options under continuous and discontinuous trajectories, the results can then be applied in probabilistic models path by path. The approach is r...

  20. Experiencing a probabilistic approach to clarify and disclose uncertainties when setting occupational exposure limits.

    Science.gov (United States)

    Vernez, David; Fraize-Frontier, Sandrine; Vincent, Raymond; Binet, Stéphane; Rousselle, Christophe

    2018-03-15

    Assessment factors (AFs) are commonly used for deriving reference concentrations for chemicals. These factors take into account variabilities as well as uncertainties in the dataset, such as inter-species and intra-species variabilities or exposure duration extrapolation or extrapolation from the lowest-observed-adverse-effect level (LOAEL) to the noobserved- adverse-effect level (NOAEL). In a deterministic approach, the value of an AF is the result of a debate among experts and, often a conservative value is used as a default choice. A probabilistic framework to better take into account uncertainties and/or variability when setting occupational exposure limits (OELs) is presented and discussed in this paper. Each AF is considered as a random variable with a probabilistic distribution. A short literature was conducted before setting default distributions ranges and shapes for each AF commonly used. A random sampling, using Monte Carlo techniques, is then used for propagating the identified uncertainties and computing the final OEL distribution. Starting from the broad default distributions obtained, experts narrow it to its most likely range, according to the scientific knowledge available for a specific chemical. Introducing distribution rather than single deterministic values allows disclosing and clarifying variability and/or uncertainties inherent to the OEL construction process. This probabilistic approach yields quantitative insight into both the possible range and the relative likelihood of values for model outputs. It thereby provides a better support in decision-making and improves transparency. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.

  1. Probabilistic Structural Analysis Theory Development

    Science.gov (United States)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  2. Survey on application of probabilistic fracture mechanics approach to nuclear piping

    International Nuclear Information System (INIS)

    Kashima, Koichi

    1987-01-01

    Probabilistic fracture mechanics (PFM) approach is newly developed as one of the tools to evaluate the structural integrity of nuclear components. This report describes the current status of PFM studies for pressure vessel and piping system in light water reactors and focuses on the investigations of the piping failure probability which have been undertaken by USNRC. USNRC reevaluates the double-ended guillotine break (DEGB) of rector coolant piping as a design basis event for nuclear power plant by using the PFM approach. For PWR piping systems designed by Westinghouse, two causes of pipe break are considered: pipe failure due to the crack growth and pipe failure indirectly caused by failure of component supports due to an earthquake. PFM approach shows that the probability of DEGB from either cause is very low and that the effect of earthquake on pipe failure can be neglected. (author)

  3. Improving PERSIANN-CCS rain estimation using probabilistic approach and multi-sensors information

    Science.gov (United States)

    Karbalaee, N.; Hsu, K. L.; Sorooshian, S.; Kirstetter, P.; Hong, Y.

    2016-12-01

    This presentation discusses the recent implemented approaches to improve the rainfall estimation from Precipitation Estimation from Remotely Sensed Information using Artificial Neural Network-Cloud Classification System (PERSIANN-CCS). PERSIANN-CCS is an infrared (IR) based algorithm being integrated in the IMERG (Integrated Multi-Satellite Retrievals for the Global Precipitation Mission GPM) to create a precipitation product in 0.1x0.1degree resolution over the chosen domain 50N to 50S every 30 minutes. Although PERSIANN-CCS has a high spatial and temporal resolution, it overestimates or underestimates due to some limitations.PERSIANN-CCS can estimate rainfall based on the extracted information from IR channels at three different temperature threshold levels (220, 235, and 253k). This algorithm relies only on infrared data to estimate rainfall indirectly from this channel which cause missing the rainfall from warm clouds and false estimation for no precipitating cold clouds. In this research the effectiveness of using other channels of GOES satellites such as visible and water vapors has been investigated. By using multi-sensors the precipitation can be estimated based on the extracted information from multiple channels. Also, instead of using the exponential function for estimating rainfall from cloud top temperature, the probabilistic method has been used. Using probability distributions of precipitation rates instead of deterministic values has improved the rainfall estimation for different type of clouds.

  4. Probabilistic approach in treatment of deterministic analyses results of severe accidents

    International Nuclear Information System (INIS)

    Krajnc, B.; Mavko, B.

    1996-01-01

    Severe accidents sequences resulting in loss of the core geometric integrity have been found to have small probability of the occurrence. Because of their potential consequences to public health and safety, an evaluation of the core degradation progression and the resulting effects on the containment is necessary to determine the probability of a significant release of radioactive materials. This requires assessment of many interrelated phenomena including: steel and zircaloy oxidation, steam spikes, in-vessel debris cooling, potential vessel failure mechanisms, release of core material to the containment, containment pressurization from steam generation, or generation of non-condensable gases or hydrogen burn, and ultimately coolability of degraded core material. To asses the answer from the containment event trees in the sense of weather certain phenomenological event would happen or not the plant specific deterministic analyses should be performed. Due to the fact that there is a large uncertainty in the prediction of severe accidents phenomena in Level 2 analyses (containment event trees) the combination of probabilistic and deterministic approach should be used. In fact the result of the deterministic analyses of severe accidents are treated in probabilistic manner due to large uncertainty of results as a consequence of a lack of detailed knowledge. This paper discusses approach used in many IPEs, and which assures that the assigned probability for certain question in the event tree represent the probability that the event will or will not happen and that this probability also includes its uncertainty, which is mainly result of lack of knowledge. (author)

  5. Probabilistic approaches to accounting for data variability in the practical application of bioavailability in predicting aquatic risks from metals.

    Science.gov (United States)

    Ciffroy, Philippe; Charlatchka, Rayna; Ferreira, Daniel; Marang, Laura

    2013-07-01

    The biotic ligand model (BLM) theoretically enables the derivation of environmental quality standards that are based on true bioavailable fractions of metals. Several physicochemical variables (especially pH, major cations, dissolved organic carbon, and dissolved metal concentrations) must, however, be assigned to run the BLM, but they are highly variable in time and space in natural systems. This article describes probabilistic approaches for integrating such variability during the derivation of risk indexes. To describe each variable using a probability density function (PDF), several methods were combined to 1) treat censored data (i.e., data below the limit of detection), 2) incorporate the uncertainty of the solid-to-liquid partitioning of metals, and 3) detect outliers. From a probabilistic perspective, 2 alternative approaches that are based on log-normal and Γ distributions were tested to estimate the probability of the predicted environmental concentration (PEC) exceeding the predicted non-effect concentration (PNEC), i.e., p(PEC/PNEC>1). The probabilistic approach was tested on 4 real-case studies based on Cu-related data collected from stations on the Loire and Moselle rivers. The approach described in this article is based on BLM tools that are freely available for end-users (i.e., the Bio-Met software) and on accessible statistical data treatments. This approach could be used by stakeholders who are involved in risk assessments of metals for improving site-specific studies. Copyright © 2013 SETAC.

  6. A probabilistic approach to crack instability

    Science.gov (United States)

    Chudnovsky, A.; Kunin, B.

    1989-01-01

    A probabilistic model of brittle fracture is examined with reference to two-dimensional problems. The model is illustrated by using experimental data obtained for 25 macroscopically identical specimens made of short-fiber-reinforced composites. It is shown that the model proposed here provides a predictive formalism for the probability distributions of critical crack depth, critical loads, and crack arrest depths. It also provides similarity criteria for small-scale testing.

  7. Probabilistic Design Analysis (PDA) Approach to Determine the Probability of Cross-System Failures for a Space Launch Vehicle

    Science.gov (United States)

    Shih, Ann T.; Lo, Yunnhon; Ward, Natalie C.

    2010-01-01

    Quantifying the probability of significant launch vehicle failure scenarios for a given design, while still in the design process, is critical to mission success and to the safety of the astronauts. Probabilistic risk assessment (PRA) is chosen from many system safety and reliability tools to verify the loss of mission (LOM) and loss of crew (LOC) requirements set by the NASA Program Office. To support the integrated vehicle PRA, probabilistic design analysis (PDA) models are developed by using vehicle design and operation data to better quantify failure probabilities and to better understand the characteristics of a failure and its outcome. This PDA approach uses a physics-based model to describe the system behavior and response for a given failure scenario. Each driving parameter in the model is treated as a random variable with a distribution function. Monte Carlo simulation is used to perform probabilistic calculations to statistically obtain the failure probability. Sensitivity analyses are performed to show how input parameters affect the predicted failure probability, providing insight for potential design improvements to mitigate the risk. The paper discusses the application of the PDA approach in determining the probability of failure for two scenarios from the NASA Ares I project

  8. Probabilistic methods used in NUSS

    International Nuclear Information System (INIS)

    Fischer, J.; Giuliani, P.

    1985-01-01

    Probabilistic considerations are used implicitly or explicitly in all technical areas. In the NUSS codes and guides the two areas of design and siting are those where more use is made of these concepts. A brief review of the relevant documents in these two areas is made in this paper. It covers the documents where either probabilistic considerations are implied or where probabilistic approaches are recommended in the evaluation of situations and of events. In the siting guides the review mainly covers the area of seismic hydrological and external man-made events analysis, as well as some aspects of meteorological extreme events analysis. Probabilistic methods are recommended in the design guides but they are not made a requirement. There are several reasons for this, mainly lack of reliable data and the absence of quantitative safety limits or goals against which to judge the design analysis. As far as practical, engineering judgement should be backed up by quantitative probabilistic analysis. Examples are given and the concept of design basis as used in NUSS design guides is explained. (author)

  9. Comparison of a Traditional Probabilistic Risk Assessment Approach with Advanced Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L; Mandelli, Diego; Zhegang Ma

    2014-11-01

    As part of the Light Water Sustainability Program (LWRS) [1], the purpose of the Risk Informed Safety Margin Characterization (RISMC) [2] Pathway research and development (R&D) is to support plant decisions for risk-informed margin management with the aim to improve economics, reliability, and sustain safety of current NPPs. In this paper, we describe the RISMC analysis process illustrating how mechanistic and probabilistic approaches are combined in order to estimate a safety margin. We use the scenario of a “station blackout” (SBO) wherein offsite power and onsite power is lost, thereby causing a challenge to plant safety systems. We describe the RISMC approach, illustrate the station blackout modeling, and contrast this with traditional risk analysis modeling for this type of accident scenario. We also describe our approach we are using to represent advanced flooding analysis.

  10. Deterministic and probabilistic approach to determine seismic risk of nuclear power plants; a practical example

    International Nuclear Information System (INIS)

    Soriano Pena, A.; Lopez Arroyo, A.; Roesset, J.M.

    1976-01-01

    The probabilistic and deterministic approaches for calculating the seismic risk of nuclear power plants are both applied to a particular case in Southern Spain. The results obtained by both methods, when varying the input data, are presented and some conclusions drawn in relation to the applicability of the methods, their reliability and their sensitivity to change

  11. Probabilistic energy forecasting: Global Energy Forecasting Competition 2014 and beyond

    DEFF Research Database (Denmark)

    Hong, Tao; Pinson, Pierre; Fan, Shu

    2016-01-01

    The energy industry has been going through a significant modernization process over the last decade. Its infrastructure is being upgraded rapidly. The supply, demand and prices are becoming more volatile and less predictable than ever before. Even its business model is being challenged fundamenta......The energy industry has been going through a significant modernization process over the last decade. Its infrastructure is being upgraded rapidly. The supply, demand and prices are becoming more volatile and less predictable than ever before. Even its business model is being challenged...... fundamentally. In this competitive and dynamic environment, many decision-making processes rely on probabilistic forecasts to quantify the uncertain future. Although most of the papers in the energy forecasting literature focus on point or singlevalued forecasts, the research interest in probabilistic energy...

  12. Solving stochastic multiobjective vehicle routing problem using probabilistic metaheuristic

    Directory of Open Access Journals (Sweden)

    Gannouni Asmae

    2017-01-01

    closed form expression. This novel approach is based on combinatorial probability and can be incorporated in a multiobjective evolutionary algorithm. (iiProvide probabilistic approaches to elitism and diversification in multiobjective evolutionary algorithms. Finally, The behavior of the resulting Probabilistic Multi-objective Evolutionary Algorithms (PrMOEAs is empirically investigated on the multi-objective stochastic VRP problem.

  13. Probabilistic machine learning and artificial intelligence.

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  14. Probabilistic machine learning and artificial intelligence

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  15. Probabilistic costing of transmission services

    International Nuclear Information System (INIS)

    Wijayatunga, P.D.C.

    1992-01-01

    Costing of transmission services of electrical utilities is required for transactions involving the transport of energy over a power network. The calculation of these costs based on Short Run Marginal Costing (SRMC) is preferred over other methods proposed in the literature due to its economic efficiency. In the research work discussed here, the concept of probabilistic costing of use-of-system based on SRMC which emerges as a consequence of the uncertainties in a power system is introduced using two different approaches. The first approach, based on the Monte Carlo method, generates a large number of possible system states by simulating random variables in the system using pseudo random number generators. A second approach to probabilistic use-of-system costing is proposed based on numerical convolution and multi-area representation of the transmission network. (UK)

  16. Assessing dynamic postural control during exergaming in older adults: A probabilistic approach.

    Science.gov (United States)

    Soancatl Aguilar, V; Lamoth, C J C; Maurits, N M; Roerdink, J B T M

    2018-02-01

    Digital games controlled by body movements (exergames) have been proposed as a way to improve postural control among older adults. Exergames are meant to be played at home in an unsupervised way. However, only few studies have investigated the effect of unsupervised home-exergaming on postural control. Moreover, suitable methods to dynamically assess postural control during exergaming are still scarce. Dynamic postural control (DPC) assessment could be used to provide both meaningful feedback and automatic adjustment of exergame difficulty. These features could potentially foster unsupervised exergaming at home and improve the effectiveness of exergames as tools to improve balance control. The main aim of this study is to investigate the effect of six weeks of unsupervised home-exergaming on DPC as assessed by a recently developed probabilistic model. High probability values suggest 'deteriorated' postural control, whereas low probability values suggest 'good' postural control. In a pilot study, ten healthy older adults (average 77.9, SD 7.2 years) played an ice-skating exergame at home half an hour per day, three times a week during six weeks. The intervention effect on DPC was assessed using exergaming trials recorded by Kinect at baseline and every other week. Visualization of the results suggests that the probabilistic model is suitable for real-time DPC assessment. Moreover, linear mixed model analysis and parametric bootstrapping suggest a significant intervention effect on DPC. In conclusion, these results suggest that unsupervised exergaming for improving DPC among older adults is indeed feasible and that probabilistic models could be a new approach to assess DPC. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. A Probabilistic Approach to Network Event Formation from Pre-Processed Waveform Data

    Science.gov (United States)

    Kohl, B. C.; Given, J.

    2017-12-01

    The current state of the art for seismic event detection still largely depends on signal detection at individual sensor stations, including picking accurate arrivals times and correctly identifying phases, and relying on fusion algorithms to associate individual signal detections to form event hypotheses. But increasing computational capability has enabled progress toward the objective of fully utilizing body-wave recordings in an integrated manner to detect events without the necessity of previously recorded ground truth events. In 2011-2012 Leidos (then SAIC) operated a seismic network to monitor activity associated with geothermal field operations in western Nevada. We developed a new association approach for detecting and quantifying events by probabilistically combining pre-processed waveform data to deal with noisy data and clutter at local distance ranges. The ProbDet algorithm maps continuous waveform data into continuous conditional probability traces using a source model (e.g. Brune earthquake or Mueller-Murphy explosion) to map frequency content and an attenuation model to map amplitudes. Event detection and classification is accomplished by combining the conditional probabilities from the entire network using a Bayesian formulation. This approach was successful in producing a high-Pd, low-Pfa automated bulletin for a local network and preliminary tests with regional and teleseismic data show that it has promise for global seismic and nuclear monitoring applications. The approach highlights several features that we believe are essential to achieving low-threshold automated event detection: Minimizes the utilization of individual seismic phase detections - in traditional techniques, errors in signal detection, timing, feature measurement and initial phase ID compound and propagate into errors in event formation, Has a formalized framework that utilizes information from non-detecting stations, Has a formalized framework that utilizes source information, in

  18. A shortened version of the THERP/Handbook approach to human reliability analysis for probabilistic risk assessment

    International Nuclear Information System (INIS)

    Swain, A.D.

    1986-01-01

    The approach to human reliability analysis (HRA) known as THERP/Handbook has been applied to several probabilistic risk assessments (PRAs) of nuclear power plants (NPPs) and other complex systems. The approach is based on a thorough task analysis of the man-machine interfaces, including the interactions among the people, involved in the operations being assessed. The idea is to assess fully the underlying performance shaping factors (PSFs) and dependence effects which result either in reliable or unreliable human performance

  19. Identification of probabilistic approaches and map-based navigation ...

    Indian Academy of Sciences (India)

    B Madhevan

    2018-02-07

    Feb 7, 2018 ... consists of three processes: map learning (ML), localization and PP [73–76]. (i) ML ...... [83] Thrun S 2001 A probabilistic online mapping algorithm for teams of .... for target tracking using fuzzy logic controller in game theoretic ...

  20. Probabilistic Modeling and Visualization for Bankruptcy Prediction

    DEFF Research Database (Denmark)

    Antunes, Francisco; Ribeiro, Bernardete; Pereira, Francisco Camara

    2017-01-01

    In accounting and finance domains, bankruptcy prediction is of great utility for all of the economic stakeholders. The challenge of accurate assessment of business failure prediction, specially under scenarios of financial crisis, is known to be complicated. Although there have been many successful...... studies on bankruptcy detection, seldom probabilistic approaches were carried out. In this paper we assume a probabilistic point-of-view by applying Gaussian Processes (GP) in the context of bankruptcy prediction, comparing it against the Support Vector Machines (SVM) and the Logistic Regression (LR......). Using real-world bankruptcy data, an in-depth analysis is conducted showing that, in addition to a probabilistic interpretation, the GP can effectively improve the bankruptcy prediction performance with high accuracy when compared to the other approaches. We additionally generate a complete graphical...

  1. A fuzzy-based reliability approach to evaluate basic events of fault tree analysis for nuclear power plant probabilistic safety assessment

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry

    2014-01-01

    Highlights: • We propose a fuzzy-based reliability approach to evaluate basic event reliabilities. • It implements the concepts of failure possibilities and fuzzy sets. • Experts evaluate basic event failure possibilities using qualitative words. • Triangular fuzzy numbers mathematically represent qualitative failure possibilities. • It is a very good alternative for conventional reliability approach. - Abstract: Fault tree analysis has been widely utilized as a tool for nuclear power plant probabilistic safety assessment. This analysis can be completed only if all basic events of the system fault tree have their quantitative failure rates or failure probabilities. However, it is difficult to obtain those failure data due to insufficient data, environment changing or new components. This study proposes a fuzzy-based reliability approach to evaluate basic events of system fault trees whose failure precise probability distributions of their lifetime to failures are not available. It applies the concept of failure possibilities to qualitatively evaluate basic events and the concept of fuzzy sets to quantitatively represent the corresponding failure possibilities. To demonstrate the feasibility and the effectiveness of the proposed approach, the actual basic event failure probabilities collected from the operational experiences of the David–Besse design of the Babcock and Wilcox reactor protection system fault tree are used to benchmark the failure probabilities generated by the proposed approach. The results confirm that the proposed fuzzy-based reliability approach arises as a suitable alternative for the conventional probabilistic reliability approach when basic events do not have the corresponding quantitative historical failure data for determining their reliability characteristics. Hence, it overcomes the limitation of the conventional fault tree analysis for nuclear power plant probabilistic safety assessment

  2. Probabilistic models for neural populations that naturally capture global coupling and criticality.

    Science.gov (United States)

    Humplik, Jan; Tkačik, Gašper

    2017-09-01

    Advances in multi-unit recordings pave the way for statistical modeling of activity patterns in large neural populations. Recent studies have shown that the summed activity of all neurons strongly shapes the population response. A separate recent finding has been that neural populations also exhibit criticality, an anomalously large dynamic range for the probabilities of different population activity patterns. Motivated by these two observations, we introduce a class of probabilistic models which takes into account the prior knowledge that the neural population could be globally coupled and close to critical. These models consist of an energy function which parametrizes interactions between small groups of neurons, and an arbitrary positive, strictly increasing, and twice differentiable function which maps the energy of a population pattern to its probability. We show that: 1) augmenting a pairwise Ising model with a nonlinearity yields an accurate description of the activity of retinal ganglion cells which outperforms previous models based on the summed activity of neurons; 2) prior knowledge that the population is critical translates to prior expectations about the shape of the nonlinearity; 3) the nonlinearity admits an interpretation in terms of a continuous latent variable globally coupling the system whose distribution we can infer from data. Our method is independent of the underlying system's state space; hence, it can be applied to other systems such as natural scenes or amino acid sequences of proteins which are also known to exhibit criticality.

  3. Probabilistic dual heuristic programming-based adaptive critic

    Science.gov (United States)

    Herzallah, Randa

    2010-02-01

    Adaptive critic (AC) methods have common roots as generalisations of dynamic programming for neural reinforcement learning approaches. Since they approximate the dynamic programming solutions, they are potentially suitable for learning in noisy, non-linear and non-stationary environments. In this study, a novel probabilistic dual heuristic programming (DHP)-based AC controller is proposed. Distinct to current approaches, the proposed probabilistic (DHP) AC method takes uncertainties of forward model and inverse controller into consideration. Therefore, it is suitable for deterministic and stochastic control problems characterised by functional uncertainty. Theoretical development of the proposed method is validated by analytically evaluating the correct value of the cost function which satisfies the Bellman equation in a linear quadratic control problem. The target value of the probabilistic critic network is then calculated and shown to be equal to the analytically derived correct value. Full derivation of the Riccati solution for this non-standard stochastic linear quadratic control problem is also provided. Moreover, the performance of the proposed probabilistic controller is demonstrated on linear and non-linear control examples.

  4. Overview of the probabilistic risk assessment approach

    International Nuclear Information System (INIS)

    Reed, J.W.

    1985-01-01

    The techniques of probabilistic risk assessment (PRA) are applicable to Department of Energy facilities. The background and techniques of PRA are given with special attention to seismic, wind and flooding external events. A specific application to seismic events is provided to demonstrate the method. However, the PRA framework is applicable also to wind and external flooding. 3 references, 8 figures, 1 table

  5. A Probabilistic, Facility-Centric Approach to Lightning Strike Location

    Science.gov (United States)

    Huddleston, Lisa L.; Roeder, William p.; Merceret, Francis J.

    2012-01-01

    A new probabilistic facility-centric approach to lightning strike location has been developed. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even with the location error ellipse. This technique is adapted from a method of calculating the probability of debris collisionith spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force Station. Future applications could include forensic meteorology.

  6. Systems analysis approach to probabilistic modeling of fault trees

    International Nuclear Information System (INIS)

    Bartholomew, R.J.; Qualls, C.R.

    1985-01-01

    A method of probabilistic modeling of fault tree logic combined with stochastic process theory (Markov modeling) has been developed. Systems are then quantitatively analyzed probabilistically in terms of their failure mechanisms including common cause/common mode effects and time dependent failure and/or repair rate effects that include synergistic and propagational mechanisms. The modeling procedure results in a state vector set of first order, linear, inhomogeneous, differential equations describing the time dependent probabilities of failure described by the fault tree. The solutions of this Failure Mode State Variable (FMSV) model are cumulative probability distribution functions of the system. A method of appropriate synthesis of subsystems to form larger systems is developed and applied to practical nuclear power safety systems

  7. Integrated Deterministic-Probabilistic Safety Assessment Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Kudinov, P.; Vorobyev, Y.; Sanchez-Perea, M.; Queral, C.; Jimenez Varas, G.; Rebollo, M. J.; Mena, L.; Gomez-Magin, J.

    2014-02-01

    IDPSA (Integrated Deterministic-Probabilistic Safety Assessment) is a family of methods which use tightly coupled probabilistic and deterministic approaches to address respective sources of uncertainties, enabling Risk informed decision making in a consistent manner. The starting point of the IDPSA framework is that safety justification must be based on the coupling of deterministic (consequences) and probabilistic (frequency) considerations to address the mutual interactions between stochastic disturbances (e.g. failures of the equipment, human actions, stochastic physical phenomena) and deterministic response of the plant (i.e. transients). This paper gives a general overview of some IDPSA methods as well as some possible applications to PWR safety analyses. (Author)

  8. CANDU type fuel behavior evaluation - a probabilistic approach

    International Nuclear Information System (INIS)

    Moscalu, D.R.; Horhoianu, G.; Popescu, I.A.; Olteanu, G.

    1995-01-01

    In order to realistically assess the behavior of the fuel elements during in-reactor operation, probabilistic methods have recently been introduced in the analysis of fuel performance. The present paper summarizes the achievements in this field at the Institute for Nuclear Research (INR), pointing out some advantages of the utilized method in the evaluation of CANDU type fuel behavior in steady state conditions. The Response Surface Method (RSM) has been selected for the investigation of the effects of the variability in fuel element computer code inputs on the code outputs (fuel element performance parameters). A new developed version of the probabilistic code APMESRA based on RSM is briefly presented. The examples of application include the analysis of the results of an in-reactor fuel element experiment and the investigation of the calculated performance parameter distribution for a new CANDU type extended burnup fuel element design. (author)

  9. Probabilistic structural integrity of reactor vessel under pressurized thermal shock

    International Nuclear Information System (INIS)

    Myung Jo Hhung; Young Hwan Choi; Hho Jung Kim; Changheui Jang

    2005-01-01

    Performed here is a comparative assessment study for the probabilistic fracture mechanics approach of the pressurized thermal shock of the reactor pressure vessel. A round robin consisting of 1 prerequisite study and 5 cases for probabilistic approaches is proposed, and all organizations interested are invited. The problems are solved and their results are compared to issue some recommendation of best practices in this area and to assure an understanding of the key parameters of this type of approach, which will be useful in the justification through a probabilistic approach for the case of a plant over-passing the screening criteria. Six participants from 3 organizations in Korea responded to the problem and their results are compiled in this study. (authors)

  10. Evaluation of diffusion-tensor imaging-based global search and tractography for tumor surgery close to the language system.

    Directory of Open Access Journals (Sweden)

    Mirco Richter

    Full Text Available Pre-operative planning and intra-operative guidance in neurosurgery require detailed information about the location of functional areas and their anatomo-functional connectivity. In particular, regarding the language system, post-operative deficits such as aphasia can be avoided. By combining functional magnetic resonance imaging and diffusion tensor imaging, the connectivity between functional areas can be reconstructed by tractography techniques that need to cope with limitations such as limited resolution and low anisotropic diffusion close to functional areas. Tumors pose particular challenges because of edema, displacement effects on brain tissue and infiltration of white matter. Under these conditions, standard fiber tracking methods reconstruct pathways of insufficient quality. Therefore, robust global or probabilistic approaches are required. In this study, two commonly used standard fiber tracking algorithms, streamline propagation and tensor deflection, were compared with a previously published global search, Gibbs tracking and a connection-oriented probabilistic tractography approach. All methods were applied to reconstruct neuronal pathways of the language system of patients undergoing brain tumor surgery, and control subjects. Connections between Broca and Wernicke areas via the arcuate fasciculus (AF and the inferior fronto-occipital fasciculus (IFOF were validated by a clinical expert to ensure anatomical feasibility, and compared using distance- and diffusion-based similarity metrics to evaluate their agreement on pathway locations. For both patients and controls, a strong agreement between all methods was observed regarding the location of the AF. In case of the IFOF however, standard fiber tracking and Gibbs tracking predominantly identified the inferior longitudinal fasciculus that plays a secondary role in semantic language processing. In contrast, global search resolved connections in almost every case via the IFOF which

  11. Probabilistic Analysis Methods for Hybrid Ventilation

    DEFF Research Database (Denmark)

    Brohus, Henrik; Frier, Christian; Heiselberg, Per

    This paper discusses a general approach for the application of probabilistic analysis methods in the design of ventilation systems. The aims and scope of probabilistic versus deterministic methods are addressed with special emphasis on hybrid ventilation systems. A preliminary application...... of stochastic differential equations is presented comprising a general heat balance for an arbitrary number of loads and zones in a building to determine the thermal behaviour under random conditions....

  12. Various Approaches to Globalization of Women's Rights

    Directory of Open Access Journals (Sweden)

    محمد تقی رفیعی

    2017-03-01

    Full Text Available Globalization is an undeniable fact; however regarding to its complex dimensions assessing the concept is greatly difficult. Having a boundless world has always been discussed in globalization, thus, any discussion on globalizing women's rights, which is associated with the culture, tradition and moral values of a society, is controversial and momentous. First, regarding the terminology of globalization, the concept of globalization will be clarified. Then, three approaches of traditional, reformational and religional modernists, which approach the globalization of women's rights differently, will be defined and examined. These approaches recognize the globalization of women's rights in a way to achieve the same rules, which are enshrined in the Convention on the Elimination of All Forms of Discrimination against Women (CEDAW. Of course, all these approaches have religious attitudes toward the discussion, and atheist opinions are not subject to this study. Finally, it seems that, in conformity with the viewpoint derived from the new religinal modernists’ perspective and the great concern of the reformative approach in respect of protecting religious values, new mechanisms can be designated, which are not harmful to religious foundations on one hand, and pave the way for globalizing women's rights on the other.

  13. Probabilistic approach to requalification of existing NPPs under aircraft crash loading

    International Nuclear Information System (INIS)

    Birbraer, A.N.; Roleder, A.J.; Shulman, G.S.

    1993-01-01

    A probabilistic approach to the analysis of NPP safety under aircraft impact is discussed. It may be used both for requalification of existing NPPs and in the process of NPP design. NPP is considered as a system of components: structures, pipes, different kinds of equipment, soil, foundation. The exceeding of the limit probability of the radioactive products release out of containment (i.e. of the NPP safety requirements non-fulfilment) is taken as a system failure criterion. An example of an event tree representing the consequence of events causing the failure is given. Described are the methods of estimate of elementary events probabilities through which a composite probability of the failure is evaluated. (author)

  14. Combinations of probabilistic and approximate quantum cloning and deleting

    International Nuclear Information System (INIS)

    Qiu Daowen

    2002-01-01

    We first construct a probabilistic and approximate quantum cloning machine (PACM) and then clarify the relation between the PACM and other cloning machines. After that, we estimate the global fidelity of the approximate cloning that improves the previous estimation for the deterministic cloning machine; and also derive a bound on the success probability of producing perfect multiple clones. Afterwards, we further establish a more generalized probabilistic and approximate cloning and deleting machine (PACDM) and discuss the connections of the PACDM to some of the existing quantum cloning and deleting machines. Finally the global fidelity and a bound on the success probability of the PACDM are obtained. Summarily, the quantum devices established in this paper improve and also greatly generalize some of the existing machines

  15. A probabilistic approach to rock mechanical property characterization for nuclear waste repository design

    International Nuclear Information System (INIS)

    Kim, Kunsoo; Gao, Hang

    1996-01-01

    A probabilistic approach is proposed for the characterization of host rock mechanical properties at the Yucca Mountain site. This approach helps define the probability distribution of rock properties by utilizing extreme value statistics and Monte Carlo simulation. We analyze mechanical property data of tuff obtained by the NNWSI Project to assess the utility of the methodology. The analysis indicates that laboratory measured strength and deformation data of Calico Hills and Bullfrog tuffs follow an extremal. probability distribution (the third type asymptotic distribution of the smallest values). Monte Carlo simulation is carried out to estimate rock mass deformation moduli using a one-dimensional tuff model proposed by Zimmermann and Finley. We suggest that the results of these analyses be incorporated into the repository design

  16. Probabilistic assessment of nuclear safety and safeguards

    International Nuclear Information System (INIS)

    Higson, D.J.

    1987-01-01

    Nuclear reactor accidents and diversions of materials from the nuclear fuel cycle are perceived by many people as particularly serious threats to society. Probabilistic assessment is a rational approach to the evaluation of both threats, and may provide a basis for decisions on appropriate actions to control them. Probabilistic method have become standard tools used in the analysis of safety, but there are disagreements on the criteria to be applied when assessing the results of analysis. Probabilistic analysis and assessment of the effectiveness of nuclear material safeguards are still at an early stage of development. (author)

  17. A Probabilistic Approach for Breast Boundary Extraction in Mammograms

    Directory of Open Access Journals (Sweden)

    Hamed Habibi Aghdam

    2013-01-01

    Full Text Available The extraction of the breast boundary is crucial to perform further analysis of mammogram. Methods to extract the breast boundary can be classified into two categories: methods based on image processing techniques and those based on models. The former use image transformation techniques such as thresholding, morphological operations, and region growing. In the second category, the boundary is extracted using more advanced techniques, such as the active contour model. The problem with thresholding methods is that it is a hard to automatically find the optimal threshold value by using histogram information. On the other hand, active contour models require defining a starting point close to the actual boundary to be able to successfully extract the boundary. In this paper, we propose a probabilistic approach to address the aforementioned problems. In our approach we use local binary patterns to describe the texture around each pixel. In addition, the smoothness of the boundary is handled by using a new probability model. Experimental results show that the proposed method reaches 38% and 50% improvement with respect to the results obtained by the active contour model and threshold-based methods respectively, and it increases the stability of the boundary extraction process up to 86%.

  18. Registration of indoor TLS data: in favor of a probabilistic approach initialized by geo-location

    International Nuclear Information System (INIS)

    Hullo, Jean-Francois

    2013-01-01

    Many pre-maintenance operations of industrial facilities currently resort on to three dimensional CAD models. The acquisition of these models is performed from point clouds measured by Terrestrial Laser Scanning (TLS). When the scenes are complex, several view points for scanning, also known as stations, are necessary to ensure the completeness and the density of the survey data. The generation of a global point cloud, i.e. the expression of all the acquired data in a common reference frame, is a crucial step called registration. During this process, the pose parameters are estimated. If the GNSS Systems are now a solution for many outdoor scenes, the registration of indoor TLS data still remains a challenge. The goal of this thesis is to improve the acquisition process of TLS data in industrial environments. The aim is to guarantee the precision and accuracy of acquired data, while optimizing on-site acquisition time and protocols by, as often as possible, freeing the operator from the constraints inherent to conventional topography surveys. In a first part, we consider the state of the art of the means and methods used during the acquisition of dense point clouds of complex interior scenes (Part I). In a second part, we study and evaluate the data available for the registration: terrestrial laser data, primitive reconstruction algorithms in point clouds and indoor geo-location Systems (Part II). In the third part, we then formalize and experiment a registration algorithm based on the use of matched primitives, reconstructed from per station point clouds ( Part III). We finally propose a probabilistic approach for matching primitives, allowing the integration of a priori information and uncertainty in the constraints System used for calculating poses (Part IV). The contributions of our work are as follows: - to take a critical look at current methods of TLS data acquisition in industrial environments, - to evaluate, through experimentations, the information

  19. Probabilistic Decision Graphs - Combining Verification and AI Techniques for Probabilistic Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We adopt probabilistic decision graphs developed in the field of automated verification as a tool for probabilistic model representation and inference. We show that probabilistic inference has linear time complexity in the size of the probabilistic decision graph, that the smallest probabilistic ...

  20. Probabilistic conditional independence structures

    CERN Document Server

    Studeny, Milan

    2005-01-01

    Probabilistic Conditional Independence Structures provides the mathematical description of probabilistic conditional independence structures; the author uses non-graphical methods of their description, and takes an algebraic approach.The monograph presents the methods of structural imsets and supermodular functions, and deals with independence implication and equivalence of structural imsets.Motivation, mathematical foundations and areas of application are included, and a rough overview of graphical methods is also given.In particular, the author has been careful to use suitable terminology, and presents the work so that it will be understood by both statisticians, and by researchers in artificial intelligence.The necessary elementary mathematical notions are recalled in an appendix.

  1. A probabilistic approach for the estimation of earthquake source parameters from spectral inversion

    Science.gov (United States)

    Supino, M.; Festa, G.; Zollo, A.

    2017-12-01

    The amplitude spectrum of a seismic signal related to an earthquake source carries information about the size of the rupture, moment, stress and energy release. Furthermore, it can be used to characterize the Green's function of the medium crossed by the seismic waves. We describe the earthquake amplitude spectrum assuming a generalized Brune's (1970) source model, and direct P- and S-waves propagating in a layered velocity model, characterized by a frequency-independent Q attenuation factor. The observed displacement spectrum depends indeed on three source parameters, the seismic moment (through the low-frequency spectral level), the corner frequency (that is a proxy of the fault length) and the high-frequency decay parameter. These parameters are strongly correlated each other and with the quality factor Q; a rigorous estimation of the associated uncertainties and parameter resolution is thus needed to obtain reliable estimations.In this work, the uncertainties are characterized adopting a probabilistic approach for the parameter estimation. Assuming an L2-norm based misfit function, we perform a global exploration of the parameter space to find the absolute minimum of the cost function and then we explore the cost-function associated joint a-posteriori probability density function around such a minimum, to extract the correlation matrix of the parameters. The global exploration relies on building a Markov chain in the parameter space and on combining a deterministic minimization with a random exploration of the space (basin-hopping technique). The joint pdf is built from the misfit function using the maximum likelihood principle and assuming a Gaussian-like distribution of the parameters. It is then computed on a grid centered at the global minimum of the cost-function. The numerical integration of the pdf finally provides mean, variance and correlation matrix associated with the set of best-fit parameters describing the model. Synthetic tests are performed to

  2. Validation of the probabilistic approach for the analysis of PWR transients

    International Nuclear Information System (INIS)

    Amesz, J.; Francocci, G.F.; Clarotti, C.

    1978-01-01

    This paper reviews the pilot study at present being carried out on the validation of probabilistic methodology with real data coming from the operational records of the PWR power station at Obrigheim (KWO, Germany) operating since 1969. The aim of this analysis is to validate the a priori predictions of reactor transients performed by a probabilistic methodology, with the posteriori analysis of transients that actually occurred at a power station. Two levels of validation have been distinguished: (a) validation of the rate of occurrence of initiating events; (b) validation of the transient-parameter amplitude (i.e., overpressure) caused by the above mentioned initiating events. The paper describes the a priori calculations performed using a fault-tree analysis by means of a probabilistic code (SALP 3) and event-trees coupled with a PWR system deterministic computer code (LOOP 7). Finally the principle results of these analyses are presented and critically reviewed

  3. A probabilistic approach to the management of multi-stage multicriteria process

    Directory of Open Access Journals (Sweden)

    Yu. V. Bugaev

    2017-01-01

    Full Text Available Currently, any production process is viewed as the primary means of profit and competitiveness, in other words, becomes the dominant process approach. In this approach, the final product production appears network of interconnected processing steps during which the conversion of inputs into outputs, with a stable, accurate executable, high process most efficiently and cost-effectively provides a planned quality. An example is the organization of bread production. For the modern period is characterized by the classical recovery technology that allows to improve the palatability of bread, enhance its flavor, longer-lasting freshness. Baking is a process to be controlled in order to obtain the required quality parameters of the final product. One of the new and promising methods of quality management processes is a probabilistic method to determine the increase in the probability of release of quality products within the resources allocated for measures to improve the quality level. The paper was applied a quality management concept is based on a probabilistic approach for the multi-step process, which consists in the fact that as one of the main criteria adopted by the probability of release of high-quality products. However, it is obvious that the implementation of certain measures for its improvement requires the connection of certain resources that, first of all, is inevitably associated with certain cash costs. Thus, we arrive at an optimal control problem, which has at least two criteria - probability qualitative completion of the process, which should be maximized and the total costs of the corrective measures that need to be minimized. The authors of the idealized model of optimal control has been developed for the case when a single event affects only a single step. a special case of vector Uorshall-Floyd algorithm was used to optimize the structure of a multi-step process. The use of vector optimization on graphs allowed the authors to

  4. On the Efficiency of Local and Global Communication in Modular Robots

    DEFF Research Database (Denmark)

    Garcia, Ricardo Franco Mendoza; Schultz, Ulrik Pagh; Støy, Kasper

    2009-01-01

    , we conclude that global communication is convenient for centralized control approaches and local communication is convenient for distributed control approaches. In addition, we conclude that global is in general convenient for low-connectivity configurations, such as chains, trees or limbs......As exchange of information is essential to modular robots, deciding between local or global communication is a common design choice. This choice, however, still lacks theoretical support. In this paper we analyse the efficiency of local and global communication in modular robots. To this end, we...... use parameters to describe the topology of modular robots, develop a probabilistic model of local communication using these parameters and, using a model of global communication from literature, compare the transmission times of local and global communication in different robots. Based on our results...

  5. An integrated approach to the probabilistic assessments of aircraft strikes and structural mode of damages to nuclear power plants

    International Nuclear Information System (INIS)

    Godbout, P.; Brais, A.

    1975-01-01

    The possibilities of an aircraft striking a Canadian nuclear power plant in the vicinity of an airport and of inducing structural failure modes have been evaluated. This evaluation, together with other studies, may enhance decisions in the development of general criteria for the siting of reactors near airports. The study made use, for assessment, of the probabilistic approach and made judicious applications of the finite Canadian, French, German, American and English resources that were available. The tools, techniques and methods used for achieving the above, form what may be called an integrated approach. This method of approach requires that the study be made in six consecutive steps as follows: the qualitative evaluation of having an aircraft strike on a site situated near an airport with the use of the logic model technique; the statistical data gathering on aircraft movements and accidents; evaluating the probability distribution and calculating the basic event probabilities; evaluating the probability of an aircraft strike and the application of the sensitivity approach; generating the probability density distribution versus strike impact energy, that is, the evaluation of the energy envelope; and the probabilistic evaluation of structural failure mode inducements

  6. Application of a time probabilistic approach to seismic landslide hazard estimates in Iran

    Science.gov (United States)

    Rajabi, A. M.; Del Gaudio, V.; Capolongo, D.; Khamehchiyan, M.; Mahdavifar, M. R.

    2009-04-01

    Iran is a country located in a tectonic active belt and is prone to earthquake and related phenomena. In the recent years, several earthquakes caused many fatalities and damages to facilities, e.g. the Manjil (1990), Avaj (2002), Bam (2003) and Firuzabad-e-Kojur (2004) earthquakes. These earthquakes generated many landslides. For instance, catastrophic landslides triggered by the Manjil Earthquake (Ms = 7.7) in 1990 buried the village of Fatalak, killed more than 130 peoples and cut many important road and other lifelines, resulting in major economic disruption. In general, earthquakes in Iran have been concentrated in two major zones with different seismicity characteristics: one is the region of Alborz and Central Iran and the other is the Zagros Orogenic Belt. Understanding where seismically induced landslides are most likely to occur is crucial in reducing property damage and loss of life in future earthquakes. For this purpose a time probabilistic approach for earthquake-induced landslide hazard at regional scale, proposed by Del Gaudio et al. (2003), has been applied to the whole Iranian territory to provide the basis of hazard estimates. This method consists in evaluating the recurrence of seismically induced slope failure conditions inferred from the Newmark's model. First, by adopting Arias Intensity to quantify seismic shaking and using different Arias attenuation relations for Alborz - Central Iran and Zagros regions, well-established methods of seismic hazard assessment, based on the Cornell (1968) method, were employed to obtain the occurrence probabilities for different levels of seismic shaking in a time interval of interest (50 year). Then, following Jibson (1998), empirical formulae specifically developed for Alborz - Central Iran and Zagros, were used to represent, according to the Newmark's model, the relation linking Newmark's displacement Dn to Arias intensity Ia and to slope critical acceleration ac. These formulae were employed to evaluate

  7. A probabilistic approach to the computation of the levelized cost of electricity

    International Nuclear Information System (INIS)

    Geissmann, Thomas

    2017-01-01

    This paper sets forth a novel approach to calculate the levelized cost of electricity (LCOE) using a probabilistic model that accounts for endogenous input parameters. The approach is applied to the example of a nuclear and gas power project. Monte Carlo simulation results show that a correlation between input parameters has a significant effect on the model outcome. By controlling for endogeneity, a statistically significant difference in the mean LCOE estimate and a change in the order of input leverages is observed. Moreover, the paper discusses the role of discounting options and external costs in detail. In contrast to the gas power project, the economic viability of the nuclear project is considerably weaker. - Highlights: • First model of levelized cost of electricity accounting for uncertainty and endogeneities in input parameters. • Allowance for endogeneities significantly affects results. • Role of discounting options and external costs is discussed and modelled.

  8. All-possible-couplings approach to measuring probabilistic context.

    Directory of Open Access Journals (Sweden)

    Ehtibar N Dzhafarov

    Full Text Available From behavioral sciences to biology to quantum mechanics, one encounters situations where (i a system outputs several random variables in response to several inputs, (ii for each of these responses only some of the inputs may "directly" influence them, but (iii other inputs provide a "context" for this response by influencing its probabilistic relations to other responses. These contextual influences are very different, say, in classical kinetic theory and in the entanglement paradigm of quantum mechanics, which are traditionally interpreted as representing different forms of physical determinism. One can mathematically construct systems with other types of contextuality, whether or not empirically realizable: those that form special cases of the classical type, those that fall between the classical and quantum ones, and those that violate the quantum type. We show how one can quantify and classify all logically possible contextual influences by studying various sets of probabilistic couplings, i.e., sets of joint distributions imposed on random outputs recorded at different (mutually incompatible values of inputs.

  9. Comparative study of probabilistic methodologies for small signal stability assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rueda, J.L.; Colome, D.G. [Universidad Nacional de San Juan (IEE-UNSJ), San Juan (Argentina). Inst. de Energia Electrica], Emails: joseluisrt@iee.unsj.edu.ar, colome@iee.unsj.edu.ar

    2009-07-01

    Traditional deterministic approaches for small signal stability assessment (SSSA) are unable to properly reflect the existing uncertainties in real power systems. Hence, the probabilistic analysis of small signal stability (SSS) is attracting more attention by power system engineers. This paper discusses and compares two probabilistic methodologies for SSSA, which are based on the two point estimation method and the so-called Monte Carlo method, respectively. The comparisons are based on the results obtained for several power systems of different sizes and with different SSS performance. It is demonstrated that although with an analytical approach the amount of computation of probabilistic SSSA can be reduced, the different degrees of approximations that are adopted, lead to deceptive results. Conversely, Monte Carlo based probabilistic SSSA can be carried out with reasonable computational effort while holding satisfactory estimation precision. (author)

  10. Probabilistic composition of preferences, theory and applications

    CERN Document Server

    Parracho Sant'Anna, Annibal

    2015-01-01

    Putting forward a unified presentation of the features and possible applications of probabilistic preferences composition, and serving as a methodology for decisions employing multiple criteria, this book maximizes reader insights into the evaluation in probabilistic terms and the development of composition approaches that do not depend on assigning weights to the criteria. With key applications in important areas of management such as failure modes, effects analysis and productivity analysis – together with explanations about the application of the concepts involved –this book makes available numerical examples of probabilistic transformation development and probabilistic composition. Useful not only as a reference source for researchers, but also in teaching classes of graduate courses in Production Engineering and Management Science, the key themes of the book will be of especial interest to researchers in the field of Operational Research.

  11. A probabilistic approach of sum rules for heat polynomials

    International Nuclear Information System (INIS)

    Vignat, C; Lévêque, O

    2012-01-01

    In this paper, we show that the sum rules for generalized Hermite polynomials derived by Daboul and Mizrahi (2005 J. Phys. A: Math. Gen. http://dx.doi.org/10.1088/0305-4470/38/2/010) and by Graczyk and Nowak (2004 C. R. Acad. Sci., Ser. 1 338 849) can be interpreted and easily recovered using a probabilistic moment representation of these polynomials. The covariance property of the raising operator of the harmonic oscillator, which is at the origin of the identities proved in Daboul and Mizrahi and the dimension reduction effect expressed in the main result of Graczyk and Nowak are both interpreted in terms of the rotational invariance of the Gaussian distributions. As an application of these results, we uncover a probabilistic moment interpretation of two classical integrals of the Wigner function that involve the associated Laguerre polynomials. (paper)

  12. Cognitive Development Effects of Teaching Probabilistic Decision Making to Middle School Students

    Science.gov (United States)

    Mjelde, James W.; Litzenberg, Kerry K.; Lindner, James R.

    2011-01-01

    This study investigated the comprehension and effectiveness of teaching formal, probabilistic decision-making skills to middle school students. Two specific objectives were to determine (1) if middle school students can comprehend a probabilistic decision-making approach, and (2) if exposure to the modeling approaches improves middle school…

  13. Basic Ideas to Approach Metastability in Probabilistic Cellular Automata

    NARCIS (Netherlands)

    Cirillo, Emilio N. M.; Nardi, Francesca R.; Spitoni, Cristian

    2016-01-01

    Cellular Automata are discrete--time dynamical systems on a spatially extended discrete space which provide paradigmatic examples of nonlinear phenomena. Their stochastic generalizations, i.e., Probabilistic Cellular Automata, are discrete time Markov chains on lattice with finite single--cell

  14. Probabilistic Graph Layout for Uncertain Network Visualization.

    Science.gov (United States)

    Schulz, Christoph; Nocaj, Arlind; Goertler, Jochen; Deussen, Oliver; Brandes, Ulrik; Weiskopf, Daniel

    2017-01-01

    We present a novel uncertain network visualization technique based on node-link diagrams. Nodes expand spatially in our probabilistic graph layout, depending on the underlying probability distributions of edges. The visualization is created by computing a two-dimensional graph embedding that combines samples from the probabilistic graph. A Monte Carlo process is used to decompose a probabilistic graph into its possible instances and to continue with our graph layout technique. Splatting and edge bundling are used to visualize point clouds and network topology. The results provide insights into probability distributions for the entire network-not only for individual nodes and edges. We validate our approach using three data sets that represent a wide range of network types: synthetic data, protein-protein interactions from the STRING database, and travel times extracted from Google Maps. Our approach reveals general limitations of the force-directed layout and allows the user to recognize that some nodes of the graph are at a specific position just by chance.

  15. Bayesian probabilistic network approach for managing earthquake risks of cities

    DEFF Research Database (Denmark)

    Bayraktarli, Yahya; Faber, Michael

    2011-01-01

    This paper considers the application of Bayesian probabilistic networks (BPNs) to large-scale risk based decision making in regard to earthquake risks. A recently developed risk management framework is outlined which utilises Bayesian probabilistic modelling, generic indicator based risk models...... and a fourth module on the consequences of an earthquake. Each of these modules is integrated into a BPN. Special attention is given to aggregated risk, i.e. the risk contribution from assets at multiple locations in a city subjected to the same earthquake. The application of the methodology is illustrated...... on an example considering a portfolio of reinforced concrete structures in a city located close to the western part of the North Anatolian Fault in Turkey....

  16. Intermediate probabilistic safety assessment approach for safety critical digital systems

    International Nuclear Information System (INIS)

    Taeyong, Sung; Hyun Gook, Kang

    2001-01-01

    Even though the conventional probabilistic safety assessment methods are immature for applying to microprocessor-based digital systems, practical needs force to apply it. In the Korea, UCN 5 and 6 units are being constructed and Korean Next Generation Reactor is being designed using the digital instrumentation and control equipment for the safety related functions. Korean regulatory body requires probabilistic safety assessment. This paper analyzes the difficulties on the assessment of digital systems and suggests an intermediate framework for evaluating their safety using fault tree models. The framework deals with several important characteristics of digital systems including software modules and fault-tolerant features. We expect that the analysis result will provide valuable design feedback. (authors)

  17. Setting Ambitious yet Achievable Targets Using Probabilistic Projections: Meeting Demand for Family Planning.

    Science.gov (United States)

    Kantorová, Vladimíra; New, Jin Rou; Biddlecom, Ann; Alkema, Leontine

    2017-09-01

    In 2015, governments adopted 17 internationally agreed goals to ensure progress and well-being in the economic, social, and environmental dimensions of sustainable development. These new goals present a challenge for countries to set empirical targets that are ambitious yet achievable and that can account for different starting points and rates of progress. We used probabilistic projections of family planning indicators, based on a global data set and Bayesian hierarchical modeling, to generate illustrative targets at the country level. Targets were defined as the percentage of demand for family planning satisfied with modern contraceptive methods where a country has at least a 10 percent chance of reaching the target by 2030. National targets for 2030 ranged from below 50 percent of demand satisfied with modern contraceptives (for three countries in Africa) to above 90 percent (for 41 countries from all major areas of the world). The probabilistic approach also identified countries for which a global fixed target value of 75 percent demand satisfied was either unambitious or has little chance of achievement. We present the web-based Family Planning Estimation Tool (FPET) enabling national decision makers to compute and assess targets for meeting family planning demand. © 2017 The Population Council, Inc.

  18. Probabilistic assessment of pressure vessel and piping reliability

    International Nuclear Information System (INIS)

    Sundararajan, C.

    1986-01-01

    The paper presents a critical review of the state-of-the-art in probabilistic assessment of pressure vessel and piping reliability. First the differences in assessing the reliability directly from historical failure data and indirectly by a probabilistic analysis of the failure phenomenon are discussed and the advantages and disadvantages are pointed out. The rest of the paper deals with the latter approach of reliability assessment. Methods of probabilistic reliability assessment are described and major projects where these methods are applied for pressure vessel and piping problems are discussed. An extensive list of references is provided at the end of the paper

  19. Probabilistic approach to the prediction of radioactive contamination of agricultural production

    International Nuclear Information System (INIS)

    Fesenko, S.F.; Chernyaeva, L.G.; Sanzharova, N.I.; Aleksakhin, R.M.

    1993-01-01

    The organization of agricultural production on the territory contaminated as a result of the Chernobyl reactor disaster involves prediction of the content of radionuclides in agro-industrial products. Traditional methods of prediting the contamination in the products does not give sufficient agreement with actual data and as a result it is difficult to make the necessary decisions about eliminating the consequences of the disaster in the agro-industrial complex. In many ways this is because the available methods are based on data on the radionuclide content in soils, plants, and plant and animal products. The parameters of the models used in the prediction are also evaluated on the basis of these results. Even if obtained from a single field or herd of livestock, however, such indicators have substantial variation coefficients due to various factors such as the spatial structure of the fallouts, the variability of the soil properties, the sampling error, the errors of processing and measuring the samples, and well as the data-averaging error. Consequently the parameters of the radionuclide transfer along the agricultural chains are very variable, thus considerably reducing the reliability of predicted values. The reliability of the prediction of radioactive contamination of agricultural products can be increased substantially by taking a probabilistic approach involving information about the random laws of contamination of farming land and the statistical features of the parameters of radionuclie migration along food chains. Considering the above, comparative analysis is made of the results obtained on the basis of the traditional treatment (deterministic in the simplest form) and its probabilistic analog

  20. A probabilistic approach for optimal sensor allocation in structural health monitoring

    International Nuclear Information System (INIS)

    Azarbayejani, M; Reda Taha, M M; El-Osery, A I; Choi, K K

    2008-01-01

    Recent advances in sensor technology promote using large sensor networks to efficiently and economically monitor, identify and quantify damage in structures. In structural health monitoring (SHM) systems, the effectiveness and reliability of the sensor network are crucial to determine the optimal number and locations of sensors in SHM systems. Here, we suggest a probabilistic approach for identifying the optimal number and locations of sensors for SHM. We demonstrate a methodology to establish the probability distribution function that identifies the optimal sensor locations such that damage detection is enhanced. The approach is based on using the weights of a neural network trained from simulations using a priori knowledge about damage locations and damage severities to generate a normalized probability distribution function for optimal sensor allocation. We also demonstrate that the optimal sensor network can be related to the highest probability of detection (POD). The redundancy of the proposed sensor network is examined using a 'leave one sensor out' analysis. A prestressed concrete bridge is selected as a case study to demonstrate the effectiveness of the proposed method. The results show that the proposed approach can provide a robust design for sensor networks that are more efficient than a uniform distribution of sensors on a structure

  1. Deterministic and probabilistic approach to safety analysis

    International Nuclear Information System (INIS)

    Heuser, F.W.

    1980-01-01

    The examples discussed in this paper show that reliability analysis methods fairly well can be applied in order to interpret deterministic safety criteria in quantitative terms. For further improved extension of applied reliability analysis it has turned out that the influence of operational and control systems and of component protection devices should be considered with the aid of reliability analysis methods in detail. Of course, an extension of probabilistic analysis must be accompanied by further development of the methods and a broadening of the data base. (orig.)

  2. Probabilistic and deterministic soil structure interaction analysis including ground motion incoherency effects

    International Nuclear Information System (INIS)

    Elkhoraibi, T.; Hashemi, A.; Ostadan, F.

    2014-01-01

    Soil-structure interaction (SSI) is a major step for seismic design of massive and stiff structures typical of the nuclear facilities and civil infrastructures such as tunnels, underground stations, dams and lock head structures. Currently most SSI analyses are performed deterministically, incorporating limited range of variation in soil and structural properties and without consideration of the ground motion incoherency effects. This often leads to overestimation of the seismic response particularly the In-Structure-Response Spectra (ISRS) with significant impositions of design and equipment qualification costs, especially in the case of high-frequency sensitive equipment at stiff soil or rock sites. The reluctance to incorporate a more comprehensive probabilistic approach is mainly due to the fact that the computational cost of performing probabilistic SSI analysis even without incoherency function considerations has been prohibitive. As such, bounding deterministic approaches have been preferred by the industry and accepted by the regulatory agencies. However, given the recently available and growing computing capabilities, the need for a probabilistic-based approach to the SSI analysis is becoming clear with the advances in performance-based engineering and the utilization of fragility analysis in the decision making process whether by the owners or the regulatory agencies. This paper demonstrates the use of both probabilistic and deterministic SSI analysis techniques to identify important engineering demand parameters in the structure. A typical nuclear industry structure is used as an example for this study. The system is analyzed for two different site conditions: rock and deep soil. Both deterministic and probabilistic SSI analysis approaches are performed, using the program SASSI, with and without ground motion incoherency considerations. In both approaches, the analysis begins at the hard rock level using the low frequency and high frequency hard rock

  3. Probabilistic and deterministic soil structure interaction analysis including ground motion incoherency effects

    Energy Technology Data Exchange (ETDEWEB)

    Elkhoraibi, T., E-mail: telkhora@bechtel.com; Hashemi, A.; Ostadan, F.

    2014-04-01

    Soil-structure interaction (SSI) is a major step for seismic design of massive and stiff structures typical of the nuclear facilities and civil infrastructures such as tunnels, underground stations, dams and lock head structures. Currently most SSI analyses are performed deterministically, incorporating limited range of variation in soil and structural properties and without consideration of the ground motion incoherency effects. This often leads to overestimation of the seismic response particularly the In-Structure-Response Spectra (ISRS) with significant impositions of design and equipment qualification costs, especially in the case of high-frequency sensitive equipment at stiff soil or rock sites. The reluctance to incorporate a more comprehensive probabilistic approach is mainly due to the fact that the computational cost of performing probabilistic SSI analysis even without incoherency function considerations has been prohibitive. As such, bounding deterministic approaches have been preferred by the industry and accepted by the regulatory agencies. However, given the recently available and growing computing capabilities, the need for a probabilistic-based approach to the SSI analysis is becoming clear with the advances in performance-based engineering and the utilization of fragility analysis in the decision making process whether by the owners or the regulatory agencies. This paper demonstrates the use of both probabilistic and deterministic SSI analysis techniques to identify important engineering demand parameters in the structure. A typical nuclear industry structure is used as an example for this study. The system is analyzed for two different site conditions: rock and deep soil. Both deterministic and probabilistic SSI analysis approaches are performed, using the program SASSI, with and without ground motion incoherency considerations. In both approaches, the analysis begins at the hard rock level using the low frequency and high frequency hard rock

  4. An Enhanced Artificial Bee Colony Algorithm with Solution Acceptance Rule and Probabilistic Multisearch.

    Science.gov (United States)

    Yurtkuran, Alkın; Emel, Erdal

    2016-01-01

    The artificial bee colony (ABC) algorithm is a popular swarm based technique, which is inspired from the intelligent foraging behavior of honeybee swarms. This paper proposes a new variant of ABC algorithm, namely, enhanced ABC with solution acceptance rule and probabilistic multisearch (ABC-SA) to address global optimization problems. A new solution acceptance rule is proposed where, instead of greedy selection between old solution and new candidate solution, worse candidate solutions have a probability to be accepted. Additionally, the acceptance probability of worse candidates is nonlinearly decreased throughout the search process adaptively. Moreover, in order to improve the performance of the ABC and balance the intensification and diversification, a probabilistic multisearch strategy is presented. Three different search equations with distinctive characters are employed using predetermined search probabilities. By implementing a new solution acceptance rule and a probabilistic multisearch approach, the intensification and diversification performance of the ABC algorithm is improved. The proposed algorithm has been tested on well-known benchmark functions of varying dimensions by comparing against novel ABC variants, as well as several recent state-of-the-art algorithms. Computational results show that the proposed ABC-SA outperforms other ABC variants and is superior to state-of-the-art algorithms proposed in the literature.

  5. An Enhanced Artificial Bee Colony Algorithm with Solution Acceptance Rule and Probabilistic Multisearch

    Directory of Open Access Journals (Sweden)

    Alkın Yurtkuran

    2016-01-01

    Full Text Available The artificial bee colony (ABC algorithm is a popular swarm based technique, which is inspired from the intelligent foraging behavior of honeybee swarms. This paper proposes a new variant of ABC algorithm, namely, enhanced ABC with solution acceptance rule and probabilistic multisearch (ABC-SA to address global optimization problems. A new solution acceptance rule is proposed where, instead of greedy selection between old solution and new candidate solution, worse candidate solutions have a probability to be accepted. Additionally, the acceptance probability of worse candidates is nonlinearly decreased throughout the search process adaptively. Moreover, in order to improve the performance of the ABC and balance the intensification and diversification, a probabilistic multisearch strategy is presented. Three different search equations with distinctive characters are employed using predetermined search probabilities. By implementing a new solution acceptance rule and a probabilistic multisearch approach, the intensification and diversification performance of the ABC algorithm is improved. The proposed algorithm has been tested on well-known benchmark functions of varying dimensions by comparing against novel ABC variants, as well as several recent state-of-the-art algorithms. Computational results show that the proposed ABC-SA outperforms other ABC variants and is superior to state-of-the-art algorithms proposed in the literature.

  6. Specifying design conservatism: Worst case versus probabilistic analysis

    Science.gov (United States)

    Miles, Ralph F., Jr.

    1993-01-01

    Design conservatism is the difference between specified and required performance, and is introduced when uncertainty is present. The classical approach of worst-case analysis for specifying design conservatism is presented, along with the modern approach of probabilistic analysis. The appropriate degree of design conservatism is a tradeoff between the required resources and the probability and consequences of a failure. A probabilistic analysis properly models this tradeoff, while a worst-case analysis reveals nothing about the probability of failure, and can significantly overstate the consequences of failure. Two aerospace examples will be presented that illustrate problems that can arise with a worst-case analysis.

  7. Specification of test criteria and probabilistic approach: the case of plutonium air transport

    International Nuclear Information System (INIS)

    Hubert, P.; Pages, P.; Ringot, C.; Tomachewsky, E.

    1989-03-01

    The safety of international transportation relies on compliance with IAEA regulations which specify a serie of test which the package is supposed to withstand. For Plutonium air transport some national regulations are more stringent than the IAEA one, namely the US one. For example the drop test is to be performed at 129 m.s -1 instead of 13.4 m.s -1 . The development of international Plutonium exchanges has raised the question of the adequacy of both those standards. The purpose of this paper is to show how a probabilistic approach helps in assessing the efficiency of a move towards more stringent tests

  8. Globalization - Different approaches

    Directory of Open Access Journals (Sweden)

    Viorica Puscaciu

    2014-11-01

    Full Text Available In this paper we investigate the different approaches of the globalization phenomenon. Despite the geographical distancesm, the link between people are ever more strong on different ways and plans: from technology between political, economical, cultural world events, and many other aspects. So, the link between globalization and democracy, and its impact on the most important social and economic matters. We also surprise the impact of the internet revolution and its corolar e-commerce, and its consequences, sometimes unpredictible ones. Another annalysed problem is that of the governments trying, and sometimes succeeding to controll the money, products, peole and their ideas that freely move inside the national frontiers, thus going to slower or to stop the progress. Nevertheless, this global interraction between people also create phenomena of insecurity on different ways: terrorism, traffic of arms, drugs, economical aggresions causing the environment, and other inconvenient facts and situations.

  9. Probabilistic fuel rod analyses using the TRANSURANUS code

    Energy Technology Data Exchange (ETDEWEB)

    Lassmann, K; O` Carroll, C; Laar, J Van De [CEC Joint Research Centre, Karlsruhe (Germany)

    1997-08-01

    After more than 25 years of fuel rod modelling research, the basic concepts are well established and the limitations of the specific approaches are known. However, the widely used mechanistic approach leads in many cases to discrepancies between theoretical predictions and experimental evidence indicating that models are not exact and that some of the physical processes encountered are of stochastic nature. To better understand uncertainties and their consequences, the mechanistic approach must therefore be augmented by statistical analyses. In the present paper the basic probabilistic methods are briefly discussed. Two such probabilistic approaches are included in the fuel rod performance code TRANSURANUS: the Monte Carlo method and the Numerical Noise Analysis. These two techniques are compared and their capabilities are demonstrated. (author). 12 refs, 4 figs, 2 tabs.

  10. A probabilistic safety assessment of the standard French 900MWe pressurized water reactor. Main report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1990-04-15

    To situate the probabilistic safety assessment of standardized 900 MWe units made by the Institute for Nuclear Safety and Protection (IPSN), it is necessary to consider the importance and possible utilization of a study of this type. At the present time, the safety of nuclear installations essentially depends on the application of the defence in-depth approach. The design arrangements adopted are justified by the operating organization on the basis of deterministic studies of a limited number of conventional situations with corresponding safety margins. These conventional situations are grouped in categories by frequency, it being accepted that the greater the consequences the lesser the frequency must be. However in the framework of the analysis performed under the control of the French safety authority, the importance was rapidly recognized of setting an overall reference objective. By 1977, on the occasion of appraisal of the fundamental safety options of the standardized 1300 MWe units, the Central Service for the Safety of Nuclear Installations (SCSIN) set the following global probabilistic objective: 'Generally speaking, the design of installations including a pressurized water nuclear reactor must be such that the global probability of the nuclear unit being the origin of unacceptable consequences does not exceed 10{sup -6} per year...' Probabilistic analyses making reference to this global objective gradually began to supplement the deterministic approach, both for examining external hazards to be considered in the design basis and for examining the possible need for additional means of countering the failure of doubled systems in application of the deterministic single-failure criterion. A new step has been taken in France by carrying out two level 1 probabilistic safety assessments (calculation of the annual probability of core meltdown), one for the 900 MWe series by the IPSN and the other for the 1300 MWe series by Electricite de France. The objective

  11. A probabilistic safety assessment of the standard French 900MWe pressurized water reactor. Main report

    International Nuclear Information System (INIS)

    1990-04-01

    To situate the probabilistic safety assessment of standardized 900 MWe units made by the Institute for Nuclear Safety and Protection (IPSN), it is necessary to consider the importance and possible utilization of a study of this type. At the present time, the safety of nuclear installations essentially depends on the application of the defence in-depth approach. The design arrangements adopted are justified by the operating organization on the basis of deterministic studies of a limited number of conventional situations with corresponding safety margins. These conventional situations are grouped in categories by frequency, it being accepted that the greater the consequences the lesser the frequency must be. However in the framework of the analysis performed under the control of the French safety authority, the importance was rapidly recognized of setting an overall reference objective. By 1977, on the occasion of appraisal of the fundamental safety options of the standardized 1300 MWe units, the Central Service for the Safety of Nuclear Installations (SCSIN) set the following global probabilistic objective: 'Generally speaking, the design of installations including a pressurized water nuclear reactor must be such that the global probability of the nuclear unit being the origin of unacceptable consequences does not exceed 10 -6 per year...' Probabilistic analyses making reference to this global objective gradually began to supplement the deterministic approach, both for examining external hazards to be considered in the design basis and for examining the possible need for additional means of countering the failure of doubled systems in application of the deterministic single-failure criterion. A new step has been taken in France by carrying out two level 1 probabilistic safety assessments (calculation of the annual probability of core meltdown), one for the 900 MWe series by the IPSN and the other for the 1300 MWe series by Electricite de France. The objective of

  12. A policy synthesis approach for slowing global warming

    International Nuclear Information System (INIS)

    Timilsina, G.R.

    1996-01-01

    Global warming is a burning environmental issue today but confronting with subjective as well as policy conflicts. The findings of various studies indicate that developed countries that are capable of affording effective measures towards the global warming mitigation have fewer incentives for doing so because they will have a minimal damage from global warming. The developing countries, although they will have greater damage, are unlikely to divert their development budget for taking preventive actions towards global warming. The only solution in this situation is to design a policy that encourages all the nation in the world to participate in the programs for slowing global warming. Without active participation of all nations, it seems unlikely to reduce the global warming problem in an effective way. This study presents a qualitative policy recommendation extracted from a comprehensive analysis of the findings of several studies conducted so far in this field. This study has categorized the policy approaches for mitigating the global warming in three groups: Engineering approach, forestry approach and economic approach

  13. The Global Tsunami Model (GTM)

    Science.gov (United States)

    Lorito, S.; Basili, R.; Harbitz, C. B.; Løvholt, F.; Polet, J.; Thio, H. K.

    2017-12-01

    The tsunamis occurred worldwide in the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but often disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.

  14. A probabilistic method for streamflow projection and associated uncertainty analysis in a data sparse alpine region

    Science.gov (United States)

    Ren, Weiwei; Yang, Tao; Shi, Pengfei; Xu, Chong-yu; Zhang, Ke; Zhou, Xudong; Shao, Quanxi; Ciais, Philippe

    2018-06-01

    Climate change imposes profound influence on regional hydrological cycle and water security in many alpine regions worldwide. Investigating regional climate impacts using watershed scale hydrological models requires a large number of input data such as topography, meteorological and hydrological data. However, data scarcity in alpine regions seriously restricts evaluation of climate change impacts on water cycle using conventional approaches based on global or regional climate models, statistical downscaling methods and hydrological models. Therefore, this study is dedicated to development of a probabilistic model to replace the conventional approaches for streamflow projection. The probabilistic model was built upon an advanced Bayesian Neural Network (BNN) approach directly fed by the large-scale climate predictor variables and tested in a typical data sparse alpine region, the Kaidu River basin in Central Asia. Results show that BNN model performs better than the general methods across a number of statistical measures. The BNN method with flexible model structures by active indicator functions, which reduce the dependence on the initial specification for the input variables and the number of hidden units, can work well in a data limited region. Moreover, it can provide more reliable streamflow projections with a robust generalization ability. Forced by the latest bias-corrected GCM scenarios, streamflow projections for the 21st century under three RCP emission pathways were constructed and analyzed. Briefly, the proposed probabilistic projection approach could improve runoff predictive ability over conventional methods and provide better support to water resources planning and management under data limited conditions as well as enable a facilitated climate change impact analysis on runoff and water resources in alpine regions worldwide.

  15. Dynamic Fault Diagnosis for Nuclear Installation Using Probabilistic Approach

    International Nuclear Information System (INIS)

    Djoko Hari Nugroho; Deswandri; Ahmad Abtokhi; Darlis

    2003-01-01

    Probabilistic based fault diagnosis which represent the relationship between cause and consequence of the events for trouble shooting is developed in this research based on Bayesian Networks. Contribution of on-line data comes from sensors and system/component reliability in node cause is expected increasing the belief level of Bayesian Networks. (author)

  16. A novel approach for voltage secure operation using Probabilistic Neural Network in transmission network

    Directory of Open Access Journals (Sweden)

    Santi Behera

    2016-05-01

    Full Text Available This work proposes a unique approach for improving voltage stability limit using a Probabilistic Neural Network (PNN classifier that gives corrective controls available in the system in the scenario of contingencies. The sensitivity of system is analyzed to identify weak buses with ENVCI evaluation approaching zero. The input to the classifier, termed as voltage stability enhancing neural network (VSENN classifier, for training are line flows and bus voltages near the notch point of the P–V curve and the output of the VSENN is a control variable. For various contingencies the control action that improves the voltage profile as well as stability index is identified and trained accordingly. The trained VSENN is finally tested for its robustness to improve load margin and ENVCI as well, apart from trained set of operating condition of the system along with contingencies. The proposed approach is verified in IEEE 39-bus test system.

  17. Non-probabilistic defect assessment for structures with cracks based on interval model

    International Nuclear Information System (INIS)

    Dai, Qiao; Zhou, Changyu; Peng, Jian; Chen, Xiangwei; He, Xiaohua

    2013-01-01

    Highlights: • Non-probabilistic approach is introduced to defect assessment. • Definition and establishment of IFAC are put forward. • Determination of assessment rectangle is proposed. • Solution of non-probabilistic reliability index is presented. -- Abstract: Traditional defect assessment methods conservatively treat uncertainty of parameters as safety factors, while the probabilistic method is based on the clear understanding of detailed statistical information of parameters. In this paper, the non-probabilistic approach is introduced to the failure assessment diagram (FAD) to propose a non-probabilistic defect assessment method for structures with cracks. This novel defect assessment method contains three critical processes: establishment of the interval failure assessment curve (IFAC), determination of the assessment rectangle, and solution of the non-probabilistic reliability degree. Based on the interval theory, uncertain parameters such as crack sizes, material properties and loads are considered as interval variables. As a result, the failure assessment curve (FAC) will vary in a certain range, which is defined as IFAC. And the assessment point will vary within a rectangle zone which is defined as an assessment rectangle. Based on the interval model, the establishment of IFAC and the determination of the assessment rectangle are presented. Then according to the interval possibility degree method, the non-probabilistic reliability degree of IFAC can be determined. Meanwhile, in order to clearly introduce the non-probabilistic defect assessment method, a numerical example for the assessment of a pipe with crack is given. In addition, the assessment result of the proposed method is compared with that of the traditional probabilistic method, which confirms that this non-probabilistic defect assessment can reasonably resolve the practical problem with interval variables

  18. Non-probabilistic defect assessment for structures with cracks based on interval model

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Qiao; Zhou, Changyu, E-mail: changyu_zhou@163.com; Peng, Jian; Chen, Xiangwei; He, Xiaohua

    2013-09-15

    Highlights: • Non-probabilistic approach is introduced to defect assessment. • Definition and establishment of IFAC are put forward. • Determination of assessment rectangle is proposed. • Solution of non-probabilistic reliability index is presented. -- Abstract: Traditional defect assessment methods conservatively treat uncertainty of parameters as safety factors, while the probabilistic method is based on the clear understanding of detailed statistical information of parameters. In this paper, the non-probabilistic approach is introduced to the failure assessment diagram (FAD) to propose a non-probabilistic defect assessment method for structures with cracks. This novel defect assessment method contains three critical processes: establishment of the interval failure assessment curve (IFAC), determination of the assessment rectangle, and solution of the non-probabilistic reliability degree. Based on the interval theory, uncertain parameters such as crack sizes, material properties and loads are considered as interval variables. As a result, the failure assessment curve (FAC) will vary in a certain range, which is defined as IFAC. And the assessment point will vary within a rectangle zone which is defined as an assessment rectangle. Based on the interval model, the establishment of IFAC and the determination of the assessment rectangle are presented. Then according to the interval possibility degree method, the non-probabilistic reliability degree of IFAC can be determined. Meanwhile, in order to clearly introduce the non-probabilistic defect assessment method, a numerical example for the assessment of a pipe with crack is given. In addition, the assessment result of the proposed method is compared with that of the traditional probabilistic method, which confirms that this non-probabilistic defect assessment can reasonably resolve the practical problem with interval variables.

  19. Retention and Curve Number Variability in a Small Agricultural Catchment: The Probabilistic Approach

    Directory of Open Access Journals (Sweden)

    Kazimierz Banasik

    2014-04-01

    Full Text Available The variability of the curve number (CN and the retention parameter (S of the Soil Conservation Service (SCS-CN method in a small agricultural, lowland watershed (23.4 km2 to the gauging station in central Poland has been assessed using the probabilistic approach: distribution fitting and confidence intervals (CIs. Empirical CNs and Ss were computed directly from recorded rainfall depths and direct runoff volumes. Two measures of the goodness of fit were used as selection criteria in the identification of the parent distribution function. The measures specified the generalized extreme value (GEV, normal and general logistic (GLO distributions for 100-CN and GLO, lognormal and GEV distributions for S. The characteristics estimated from theoretical distribution (median, quantiles were compared to the tabulated CN and to the antecedent runoff conditions of Hawkins and Hjelmfelt. The distribution fitting for the whole sample revealed a good agreement between the tabulated CN and the median and between the antecedent runoff conditions (ARCs of Hawkins and Hjelmfelt, which certified a good calibration of the model. However, the division of the CN sample due to heavy and moderate rainfall depths revealed a serious inconsistency between the parameters mentioned. This analysis proves that the application of the SCS-CN method should rely on deep insight into the probabilistic properties of CN and S.

  20. Probabilistic Design of Wave Energy Devices

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Kofoed, Jens Peter; Ferreira, C.B.

    2011-01-01

    Wave energy has a large potential for contributing significantly to production of renewable energy. However, the wave energy sector is still not able to deliver cost competitive and reliable solutions. But the sector has already demonstrated several proofs of concepts. The design of wave energy...... devices is a new and expanding technical area where there is no tradition for probabilistic design—in fact very little full scale devices has been build to date, so it can be said that no design tradition really exists in this area. For this reason it is considered to be of great importance to develop...... and advocate for a probabilistic design approach, as it is assumed (in other areas this has been demonstrated) that this leads to more economical designs compared to designs based on deterministic methods. In the present paper a general framework for probabilistic design and reliability analysis of wave energy...

  1. Mesh Independent Probabilistic Residual Life Prediction of Metallic Airframe Structures, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Engineering and Materials, Inc. (GEM) along with its team members, Clarkson University and LM Aero, propose to develop a mesh independent probabilistic...

  2. xLPR - a probabilistic approach to piping integrity analysis

    International Nuclear Information System (INIS)

    Harrington, C.; Rudland, D.; Fyfitch, S.

    2015-01-01

    The xLPR Code is a probabilistic fracture mechanics (PFM) computational tool that can be used to quantitatively determine a best-estimate probability of failure with well characterized uncertainties for reactor coolant system components, beginning with the piping systems and including the effects of relevant active degradation mechanisms. The initial application planned for xLPR is somewhat narrowly focused on validating LBB (leak-before-break) compliance in PWSCC-susceptible systems such as coolant systems of PWRs. The xLPR code incorporates a set of deterministic models that represent the full range of physical phenomena necessary to evaluate both fatigue and PWSCC degradation modes from crack initiation through failure. These models are each implemented in a modular form and linked together by a probabilistic framework that contains the logic for xLPR execution, exercises the individual modules as required, and performs necessary administrative and bookkeeping functions. The completion of the first production version of the xLPR code in a fully documented, releasable condition is presently planned for spring 2015

  3. Probabilistic Design of Wind Turbines

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Toft, H.S.

    2010-01-01

    Probabilistic design of wind turbines requires definition of the structural elements to be included in the probabilistic basis: e.g., blades, tower, foundation; identification of important failure modes; careful stochastic modeling of the uncertain parameters; recommendations for target reliability....... It is described how uncertainties in wind turbine design related to computational models, statistical data from test specimens, results from a few full-scale tests and from prototype wind turbines can be accounted for using the Maximum Likelihood Method and a Bayesian approach. Assessment of the optimal...... reliability level by cost-benefit optimization is illustrated by an offshore wind turbine example. Uncertainty modeling is illustrated by an example where physical, statistical and model uncertainties are estimated....

  4. Application of the probabilistic method at the E.D.F

    International Nuclear Information System (INIS)

    Gachot, Bernard

    1976-01-01

    Having first evoked the problems arising from the definition of a so-called 'acceptable risk', the probabilistic study programme on safety carried out at the E.D.F. is described. The different aspects of the probabilistic estimation of a hazard are presented as well as the different steps i.e. collecting the information, carrying out a quantitative and qualitative analysis, which characterize the probabilistic study of safety problems. The problem of data determination is considered on reliability of the equipment, noting as a conclusion, that in spite of the lack of accuracy of the present data, the probabilistic methods already appear as a highly valuable tool favouring an homogenous and coherent approach of nuclear plant safety [fr

  5. Probabilistic insurance

    OpenAIRE

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these preferences are intuitively appealing they are difficult to reconcile with expected utility theory. Under highly plausible assumptions about the utility function, willingness to pay for probabilistic i...

  6. Probabilistic forecasting and Bayesian data assimilation

    CERN Document Server

    Reich, Sebastian

    2015-01-01

    In this book the authors describe the principles and methods behind probabilistic forecasting and Bayesian data assimilation. Instead of focusing on particular application areas, the authors adopt a general dynamical systems approach, with a profusion of low-dimensional, discrete-time numerical examples designed to build intuition about the subject. Part I explains the mathematical framework of ensemble-based probabilistic forecasting and uncertainty quantification. Part II is devoted to Bayesian filtering algorithms, from classical data assimilation algorithms such as the Kalman filter, variational techniques, and sequential Monte Carlo methods, through to more recent developments such as the ensemble Kalman filter and ensemble transform filters. The McKean approach to sequential filtering in combination with coupling of measures serves as a unifying mathematical framework throughout Part II. Assuming only some basic familiarity with probability, this book is an ideal introduction for graduate students in ap...

  7. Scalable group level probabilistic sparse factor analysis

    DEFF Research Database (Denmark)

    Hinrich, Jesper Løve; Nielsen, Søren Føns Vind; Riis, Nicolai Andre Brogaard

    2017-01-01

    Many data-driven approaches exist to extract neural representations of functional magnetic resonance imaging (fMRI) data, but most of them lack a proper probabilistic formulation. We propose a scalable group level probabilistic sparse factor analysis (psFA) allowing spatially sparse maps, component...... pruning using automatic relevance determination (ARD) and subject specific heteroscedastic spatial noise modeling. For task-based and resting state fMRI, we show that the sparsity constraint gives rise to components similar to those obtained by group independent component analysis. The noise modeling...... shows that noise is reduced in areas typically associated with activation by the experimental design. The psFA model identifies sparse components and the probabilistic setting provides a natural way to handle parameter uncertainties. The variational Bayesian framework easily extends to more complex...

  8. Fully probabilistic design: the way for optimizing of concrete structures

    Directory of Open Access Journals (Sweden)

    I. Laníková

    Full Text Available Some standards for the design of concrete structures (e.g. EC2 and the original ČSN 73 1201-86 allow a structure to be designed by several methods. This contribution documents the fact that even if a structure does not comply with the partial reliability factor method, according to EC2, it can satisfy the conditions during the application of the fully probabilistic approach when using the same standard. From an example of the reliability of a prestressed spun concrete pole designed by the partial factor method and fully probabilistic approach according to the Eurocode it is evident that an expert should apply a more precise (though unfortunately more complicated method in the limiting cases. The Monte Carlo method, modified by the Latin Hypercube Sampling (LHS method, has been used for the calculation of reliability. Ultimate and serviceability limit states were checked for the partial factor method and fully probabilistic design. As a result of fully probabilistic design it is possible to obtain a more efficient design for a structure.

  9. A Unified Probabilistic Framework for Dose-Response Assessment of Human Health Effects.

    Science.gov (United States)

    Chiu, Weihsueh A; Slob, Wout

    2015-12-01

    When chemical health hazards have been identified, probabilistic dose-response assessment ("hazard characterization") quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. We developed a unified framework for probabilistic dose-response assessment. We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose-response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, "effect metrics" can be specified to define "toxicologically equivalent" sizes for this underlying individual response; and d) dose-response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose-response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Probabilistically derived exposure limits are based on estimating a "target human dose" (HDMI), which requires risk management-informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%-10% effect sizes. Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions.

  10. A Heuristic Probabilistic Approach to Estimating Size-Dependent Mobility of Nonuniform Sediment

    Science.gov (United States)

    Woldegiorgis, B. T.; Wu, F. C.; van Griensven, A.; Bauwens, W.

    2017-12-01

    Simulating the mechanism of bed sediment mobility is essential for modelling sediment dynamics. Despite the fact that many studies are carried out on this subject, they use complex mathematical formulations that are computationally expensive, and are often not easy for implementation. In order to present a simple and computationally efficient complement to detailed sediment mobility models, we developed a heuristic probabilistic approach to estimating the size-dependent mobilities of nonuniform sediment based on the pre- and post-entrainment particle size distributions (PSDs), assuming that the PSDs are lognormally distributed. The approach fits a lognormal probability density function (PDF) to the pre-entrainment PSD of bed sediment and uses the threshold particle size of incipient motion and the concept of sediment mixture to estimate the PSDs of the entrained sediment and post-entrainment bed sediment. The new approach is simple in physical sense and significantly reduces the complexity and computation time and resource required by detailed sediment mobility models. It is calibrated and validated with laboratory and field data by comparing to the size-dependent mobilities predicted with the existing empirical lognormal cumulative distribution function (CDF) approach. The novel features of the current approach are: (1) separating the entrained and non-entrained sediments by a threshold particle size, which is a modified critical particle size of incipient motion by accounting for the mixed-size effects, and (2) using the mixture-based pre- and post-entrainment PSDs to provide a continuous estimate of the size-dependent sediment mobility.

  11. A probabilistic approach for estimating the spatial extent of pesticide agricultural use sites and potential co-occurrence with listed species for use in ecological risk assessments.

    Science.gov (United States)

    Budreski, Katherine; Winchell, Michael; Padilla, Lauren; Bang, JiSu; Brain, Richard A

    2016-04-01

    A crop footprint refers to the estimated spatial extent of growing areas for a specific crop, and is commonly used to represent the potential "use site" footprint for a pesticide labeled for use on that crop. A methodology for developing probabilistic crop footprints to estimate the likelihood of pesticide use and the potential co-occurrence of pesticide use and listed species locations was tested at the national scale and compared to alternative methods. The probabilistic aspect of the approach accounts for annual crop rotations and the uncertainty in remotely sensed crop and land cover data sets. The crop footprints used historically are derived exclusively from the National Land Cover Database (NLCD) Cultivated Crops and/or Pasture/Hay classes. This approach broadly aggregates agriculture into 2 classes, which grossly overestimates the spatial extent of individual crops that are labeled for pesticide use. The approach also does not use all the available crop data, represents a single point in time, and does not account for the uncertainty in land cover data set classifications. The probabilistic crop footprint approach described herein incorporates best available information at the time of analysis from the National Agricultural Statistics Service (NASS) Cropland Data Layer (CDL) for 5 y (2008-2012 at the time of analysis), the 2006 NLCD, the 2007 NASS Census of Agriculture, and 5 y of NASS Quick Stats (2008-2012). The approach accounts for misclassification of crop classes in the CDL by incorporating accuracy assessment information by state, year, and crop. The NLCD provides additional information to improve the CDL crop probability through an adjustment based on the NLCD accuracy assessment data using the principles of Bayes' Theorem. Finally, crop probabilities are scaled at the state level by comparing against NASS surveys (Census of Agriculture and Quick Stats) of reported planted acres by crop. In an example application of the new method, the probabilistic

  12. A Probabilistic Framework for Curve Evolution

    DEFF Research Database (Denmark)

    Dahl, Vedrana Andersen

    2017-01-01

    approach include ability to handle textured images, simple generalization to multiple regions, and efficiency in computation. We test our probabilistic framework in combination with parametric (snakes) and geometric (level-sets) curves. The experimental results on composed and natural images demonstrate...

  13. Probabilistic Linguistic Power Aggregation Operators for Multi-Criteria Group Decision Making

    Directory of Open Access Journals (Sweden)

    Agbodah Kobina

    2017-12-01

    Full Text Available As an effective aggregation tool, power average (PA allows the input arguments being aggregated to support and reinforce each other, which provides more versatility in the information aggregation process. Under the probabilistic linguistic term environment, we deeply investigate the new power aggregation (PA operators for fusing the probabilistic linguistic term sets (PLTSs. In this paper, we firstly develop the probabilistic linguistic power average (PLPA, the weighted probabilistic linguistic power average (WPLPA operators, the probabilistic linguistic power geometric (PLPG and the weighted probabilistic linguistic power geometric (WPLPG operators. At the same time, we carefully analyze the properties of these new aggregation operators. With the aid of the WPLPA and WPLPG operators, we further design the approaches for the application of multi-criteria group decision-making (MCGDM with PLTSs. Finally, we use an illustrated example to expound our proposed methods and verify their performances.

  14. From Sensory Signals to Modality-Independent Conceptual Representations: A Probabilistic Language of Thought Approach.

    Directory of Open Access Journals (Sweden)

    Goker Erdogan

    2015-11-01

    Full Text Available People learn modality-independent, conceptual representations from modality-specific sensory signals. Here, we hypothesize that any system that accomplishes this feat will include three components: a representational language for characterizing modality-independent representations, a set of sensory-specific forward models for mapping from modality-independent representations to sensory signals, and an inference algorithm for inverting forward models-that is, an algorithm for using sensory signals to infer modality-independent representations. To evaluate this hypothesis, we instantiate it in the form of a computational model that learns object shape representations from visual and/or haptic signals. The model uses a probabilistic grammar to characterize modality-independent representations of object shape, uses a computer graphics toolkit and a human hand simulator to map from object representations to visual and haptic features, respectively, and uses a Bayesian inference algorithm to infer modality-independent object representations from visual and/or haptic signals. Simulation results show that the model infers identical object representations when an object is viewed, grasped, or both. That is, the model's percepts are modality invariant. We also report the results of an experiment in which different subjects rated the similarity of pairs of objects in different sensory conditions, and show that the model provides a very accurate account of subjects' ratings. Conceptually, this research significantly contributes to our understanding of modality invariance, an important type of perceptual constancy, by demonstrating how modality-independent representations can be acquired and used. Methodologically, it provides an important contribution to cognitive modeling, particularly an emerging probabilistic language-of-thought approach, by showing how symbolic and statistical approaches can be combined in order to understand aspects of human perception.

  15. Optimization (Alara) and probabilistic exposures: the application of optimization criteria to the control of risks due to exposures of a probabilistic nature

    International Nuclear Information System (INIS)

    Gonzalez, A.J.

    1989-01-01

    The paper described the application of the principles of optimization recommended by the International Commission on Radiological Protection (ICRP) to the restrain of radiation risks due to exposures that may or may not be incurred and to which a probability of occurrence can be assigned. After describing the concept of probabilistic exposures, it proposes a basis for a converging policy of control for both certain and probabilistic exposures, namely the dose-risk relationship adopted for radiation protection purposes. On that basis some coherent approaches for dealing with probabilistic exposures, such as the limitation of individual risks, are discussed. The optimization of safety for reducing all risks from probabilistic exposures to as-low-as-reasonably-achievable (ALARA) levels is reviewed in full. The principles of optimization of protection are used as a basic framework and the relevant factors to be taken into account when moving to probabilistic exposures are presented. The paper also reviews the decision-aiding techniques suitable for performing optimization with particular emphasis to the multi-attribute utility-analysis technique. Finally, there is a discussion on some practical application of decision-aiding multi-attribute utility analysis to probabilistic exposures including the use of probabilistic utilities. In its final outlook, the paper emphasizes the need for standardization and solutions to generic problems, if optimization of safety is to be successful

  16. Assessing Probabilistic Risk Assessment Approaches for Insect Biological Control Introductions.

    Science.gov (United States)

    Kaufman, Leyla V; Wright, Mark G

    2017-07-07

    The introduction of biological control agents to new environments requires host specificity tests to estimate potential non-target impacts of a prospective agent. Currently, the approach is conservative, and is based on physiological host ranges determined under captive rearing conditions, without consideration for ecological factors that may influence realized host range. We use historical data and current field data from introduced parasitoids that attack an endemic Lepidoptera species in Hawaii to validate a probabilistic risk assessment (PRA) procedure for non-target impacts. We use data on known host range and habitat use in the place of origin of the parasitoids to determine whether contemporary levels of non-target parasitism could have been predicted using PRA. Our results show that reasonable predictions of potential non-target impacts may be made if comprehensive data are available from places of origin of biological control agents, but scant data produce poor predictions. Using apparent mortality data rather than marginal attack rate estimates in PRA resulted in over-estimates of predicted non-target impact. Incorporating ecological data into PRA models improved the predictive power of the risk assessments.

  17. CAD Parts-Based Assembly Modeling by Probabilistic Reasoning

    KAUST Repository

    Zhang, Kai-Ke

    2016-04-11

    Nowadays, increasing amount of parts and sub-assemblies are publicly available, which can be used directly for product development instead of creating from scratch. In this paper, we propose an interactive design framework for efficient and smart assembly modeling, in order to improve the design efficiency. Our approach is based on a probabilistic reasoning. Given a collection of industrial assemblies, we learn a probabilistic graphical model from the relationships between the parts of assemblies. Then in the modeling stage, this probabilistic model is used to suggest the most likely used parts compatible with the current assembly. Finally, the parts are assembled under certain geometric constraints. We demonstrate the effectiveness of our framework through a variety of assembly models produced by our prototype system. © 2015 IEEE.

  18. CAD Parts-Based Assembly Modeling by Probabilistic Reasoning

    KAUST Repository

    Zhang, Kai-Ke; Hu, Kai-Mo; Yin, Li-Cheng; Yan, Dongming; Wang, Bin

    2016-01-01

    Nowadays, increasing amount of parts and sub-assemblies are publicly available, which can be used directly for product development instead of creating from scratch. In this paper, we propose an interactive design framework for efficient and smart assembly modeling, in order to improve the design efficiency. Our approach is based on a probabilistic reasoning. Given a collection of industrial assemblies, we learn a probabilistic graphical model from the relationships between the parts of assemblies. Then in the modeling stage, this probabilistic model is used to suggest the most likely used parts compatible with the current assembly. Finally, the parts are assembled under certain geometric constraints. We demonstrate the effectiveness of our framework through a variety of assembly models produced by our prototype system. © 2015 IEEE.

  19. On the evaluation of the efficacy of a smart damper: a new equivalent energy-based probabilistic approach

    International Nuclear Information System (INIS)

    Aly, A M; Christenson, R E

    2008-01-01

    Smart damping technology has been proposed to protect civil structures from dynamic loads. Each application of smart damping control provides varying levels of performance relative to active and passive control strategies. Currently, researchers compare the relative efficacy of smart damping control to active and passive strategies by running numerous simulations. These simulations can require significant computation time and resources. Because of this, it is desirable to develop an approach to assess the applicability of smart damping technology which requires less computation time. This paper discusses and verifies a probabilistic approach to determine the efficacy of smart damping technology based on clipped optimal state feedback control theory

  20. Improving probabilistic prediction of daily streamflow by identifying Pareto optimal approaches for modelling heteroscedastic residual errors

    Science.gov (United States)

    David, McInerney; Mark, Thyer; Dmitri, Kavetski; George, Kuczera

    2017-04-01

    This study provides guidance to hydrological researchers which enables them to provide probabilistic predictions of daily streamflow with the best reliability and precision for different catchment types (e.g. high/low degree of ephemerality). Reliable and precise probabilistic prediction of daily catchment-scale streamflow requires statistical characterization of residual errors of hydrological models. It is commonly known that hydrological model residual errors are heteroscedastic, i.e. there is a pattern of larger errors in higher streamflow predictions. Although multiple approaches exist for representing this heteroscedasticity, few studies have undertaken a comprehensive evaluation and comparison of these approaches. This study fills this research gap by evaluating 8 common residual error schemes, including standard and weighted least squares, the Box-Cox transformation (with fixed and calibrated power parameter, lambda) and the log-sinh transformation. Case studies include 17 perennial and 6 ephemeral catchments in Australia and USA, and two lumped hydrological models. We find the choice of heteroscedastic error modelling approach significantly impacts on predictive performance, though no single scheme simultaneously optimizes all performance metrics. The set of Pareto optimal schemes, reflecting performance trade-offs, comprises Box-Cox schemes with lambda of 0.2 and 0.5, and the log scheme (lambda=0, perennial catchments only). These schemes significantly outperform even the average-performing remaining schemes (e.g., across ephemeral catchments, median precision tightens from 105% to 40% of observed streamflow, and median biases decrease from 25% to 4%). Theoretical interpretations of empirical results highlight the importance of capturing the skew/kurtosis of raw residuals and reproducing zero flows. Recommendations for researchers and practitioners seeking robust residual error schemes for practical work are provided.

  1. A Unified Probabilistic Framework for Dose–Response Assessment of Human Health Effects

    Science.gov (United States)

    Slob, Wout

    2015-01-01

    Background When chemical health hazards have been identified, probabilistic dose–response assessment (“hazard characterization”) quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. Objectives We developed a unified framework for probabilistic dose–response assessment. Methods We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose–response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, “effect metrics” can be specified to define “toxicologically equivalent” sizes for this underlying individual response; and d) dose–response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose–response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Results Probabilistically derived exposure limits are based on estimating a “target human dose” (HDMI), which requires risk management–informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%–10% effect sizes. Conclusions Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk

  2. Efficient probabilistic inference in generic neural networks trained with non-probabilistic feedback.

    Science.gov (United States)

    Orhan, A Emin; Ma, Wei Ji

    2017-07-26

    Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.

  3. HIERARCHICAL PROBABILISTIC INFERENCE OF COSMIC SHEAR

    International Nuclear Information System (INIS)

    Schneider, Michael D.; Dawson, William A.; Hogg, David W.; Marshall, Philip J.; Bard, Deborah J.; Meyers, Joshua; Lang, Dustin

    2015-01-01

    Point estimators for the shearing of galaxy images induced by gravitational lensing involve a complex inverse problem in the presence of noise, pixelization, and model uncertainties. We present a probabilistic forward modeling approach to gravitational lensing inference that has the potential to mitigate the biased inferences in most common point estimators and is practical for upcoming lensing surveys. The first part of our statistical framework requires specification of a likelihood function for the pixel data in an imaging survey given parameterized models for the galaxies in the images. We derive the lensing shear posterior by marginalizing over all intrinsic galaxy properties that contribute to the pixel data (i.e., not limited to galaxy ellipticities) and learn the distributions for the intrinsic galaxy properties via hierarchical inference with a suitably flexible conditional probabilitiy distribution specification. We use importance sampling to separate the modeling of small imaging areas from the global shear inference, thereby rendering our algorithm computationally tractable for large surveys. With simple numerical examples we demonstrate the improvements in accuracy from our importance sampling approach, as well as the significance of the conditional distribution specification for the intrinsic galaxy properties when the data are generated from an unknown number of distinct galaxy populations with different morphological characteristics

  4. A probabilistic approach for validating protein NMR chemical shift assignments

    International Nuclear Information System (INIS)

    Wang Bowei; Wang, Yunjun; Wishart, David S.

    2010-01-01

    It has been estimated that more than 20% of the proteins in the BMRB are improperly referenced and that about 1% of all chemical shift assignments are mis-assigned. These statistics also reflect the likelihood that any newly assigned protein will have shift assignment or shift referencing errors. The relatively high frequency of these errors continues to be a concern for the biomolecular NMR community. While several programs do exist to detect and/or correct chemical shift mis-referencing or chemical shift mis-assignments, most can only do one, or the other. The one program (SHIFTCOR) that is capable of handling both chemical shift mis-referencing and mis-assignments, requires the 3D structure coordinates of the target protein. Given that chemical shift mis-assignments and chemical shift re-referencing issues should ideally be addressed prior to 3D structure determination, there is a clear need to develop a structure-independent approach. Here, we present a new structure-independent protocol, which is based on using residue-specific and secondary structure-specific chemical shift distributions calculated over small (3-6 residue) fragments to identify mis-assigned resonances. The method is also able to identify and re-reference mis-referenced chemical shift assignments. Comparisons against existing re-referencing or mis-assignment detection programs show that the method is as good or superior to existing approaches. The protocol described here has been implemented into a freely available Java program called 'Probabilistic Approach for protein Nmr Assignment Validation (PANAV)' and as a web server (http://redpoll.pharmacy.ualberta.ca/PANAVhttp://redpoll.pharmacy.ualberta.ca/PANAV) which can be used to validate and/or correct as well as re-reference assigned protein chemical shifts.

  5. Alternative legal and institutional approaches to global change

    International Nuclear Information System (INIS)

    Thacher, P.S.

    1991-01-01

    The processes of global change currently under way cannot be dealt with in isolation. Factors linked to environmental quality such as demographic growth, economic interdependence and indebtedness, sociopolitical changes, and others must be managed collectively. In looking at the problems of global change, a central question before us is: How comprehensive should a legal regime be in a world of considerable uncertainty in which everything is interrelated with everything else, and what we do may, or may not be, have irreversible consequences for future generations. This article focuses on the problem of global warming to provide a model approach to the larger issues of global change. This reduces the scope of global change to a manageable but representative class of the problems at issue. The author suggests an approach to stabilize global climate by the end of the next century. However, even within this relatively narrow context of stabilizing the climate, a comprehensive approach is needed to address all heat-trapping gases - not just CO 2 - to ensure that all human activities generating these gases are managed properly, without causing other problems

  6. Numerical probabilistic analysis for slope stability in fractured rock masses using DFN-DEM approach

    Directory of Open Access Journals (Sweden)

    Alireza Baghbanan

    2017-06-01

    Full Text Available Due to existence of uncertainties in input geometrical properties of fractures, there is not any unique solution for assessing the stability of slopes in jointed rock masses. Therefore, the necessity of applying probabilistic analysis in these cases is inevitable. In this study a probabilistic analysis procedure together with relevant algorithms are developed using Discrete Fracture Network-Distinct Element Method (DFN-DEM approach. In the right abutment of Karun 4 dam and downstream of the dam body, five joint sets and one major joint have been identified. According to the geometrical properties of fractures in Karun river valley, instability situations are probable in this abutment. In order to evaluate the stability of the rock slope, different combinations of joint set geometrical parameters are selected, and a series of numerical DEM simulations are performed on generated and validated DFN models in DFN-DEM approach to measure minimum required support patterns in dry and saturated conditions. Results indicate that the distribution of required bolt length is well fitted with a lognormal distribution in both circumstances. In dry conditions, the calculated mean value is 1125.3 m, and more than 80 percent of models need only 1614.99 m of bolts which is a bolt pattern with 2 m spacing and 12 m length. However, as for the slopes with saturated condition, the calculated mean value is 1821.8 m, and more than 80 percent of models need only 2653.49 m of bolts which is equivalent to a bolt pattern with 15 m length and 1.5 m spacing. Comparison between obtained results with numerical and empirical method show that investigation of a slope stability with different DFN realizations which conducted in different block patterns is more efficient than the empirical methods.

  7. Development of probabilistic fatigue curve for asphalt concrete based on viscoelastic continuum damage mechanics

    Directory of Open Access Journals (Sweden)

    Himanshu Sharma

    2016-07-01

    Full Text Available Due to its roots in fundamental thermodynamic framework, continuum damage approach is popular for modeling asphalt concrete behavior. Currently used continuum damage models use mixture averaged values for model parameters and assume deterministic damage process. On the other hand, significant scatter is found in fatigue data generated even under extremely controlled laboratory testing conditions. Thus, currently used continuum damage models fail to account the scatter observed in fatigue data. This paper illustrates a novel approach for probabilistic fatigue life prediction based on viscoelastic continuum damage approach. Several specimens were tested for their viscoelastic properties and damage properties under uniaxial mode of loading. The data thus generated were analyzed using viscoelastic continuum damage mechanics principles to predict fatigue life. Weibull (2 parameter, 3 parameter and lognormal distributions were fit to fatigue life predicted using viscoelastic continuum damage approach. It was observed that fatigue damage could be best-described using Weibull distribution when compared to lognormal distribution. Due to its flexibility, 3-parameter Weibull distribution was found to fit better than 2-parameter Weibull distribution. Further, significant differences were found between probabilistic fatigue curves developed in this research and traditional deterministic fatigue curve. The proposed methodology combines advantages of continuum damage mechanics as well as probabilistic approaches. These probabilistic fatigue curves can be conveniently used for reliability based pavement design. Keywords: Probabilistic fatigue curve, Continuum damage mechanics, Weibull distribution, Lognormal distribution

  8. Probabilistic Durability Analysis in Advanced Engineering Design

    Directory of Open Access Journals (Sweden)

    A. Kudzys

    2000-01-01

    Full Text Available Expedience of probabilistic durability concepts and approaches in advanced engineering design of building materials, structural members and systems is considered. Target margin values of structural safety and serviceability indices are analyzed and their draft values are presented. Analytical methods of the cumulative coefficient of correlation and the limit transient action effect for calculation of reliability indices are given. Analysis can be used for probabilistic durability assessment of carrying and enclosure metal, reinforced concrete, wood, plastic, masonry both homogeneous and sandwich or composite structures and some kinds of equipments. Analysis models can be applied in other engineering fields.

  9. A global approach to risk management: lessons from the nuclear industry

    International Nuclear Information System (INIS)

    Lazo, T.; Kaufer, B.

    2003-01-01

    The industry's nuclear safety experts are continuously striving to minimise the possible risk and extent of a nuclear accident, while nuclear regulatory, authorities work to ensure that all safety requirements are met. Relying on a combination of deterministic and probabilistic approaches, they are obtaining positive results in terms of both risk-informed regulation and nuclear safety management. This article addresses this aspect of risk management, as well as the management of radiation exposure risk. It looks into nuclear emergency planning, preparedness and management, and stresses the importance of coordinating potential protection approaches and providing effective communication should a nuclear accident occur. (authors)

  10. Advances in probabilistic databases for uncertain information management

    CERN Document Server

    Yan, Li

    2013-01-01

    This book covers a fast-growing topic in great depth and focuses on the technologies and applications of probabilistic data management. It aims to provide a single account of current studies in probabilistic data management. The objective of the book is to provide the state of the art information to researchers, practitioners, and graduate students of information technology of intelligent information processing, and at the same time serving the information technology professional faced with non-traditional applications that make the application of conventional approaches difficult or impossible.

  11. Modeling and control of an unstable system using probabilistic fuzzy inference system

    Directory of Open Access Journals (Sweden)

    Sozhamadevi N.

    2015-09-01

    Full Text Available A new type Fuzzy Inference System is proposed, a Probabilistic Fuzzy Inference system which model and minimizes the effects of statistical uncertainties. The blend of two different concepts, degree of truth and probability of truth in a unique framework leads to this new concept. This combination is carried out both in Fuzzy sets and Fuzzy rules, which gives rise to Probabilistic Fuzzy Sets and Probabilistic Fuzzy Rules. Introducing these probabilistic elements, a distinctive probabilistic fuzzy inference system is developed and this involves fuzzification, inference and output processing. This integrated approach accounts for all of the uncertainty like rule uncertainties and measurement uncertainties present in the systems and has led to the design which performs optimally after training. In this paper a Probabilistic Fuzzy Inference System is applied for modeling and control of a highly nonlinear, unstable system and also proved its effectiveness.

  12. On Probabilistic Alpha-Fuzzy Fixed Points and Related Convergence Results in Probabilistic Metric and Menger Spaces under Some Pompeiu-Hausdorff-Like Probabilistic Contractive Conditions

    OpenAIRE

    De la Sen, M.

    2015-01-01

    In the framework of complete probabilistic metric spaces and, in particular, in probabilistic Menger spaces, this paper investigates some relevant properties of convergence of sequences to probabilistic α-fuzzy fixed points under some types of probabilistic contractive conditions.

  13. Probabilistic simulation applications to reliability assessments

    International Nuclear Information System (INIS)

    Miller, Ian; Nutt, Mark W.; Hill, Ralph S. III

    2003-01-01

    Probabilistic risk/reliability (PRA) analyses for engineered systems are conventionally based on fault-tree methods. These methods are mature and efficient, and are well suited to systems consisting of interacting components with known, low probabilities of failure. Even complex systems, such as nuclear power plants or aircraft, are modeled by the careful application of these approaches. However, for systems that may evolve in complex and nonlinear ways, and where the performance of components may be a sensitive function of the history of their working environments, fault-tree methods can be very demanding. This paper proposes an alternative method of evaluating such systems, based on probabilistic simulation using intelligent software objects to represent the components of such systems. Using a Monte Carlo approach, simulation models can be constructed from relatively simple interacting objects that capture the essential behavior of the components that they represent. Such models are capable of reflecting the complex behaviors of the systems that they represent in a natural and realistic way. (author)

  14. Stability analysis for discrete-time stochastic memristive neural networks with both leakage and probabilistic delays.

    Science.gov (United States)

    Liu, Hongjian; Wang, Zidong; Shen, Bo; Huang, Tingwen; Alsaadi, Fuad E

    2018-06-01

    This paper is concerned with the globally exponential stability problem for a class of discrete-time stochastic memristive neural networks (DSMNNs) with both leakage delays as well as probabilistic time-varying delays. For the probabilistic delays, a sequence of Bernoulli distributed random variables is utilized to determine within which intervals the time-varying delays fall at certain time instant. The sector-bounded activation function is considered in the addressed DSMNN. By taking into account the state-dependent characteristics of the network parameters and choosing an appropriate Lyapunov-Krasovskii functional, some sufficient conditions are established under which the underlying DSMNN is globally exponentially stable in the mean square. The derived conditions are made dependent on both the leakage and the probabilistic delays, and are therefore less conservative than the traditional delay-independent criteria. A simulation example is given to show the effectiveness of the proposed stability criterion. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Probabilistic Insurance

    NARCIS (Netherlands)

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    Probabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in premium to compensate for a 1% default risk. These observations cannot be

  16. Probabilistic Insurance

    NARCIS (Netherlands)

    P.P. Wakker (Peter); R.H. Thaler (Richard); A. Tversky (Amos)

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these

  17. Symbolic Computing in Probabilistic and Stochastic Analysis

    Directory of Open Access Journals (Sweden)

    Kamiński Marcin

    2015-12-01

    Full Text Available The main aim is to present recent developments in applications of symbolic computing in probabilistic and stochastic analysis, and this is done using the example of the well-known MAPLE system. The key theoretical methods discussed are (i analytical derivations, (ii the classical Monte-Carlo simulation approach, (iii the stochastic perturbation technique, as well as (iv some semi-analytical approaches. It is demonstrated in particular how to engage the basic symbolic tools implemented in any system to derive the basic equations for the stochastic perturbation technique and how to make an efficient implementation of the semi-analytical methods using an automatic differentiation and integration provided by the computer algebra program itself. The second important illustration is probabilistic extension of the finite element and finite difference methods coded in MAPLE, showing how to solve boundary value problems with random parameters in the environment of symbolic computing. The response function method belongs to the third group, where interference of classical deterministic software with the non-linear fitting numerical techniques available in various symbolic environments is displayed. We recover in this context the probabilistic structural response in engineering systems and show how to solve partial differential equations including Gaussian randomness in their coefficients.

  18. A note on probabilistic models over strings: the linear algebra approach.

    Science.gov (United States)

    Bouchard-Côté, Alexandre

    2013-12-01

    Probabilistic models over strings have played a key role in developing methods that take into consideration indels as phylogenetically informative events. There is an extensive literature on using automata and transducers on phylogenies to do inference on these probabilistic models, in which an important theoretical question is the complexity of computing the normalization of a class of string-valued graphical models. This question has been investigated using tools from combinatorics, dynamic programming, and graph theory, and has practical applications in Bayesian phylogenetics. In this work, we revisit this theoretical question from a different point of view, based on linear algebra. The main contribution is a set of results based on this linear algebra view that facilitate the analysis and design of inference algorithms on string-valued graphical models. As an illustration, we use this method to give a new elementary proof of a known result on the complexity of inference on the "TKF91" model, a well-known probabilistic model over strings. Compared to previous work, our proving method is easier to extend to other models, since it relies on a novel weak condition, triangular transducers, which is easy to establish in practice. The linear algebra view provides a concise way of describing transducer algorithms and their compositions, opens the possibility of transferring fast linear algebra libraries (for example, based on GPUs), as well as low rank matrix approximation methods, to string-valued inference problems.

  19. Probabilistic liver atlas construction.

    Science.gov (United States)

    Dura, Esther; Domingo, Juan; Ayala, Guillermo; Marti-Bonmati, Luis; Goceri, E

    2017-01-13

    Anatomical atlases are 3D volumes or shapes representing an organ or structure of the human body. They contain either the prototypical shape of the object of interest together with other shapes representing its statistical variations (statistical atlas) or a probability map of belonging to the object (probabilistic atlas). Probabilistic atlases are mostly built with simple estimations only involving the data at each spatial location. A new method for probabilistic atlas construction that uses a generalized linear model is proposed. This method aims to improve the estimation of the probability to be covered by the liver. Furthermore, all methods to build an atlas involve previous coregistration of the sample of shapes available. The influence of the geometrical transformation adopted for registration in the quality of the final atlas has not been sufficiently investigated. The ability of an atlas to adapt to a new case is one of the most important quality criteria that should be taken into account. The presented experiments show that some methods for atlas construction are severely affected by the previous coregistration step. We show the good performance of the new approach. Furthermore, results suggest that extremely flexible registration methods are not always beneficial, since they can reduce the variability of the atlas and hence its ability to give sensible values of probability when used as an aid in segmentation of new cases.

  20. Next-generation probabilistic seismicity forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Hiemer, S.

    2014-07-01

    The development of probabilistic seismicity forecasts is one of the most important tasks of seismologists at present time. Such forecasts form the basis of probabilistic seismic hazard assessment, a widely used approach to generate ground motion exceedance maps. These hazard maps guide the development of building codes, and in the absence of the ability to deterministically predict earthquakes, good building and infrastructure planning is key to prevent catastrophes. Probabilistic seismicity forecasts are models that specify the occurrence rate of earthquakes as a function of space, time and magnitude. The models presented in this thesis are time-invariant mainshock occurrence models. Accordingly, the reliable estimation of the spatial and size distribution of seismicity are of crucial importance when constructing such probabilistic forecasts. Thereby we focus on data-driven approaches to infer these distributions, circumventing the need for arbitrarily chosen external parameters and subjective expert decisions. Kernel estimation has been shown to appropriately transform discrete earthquake locations into spatially continuous probability distributions. However, we show that neglecting the information from fault networks constitutes a considerable shortcoming and thus limits the skill of these current seismicity models. We present a novel earthquake rate forecast that applies the kernel-smoothing method to both past earthquake locations and slip rates on mapped crustal faults applied to Californian and European data. Our model is independent from biases caused by commonly used non-objective seismic zonations, which impose artificial borders of activity that are not expected in nature. Studying the spatial variability of the seismicity size distribution is of great importance. The b-value of the well-established empirical Gutenberg-Richter model forecasts the rates of hazard-relevant large earthquakes based on the observed rates of abundant small events. We propose a

  1. Next-generation probabilistic seismicity forecasting

    International Nuclear Information System (INIS)

    Hiemer, S.

    2014-01-01

    The development of probabilistic seismicity forecasts is one of the most important tasks of seismologists at present time. Such forecasts form the basis of probabilistic seismic hazard assessment, a widely used approach to generate ground motion exceedance maps. These hazard maps guide the development of building codes, and in the absence of the ability to deterministically predict earthquakes, good building and infrastructure planning is key to prevent catastrophes. Probabilistic seismicity forecasts are models that specify the occurrence rate of earthquakes as a function of space, time and magnitude. The models presented in this thesis are time-invariant mainshock occurrence models. Accordingly, the reliable estimation of the spatial and size distribution of seismicity are of crucial importance when constructing such probabilistic forecasts. Thereby we focus on data-driven approaches to infer these distributions, circumventing the need for arbitrarily chosen external parameters and subjective expert decisions. Kernel estimation has been shown to appropriately transform discrete earthquake locations into spatially continuous probability distributions. However, we show that neglecting the information from fault networks constitutes a considerable shortcoming and thus limits the skill of these current seismicity models. We present a novel earthquake rate forecast that applies the kernel-smoothing method to both past earthquake locations and slip rates on mapped crustal faults applied to Californian and European data. Our model is independent from biases caused by commonly used non-objective seismic zonations, which impose artificial borders of activity that are not expected in nature. Studying the spatial variability of the seismicity size distribution is of great importance. The b-value of the well-established empirical Gutenberg-Richter model forecasts the rates of hazard-relevant large earthquakes based on the observed rates of abundant small events. We propose a

  2. Assessing Probabilistic Risk Assessment Approaches for Insect Biological Control Introductions

    Directory of Open Access Journals (Sweden)

    Leyla V. Kaufman

    2017-07-01

    Full Text Available The introduction of biological control agents to new environments requires host specificity tests to estimate potential non-target impacts of a prospective agent. Currently, the approach is conservative, and is based on physiological host ranges determined under captive rearing conditions, without consideration for ecological factors that may influence realized host range. We use historical data and current field data from introduced parasitoids that attack an endemic Lepidoptera species in Hawaii to validate a probabilistic risk assessment (PRA procedure for non-target impacts. We use data on known host range and habitat use in the place of origin of the parasitoids to determine whether contemporary levels of non-target parasitism could have been predicted using PRA. Our results show that reasonable predictions of potential non-target impacts may be made if comprehensive data are available from places of origin of biological control agents, but scant data produce poor predictions. Using apparent mortality data rather than marginal attack rate estimates in PRA resulted in over-estimates of predicted non-target impact. Incorporating ecological data into PRA models improved the predictive power of the risk assessments.

  3. Probabilistic logic networks a comprehensive framework for uncertain inference

    CERN Document Server

    Goertzel, Ben; Goertzel, Izabela Freire; Heljakka, Ari

    2008-01-01

    This comprehensive book describes Probabilistic Logic Networks (PLN), a novel conceptual, mathematical and computational approach to uncertain inference. A broad scope of reasoning types are considered.

  4. Integration of Probabilistic Exposure Assessment and Probabilistic Hazard Characterization

    NARCIS (Netherlands)

    Voet, van der H.; Slob, W.

    2007-01-01

    A method is proposed for integrated probabilistic risk assessment where exposure assessment and hazard characterization are both included in a probabilistic way. The aim is to specify the probability that a random individual from a defined (sub)population will have an exposure high enough to cause a

  5. Probabilistic Networks

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Lauritzen, Steffen Lilholt

    2001-01-01

    This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs.......This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs....

  6. Probabilistic hypergraph based hash codes for social image search

    Institute of Scientific and Technical Information of China (English)

    Yi XIE; Hui-min YU; Roland HU

    2014-01-01

    With the rapid development of the Internet, recent years have seen the explosive growth of social media. This brings great challenges in performing efficient and accurate image retrieval on a large scale. Recent work shows that using hashing methods to embed high-dimensional image features and tag information into Hamming space provides a powerful way to index large collections of social images. By learning hash codes through a spectral graph partitioning algorithm, spectral hashing (SH) has shown promising performance among various hashing approaches. However, it is incomplete to model the relations among images only by pairwise simple graphs which ignore the relationship in a higher order. In this paper, we utilize a probabilistic hypergraph model to learn hash codes for social image retrieval. A probabilistic hypergraph model offers a higher order repre-sentation among social images by connecting more than two images in one hyperedge. Unlike a normal hypergraph model, a probabilistic hypergraph model considers not only the grouping information, but also the similarities between vertices in hy-peredges. Experiments on Flickr image datasets verify the performance of our proposed approach.

  7. Probabilistic risk assessment methodology

    International Nuclear Information System (INIS)

    Shinaishin, M.A.

    1988-06-01

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  8. Probabilistic risk assessment methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shinaishin, M A

    1988-06-15

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  9. Seismic and wind vulnerability assessment for the GAR-13 global risk assessment

    OpenAIRE

    Yamín Lacouture, Luis Eduardo; Hurtado Chaparro, Alvaro Ivan; Barbat Barbat, Horia Alejandro; Cardona Arboleda, Omar Dario

    2014-01-01

    A general methodology to evaluate vulnerability functions suitable for a probabilistic global risk assessment is proposed. The methodology is partially based in the methodological approach of the Multi-hazard Loss Estimation Methodology (Hazus) developed by the Federal Emergency Management Agency (FEMA). The vulnerability assessment process considers the resolution, information and limitations established for both the hazard and exposure models adopted. Seismic and wind vulnerability function...

  10. Event-Based Media Enrichment Using an Adaptive Probabilistic Hypergraph Model.

    Science.gov (United States)

    Liu, Xueliang; Wang, Meng; Yin, Bao-Cai; Huet, Benoit; Li, Xuelong

    2015-11-01

    Nowadays, with the continual development of digital capture technologies and social media services, a vast number of media documents are captured and shared online to help attendees record their experience during events. In this paper, we present a method combining semantic inference and multimodal analysis for automatically finding media content to illustrate events using an adaptive probabilistic hypergraph model. In this model, media items are taken as vertices in the weighted hypergraph and the task of enriching media to illustrate events is formulated as a ranking problem. In our method, each hyperedge is constructed using the K-nearest neighbors of a given media document. We also employ a probabilistic representation, which assigns each vertex to a hyperedge in a probabilistic way, to further exploit the correlation among media data. Furthermore, we optimize the hypergraph weights in a regularization framework, which is solved as a second-order cone problem. The approach is initiated by seed media and then used to rank the media documents using a transductive inference process. The results obtained from validating the approach on an event dataset collected from EventMedia demonstrate the effectiveness of the proposed approach.

  11. Tools for voltage stability analysis, including a probabilistic approach

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Filho, X; Martins, N; Bianco, A; Pinto, H J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  12. Probabilistic theory of mean field games with applications

    CERN Document Server

    Carmona, René

    2018-01-01

    This two-volume book offers a comprehensive treatment of the probabilistic approach to mean field game models and their applications. The book is self-contained in nature and includes original material and applications with explicit examples throughout, including numerical solutions. Volume I of the book is entirely devoted to the theory of mean field games without a common noise. The first half of the volume provides a self-contained introduction to mean field games, starting from concrete illustrations of games with a finite number of players, and ending with ready-for-use solvability results. Readers are provided with the tools necessary for the solution of forward-backward stochastic differential equations of the McKean-Vlasov type at the core of the probabilistic approach. The second half of this volume focuses on the main principles of analysis on the Wasserstein space. It includes Lions' approach to the Wasserstein differential calculus, and the applications of its results to the analysis of stochastic...

  13. Probabilistic Logical Characterization

    DEFF Research Database (Denmark)

    Hermanns, Holger; Parma, Augusto; Segala, Roberto

    2011-01-01

    Probabilistic automata exhibit both probabilistic and non-deterministic choice. They are therefore a powerful semantic foundation for modeling concurrent systems with random phenomena arising in many applications ranging from artificial intelligence, security, systems biology to performance...... modeling. Several variations of bisimulation and simulation relations have proved to be useful as means to abstract and compare different automata. This paper develops a taxonomy of logical characterizations of these relations on image-finite and image-infinite probabilistic automata....

  14. Probabilistic metric spaces

    CERN Document Server

    Schweizer, B

    2005-01-01

    Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

  15. ProLBB - A Probabilistic Approach to Leak Before Break Demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Dillstroem, Peter; Weilin Zang (Inspecta Technology AB, Stockholm (SE))

    2007-11-15

    Recently, the Swedish Nuclear Power Inspectorate has developed guidelines on how to demonstrate the existence of Leak Before Break (LBB). The guidelines, mainly based on NUREG/CR-6765, define the steps that must be fulfilled to get a conservative assessment of LBB acceptability. In this report, a probabilistic LBB approach is defined and implemented into the software ProLBB. The main conclusions, from the study presented in this report, are summarized below. - The probabilistic approach developed in this study was applied to different piping systems in both Boiler Water Reactors (BWR) and Pressurised Water Reactors (PWR). Pipe sizes were selected so that small, medium and large pipes were included in the analysis. The present study shows that the conditional probability of fracture is in general small for the larger diameter pipes when evaluated as function of leak flow rate. However, when evaluated as function of fraction of crack length around the circumference, then the larger diameter pipes will belong to the ones with the highest conditional fracture probabilities. - The total failure probability, corresponding to the product between the leak probability and the conditional fracture probability, will be very small for all pipe geometries when evaluated as function of fraction of crack length around the circumference. This is mainly due to a small leak probability which is consistent with expectations since no active damage mechanism has been assumed. - One of the objectives of the approach was to be able to check the influence of off-centre cracks (i.e. the possibility that cracks occur randomly around the pipe circumference). To satisfy this objective, new stress intensity factor solutions for off-centre cracks were developed. Also to check how off-centre cracks influence crack opening areas, new form factors solutions for COA were developed taking plastic deformation into account. - The influence from an off-center crack position on the conditional

  16. ProLBB - A Probabilistic Approach to Leak Before Break Demonstration

    International Nuclear Information System (INIS)

    Dillstroem, Peter; Weilin Zang

    2007-11-01

    Recently, the Swedish Nuclear Power Inspectorate has developed guidelines on how to demonstrate the existence of Leak Before Break (LBB). The guidelines, mainly based on NUREG/CR-6765, define the steps that must be fulfilled to get a conservative assessment of LBB acceptability. In this report, a probabilistic LBB approach is defined and implemented into the software ProLBB. The main conclusions, from the study presented in this report, are summarized below. - The probabilistic approach developed in this study was applied to different piping systems in both Boiler Water Reactors (BWR) and Pressurised Water Reactors (PWR). Pipe sizes were selected so that small, medium and large pipes were included in the analysis. The present study shows that the conditional probability of fracture is in general small for the larger diameter pipes when evaluated as function of leak flow rate. However, when evaluated as function of fraction of crack length around the circumference, then the larger diameter pipes will belong to the ones with the highest conditional fracture probabilities. - The total failure probability, corresponding to the product between the leak probability and the conditional fracture probability, will be very small for all pipe geometries when evaluated as function of fraction of crack length around the circumference. This is mainly due to a small leak probability which is consistent with expectations since no active damage mechanism has been assumed. - One of the objectives of the approach was to be able to check the influence of off-centre cracks (i.e. the possibility that cracks occur randomly around the pipe circumference). To satisfy this objective, new stress intensity factor solutions for off-centre cracks were developed. Also to check how off-centre cracks influence crack opening areas, new form factors solutions for COA were developed taking plastic deformation into account. - The influence from an off-center crack position on the conditional

  17. Probabilistic record linkage.

    Science.gov (United States)

    Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona

    2016-06-01

    Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a 'black box' research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. © The Author 2015; Published by Oxford University Press on behalf of the International Epidemiological Association.

  18. Probabilistic and Fuzzy Arithmetic Approaches for the Treatment of Uncertainties in the Installation of Torpedo Piles

    Directory of Open Access Journals (Sweden)

    Denise Margareth Kazue Nishimura Kunitaki

    2008-01-01

    Full Text Available The “torpedo” pile is a foundation system that has been recently considered to anchor mooring lines and risers of floating production systems for offshore oil exploitation. The pile is installed in a free fall operation from a vessel. However, the soil parameters involved in the penetration model of the torpedo pile contain uncertainties that can affect the precision of analysis methods to evaluate its final penetration depth. Therefore, this paper deals with methodologies for the assessment of the sensitivity of the response to the variation of the uncertain parameters and mainly to incorporate into the analysis method techniques for the formal treatment of the uncertainties. Probabilistic and “possibilistic” approaches are considered, involving, respectively, the Monte Carlo method (MC and concepts of fuzzy arithmetic (FA. The results and performance of both approaches are compared, stressing the ability of the latter approach to efficiently deal with the uncertainties of the model, with outstanding computational efficiency, and therefore, to comprise an effective design tool.

  19. Combination of the deterministic and probabilistic approaches for risk-informed decision-making in US NRC regulatory guides

    International Nuclear Information System (INIS)

    Patrik, M.; Babic, P.

    2001-06-01

    The report responds to the trend where probabilistic safety analyses are attached, on a voluntary basis (as yet), to the mandatory deterministic assessment of modifications of NPP systems or operating procedures, resulting in risk-informed type documents. It contains a nearly complete Czech translation of US NRC Regulatory Guide 1.177 and presents some suggestions for improving a) PSA study applications; b) the development of NPP documents for the regulatory body; and c) the interconnection between PSA and traditional deterministic analyses as contained in the risk-informed approach. (P.A.)

  20. Probabilistic aspects of risk analyses for hazardous facilities

    International Nuclear Information System (INIS)

    Morici, A.; Valeri, A.; Zaffiro, C.

    1989-01-01

    The work described in the paper discusses the aspects of the risk analysis concerned with the use of the probabilistic methodology, in order to see how this approach may affect the risk management of industrial hazardous facilities. To this purpose reference is done to the Probabilistic Risk Assessment (PRA) of nuclear power plants. The paper points out that even though the public aversion towards nuclear risks is still far from being removed, the probabilistic approach may provide a sound support to the decision making and authorization process for any industrial activity implying risk for the environment and the public health. It is opinion of the authors that the probabilistic techniques have been developed to a great level of sophistication in the nuclear industry and provided much more experience in this field than in others. For some particular areas of the nuclear applications, such as the plant reliability and the plant response to the accidents, these techniques have reached a sufficient level of maturity and so some results have been usefully taken as a measure of the safety level of the plant itself. The use of some limited safety goals is regarded as a relevant item of the nuclear licensing process. The paper claims that it is time now that these methods would be applied with equal success to other hazardous facilities, and makes some comparative consideration on the differences of these plants with nuclear power plants in order to understand the effect of these differences on the PRA results and on the use one intends to make with them. (author)

  1. A robust probabilistic approach for variational inversion in shallow water acoustic tomography

    International Nuclear Information System (INIS)

    Berrada, M; Badran, F; Crépon, M; Thiria, S; Hermand, J-P

    2009-01-01

    This paper presents a variational methodology for inverting shallow water acoustic tomography (SWAT) measurements. The aim is to determine the vertical profile of the speed of sound c(z), knowing the acoustic pressures generated by a frequency source and collected by a sparse vertical hydrophone array (VRA). A variational approach that minimizes a cost function measuring the distance between observations and their modeled equivalents is used. A regularization term in the form of a quadratic restoring term to a background is also added. To avoid inverting the variance–covariance matrix associated with the above-weighted quadratic background, this work proposes to model the sound speed vector using probabilistic principal component analysis (PPCA). The PPCA introduces an optimum reduced number of non-correlated latent variables η, which determine a new control vector and a new regularization term, expressed as η T η. The PPCA represents a rigorous formalism for the use of a priori information and allows an efficient implementation of the variational inverse method

  2. Probabilistic methods for physics

    International Nuclear Information System (INIS)

    Cirier, G

    2013-01-01

    We present an asymptotic method giving a probability of presence of the iterated spots of R d by a polynomial function f. We use the well-known Perron Frobenius operator (PF) that lets certain sets and measure invariant by f. Probabilistic solutions can exist for the deterministic iteration. If the theoretical result is already known, here we quantify these probabilities. This approach seems interesting to use for computing situations when the deterministic methods don't run. Among the examined applications, are asymptotic solutions of Lorenz, Navier-Stokes or Hamilton's equations. In this approach, linearity induces many difficult problems, all of whom we have not yet resolved.

  3. Tsunamigenic scenarios for southern Peru and northern Chile seismic gap: Deterministic and probabilistic hybrid approach for hazard assessment

    Science.gov (United States)

    González-Carrasco, J. F.; Gonzalez, G.; Aránguiz, R.; Yanez, G. A.; Melgar, D.; Salazar, P.; Shrivastava, M. N.; Das, R.; Catalan, P. A.; Cienfuegos, R.

    2017-12-01

    Plausible worst-case tsunamigenic scenarios definition plays a relevant role in tsunami hazard assessment focused in emergency preparedness and evacuation planning for coastal communities. During the last decade, the occurrence of major and moderate tsunamigenic earthquakes along worldwide subduction zones has given clues about critical parameters involved in near-field tsunami inundation processes, i.e. slip spatial distribution, shelf resonance of edge waves and local geomorphology effects. To analyze the effects of these seismic and hydrodynamic variables over the epistemic uncertainty of coastal inundation, we implement a combined methodology using deterministic and probabilistic approaches to construct 420 tsunamigenic scenarios in a mature seismic gap of southern Peru and northern Chile, extended from 17ºS to 24ºS. The deterministic scenarios are calculated using a regional distribution of trench-parallel gravity anomaly (TPGA) and trench-parallel topography anomaly (TPTA), three-dimensional Slab 1.0 worldwide subduction zones geometry model and published interseismic coupling (ISC) distributions. As result, we find four higher slip deficit zones interpreted as major seismic asperities of the gap, used in a hierarchical tree scheme to generate ten tsunamigenic scenarios with seismic magnitudes fluctuates between Mw 8.4 to Mw 8.9. Additionally, we construct ten homogeneous slip scenarios as inundation baseline. For the probabilistic approach, we implement a Karhunen - Loève expansion to generate 400 stochastic tsunamigenic scenarios over the maximum extension of the gap, with the same magnitude range of the deterministic sources. All the scenarios are simulated through a non-hydrostatic tsunami model Neowave 2D, using a classical nesting scheme, for five coastal major cities in northern Chile (Arica, Iquique, Tocopilla, Mejillones and Antofagasta) obtaining high resolution data of inundation depth, runup, coastal currents and sea level elevation. The

  4. Probabilistic vs linear blending approaches to shared control for wheelchair driving.

    Science.gov (United States)

    Ezeh, Chinemelu; Trautman, Pete; Devigne, Louise; Bureau, Valentin; Babel, Marie; Carlson, Tom

    2017-07-01

    Some people with severe mobility impairments are unable to operate powered wheelchairs reliably and effectively, using commercially available interfaces. This has sparked a body of research into "smart wheelchairs", which assist users to drive safely and create opportunities for them to use alternative interfaces. Various "shared control" techniques have been proposed to provide an appropriate level of assistance that is satisfactory and acceptable to the user. Most shared control techniques employ a traditional strategy called linear blending (LB), where the user's commands and wheelchair's autonomous commands are combined in some proportion. In this paper, however, we implement a more generalised form of shared control called probabilistic shared control (PSC). This probabilistic formulation improves the accuracy of modelling the interaction between the user and the wheelchair by taking into account uncertainty in the interaction. In this paper, we demonstrate the practical success of PSC over LB in terms of safety, particularly for novice users.

  5. Decision making by hybrid probabilistic: Possibilistic utility theory

    Directory of Open Access Journals (Sweden)

    Pap Endre

    2009-01-01

    Full Text Available It is presented an approach to decision theory based upon nonprobabilistic uncertainty. There is an axiomatization of the hybrid probabilistic possibilistic mixtures based on a pair of triangular conorm and triangular norm satisfying restricted distributivity law, and the corresponding non-additive Smeasure. This is characterized by the families of operations involved in generalized mixtures, based upon a previous result on the characterization of the pair of continuous t-norm and t-conorm such that the former is restrictedly distributive over the latter. The obtained family of mixtures combines probabilistic and idempotent (possibilistic mixtures via a threshold.

  6. A probabilistic approach to emission-line galaxy classification

    Science.gov (United States)

    de Souza, R. S.; Dantas, M. L. L.; Costa-Duarte, M. V.; Feigelson, E. D.; Killedar, M.; Lablanche, P.-Y.; Vilalta, R.; Krone-Martins, A.; Beck, R.; Gieseke, F.

    2017-12-01

    We invoke a Gaussian mixture model (GMM) to jointly analyse two traditional emission-line classification schemes of galaxy ionization sources: the Baldwin-Phillips-Terlevich (BPT) and WH α versus [N II]/H α (WHAN) diagrams, using spectroscopic data from the Sloan Digital Sky Survey Data Release 7 and SEAGal/STARLIGHT data sets. We apply a GMM to empirically define classes of galaxies in a three-dimensional space spanned by the log [O III]/H β, log [N II]/H α and log EW(H α) optical parameters. The best-fitting GMM based on several statistical criteria suggests a solution around four Gaussian components (GCs), which are capable to explain up to 97 per cent of the data variance. Using elements of information theory, we compare each GC to their respective astronomical counterpart. GC1 and GC4 are associated with star-forming galaxies, suggesting the need to define a new starburst subgroup. GC2 is associated with BPT's active galactic nuclei (AGN) class and WHAN's weak AGN class. GC3 is associated with BPT's composite class and WHAN's strong AGN class. Conversely, there is no statistical evidence - based on four GCs - for the existence of a Seyfert/low-ionization nuclear emission-line region (LINER) dichotomy in our sample. Notwithstanding, the inclusion of an additional GC5 unravels it. The GC5 appears associated with the LINER and passive galaxies on the BPT and WHAN diagrams, respectively. This indicates that if the Seyfert/LINER dichotomy is there, it does not account significantly to the global data variance and may be overlooked by standard metrics of goodness of fit. Subtleties aside, we demonstrate the potential of our methodology to recover/unravel different objects inside the wilderness of astronomical data sets, without lacking the ability to convey physically interpretable results. The probabilistic classifications from the GMM analysis are publicly available within the COINtoolbox at https://cointoolbox.github.io/GMM_Catalogue/.

  7. Probabilistic risk assessment for six vapour intrusion algorithms

    NARCIS (Netherlands)

    Provoost, J.; Reijnders, L.; Bronders, J.; Van Keer, I.; Govaerts, S.

    2014-01-01

    A probabilistic assessment with sensitivity analysis using Monte Carlo simulation for six vapour intrusion algorithms, used in various regulatory frameworks for contaminated land management, is presented here. In addition a deterministic approach with default parameter sets is evaluated against

  8. Disjunctive Probabilistic Modal Logic is Enough for Bisimilarity on Reactive Probabilistic Systems

    OpenAIRE

    Bernardo, Marco; Miculan, Marino

    2016-01-01

    Larsen and Skou characterized probabilistic bisimilarity over reactive probabilistic systems with a logic including true, negation, conjunction, and a diamond modality decorated with a probabilistic lower bound. Later on, Desharnais, Edalat, and Panangaden showed that negation is not necessary to characterize the same equivalence. In this paper, we prove that the logical characterization holds also when conjunction is replaced by disjunction, with negation still being not necessary. To this e...

  9. Probabilistic Structural Analysis Program

    Science.gov (United States)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  10. Generalized outcome-based strategy classification: comparing deterministic and probabilistic choice models.

    Science.gov (United States)

    Hilbig, Benjamin E; Moshagen, Morten

    2014-12-01

    Model comparisons are a vital tool for disentangling which of several strategies a decision maker may have used--that is, which cognitive processes may have governed observable choice behavior. However, previous methodological approaches have been limited to models (i.e., decision strategies) with deterministic choice rules. As such, psychologically plausible choice models--such as evidence-accumulation and connectionist models--that entail probabilistic choice predictions could not be considered appropriately. To overcome this limitation, we propose a generalization of Bröder and Schiffer's (Journal of Behavioral Decision Making, 19, 361-380, 2003) choice-based classification method, relying on (1) parametric order constraints in the multinomial processing tree framework to implement probabilistic models and (2) minimum description length for model comparison. The advantages of the generalized approach are demonstrated through recovery simulations and an experiment. In explaining previous methods and our generalization, we maintain a nontechnical focus--so as to provide a practical guide for comparing both deterministic and probabilistic choice models.

  11. Application of probabilistic precipitation forecasts from a ...

    African Journals Online (AJOL)

    Application of probabilistic precipitation forecasts from a deterministic model towards increasing the lead-time of flash flood forecasts in South Africa. ... The procedure is applied to a real flash flood event and the ensemble-based rainfall forecasts are verified against rainfall estimated by the SAFFG system. The approach ...

  12. A probabilistic approach for the interpretation of RNA profiles as cell type evidence.

    Science.gov (United States)

    de Zoete, Jacob; Curran, James; Sjerps, Marjan

    2016-01-01

    DNA profiles can be used as evidence to distinguish between possible donors of a crime stain. In some cases, both the prosecution and the defence claim that the cell material was left by the suspect but they dispute which cell type was left behind. For example, in sexual offense cases the prosecution could claim that the sample contains semen cells where the defence argues that the sample contains skin cells. In these cases, traditional methods (e.g. a phosphatase test) can be used to examine the cell type contained in the sample. However, there are some drawbacks when using these methods. For instance, many of these techniques need to be carried out separately for each cell type and each of them requires part of the available sample, which reduces the amount that can be used for DNA analysis. Another option is messenger RNA (mRNA) evidence. mRNA expression levels vary among cell types and can be used to make (probability) statements about the cell type(s) present in a sample. Existing methods for the interpretation of RNA profiles as evidence for the presence of certain cell types aim at making categorical statements. Such statements limit the possibility to report the associated uncertainty. Some of these existing methods will be discussed. Most notably, a method based on a 'n/2' scoring rule (Lindenbergh et al.) and a method using marker values and cell type scoring thresholds (Roeder et al.). From a statistical point of view, a probabilistic approach is the most obvious choice. Two approaches (multinomial logistic regression and naïve Bayes') are suggested. All methods are compared, using two different datasets and several criteria regarding their ability to assess the evidential value of RNA profiles. We conclude that both the naïve Bayes' method and a method based on multinomial logistic regression, that produces a probabilistic statement as measure of the evidential value, are an important improvement of the existing methods. Besides a better performance

  13. A probabilistic bridge safety evaluation against floods.

    Science.gov (United States)

    Liao, Kuo-Wei; Muto, Yasunori; Chen, Wei-Lun; Wu, Bang-Ho

    2016-01-01

    To further capture the influences of uncertain factors on river bridge safety evaluation, a probabilistic approach is adopted. Because this is a systematic and nonlinear problem, MPP-based reliability analyses are not suitable. A sampling approach such as a Monte Carlo simulation (MCS) or importance sampling is often adopted. To enhance the efficiency of the sampling approach, this study utilizes Bayesian least squares support vector machines to construct a response surface followed by an MCS, providing a more precise safety index. Although there are several factors impacting the flood-resistant reliability of a bridge, previous experiences and studies show that the reliability of the bridge itself plays a key role. Thus, the goal of this study is to analyze the system reliability of a selected bridge that includes five limit states. The random variables considered here include the water surface elevation, water velocity, local scour depth, soil property and wind load. Because the first three variables are deeply affected by river hydraulics, a probabilistic HEC-RAS-based simulation is performed to capture the uncertainties in those random variables. The accuracy and variation of our solutions are confirmed by a direct MCS to ensure the applicability of the proposed approach. The results of a numerical example indicate that the proposed approach can efficiently provide an accurate bridge safety evaluation and maintain satisfactory variation.

  14. Automatic Probabilistic Program Verification through Random Variable Abstraction

    Directory of Open Access Journals (Sweden)

    Damián Barsotti

    2010-06-01

    Full Text Available The weakest pre-expectation calculus has been proved to be a mature theory to analyze quantitative properties of probabilistic and nondeterministic programs. We present an automatic method for proving quantitative linear properties on any denumerable state space using iterative backwards fixed point calculation in the general framework of abstract interpretation. In order to accomplish this task we present the technique of random variable abstraction (RVA and we also postulate a sufficient condition to achieve exact fixed point computation in the abstract domain. The feasibility of our approach is shown with two examples, one obtaining the expected running time of a probabilistic program, and the other the expected gain of a gambling strategy. Our method works on general guarded probabilistic and nondeterministic transition systems instead of plain pGCL programs, allowing us to easily model a wide range of systems including distributed ones and unstructured programs. We present the operational and weakest precondition semantics for this programs and prove its equivalence.

  15. Simple probabilistic approach to evaluate radioiodine behavior at severe accidents: application to Phebus test FPT1

    International Nuclear Information System (INIS)

    Rydl, A.

    2007-01-01

    The contribution of radioiodine to risk from a severe accident is recognized to be one of the highest among all the fission products. In a long term (e.g. several days), volatile species of iodine are the most important forms of iodine from the safety point of view. These volatile forms ('volatile iodine') are mainly molecular iodine, I 2 , and various types of organic iodides, RI. A certain controversy exist today among the international research community about the relative importance of the processes leading to volatile iodine formation in containment under severe accident conditions. The amount of knowledge, coming from experiments, of the phenomenology of iodine behavior is enormous and it is embedded in specialized mechanistic or empirical codes. An exhaustive description of the processes governing the iodine behavior in containment is given in reference 1. Yet, all this knowledge is still not enough to resolve some important questions. Moreover, the results of different codes -when applied to relatively simple experiments, such as RTF or CAIMAN - vary widely. Thus, as a complement (or maybe even as an alternative in some instances) to deterministic analyses of iodine behavior, simple probabilistic approach is proposed in this work which could help to see the whole problem in a different perspective. The final goal of using this approach should be the characterization of uncertainties of the description of various processes in question. This would allow for identification of the processes which contribute most significantly to the overall uncertainty of the predictions of iodine volatility in containment. In this work we made a dedicated, small event tree to describe iodine behavior at an accident and we used that tree for a simple sensitivity study. For the evaluation of the tree, the US NRC code EVNTRE was used. To test the proposed probabilistic approach we analyzed results of the integral PHEBUS FPT1 experiment which comprises most of the important

  16. Solving probabilistic inverse problems rapidly with prior samples

    NARCIS (Netherlands)

    Käufl, Paul; Valentine, Andrew P.; de Wit, Ralph W.; Trampert, Jeannot

    2016-01-01

    Owing to the increasing availability of computational resources, in recent years the probabilistic solution of non-linear, geophysical inverse problems by means of sampling methods has become increasingly feasible. Nevertheless, we still face situations in which a Monte Carlo approach is not

  17. Application of probabilistic fracture mechanics to reactor pressure vessel safety assessment

    International Nuclear Information System (INIS)

    Venturini, V.; Pitner, P.

    1995-06-01

    Among all the components of a PWR (Pressurized Water Reactor) nuclear power plant, the reactor vessel is of major importance for safety. The integrity of this structure must be guaranteed in all circumstances, even in the case of the most severe accidents, and its mechanical state can be decisive for the lifetime of the plant. The brittle rupture would be the most important of all potential hazards if the irradiation effects were not consistent with predictions. The interest of having a reliable and precise method of evaluating the available safety margins and the integrity of this component led Electricite de France (EDF) to carry out a probabilistic fracture mechanics analysis. The probabilistic model developed by integration of the uncertainties in the usual fracture mechanics equations is presented. A special focus is made on the problem of coupling thermo-mechanical finite element calculations and reliability analysis. The use of a finite element code can be associated with prohibitive computation times when it is invoked numerous times during simulations sequences or complex iterative procedures. The response surface method is used. It provides an approximation of the response from a reduced number of original data. The global approach is illustrated on an example corresponding to a specific accidental transient. A validation of the obtained results is also carried out through the comparison with an equivalent model without coupling. (author)

  18. Impact of external events on site evaluation: a probabilistic approach

    International Nuclear Information System (INIS)

    Jaccarino, E.; Giuliani, P.; Zaffiro, C.

    1975-01-01

    A probabilistic method is proposed for definition of the reference external events of nuclear sites. The external events taken into account are earthquakes, floods and tornadoes. On the basis of the available historical data for each event it is possible to perform statistical analyses to determine the probability of occurrence on site of events of given characteristics. For earthquakes, the method of analysis takes into consideration both the annual frequency of seismic events in Italy and the probabilistic distribution of areas stricken by each event. For floods, the methods of analysis of hydrological data and the basic criteria for the determination of design events are discussed and the general lines of the hydraulic analysis of a nuclear site are shown. For tornadoes, the statistical analysis has been performed for the events which occurred in Italy during the last 40 years; these events have been classified according to an empirical intensity scale. The probability of each reference event should be a function of the potential radiological damage associated with the particular type of plant which must be installed on the site. Thus the reference event could be chosen such that for the whole of the national territory the risk for safety and environmental protection is the same. (author)

  19. Probabilistic seasonal Forecasts to deterministic Farm Leve Decisions: Innovative Approach

    Science.gov (United States)

    Mwangi, M. W.

    2015-12-01

    Climate change and vulnerability are major challenges in ensuring household food security. Climate information services have the potential to cushion rural households from extreme climate risks. However, most the probabilistic nature of climate information products is not easily understood by majority of smallholder farmers. Despite the probabilistic nature, climate information have proved to be a valuable climate risk adaptation strategy at the farm level. This calls for innovative ways to help farmers understand and apply climate information services to inform their farm level decisions. The study endeavored to co-design and test appropriate innovation systems for climate information services uptake and scale up necessary for achieving climate risk development. In addition it also determined the conditions necessary to support the effective performance of the proposed innovation system. Data and information sources included systematic literature review, secondary sources, government statistics, focused group discussions, household surveys and semi-structured interviews. Data wasanalyzed using both quantitative and qualitative data analysis techniques. Quantitative data was analyzed using the Statistical Package for Social Sciences (SPSS) software. Qualitative data was analyzed using qualitative techniques, which involved establishing the categories and themes, relationships/patterns and conclusions in line with the study objectives. Sustainable livelihood, reduced household poverty and climate change resilience were the impact that resulted from the study.

  20. A tiered approach for probabilistic ecological risk assessment of contaminated sites; Un approccio multilivello per l'analisi probabilistica di rischio ecologico di siti contaminati

    Energy Technology Data Exchange (ETDEWEB)

    Zolezzi, M. [Fisia Italimpianti SpA, Genova (Italy); Nicolella, C. [Pisa Univ., Pisa (Italy). Dipartimento di ingegneria chimica, chimica industriale e scienza dei materiali; Tarazona, J.V. [Instituto Nacional de Investigacion y Tecnologia Agraria y Alimentaria, Madrid (Spain). Departamento de Medio Ambiente, Laboratorio de toxicologia

    2005-09-15

    This paper presents a tiered methodology for probabilistic ecological risk assessment. The proposed approach starts from deterministic comparison (ratio) of single exposure concentration and threshold or safe level calculated from a dose-response relationship, goes through comparison of probabilistic distributions that describe exposure values and toxicological responses of organisms to the chemical of concern, and finally determines the so called distribution-based quotients (DBQs). In order to illustrate the proposed approach, soil concentrations of 1,2,4-trichlorobenzene (1,2,4- TCB) measured in an industrial contaminated site were used for site-specific probabilistic ecological risks assessment. By using probabilistic distributions, the risk, which exceeds a level of concern for soil organisms with the deterministic approach, is associated to the presence of hot spots reaching concentrations able to affect acutely more than 50% of the soil species, while the large majority of the area presents 1,2,4- TCB concentrations below those reported as toxic. [Italian] Scopo del presente studio e fornire una procedura per l'analisi di rischio ecologico di siti contaminati basata su livelli successivi di approfondimento. L'approccio proposto, partendo dal semplice rapporto deterministico tra un livello di esposizione ed un valore di effetto che consenta la salvaguardia del maggior numero di specie dell'ecosistema considerato, procede attraverso il confronto tra le distribuzioni statistiche dei valori di esposizione e di sensitivita delle specie, per determinare infine la distribuzione probabilistica del quoziente di rischio. Ai fini di illustrare la metodologia proposta, le concentrazioni di 1,2,4-triclorobenzene determinate nel suolo di un sito industriale contaminato sono state utilizzate per condurre l'analisi di rischio per le specie terrestri. L'utilizzo delle distribuzioni probabilistiche ha permesso di associare il rischio, inizialmente

  1. Probabilistic seismic hazards: Guidelines and constraints in evaluating results

    International Nuclear Information System (INIS)

    Sadigh, R.K.; Power, M.S.

    1989-01-01

    In conducting probabilistic seismic hazard analyses, consideration of the dispersion as well as the upper bounds on ground motion is of great significance. In particular, the truncation of ground motion levels at some upper limit would have a major influence on the computed hazard at the low-to-very-low probability levels. Additionally, other deterministic guidelines and constraints should be considered in evaluating the probabilistic seismic hazard results. In contrast to probabilistic seismic hazard evaluations, mean plus one standard deviation ground motions are typically used for deterministic estimates of ground motions from maximum events that may affect a structure. To be consistent with standard deterministic maximum estimates of ground motions values should be the highest level considered for the site. These maximum values should be associated with the largest possible event occurring at the site. Furthermore, the relationships between the ground motion level and probability of exceedance should reflect a transition from purely probabilistic assessments of ground motion at high probability levels where there are multiple chances for events to a deterministic upper bound ground motion at very low probability levels where there is very limited opportunity for maximum events to occur. In Interplate Regions, where the seismic sources may be characterized by a high-to-very-high rate of activity, the deterministic bounds will be approached or exceeded by the computer probabilistic hazard values at annual probability of exceedance levels typically as high as 10 -2 to 10 -3 . Thus, at these or lower values probability levels, probabilistically computed hazard values could be readily interpreted in the light of the deterministic constraints

  2. Global energy modeling - A biophysical approach

    Energy Technology Data Exchange (ETDEWEB)

    Dale, Michael

    2010-09-15

    This paper contrasts the standard economic approach to energy modelling with energy models using a biophysical approach. Neither of these approaches includes changing energy-returns-on-investment (EROI) due to declining resource quality or the capital intensive nature of renewable energy sources. Both of these factors will become increasingly important in the future. An extension to the biophysical approach is outlined which encompasses a dynamic EROI function that explicitly incorporates technological learning. The model is used to explore several scenarios of long-term future energy supply especially concerning the global transition to renewable energy sources in the quest for a sustainable energy system.

  3. Confluence reduction for probabilistic systems

    NARCIS (Netherlands)

    Timmer, Mark; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette

    In this presentation we introduce a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We proved that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To

  4. Probabilistic systems coalgebraically: A survey

    Science.gov (United States)

    Sokolova, Ana

    2011-01-01

    We survey the work on both discrete and continuous-space probabilistic systems as coalgebras, starting with how probabilistic systems are modeled as coalgebras and followed by a discussion of their bisimilarity and behavioral equivalence, mentioning results that follow from the coalgebraic treatment of probabilistic systems. It is interesting to note that, for different reasons, for both discrete and continuous probabilistic systems it may be more convenient to work with behavioral equivalence than with bisimilarity. PMID:21998490

  5. Probabilistic and possibilistic approach for assessment of radiological risk due to organically bound and tissue free water tritium

    International Nuclear Information System (INIS)

    Dahiya, Sudhir; Hegde, A.G.; Joshi, M.L.; Verma, P.C.; Kushwaha, H.S.

    2006-01-01

    This study illustrates use of two approaches namely probabilistic using Monte Carlo simulation (MCS) and possibilistic using fuzzy α-cut (FAC) to estimate the radiological cancer risk to the population from ingestion of organically bound tritium (OBT) and tissue free water tritium (TFWT) from fish consumption from the Rana Pratap Sagar Lake (RPSL), Kota. Using FAC technique, radiological cancer risk rate (year -1 ) at A αl.0 level were 1.15E-08 and 1.50E-09 for OBT and TFWT respectively from fish ingestion pathway. The radiological cancer risk rate (year -1 ) using MCS approach at 50th percentile (median) level is 1.14E-08 and 1.49E-09 for OBT and HTO respectively from ingestion of fresh water fish. (author)

  6. Global Grid of Probabilities of Urban Expansion to 2030

    Data.gov (United States)

    National Aeronautics and Space Administration — The Global Grid of Probabilities of Urban Expansion to 2030 presents spatially explicit probabilistic forecasts of global urban land cover change from 2000 to 2030...

  7. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    Science.gov (United States)

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1987-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach has been developed to quantify the effects of the random uncertainties. The results of this study indicate that only the variations in geometry have significant effects.

  8. Conditional Probabilistic Population Forecasting

    OpenAIRE

    Sanderson, Warren C.; Scherbov, Sergei; O'Neill, Brian C.; Lutz, Wolfgang

    2004-01-01

    Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because...

  9. The role of probabilistic safety assessment in the design

    International Nuclear Information System (INIS)

    Green, A.; Ingham, E.L.

    1989-01-01

    The use of probabilistic safety assessment (PSA) for Heysham 2 and Torness marked a major change in the design approach to nuclear safety within the U.K. Design Safety Guidelines incorporating probabilistic safety targets required that design justification would necessitate explicit consideration of the consequence of accidents in relation to their frequency. The paper discusses these safety targets and their implications, the integration of PSA into the design process and an outline of the methodology. The influence of PSA on the design is discussed together with its role in the overall demonstration of reactor safety. (author)

  10. A Methodology for Probabilistic Accident Management

    International Nuclear Information System (INIS)

    Munteanu, Ion; Aldemir, Tunc

    2003-01-01

    While techniques have been developed to tackle different tasks in accident management, there have been very few attempts to develop an on-line operator assistance tool for accident management and none that can be found in the literature that uses probabilistic arguments, which are important in today's licensing climate. The state/parameter estimation capability of the dynamic system doctor (DSD) approach is combined with the dynamic event-tree generation capability of the integrated safety assessment (ISA) methodology to address this issue. The DSD uses the cell-to-cell mapping technique for system representation that models the system evolution in terms of probability of transitions in time between sets of user-defined parameter/state variable magnitude intervals (cells) within a user-specified time interval (e.g., data sampling interval). The cell-to-cell transition probabilities are obtained from the given system model. The ISA follows the system dynamics in tree form and braches every time a setpoint for system/operator intervention is exceeded. The combined approach (a) can automatically account for uncertainties in the monitored system state, inputs, and modeling uncertainties through the appropriate choice of the cells, as well as providing a probabilistic measure to rank the likelihood of possible system states in view of these uncertainties; (b) allows flexibility in system representation; (c) yields the lower and upper bounds on the estimated values of state variables/parameters as well as their expected values; and (d) leads to fewer branchings in the dynamic event-tree generation. Using a simple but realistic pressurizer model, the potential use of the DSD-ISA methodology for on-line probabilistic accident management is illustrated

  11. Probabilistic language models in cognitive neuroscience: Promises and pitfalls.

    Science.gov (United States)

    Armeni, Kristijan; Willems, Roel M; Frank, Stefan L

    2017-12-01

    Cognitive neuroscientists of language comprehension study how neural computations relate to cognitive computations during comprehension. On the cognitive part of the equation, it is important that the computations and processing complexity are explicitly defined. Probabilistic language models can be used to give a computationally explicit account of language complexity during comprehension. Whereas such models have so far predominantly been evaluated against behavioral data, only recently have the models been used to explain neurobiological signals. Measures obtained from these models emphasize the probabilistic, information-processing view of language understanding and provide a set of tools that can be used for testing neural hypotheses about language comprehension. Here, we provide a cursory review of the theoretical foundations and example neuroimaging studies employing probabilistic language models. We highlight the advantages and potential pitfalls of this approach and indicate avenues for future research. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Potential changes in the extreme climate conditions at the regional scale: from observed data to modelling approaches and towards probabilistic climate change information

    International Nuclear Information System (INIS)

    Gachon, P.; Radojevic, M.; Harding, A.; Saad, C.; Nguyen, V.T.V.

    2008-01-01

    The changes in the characteristics of extreme climate conditions are one of the most critical challenges for all ecosystems, human being and infrastructure, in the context of the on-going global climate change. However, extremes information needed for impacts studies cannot be obtained directly from coarse scale global climate models (GCMs), due mainly to their difficulties to incorporate regional scale feedbacks and processes responsible in part for the occurrence, intensity and duration of extreme events. Downscaling approaches, namely statistical and dynamical downscaling techniques (i.e. SD and RCM), have emerged as useful tools to develop high resolution climate change information, in particular for extremes, as those are theoretically more capable to take into account regional/local forcings and their feedbacks from large scale influences as they are driven with GCM synoptic variables. Nevertheless, in spite of the potential added values from downscaling methods (statistical and dynamical), a rigorous assessment of these methods are needed as inherent difficulties to simulate extremes are still present. In this paper, different series of RCM and SD simulations using three different GCMs are presented and evaluated with respect to observed values over the current period and over a river basin in southern Quebec, with future ensemble runs, i.e. centered over 2050s (i.e. 2041-2070 period using the SRES A2 emission scenario). Results suggest that the downscaling performance over the baseline period significantly varies between the two downscaling techniques and over various seasons with more regular reliable simulated values with SD technique for temperature than for RCM runs, while both approaches produced quite similar temperature changes in the future from median values with more divergence for extremes. For precipitation, less accurate information is obtained compared to observed data, and with more differences among models with higher uncertainties in the

  13. Failure probability assessment of wall-thinned nuclear pipes using probabilistic fracture mechanics

    International Nuclear Information System (INIS)

    Lee, Sang-Min; Chang, Yoon-Suk; Choi, Jae-Boong; Kim, Young-Jin

    2006-01-01

    The integrity of nuclear piping system has to be maintained during operation. In order to maintain the integrity, reliable assessment procedures including fracture mechanics analysis, etc., are required. Up to now, this has been performed using conventional deterministic approaches even though there are many uncertainties to hinder a rational evaluation. In this respect, probabilistic approaches are considered as an appropriate method for piping system evaluation. The objectives of this paper are to estimate the failure probabilities of wall-thinned pipes in nuclear secondary systems and to propose limited operating conditions under different types of loadings. To do this, a probabilistic assessment program using reliability index and simulation techniques was developed and applied to evaluate failure probabilities of wall-thinned pipes subjected to internal pressure, bending moment and combined loading of them. The sensitivity analysis results as well as prototypal integrity assessment results showed a promising applicability of the probabilistic assessment program, necessity of practical evaluation reflecting combined loading condition and operation considering limited condition

  14. Conditional Probabilistic Population Forecasting

    OpenAIRE

    Sanderson, W.C.; Scherbov, S.; O'Neill, B.C.; Lutz, W.

    2003-01-01

    Since policy makers often prefer to think in terms of scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy makers it allows them to answer "what if"...

  15. Conditional probabilistic population forecasting

    OpenAIRE

    Sanderson, Warren; Scherbov, Sergei; O'Neill, Brian; Lutz, Wolfgang

    2003-01-01

    Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because it allows them...

  16. Students’ difficulties in probabilistic problem-solving

    Science.gov (United States)

    Arum, D. P.; Kusmayadi, T. A.; Pramudya, I.

    2018-03-01

    There are many errors can be identified when students solving mathematics problems, particularly in solving the probabilistic problem. This present study aims to investigate students’ difficulties in solving the probabilistic problem. It focuses on analyzing and describing students errors during solving the problem. This research used the qualitative method with case study strategy. The subjects in this research involve ten students of 9th grade that were selected by purposive sampling. Data in this research involve students’ probabilistic problem-solving result and recorded interview regarding students’ difficulties in solving the problem. Those data were analyzed descriptively using Miles and Huberman steps. The results show that students have difficulties in solving the probabilistic problem and can be divided into three categories. First difficulties relate to students’ difficulties in understanding the probabilistic problem. Second, students’ difficulties in choosing and using appropriate strategies for solving the problem. Third, students’ difficulties with the computational process in solving the problem. Based on the result seems that students still have difficulties in solving the probabilistic problem. It means that students have not able to use their knowledge and ability for responding probabilistic problem yet. Therefore, it is important for mathematics teachers to plan probabilistic learning which could optimize students probabilistic thinking ability.

  17. Probabilistic Forecasting for On-line Operation of Urban Drainage Systems

    DEFF Research Database (Denmark)

    Löwe, Roland

    This thesis deals with the generation of probabilistic forecasts in urban hydrology. In particular, we focus on the case of runoff forecasting for real-time control (RTC) on horizons of up to two hours. For the generation of probabilistic on-line runoff forecasts, we apply the stochastic grey...... and forecasts have on on-line runoff forecast quality. Finally, we implement the stochastic grey-box model approach in a real-world real-time control (RTC) setup and study how RTC can benefit from a dynamic quantification of runoff forecast uncertainty....

  18. A Decision Support System Coupling Fuzzy Logic and Probabilistic Graphical Approaches for the Agri-Food Industry: Prediction of Grape Berry Maturity.

    Science.gov (United States)

    Perrot, Nathalie; Baudrit, Cédric; Brousset, Jean Marie; Abbal, Philippe; Guillemin, Hervé; Perret, Bruno; Goulet, Etienne; Guerin, Laurence; Barbeau, Gérard; Picque, Daniel

    2015-01-01

    Agri-food is one of the most important sectors of the industry and a major contributor to the global warming potential in Europe. Sustainability issues pose a huge challenge for this sector. In this context, a big issue is to be able to predict the multiscale dynamics of those systems using computing science. A robust predictive mathematical tool is implemented for this sector and applied to the wine industry being easily able to be generalized to other applications. Grape berry maturation relies on complex and coupled physicochemical and biochemical reactions which are climate dependent. Moreover one experiment represents one year and the climate variability could not be covered exclusively by the experiments. Consequently, harvest mostly relies on expert predictions. A big challenge for the wine industry is nevertheless to be able to anticipate the reactions for sustainability purposes. We propose to implement a decision support system so called FGRAPEDBN able to (1) capitalize the heterogeneous fragmented knowledge available including data and expertise and (2) predict the sugar (resp. the acidity) concentrations with a relevant RMSE of 7 g/l (resp. 0.44 g/l and 0.11 g/kg). FGRAPEDBN is based on a coupling between a probabilistic graphical approach and a fuzzy expert system.

  19. A General Framework for Probabilistic Characterizing Formulae

    DEFF Research Database (Denmark)

    Sack, Joshua; Zhang, Lijun

    2012-01-01

    Recently, a general framework on characteristic formulae was proposed by Aceto et al. It offers a simple theory that allows one to easily obtain characteristic formulae of many non-probabilistic behavioral relations. Our paper studies their techniques in a probabilistic setting. We provide...... a general method for determining characteristic formulae of behavioral relations for probabilistic automata using fixed-point probability logics. We consider such behavioral relations as simulations and bisimulations, probabilistic bisimulations, probabilistic weak simulations, and probabilistic forward...

  20. A probabilistic Hu-Washizu variational principle

    Science.gov (United States)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  1. Probabilistic Capacity of a Grid connected Wind Farm

    DEFF Research Database (Denmark)

    Zhao, Menghua; Chen, Zhe; Blaabjerg, Frede

    2005-01-01

    This paper proposes a method to find the maximum acceptable wind power injection regarding the thermal limits, steady state stability limits and voltage limits of the grid system. The probabilistic wind power is introduced based on the probability distribution of wind speed. Based on Power Transfer...... Distribution Factor (PTDF) and voltage sensitivities, a predictor-corrector method is suggested to calculate the acceptable active power injection. Then this method is combined with the probabilistic model of wind power to compute the allowable capacity of the wind farm. Finally, an example is illustrated...... to test this method. It is concluded that proposed method in this paper is a feasible, fast, and accurate approach to find the size of a wind farm....

  2. Principles or imagination? Two approaches to global justice.

    NARCIS (Netherlands)

    Coeckelbergh, Mark

    2007-01-01

    What does it mean to introduce the notion of imagination in the discussion about global justice? What is gained by studying the role of imagination in thinking about global justice? Does a focus on imagination imply that we must replace existing influential principle-centred approaches such as that

  3. Seismic Probabilistic Risk Assessment (SPRA), approach and results

    International Nuclear Information System (INIS)

    Campbell, R.D.

    1995-01-01

    During the past 15 years there have been over 30 Seismic Probabilistic Risk Assessments (SPRAs) and Seismic Probabilistic Safety Assessments (SPSAs) conducted of Western Nuclear Power Plants, principally of US design. In this paper PRA and PSA are used interchangeably as the overall process is essentially the same. Some similar assessments have been done for reactors in Taiwan, Korea, Japan, Switzerland and Slovenia. These plants were also principally US supplied or built under US license. Since the restructuring of the governments in former Soviet Bloc countries, there has been grave concern regarding the safety of the reactors in these countries. To date there has been considerable activity in conducting partial seismic upgrades but the overall quantification of risk has not been pursued to the depth that it has in Western countries. This paper summarizes the methodology for Seismic PRA/PSA and compares results of two partially completed and two completed PRAs of soviet designed reactors to results from earlier PRAs on US Reactors. A WWER 440 and a WWER 1000 located in low seismic activity regions have completed PRAs and results show the seismic risk to be very low for both designs. For more active regions, partially completed PRAs of a WWER 440 and WWER 1000 located at the same site show the WWER 440 to have much greater seismic risk than the WWER 1000 plant. The seismic risk from the 1000 MW plant compares with the high end of seismic risk for earlier seismic PRAs in the US. Just as for most US plants, the seismic risk appears to be less than the risk from internal events if risk is measured is terms of mean core damage frequency. However, due to the lack of containment for the earlier WWER 440s, the risk to the public may be significantly greater due to the more probable scenario of an early release. The studies reported have not taken the accident sequences beyond the stage of core damage hence the public heath risk ratios are speculative. (author)

  4. Development of Nuclear Safety Culture evaluation method for an operation team based on the probabilistic approach

    International Nuclear Information System (INIS)

    Han, Sang Min; Lee, Seung Min; Yim, Ho Bin; Seong, Poong Hyun

    2018-01-01

    Highlights: •We proposed a Probabilistic Safety Culture Healthiness Evaluation Method. •Positive relationship between the ‘success’ states of NSC and performance was shown. •The state probability profile showed a unique ratio regardless of the scenarios. •Cutset analysis provided not only root causes but also the latent causes of failures. •Pro-SCHEMe was found to be applicable to Korea NPPs. -- Abstract: The aim of this study is to propose a new quantitative evaluation method for Nuclear Safety Culture (NSC) in Nuclear Power Plant (NPP) operation teams based on the probabilistic approach. Various NSC evaluation methods have been developed, and the Korea NPP utility company has conducted the NSC assessment according to international practice. However, most of methods are conducted by interviews, observations, and the self-assessment. Consequently, the results are often qualitative, subjective, and mainly dependent on evaluator’s judgement, so the assessment results can be interpreted from different perspectives. To resolve limitations of present evaluation methods, the concept of Safety Culture Healthiness was suggested to produce quantitative results and provide faster evaluation process. This paper presents Probabilistic Safety Culture Healthiness Evaluation Method (Pro-SCHEMe) to generate quantitative inputs for Human Reliability Assessment (HRA) in Probabilistic Safety Assessment (PSA). Evaluation items which correspond to a basic event in PSA are derived in the first part of the paper through the literature survey; mostly from nuclear-related organizations such as the International Atomic Energy Agency (IAEA), the United States Nuclear Regulatory Commission (U.S.NRC), and the Institute of Nuclear Power Operations (INPO). Event trees (ETs) and fault trees (FTs) are devised to apply evaluation items to PSA based on the relationships among such items. The Modeling Guidelines are also suggested to classify and calculate NSC characteristics of

  5. Flood hazards and masonry constructions: a probabilistic framework for damage, risk and resilience at urban scale

    Directory of Open Access Journals (Sweden)

    A. Mebarki

    2012-05-01

    Full Text Available This paper deals with the failure risk of masonry constructions under the effect of floods. It is developed within a probabilistic framework, with loads and resistances considered as random variables. Two complementary approaches have been investigated for this purpose:

    – a global approach based on combined effects of several governing parameters with individual weighted contribution (material quality and geometry, presence and distance between columns, beams, openings, resistance of the soil and its slope. . .,
    – and a reliability method using the failure mechanism of masonry walls standing out-plane pressure.

    The evolution of the probability of failure of masonry constructions according to the flood water level is analysed.

    The analysis of different failure probability scenarios for masonry walls is conducted to calibrate the influence of each "vulnerability governing parameter" in the global approach that is widely used in risk assessment at the urban or regional scale.

    The global methodology is implemented in a GIS that provides the spatial distribution of damage risk for different flood scenarios. A real case is considered for the simulations, i.e. Cheffes sur Sarthe (France, for which the observed river discharge, the hydraulic load according to the Digital Terrain Model, and the structural resistance are considered as random variables. The damage probability values provided by both approaches are compared. Discussions are also developed about reduction and mitigation of the flood disaster at various scales (set of structures, city, region as well as resilience.

  6. Probabilistic atlas-guided eigen-organ method for simultaneous bounding box estimation of multiple organs in volumetric CT images

    International Nuclear Information System (INIS)

    Yao, Cong; Wada, Takashige; Shimizu, Akinobu; Kobatake, Hidefumi; Nawano, Shigeru

    2006-01-01

    We propose an approach for the simultaneous bounding box estimation of multiple organs in volumetric CT images. Local eigen-organ spaces are constructed for different types of training organs, and a global eigen-space, which describes the spatial relationships between the organs, is also constructed. Each volume of interest in the abdominal CT image is projected into the local eigen-organ spaces, and several candidate locations are determined. The final selection of the organ locations is made by projecting the set of candidate locations into the global eigen-space. A probabilistic atlas of organs is used to eliminate locations with low probability and to guide the selection of candidate locations. Evaluation by the leave-one-out method using 10 volumetric abdominal CT images showed that the proposed method provided an average accuracy of 80.38% for 11 different organ types. (author)

  7. Probabilistic safety assessment for research reactors

    International Nuclear Information System (INIS)

    1986-12-01

    Increasing interest in using Probabilistic Safety Assessment (PSA) methods for research reactor safety is being observed in many countries throughout the world. This is mainly because of the great ability of this approach in achieving safe and reliable operation of research reactors. There is also a need to assist developing countries to apply Probabilistic Safety Assessment to existing nuclear facilities which are simpler and therefore less complicated to analyse than a large Nuclear Power Plant. It may be important, therefore, to develop PSA for research reactors. This might also help to better understand the safety characteristics of the reactor and to base any backfitting on a cost-benefit analysis which would ensure that only necessary changes are made. This document touches on all the key aspects of PSA but placed greater emphasis on so-called systems analysis aspects rather than the in-plant or ex-plant consequences

  8. Radiation protection criteria for cases of probabilistic disruptive events

    International Nuclear Information System (INIS)

    Beninson, D.J.

    1985-01-01

    The individual risk limitation for the case of probabilistic disruptive events is studied, when the radiation effects cease to be only stochastic; the proposed criterion is applied for the case of high level waste repositories. The protection's optimization results from the differential cost-benefit. More general procedures of decision theory that use probabilistically defined utility functions are considered for its calculation. These more general procedures can be applied also in cases where radiation exposures are only potential, to optimize the required level of safety features. It is shown that for disruptive events of low probability and large resulting consequences, the concept of 'expectation' of consequence can not be used in decision making, but that the use of probabilistically based utility functions can conceptually assure a consistent approach in deciding the required level of safety. The use of utility functions of logaritmic form to assign weights to consequences involving different loss of life is explored (M.E.L.) [es

  9. Probabilistic Sensitivity Analysis for Launch Vehicles with Varying Payloads and Adapters for Structural Dynamics and Loads

    Science.gov (United States)

    McGhee, David S.; Peck, Jeff A.; McDonald, Emmett J.

    2012-01-01

    This paper examines Probabilistic Sensitivity Analysis (PSA) methods and tools in an effort to understand their utility in vehicle loads and dynamic analysis. Specifically, this study addresses how these methods may be used to establish limits on payload mass and cg location and requirements on adaptor stiffnesses while maintaining vehicle loads and frequencies within established bounds. To this end, PSA methods and tools are applied to a realistic, but manageable, integrated launch vehicle analysis where payload and payload adaptor parameters are modeled as random variables. This analysis is used to study both Regional Response PSA (RRPSA) and Global Response PSA (GRPSA) methods, with a primary focus on sampling based techniques. For contrast, some MPP based approaches are also examined.

  10. Prediction Uncertainty and Groundwater Management: Approaches to get the Most out of Probabilistic Outputs

    Science.gov (United States)

    Peeters, L. J.; Mallants, D.; Turnadge, C.

    2017-12-01

    Groundwater impact assessments are increasingly being undertaken in a probabilistic framework whereby various sources of uncertainty (model parameters, model structure, boundary conditions, and calibration data) are taken into account. This has resulted in groundwater impact metrics being presented as probability density functions and/or cumulative distribution functions, spatial maps displaying isolines of percentile values for specific metrics, etc. Groundwater management on the other hand typically uses single values (i.e., in a deterministic framework) to evaluate what decisions are required to protect groundwater resources. For instance, in New South Wales, Australia, a nominal drawdown value of two metres is specified by the NSW Aquifer Interference Policy as trigger-level threshold. In many cases, when drawdowns induced by groundwater extraction exceed two metres, "make-good" provisions are enacted (such as the surrendering of extraction licenses). The information obtained from a quantitative uncertainty analysis can be used to guide decision making in several ways. Two examples are discussed here: the first of which would not require modification of existing "deterministic" trigger or guideline values, whereas the second example assumes that the regulatory criteria are also expressed in probabilistic terms. The first example is a straightforward interpretation of calculated percentile values for specific impact metrics. The second examples goes a step further, as the previous deterministic thresholds do not currently allow for a probabilistic interpretation; e.g., there is no statement that "the probability of exceeding the threshold shall not be larger than 50%". It would indeed be sensible to have a set of thresholds with an associated acceptable probability of exceedance (or probability of not exceeding a threshold) that decreases as the impact increases. We here illustrate how both the prediction uncertainty and management rules can be expressed in a

  11. Optimization-Based Approaches to Control of Probabilistic Boolean Networks

    Directory of Open Access Journals (Sweden)

    Koichi Kobayashi

    2017-02-01

    Full Text Available Control of gene regulatory networks is one of the fundamental topics in systems biology. In the last decade, control theory of Boolean networks (BNs, which is well known as a model of gene regulatory networks, has been widely studied. In this review paper, our previously proposed methods on optimal control of probabilistic Boolean networks (PBNs are introduced. First, the outline of PBNs is explained. Next, an optimal control method using polynomial optimization is explained. The finite-time optimal control problem is reduced to a polynomial optimization problem. Furthermore, another finite-time optimal control problem, which can be reduced to an integer programming problem, is also explained.

  12. Probabilistic finite elements with dynamic limit bounds; a case study: 17th street flood wall, New Orleans

    NARCIS (Netherlands)

    Rajabali Nejad, Mohammadreza; van Gelder, P.H.A.J.M.; Vrijling, J.K.

    2008-01-01

    The probabilistic approach provides a better understanding of failure mechanisms and occurrence probabilities as well as consequences of failure. Besides, main advantages of the probabilistic design in comparison with the deterministic design are: a more careful, more cost effective, and more

  13. Exact and approximate probabilistic symbolic execution for nondeterministic programs

    DEFF Research Database (Denmark)

    Luckow, Kasper Søe; Păsăreanu, Corina S.; Dwyer, Matthew B.

    2014-01-01

    Probabilistic software analysis seeks to quantify the likelihood of reaching a target event under uncertain environments. Recent approaches compute probabilities of execution paths using symbolic execution, but do not support nondeterminism. Nondeterminism arises naturally when no suitable probab...... Java programs. We show that our algorithms significantly improve upon a state-of-the-art statistical model checking algorithm, originally developed for Markov Decision Processes....... probabilistic model can capture a program behavior, e.g., for multithreading or distributed systems. In this work, we propose a technique, based on symbolic execution, to synthesize schedulers that resolve nondeterminism to maximize the probability of reaching a target event. To scale to large systems, we also...

  14. An assessment of the acute dietary exposure to glyphosate using deterministic and probabilistic methods.

    Science.gov (United States)

    Stephenson, C L; Harris, C A; Clarke, R

    2018-02-01

    Use of glyphosate in crop production can lead to residues of the active substance and related metabolites in food. Glyphosate has never been considered acutely toxic; however, in 2015 the European Food Safety Authority (EFSA) proposed an acute reference dose (ARfD). This differs from the Joint FAO/WHO Meeting on Pesticide Residues (JMPR) who in 2016, in line with their existing position, concluded that an ARfD was not necessary for glyphosate. This paper makes a comprehensive assessment of short-term dietary exposure to glyphosate from potentially treated crops grown in the EU and imported third-country food sources. European Union and global deterministic models were used to make estimates of short-term dietary exposure (generally defined as up to 24 h). Estimates were refined using food-processing information, residues monitoring data, national dietary exposure models, and basic probabilistic approaches to estimating dietary exposure. Calculated exposures levels were compared to the ARfD, considered to be the amount of a substance that can be consumed in a single meal, or 24-h period, without appreciable health risk. Acute dietary intakes were Probabilistic exposure estimates showed that the acute intake on no person-days exceeded 10% of the ARfD, even for the pessimistic scenario.

  15. Probabilistic programmable quantum processors

    International Nuclear Information System (INIS)

    Buzek, V.; Ziman, M.; Hillery, M.

    2004-01-01

    We analyze how to improve performance of probabilistic programmable quantum processors. We show how the probability of success of the probabilistic processor can be enhanced by using the processor in loops. In addition, we show that an arbitrary SU(2) transformations of qubits can be encoded in program state of a universal programmable probabilistic quantum processor. The probability of success of this processor can be enhanced by a systematic correction of errors via conditional loops. Finally, we show that all our results can be generalized also for qudits. (Abstract Copyright [2004], Wiley Periodicals, Inc.)

  16. Development of a Probabilistic Flood Hazard Assessment (PFHA) for the nuclear safety

    Science.gov (United States)

    Ben Daoued, Amine; Guimier, Laurent; Hamdi, Yasser; Duluc, Claire-Marie; Rebour, Vincent

    2016-04-01

    The purpose of this study is to lay the basis for a probabilistic evaluation of flood hazard (PFHA). Probabilistic assessment of external floods has become a current topic of interest to the nuclear scientific community. Probabilistic approaches complement deterministic approaches by exploring a set of scenarios and associating a probability to each of them. These approaches aim to identify all possible failure scenarios, combining their probability, in order to cover all possible sources of risk. They are based on the distributions of initiators and/or the variables caracterizing these initiators. The PFHA can characterize the water level for example at defined point of interest in the nuclear site. This probabilistic flood hazard characterization takes into account all the phenomena that can contribute to the flooding of the site. The main steps of the PFHA are: i) identification of flooding phenomena (rains, sea water level, etc.) and screening of relevant phenomena to the nuclear site, ii) identification and probabilization of parameters associated to selected flooding phenomena, iii) spreading of the probabilized parameters from the source to the point of interest in the site, v) obtaining hazard curves and aggregation of flooding phenomena contributions at the point of interest taking into account the uncertainties. Within this framework, the methodology of the PFHA has been developed for several flooding phenomena (rain and/or sea water level, etc.) and then implemented and tested with a simplified case study. In the same logic, our study is still in progress to take into account other flooding phenomena and to carry out more case studies.

  17. Predicting carcinogenicity of diverse chemicals using probabilistic neural network modeling approaches

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Kunwar P., E-mail: kpsingh_52@yahoo.com [Academy of Scientific and Innovative Research, Council of Scientific and Industrial Research, New Delhi (India); Environmental Chemistry Division, CSIR-Indian Institute of Toxicology Research, Post Box 80, Mahatma Gandhi Marg, Lucknow 226 001 (India); Gupta, Shikha; Rai, Premanjali [Academy of Scientific and Innovative Research, Council of Scientific and Industrial Research, New Delhi (India); Environmental Chemistry Division, CSIR-Indian Institute of Toxicology Research, Post Box 80, Mahatma Gandhi Marg, Lucknow 226 001 (India)

    2013-10-15

    Robust global models capable of discriminating positive and non-positive carcinogens; and predicting carcinogenic potency of chemicals in rodents were developed. The dataset of 834 structurally diverse chemicals extracted from Carcinogenic Potency Database (CPDB) was used which contained 466 positive and 368 non-positive carcinogens. Twelve non-quantum mechanical molecular descriptors were derived. Structural diversity of the chemicals and nonlinearity in the data were evaluated using Tanimoto similarity index and Brock–Dechert–Scheinkman statistics. Probabilistic neural network (PNN) and generalized regression neural network (GRNN) models were constructed for classification and function optimization problems using the carcinogenicity end point in rat. Validation of the models was performed using the internal and external procedures employing a wide series of statistical checks. PNN constructed using five descriptors rendered classification accuracy of 92.09% in complete rat data. The PNN model rendered classification accuracies of 91.77%, 80.70% and 92.08% in mouse, hamster and pesticide data, respectively. The GRNN constructed with nine descriptors yielded correlation coefficient of 0.896 between the measured and predicted carcinogenic potency with mean squared error (MSE) of 0.44 in complete rat data. The rat carcinogenicity model (GRNN) applied to the mouse and hamster data yielded correlation coefficient and MSE of 0.758, 0.71 and 0.760, 0.46, respectively. The results suggest for wide applicability of the inter-species models in predicting carcinogenic potency of chemicals. Both the PNN and GRNN (inter-species) models constructed here can be useful tools in predicting the carcinogenicity of new chemicals for regulatory purposes. - Graphical abstract: Figure (a) shows classification accuracies (positive and non-positive carcinogens) in rat, mouse, hamster, and pesticide data yielded by optimal PNN model. Figure (b) shows generalization and predictive

  18. The dialectical thinking about deterministic and probabilistic safety analysis

    International Nuclear Information System (INIS)

    Qian Yongbai; Tong Jiejuan; Zhang Zuoyi; He Xuhong

    2005-01-01

    There are two methods in designing and analysing the safety performance of a nuclear power plant, the traditional deterministic method and the probabilistic method. To date, the design of nuclear power plant is based on the deterministic method. It has been proved in practice that the deterministic method is effective on current nuclear power plant. However, the probabilistic method (Probabilistic Safety Assessment - PSA) considers a much wider range of faults, takes an integrated look at the plant as a whole, and uses realistic criteria for the performance of the systems and constructions of the plant. PSA can be seen, in principle, to provide a broader and realistic perspective on safety issues than the deterministic approaches. In this paper, the historical origins and development trend of above two methods are reviewed and summarized in brief. Based on the discussion of two application cases - one is the changes to specific design provisions of the general design criteria (GDC) and the other is the risk-informed categorization of structure, system and component, it can be concluded that the deterministic method and probabilistic method are dialectical and unified, and that they are being merged into each other gradually, and being used in coordination. (authors)

  19. A Time-Varied Probabilistic ON/OFF Switching Algorithm for Cellular Networks

    KAUST Repository

    Rached, Nadhir B.; Ghazzai, Hakim; Kadri, Abdullah; Alouini, Mohamed-Slim

    2018-01-01

    In this letter, we develop a time-varied probabilistic on/off switching planning method for cellular networks to reduce their energy consumption. It consists in a risk-aware optimization approach that takes into consideration the randomness of the user profile associated with each base station (BS). The proposed approach jointly determines (i) the instants of time at which the current active BS configuration must be updated due to an increase or decrease of the network traffic load, and (ii) the set of minimum BSs to be activated to serve the networks’ subscribers. Probabilistic metrics modeling the traffic profile variation are developed to trigger this dynamic on/off switching operation. Selected simulation results are then performed to validate the proposed algorithm for different system parameters.

  20. A Time-Varied Probabilistic ON/OFF Switching Algorithm for Cellular Networks

    KAUST Repository

    Rached, Nadhir B.

    2018-01-11

    In this letter, we develop a time-varied probabilistic on/off switching planning method for cellular networks to reduce their energy consumption. It consists in a risk-aware optimization approach that takes into consideration the randomness of the user profile associated with each base station (BS). The proposed approach jointly determines (i) the instants of time at which the current active BS configuration must be updated due to an increase or decrease of the network traffic load, and (ii) the set of minimum BSs to be activated to serve the networks’ subscribers. Probabilistic metrics modeling the traffic profile variation are developed to trigger this dynamic on/off switching operation. Selected simulation results are then performed to validate the proposed algorithm for different system parameters.

  1. Probabilistic Hazard Estimation at a Densely Urbanised Area: the Neaples Volcanoes

    Science.gov (United States)

    de Natale, G.; Mastrolorenzo, G.; Panizza, A.; Pappalardo, L.; Claudia, T.

    2005-12-01

    The Neaples volcanic area (Southern Italy), including Vesuvius, Campi Flegrei caldera and Ischia island, is the highest risk one in the World, where more than 2 million people live within about 10 km from an active volcanic vent. Such an extreme risk calls for accurate methodologies aimed to quantify it, in a probabilistic way, considering all the available volcanological information as well as modelling results. In fact, simple hazard maps based on the observation of deposits from past eruptions have the major problem that eruptive history generally samples a very limited number of possible outcomes, thus resulting almost meaningless to get the event probability in the area. This work describes a methodology making the best use (from a Bayesian point of view) of volcanological data and modelling results, to compute probabilistic hazard maps from multi-vent explosive eruptions. The method, which follows an approach recently developed by the same authors for pyroclastic flows hazard, has been here improved and extended to compute also fall-out hazard. The application of the method to the Neapolitan volcanic area, including the densely populated city of Naples, allows, for the first time, to get a global picture of the areal distribution for the main hazards from multi-vent explosive eruptions. From a joint consideration of the hazard contributions from all the three volcanic areas, new insight on the volcanic hazard distribution emerges, which will have strong implications for urban and emergency planning in the area.

  2. PROBABILISTIC RELATIONAL MODELS OF COMPLETE IL-SEMIRINGS

    OpenAIRE

    Tsumagari, Norihiro

    2012-01-01

    This paper studies basic properties of probabilistic multirelations which are generalized the semantic domain of probabilistic systems and then provides two probabilistic models of complete IL-semirings using probabilistic multirelations. Also it is shown that these models need not be models of complete idempotentsemirings.

  3. Probabilistic safety analysis vs probabilistic fracture mechanics -relation and necessary merging

    International Nuclear Information System (INIS)

    Nilsson, Fred

    1997-01-01

    A comparison is made between some general features of probabilistic fracture mechanics (PFM) and probabilistic safety assessment (PSA) in its standard form. We conclude that: Result from PSA is a numerically expressed level of confidence in the system based on the state of current knowledge. It is thus not any objective measure of risk. It is important to carefully define the precise nature of the probabilistic statement and relate it to a well defined situation. Standardisation of PFM methods is necessary. PFM seems to be the only way to obtain estimates of the pipe break probability. Service statistics are of doubtful value because of scarcity of data and statistical inhomogeneity. Collection of service data should be directed towards the occurrence of growing cracks

  4. Modelling probabilistic fatigue crack propagation rates for a mild structural steel

    Directory of Open Access Journals (Sweden)

    J.A.F.O. Correia

    2015-01-01

    Full Text Available A class of fatigue crack growth models based on elastic–plastic stress–strain histories at the crack tip region and local strain-life damage models have been proposed in literature. The fatigue crack growth is regarded as a process of continuous crack initializations over successive elementary material blocks, which may be governed by smooth strain-life damage data. Some approaches account for the residual stresses developing at the crack tip in the actual crack driving force assessment, allowing mean stresses and loading sequential effects to be modelled. An extension of the fatigue crack propagation model originally proposed by Noroozi et al. (2005 to derive probabilistic fatigue crack propagation data is proposed, in particular concerning the derivation of probabilistic da/dN-ΔK-R fields. The elastic-plastic stresses at the vicinity of the crack tip, computed using simplified formulae, are compared with the stresses computed using an elasticplastic finite element analyses for specimens considered in the experimental program proposed to derive the fatigue crack propagation data. Using probabilistic strain-life data available for the S355 structural mild steel, probabilistic crack propagation fields are generated, for several stress ratios, and compared with experimental fatigue crack propagation data. A satisfactory agreement between the predicted probabilistic fields and experimental data is observed.

  5. A probabilistic model of ecosystem response to climate change

    International Nuclear Information System (INIS)

    Shevliakova, E.; Dowlatabadi, H.

    1994-01-01

    Anthropogenic activities are leading to rapid changes in land cover and emissions of greenhouse gases into the atmosphere. These changes can bring about climate change typified by average global temperatures rising by 1--5 C over the next century. Climate change of this magnitude is likely to alter the distribution of terrestrial ecosystems on a large scale. Options available for dealing with such change are abatement of emissions, adaptation, and geoengineering. The integrated assessment of climate change demands that frameworks be developed where all the elements of the climate problem are present (from economic activity to climate change and its impacts on market and non-market goods and services). Integrated climate assessment requires multiple impact metrics and multi-attribute utility functions to simulate the response of different key actors/decision-makers to the actual physical impacts (rather than a dollar value) of the climate-damage vs. policy-cost debate. This necessitates direct modeling of ecosystem impacts of climate change. The authors have developed a probabilistic model of ecosystem response to global change. This model differs from previous efforts in that it is statistically estimated using actual ecosystem and climate data yielding a joint multivariate probability of prevalence for each ecosystem, given climatic conditions. The authors expect this approach to permit simulation of inertia and competition which have, so far, been absent in transfer models of continental-scale ecosystem response to global change. Thus, although the probability of one ecotype will dominate others at a given point, others would have the possibility of establishing an early foothold

  6. A Probabilistic Design Methodology for a Turboshaft Engine Overall Performance Analysis

    Directory of Open Access Journals (Sweden)

    Min Chen

    2014-05-01

    Full Text Available In reality, the cumulative effect of the many uncertainties in engine component performance may stack up to affect the engine overall performance. This paper aims to quantify the impact of uncertainty in engine component performance on the overall performance of a turboshaft engine based on Monte-Carlo probabilistic design method. A novel probabilistic model of turboshaft engine, consisting of a Monte-Carlo simulation generator, a traditional nonlinear turboshaft engine model, and a probability statistical model, was implemented to predict this impact. One of the fundamental results shown herein is that uncertainty in component performance has a significant impact on the engine overall performance prediction. This paper also shows that, taking into consideration the uncertainties in component performance, the turbine entry temperature and overall pressure ratio based on the probabilistic design method should increase by 0.76% and 8.33%, respectively, compared with the ones of deterministic design method. The comparison shows that the probabilistic approach provides a more credible and reliable way to assign the design space for a target engine overall performance.

  7. Economic order quantity (EOQ) by game theory approach in probabilistic supply chain system under service level constraint for items with imperfect quality

    Science.gov (United States)

    Setiawan, R.

    2018-03-01

    In this paper, Economic Order Quantity (EOQ) of probabilistic two-level supply – chain system for items with imperfect quality has been analyzed under service level constraint. A firm applies an active service level constraint to avoid unpredictable shortage terms in the objective function. Mathematical analysis of optimal result is delivered using two equilibrium scheme concept in game theory approach. Stackelberg’s equilibrium for cooperative strategy and Stackelberg’s Equilibrium for noncooperative strategy. This is a new approach to game theory result in inventory system whether service level constraint is applied by a firm in his moves.

  8. Agent autonomy approach to probabilistic physics-of-failure modeling of complex dynamic systems with interacting failure mechanisms

    Science.gov (United States)

    Gromek, Katherine Emily

    A novel computational and inference framework of the physics-of-failure (PoF) reliability modeling for complex dynamic systems has been established in this research. The PoF-based reliability models are used to perform a real time simulation of system failure processes, so that the system level reliability modeling would constitute inferences from checking the status of component level reliability at any given time. The "agent autonomy" concept is applied as a solution method for the system-level probabilistic PoF-based (i.e. PPoF-based) modeling. This concept originated from artificial intelligence (AI) as a leading intelligent computational inference in modeling of multi agents systems (MAS). The concept of agent autonomy in the context of reliability modeling was first proposed by M. Azarkhail [1], where a fundamentally new idea of system representation by autonomous intelligent agents for the purpose of reliability modeling was introduced. Contribution of the current work lies in the further development of the agent anatomy concept, particularly the refined agent classification within the scope of the PoF-based system reliability modeling, new approaches to the learning and the autonomy properties of the intelligent agents, and modeling interacting failure mechanisms within the dynamic engineering system. The autonomous property of intelligent agents is defined as agent's ability to self-activate, deactivate or completely redefine their role in the analysis. This property of agents and the ability to model interacting failure mechanisms of the system elements makes the agent autonomy fundamentally different from all existing methods of probabilistic PoF-based reliability modeling. 1. Azarkhail, M., "Agent Autonomy Approach to Physics-Based Reliability Modeling of Structures and Mechanical Systems", PhD thesis, University of Maryland, College Park, 2007.

  9. Model checking optimal finite-horizon control for probabilistic gene regulatory networks.

    Science.gov (United States)

    Wei, Ou; Guo, Zonghao; Niu, Yun; Liao, Wenyuan

    2017-12-14

    Probabilistic Boolean networks (PBNs) have been proposed for analyzing external control in gene regulatory networks with incorporation of uncertainty. A context-sensitive PBN with perturbation (CS-PBNp), extending a PBN with context-sensitivity to reflect the inherent biological stability and random perturbations to express the impact of external stimuli, is considered to be more suitable for modeling small biological systems intervened by conditions from the outside. In this paper, we apply probabilistic model checking, a formal verification technique, to optimal control for a CS-PBNp that minimizes the expected cost over a finite control horizon. We first describe a procedure of modeling a CS-PBNp using the language provided by a widely used probabilistic model checker PRISM. We then analyze the reward-based temporal properties and the computation in probabilistic model checking; based on the analysis, we provide a method to formulate the optimal control problem as minimum reachability reward properties. Furthermore, we incorporate control and state cost information into the PRISM code of a CS-PBNp such that automated model checking a minimum reachability reward property on the code gives the solution to the optimal control problem. We conduct experiments on two examples, an apoptosis network and a WNT5A network. Preliminary experiment results show the feasibility and effectiveness of our approach. The approach based on probabilistic model checking for optimal control avoids explicit computation of large-size state transition relations associated with PBNs. It enables a natural depiction of the dynamics of gene regulatory networks, and provides a canonical form to formulate optimal control problems using temporal properties that can be automated solved by leveraging the analysis power of underlying model checking engines. This work will be helpful for further utilization of the advances in formal verification techniques in system biology.

  10. Hybrid biasing approaches for global variance reduction

    International Nuclear Information System (INIS)

    Wu, Zeyun; Abdel-Khalik, Hany S.

    2013-01-01

    A new variant of Monte Carlo—deterministic (DT) hybrid variance reduction approach based on Gaussian process theory is presented for accelerating convergence of Monte Carlo simulation and compared with Forward-Weighted Consistent Adjoint Driven Importance Sampling (FW-CADIS) approach implemented in the SCALE package from Oak Ridge National Laboratory. The new approach, denoted the Gaussian process approach, treats the responses of interest as normally distributed random processes. The Gaussian process approach improves the selection of the weight windows of simulated particles by identifying a subspace that captures the dominant sources of statistical response variations. Like the FW-CADIS approach, the Gaussian process approach utilizes particle importance maps obtained from deterministic adjoint models to derive weight window biasing. In contrast to the FW-CADIS approach, the Gaussian process approach identifies the response correlations (via a covariance matrix) and employs them to reduce the computational overhead required for global variance reduction (GVR) purpose. The effective rank of the covariance matrix identifies the minimum number of uncorrelated pseudo responses, which are employed to bias simulated particles. Numerical experiments, serving as a proof of principle, are presented to compare the Gaussian process and FW-CADIS approaches in terms of the global reduction in standard deviation of the estimated responses. - Highlights: ► Hybrid Monte Carlo Deterministic Method based on Gaussian Process Model is introduced. ► Method employs deterministic model to calculate responses correlations. ► Method employs correlations to bias Monte Carlo transport. ► Method compared to FW-CADIS methodology in SCALE code. ► An order of magnitude speed up is achieved for a PWR core model.

  11. On synchronous parallel computations with independent probabilistic choice

    International Nuclear Information System (INIS)

    Reif, J.H.

    1984-01-01

    This paper introduces probabilistic choice to synchronous parallel machine models; in particular parallel RAMs. The power of probabilistic choice in parallel computations is illustrate by parallelizing some known probabilistic sequential algorithms. The authors characterize the computational complexity of time, space, and processor bounded probabilistic parallel RAMs in terms of the computational complexity of probabilistic sequential RAMs. They show that parallelism uniformly speeds up time bounded probabilistic sequential RAM computations by nearly a quadratic factor. They also show that probabilistic choice can be eliminated from parallel computations by introducing nonuniformity

  12. A probabilistic approach to cost and duration uncertainties in environmental decisions

    International Nuclear Information System (INIS)

    Boak, D.M.; Painton, L.

    1996-01-01

    Sandia National Laboratories has developed a method for analyzing life-cycle costs using probabilistic cost forecasting and utility theory to determine the most cost-effective alternatives for safe interim storage of radioactive materials. The method explicitly incorporates uncertainties in cost and storage duration by (1) treating uncertain component costs as random variables represented by probability distributions, (2) treating uncertain durations as chance nodes in a decision tree, and (3) using stochastic simulation tools to generate life-cycle cost forecasts for each storage alternative. The method applies utility functions to the forecasted costs to incorporate the decision maker's risk preferences, making it possible to compare alternatives on the basis of both cost and cost utility. Finally, the method is used to help identify key contributors to the uncertainty in forecasted costs to focus efforts aimed at reducing cost uncertainties. Where significant cost and duration uncertainties exist, and where programmatic decisions must be made despite these uncertainties, probabilistic forecasting techniques can yield important insights into decision alternatives, especially when used as part of a larger decision analysis framework and when properly balanced with deterministic analyses. Although the method is built around an interim storage example, it is potentially applicable to many other environmental decision problems

  13. Probabilistic Modeling of Wind Turbine Drivetrain Components

    DEFF Research Database (Denmark)

    Rafsanjani, Hesam Mirzaei

    Wind energy is one of several energy sources in the world and a rapidly growing industry in the energy sector. When placed in offshore or onshore locations, wind turbines are exposed to wave excitations, highly dynamic wind loads and/or the wakes from other wind turbines. Therefore, most components...... in a wind turbine experience highly dynamic and time-varying loads. These components may fail due to wear or fatigue, and this can lead to unplanned shutdown repairs that are very costly. The design by deterministic methods using safety factors is generally unable to account for the many uncertainties. Thus......, a reliability assessment should be based on probabilistic methods where stochastic modeling of failures is performed. This thesis focuses on probabilistic models and the stochastic modeling of the fatigue life of the wind turbine drivetrain. Hence, two approaches are considered for stochastic modeling...

  14. Probabilistic escalation modelling

    Energy Technology Data Exchange (ETDEWEB)

    Korneliussen, G.; Eknes, M.L.; Haugen, K.; Selmer-Olsen, S. [Det Norske Veritas, Oslo (Norway)

    1997-12-31

    This paper describes how structural reliability methods may successfully be applied within quantitative risk assessment (QRA) as an alternative to traditional event tree analysis. The emphasis is on fire escalation in hydrocarbon production and processing facilities. This choice was made due to potential improvements over current QRA practice associated with both the probabilistic approach and more detailed modelling of the dynamics of escalating events. The physical phenomena important for the events of interest are explicitly modelled as functions of time. Uncertainties are represented through probability distributions. The uncertainty modelling enables the analysis to be simple when possible and detailed when necessary. The methodology features several advantages compared with traditional risk calculations based on event trees. (Author)

  15. A range of complex probabilistic models for RNA secondary structure prediction that includes the nearest-neighbor model and more.

    Science.gov (United States)

    Rivas, Elena; Lang, Raymond; Eddy, Sean R

    2012-02-01

    The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases.

  16. 14th International Probabilistic Workshop

    CERN Document Server

    Taerwe, Luc; Proske, Dirk

    2017-01-01

    This book presents the proceedings of the 14th International Probabilistic Workshop that was held in Ghent, Belgium in December 2016. Probabilistic methods are currently of crucial importance for research and developments in the field of engineering, which face challenges presented by new materials and technologies and rapidly changing societal needs and values. Contemporary needs related to, for example, performance-based design, service-life design, life-cycle analysis, product optimization, assessment of existing structures and structural robustness give rise to new developments as well as accurate and practically applicable probabilistic and statistical engineering methods to support these developments. These proceedings are a valuable resource for anyone interested in contemporary developments in the field of probabilistic engineering applications.

  17. Multi-Hazard Advanced Seismic Probabilistic Risk Assessment Tools and Applications

    International Nuclear Information System (INIS)

    Coleman, Justin L.; Bolisetti, Chandu; Veeraraghavan, Swetha; Parisi, Carlo; Prescott, Steven R.; Gupta, Abhinav

    2016-01-01

    Design of nuclear power plant (NPP) facilities to resist natural hazards has been a part of the regulatory process from the beginning of the NPP industry in the United States (US), but has evolved substantially over time. The original set of approaches and methods was entirely deterministic in nature and focused on a traditional engineering margins-based approach. However, over time probabilistic and risk-informed approaches were also developed and implemented in US Nuclear Regulatory Commission (NRC) guidance and regulation. A defense-in-depth framework has also been incorporated into US regulatory guidance over time. As a result, today, the US regulatory framework incorporates deterministic and probabilistic approaches for a range of different applications and for a range of natural hazard considerations. This framework will continue to evolve as a result of improved knowledge and newly identified regulatory needs and objectives, most notably in response to the NRC activities developed in response to the 2011 Fukushima accident in Japan. Although the US regulatory framework has continued to evolve over time, the tools, methods and data available to the US nuclear industry to meet the changing requirements have not kept pace. Notably, there is significant room for improvement in the tools and methods available for external event probabilistic risk assessment (PRA), which is the principal assessment approach used in risk-informed regulations and risk-informed decision-making applied to natural hazard assessment and design. This is particularly true if PRA is applied to natural hazards other than seismic loading. Development of a new set of tools and methods that incorporate current knowledge, modern best practice, and state-of-the-art computational resources would lead to more reliable assessment of facility risk and risk insights (e.g., the SSCs and accident sequences that are most risk-significant), with less uncertainty and reduced conservatisms.

  18. Multi-Hazard Advanced Seismic Probabilistic Risk Assessment Tools and Applications

    Energy Technology Data Exchange (ETDEWEB)

    Coleman, Justin L. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bolisetti, Chandu [Idaho National Lab. (INL), Idaho Falls, ID (United States); Veeraraghavan, Swetha [Idaho National Lab. (INL), Idaho Falls, ID (United States); Parisi, Carlo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Prescott, Steven R. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Gupta, Abhinav [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    Design of nuclear power plant (NPP) facilities to resist natural hazards has been a part of the regulatory process from the beginning of the NPP industry in the United States (US), but has evolved substantially over time. The original set of approaches and methods was entirely deterministic in nature and focused on a traditional engineering margins-based approach. However, over time probabilistic and risk-informed approaches were also developed and implemented in US Nuclear Regulatory Commission (NRC) guidance and regulation. A defense-in-depth framework has also been incorporated into US regulatory guidance over time. As a result, today, the US regulatory framework incorporates deterministic and probabilistic approaches for a range of different applications and for a range of natural hazard considerations. This framework will continue to evolve as a result of improved knowledge and newly identified regulatory needs and objectives, most notably in response to the NRC activities developed in response to the 2011 Fukushima accident in Japan. Although the US regulatory framework has continued to evolve over time, the tools, methods and data available to the US nuclear industry to meet the changing requirements have not kept pace. Notably, there is significant room for improvement in the tools and methods available for external event probabilistic risk assessment (PRA), which is the principal assessment approach used in risk-informed regulations and risk-informed decision-making applied to natural hazard assessment and design. This is particularly true if PRA is applied to natural hazards other than seismic loading. Development of a new set of tools and methods that incorporate current knowledge, modern best practice, and state-of-the-art computational resources would lead to more reliable assessment of facility risk and risk insights (e.g., the SSCs and accident sequences that are most risk-significant), with less uncertainty and reduced conservatisms.

  19. Recent advances in probabilistic species pool delineations

    Directory of Open Access Journals (Sweden)

    Dirk Nikolaus Karger

    2016-07-01

    Full Text Available A species pool is the set of species that could potentially colonize and establish within a community. It has been a commonly used concept in biogeography since the early days of MacArthur and Wilson’s work on Island Biogeography. Despite their simple and appealing definition, an operational application of species pools is bundled with a multitude of problems, which have often resulted in arbitrary decisions and workarounds when defining species pools. Two recently published papers address the operational problems of species pool delineations, and show ways of delineating them in a probabilistic fashion. In both papers, species pools were delineated using a process-based, mechanistical approach, which opens the door for a multitude of new applications in biogeography. Such applications include detecting the hidden signature of biotic interactions, disentangling the geographical structure of community assembly processes, and incorporating a temporal extent into species pools. Although similar in their conclusions, both ‘probabilistic approaches’ differ in their implementation and definitions. Here I give a brief overview of the differences and similarities of both approaches, and identify the challenges and advantages in their application.

  20. A study on the fatigue life prediction of tire belt-layers using probabilistic method

    International Nuclear Information System (INIS)

    Lee, Dong Woo; Park, Jong Sang; Lee, Tae Won; Kim, Seong Rae; Sung, Ki Deug; Huh, Sun Chul

    2013-01-01

    Tire belt separation failure is occurred by internal cracks generated in *1 and *2 belt layers and by its growth. And belt failure seriously affects tire endurance. Therefore, to improve the tire endurance, it is necessary to analyze tire crack growth behavior and predict fatigue life. Generally, the prediction of tire endurance is performed by the experimental method using tire test machine. But it takes much cost and time to perform experiment. In this paper, to predict tire fatigue life, we applied deterministic fracture mechanics approach, based on finite element analysis. Also, probabilistic analysis method based on statistics using Monte Carlo simulation is presented. Above mentioned two methods include a global-local finite element analysis to provide the detail necessary to model explicitly an internal crack and calculate the J-integral for tire life prediction.

  1. Bounding probabilistic safety assessment probabilities by reality

    International Nuclear Information System (INIS)

    Fragola, J.R.; Shooman, M.L.

    1991-01-01

    The investigation of the failure in systems where failure is a rare event makes the continual comparisons between the developed probabilities and empirical evidence difficult. The comparison of the predictions of rare event risk assessments with historical reality is essential to prevent probabilistic safety assessment (PSA) predictions from drifting into fantasy. One approach to performing such comparisons is to search out and assign probabilities to natural events which, while extremely rare, have a basis in the history of natural phenomena or human activities. For example the Segovian aqueduct and some of the Roman fortresses in Spain have existed for several millennia and in many cases show no physical signs of earthquake damage. This evidence could be used to bound the probability of earthquakes above a certain magnitude to less than 10 -3 per year. On the other hand, there is evidence that some repetitive actions can be performed with extremely low historical probabilities when operators are properly trained and motivated, and sufficient warning indicators are provided. The point is not that low probability estimates are impossible, but continual reassessment of the analysis assumptions, and a bounding of the analysis predictions by historical reality. This paper reviews the probabilistic predictions of PSA in this light, attempts to develop, in a general way, the limits which can be historically established and the consequent bounds that these limits place upon the predictions, and illustrates the methodology used in computing such limits. Further, the paper discusses the use of empirical evidence and the requirement for disciplined systematic approaches within the bounds of reality and the associated impact on PSA probabilistic estimates

  2. Spatial probabilistic approach on landslide susceptibility assessment from high resolution sensors derived parameters

    International Nuclear Information System (INIS)

    Aman, S N A; Latif, Z Abd; Pradhan, B

    2014-01-01

    Landslide occurrence depends on various interrelating factors which consequently initiate to massive mass of soil and rock debris that move downhill due to the gravity action. LiDAR has come with a progressive approach in mitigating landslide by permitting the formation of more accurate DEM compared to other active space borne and airborne remote sensing techniques. The objective of this research is to assess the susceptibility of landslide in Ulu Klang area by investigating the correlation between past landslide events with geo environmental factors. A high resolution LiDAR DEM was constructed to produce topographic attributes such as slope, curvature and aspect. These data were utilized to derive second deliverables of landslide parameters such as topographic wetness index (TWI), surface area ratio (SAR) and stream power index (SPI) as well as NDVI generated from IKONOS imagery. Subsequently, a probabilistic based frequency ratio model was applied to establish the spatial relationship between the landslide locations and each landslide related factor. Factor ratings were summed up to obtain Landslide Susceptibility Index (LSI) to construct the landslide susceptibility map

  3. Caffeine and paraxanthine in aquatic systems: Global exposure distributions and probabilistic risk assessment.

    Science.gov (United States)

    Rodríguez-Gil, J L; Cáceres, N; Dafouz, R; Valcárcel, Y

    2018-01-15

    This study presents one of the most complete applications of probabilistic methodologies to the risk assessment of emerging contaminants. Perhaps the most data-rich of these compounds, caffeine, as well as its main metabolite (paraxanthine), were selected for this study. Information for a total of 29,132 individual caffeine and 7442 paraxanthine samples was compiled, including samples where the compounds were not detected. The inclusion of non-detect samples (as censored data) in the estimation of environmental exposure distributions (EEDs) allowed for a realistic characterization of the global presence of these compounds in aquatic systems. EEDs were compared to species sensitivity distributions (SSDs), when possible, in order to calculate joint probability curves (JPCs) to describe the risk to aquatic organisms. This way, it was determined that unacceptable environmental risk (defined as 5% of the species being potentially exposed to concentrations able to cause effects in>5% of the cases) could be expected from chronic exposure to caffeine from effluent (28.4% of the cases), surface water (6.7% of the cases) and estuary water (5.4% of the cases). Probability of exceedance of acute predicted no-effect concentrations (PNECs) for paraxanthine were higher than 5% for all assessed matrices except for drinking water and ground water, however no experimental effects data was available for paraxanthine, resulting in a precautionary deterministic hazard assessment for this compound. Given the chemical similarities between both compounds, real effect thresholds, and thus risk, for paraxanthine, would be expected to be close to those observed for caffeine. Negligible Human health risk from exposure to caffeine via drinking or groundwater is expected from the compiled data. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Sensitivity analysis on uncertainty variables affecting the NPP's LUEC with probabilistic approach

    International Nuclear Information System (INIS)

    Nuryanti; Akhmad Hidayatno; Erlinda Muslim

    2013-01-01

    One thing that is quite crucial to be reviewed prior to any investment decision on the nuclear power plant (NPP) project is the calculation of project economic, including calculation of Levelized Unit Electricity Cost (LUEC). Infrastructure projects such as NPP’s project are vulnerable to a number of uncertainty variables. Information on the uncertainty variables which makes LUEC’s value quite sensitive due to the changes of them is necessary in order the cost overrun can be avoided. Therefore this study aimed to do the sensitivity analysis on variables that affect LUEC with probabilistic approaches. This analysis was done by using Monte Carlo technique that simulate the relationship between the uncertainty variables and visible impact on LUEC. The sensitivity analysis result shows the significant changes on LUEC value of AP1000 and OPR due to the sensitivity of investment cost and capacity factors. While LUEC changes due to sensitivity of U 3 O 8 ’s price looks not quite significant. (author)

  5. Probabilistic cumulative risk assessment of anti-androgenic pesticides in food.

    NARCIS (Netherlands)

    Müller, A.K.; Bosgra, S.; Boon, P.E.; van der Voet, H.; Nielsen, E.; Ladefoged, O.

    2009-01-01

    In this paper, we present a cumulative risk assessment of three anti-androgenic pesticides (vinclozolin, procymidone and prochloraz) using the relative potency factor (RPF) approach and an integrated probabilistic risk assessment (IPRA) model. RPFs for each substance were estimated for three

  6. Probabilistic cumulative risk assessment of anti-androgenic pesticides in food

    NARCIS (Netherlands)

    Muller, A.K.; Bosgra, S.; Boon, P.E.; Voet, van der H.; Nielsen, E.; Ladefoged, O.

    2009-01-01

    In this paper, we present a cumulative risk assessment of three anti-androgenic pesticides (vinclozolin, procymidone and prochloraz) using the relative potency factor (RPF) approach and an integrated probabilistic risk assessment (IPRA) model. RPFs for each substance were estimated for three

  7. [Academic review of global health approaches: an analytical framework].

    Science.gov (United States)

    Franco-Giraldo, Alvaro

    2015-09-01

    In order to identify perspectives on global health, this essay analyzes different trends from academia that have enriched global health and international health. A database was constructed with information from the world's leading global health centers. The search covered authors on global diplomacy and global health and was performed in PubMed, LILACS, and Google Scholar with the key words "global health" and "international health". Research and training centers in different countries have taken various academic approaches to global health; various interests and ideological orientations have emerged in relation to the global health concept. Based on the mosaic of global health centers and their positions, the review concludes that the new concept reflects the construction of a paradigm of renewal in international health and global health, the pre-paradigmatic stage of which has still not reached a final version.

  8. Principles or Imagination? Two Approaches to Global Justice

    NARCIS (Netherlands)

    Coeckelbergh, Mark

    2006-01-01

    In this paper I distinguish and discuss two approaches to global justice. One approach is Rawlsian and Kantian in inspiration. Discussions within this tradition typically focus on the question whether Rawls’s theory of justice (1971), designed for the national level, can or should be applied to the

  9. US Department of Energy Approach to Probabilistic Evaluation of Long-Term Safety for a Potential Yucca Mountain Repository

    International Nuclear Information System (INIS)

    Dr. R. Dyer; Dr. R. Andrews; Dr. A. Van Luik

    2005-01-01

    Regulatory requirements being addressed in the US geological repository program for spent nuclear fuel and high-level waste disposal specify probabilistically defined mean-value dose limits. These dose limits reflect acceptable levels of risk. The probabilistic approach mandated by regulation calculates a ''risk of a dose,'' a risk of a potential given dose value at a specific time in the future to a hypothetical person. The mean value of the time-dependent performance measure needs to remain below an acceptable level defined by regulation. Because there are uncertain parameters that are important to system performance, the regulation mandates an analysis focused on the mean value of the performance measure, but that also explores the ''full range of defensible and reasonable parameter distributions''...System performance evaluations should not be unduly influenced by...''extreme physical situations and parameter values''. Challenges in this approach lie in defending the scientific basis for the models selected, and the data and distributions sampled. A significant challenge lies in showing that uncertainties are properly identified and evaluated. A single-value parameter has no uncertainty, and where used such values need to be supported by scientific information showing the selected value is appropriate. Uncertainties are inherent in data, but are also introduced by creating parameter distributions from data sets, selecting models from among alternative models, abstracting models for use in probabilistic analysis, and in selecting the range of initiating event probabilities for unlikely events. The goal of the assessment currently in progress is to evaluate the level of risk inherent in moving ahead to the next phase of repository development: construction. During the construction phase, more will be learned to inform a new long-term risk evaluation to support moving to the next phase: accepting waste. Therefore, though there was sufficient confidence of safety

  10. Potential and limits to cluster-state quantum computing using probabilistic gates

    International Nuclear Information System (INIS)

    Gross, D.; Kieling, K.; Eisert, J.

    2006-01-01

    We establish bounds to the necessary resource consumption when building up cluster states for one-way computing using probabilistic gates. Emphasis is put on state preparation with linear optical gates, as the probabilistic character is unavoidable here. We identify rigorous general bounds to the necessary consumption of initially available maximally entangled pairs when building up one-dimensional cluster states with individually acting linear optical quantum gates, entangled pairs, and vacuum modes. As the known linear optics gates have a limited maximum success probability, as we show, this amounts to finding the optimal classical strategy of fusing pieces of linear cluster states. A formal notion of classical configurations and strategies is introduced for probabilistic nonfaulty gates. We study the asymptotic performance of strategies that can be simply described, and prove ultimate bounds to the performance of the globally optimal strategy. The arguments employ methods of random walks and convex optimization. This optimal strategy is also the one that requires the shortest storage time, and necessitates the fewest invocations of probabilistic gates. For two-dimensional cluster states, we find, for any elementary success probability, an essentially deterministic preparation of a cluster state with quadratic, hence optimal, asymptotic scaling in the use of entangled pairs. We also identify a percolation effect in state preparation, in that from a threshold probability on, almost all preparations will be either successful or fail. We outline the implications on linear optical architectures and fault-tolerant computations

  11. Unit commitment with probabilistic reserve: An IPSO approach

    International Nuclear Information System (INIS)

    Lee, Tsung-Ying; Chen, Chun-Lung

    2007-01-01

    This paper presents a new algorithm for solution of the nonlinear optimal scheduling problem. This algorithm is named the iteration particle swarm optimization (IPSO). A new index, called iteration best, is incorporated into particle swarm optimization (PSO) to improve the solution quality and computation efficiency. IPSO is applied to solve the unit commitment with probabilistic reserve problem of a power system. The outage cost as well as fuel cost of thermal units was considered in the unit commitment program to evaluate the level of spinning reserve. The optimal scheduling of on line generation units was reached while minimizing the sum of fuel cost and outage cost. A 48 unit power system was used as a numerical example to test the new algorithm. The optimal scheduling of on line generation units could be reached in the testing results while satisfying the requirement of the objective function

  12. Probabilistic Anomaly Detection Based On System Calls Analysis

    Directory of Open Access Journals (Sweden)

    Przemysław Maciołek

    2007-01-01

    Full Text Available We present an application of probabilistic approach to the anomaly detection (PAD. Byanalyzing selected system calls (and their arguments, the chosen applications are monitoredin the Linux environment. This allows us to estimate “(abnormality” of their behavior (bycomparison to previously collected profiles. We’ve attached results of threat detection ina typical computer environment.

  13. A unifying probabilistic Bayesian approach to derive electron density from MRI for radiation therapy treatment planning

    International Nuclear Information System (INIS)

    Gudur, Madhu Sudhan Reddy; Hara, Wendy; Le, Quynh-Thu; Wang, Lei; Xing, Lei; Li, Ruijiang

    2014-01-01

    MRI significantly improves the accuracy and reliability of target delineation in radiation therapy for certain tumors due to its superior soft tissue contrast compared to CT. A treatment planning process with MRI as the sole imaging modality will eliminate systematic CT/MRI co-registration errors, reduce cost and radiation exposure, and simplify clinical workflow. However, MRI lacks the key electron density information necessary for accurate dose calculation and generating reference images for patient setup. The purpose of this work is to develop a unifying method to derive electron density from standard T1-weighted MRI. We propose to combine both intensity and geometry information into a unifying probabilistic Bayesian framework for electron density mapping. For each voxel, we compute two conditional probability density functions (PDFs) of electron density given its: (1) T1-weighted MRI intensity, and (2) geometry in a reference anatomy, obtained by deformable image registration between the MRI of the atlas and test patient. The two conditional PDFs containing intensity and geometry information are combined into a unifying posterior PDF, whose mean value corresponds to the optimal electron density value under the mean-square error criterion. We evaluated the algorithm’s accuracy of electron density mapping and its ability to detect bone in the head for eight patients, using an additional patient as the atlas or template. Mean absolute HU error between the estimated and true CT, as well as receiver operating characteristics for bone detection (HU > 200) were calculated. The performance was compared with a global intensity approach based on T1 and no density correction (set whole head to water). The proposed technique significantly reduced the errors in electron density estimation, with a mean absolute HU error of 126, compared with 139 for deformable registration (p = 2  ×  10 −4 ), 283 for the intensity approach (p = 2  ×  10 −6 ) and 282

  14. Suppression of panel flutter of near-space aircraft based on non-probabilistic reliability theory

    Directory of Open Access Journals (Sweden)

    Ye-Wei Zhang

    2016-03-01

    Full Text Available The vibration active control of the composite panels with the uncertain parameters in the hypersonic flow is studied using the non-probabilistic reliability theory. Using the piezoelectric patches as active control actuators, dynamic equations of panel are established by finite element method and Hamilton’s principle. And the control model of panel with uncertain parameters is obtained. According to the non-probabilistic reliability index, and besides being based on H∞ robust control theory and non-probabilistic reliability theory, the non-probabilistic reliability performance function is given. Moreover, the relationships between the robust controller and H∞ performance index and reliability are established. Numerical results show that the control method under the influence of reliability, H∞ performance index, and approaching velocity is effective to the vibration suppression of panel in the whole interval of uncertain parameters.

  15. An integrated artificial neural networks approach for predicting global radiation

    International Nuclear Information System (INIS)

    Azadeh, A.; Maghsoudi, A.; Sohrabkhani, S.

    2009-01-01

    This article presents an integrated artificial neural network (ANN) approach for predicting solar global radiation by climatological variables. The integrated ANN trains and tests data with multi layer perceptron (MLP) approach which has the lowest mean absolute percentage error (MAPE). The proposed approach is particularly useful for locations where no available measurement equipment. Also, it considers all related climatological and meteorological parameters as input variables. To show the applicability and superiority of the integrated ANN approach, monthly data were collected for 6 years (1995-2000) in six nominal cities in Iran. Separate model for each city is considered and the quantity of solar global radiation in each city is calculated. Furthermore an integrated ANN model has been introduced for prediction of solar global radiation. The acquired results of the integrated model have shown high accuracy of about 94%. The results of the integrated model have been compared with traditional angstrom's model to show its considerable accuracy. Therefore, the proposed approach can be used as an efficient tool for prediction of solar radiation in the remote and rural locations with no direct measurement equipment.

  16. A probabilistic graphical model based stochastic input model construction

    International Nuclear Information System (INIS)

    Wan, Jiang; Zabaras, Nicholas

    2014-01-01

    Model reduction techniques have been widely used in modeling of high-dimensional stochastic input in uncertainty quantification tasks. However, the probabilistic modeling of random variables projected into reduced-order spaces presents a number of computational challenges. Due to the curse of dimensionality, the underlying dependence relationships between these random variables are difficult to capture. In this work, a probabilistic graphical model based approach is employed to learn the dependence by running a number of conditional independence tests using observation data. Thus a probabilistic model of the joint PDF is obtained and the PDF is factorized into a set of conditional distributions based on the dependence structure of the variables. The estimation of the joint PDF from data is then transformed to estimating conditional distributions under reduced dimensions. To improve the computational efficiency, a polynomial chaos expansion is further applied to represent the random field in terms of a set of standard random variables. This technique is combined with both linear and nonlinear model reduction methods. Numerical examples are presented to demonstrate the accuracy and efficiency of the probabilistic graphical model based stochastic input models. - Highlights: • Data-driven stochastic input models without the assumption of independence of the reduced random variables. • The problem is transformed to a Bayesian network structure learning problem. • Examples are given in flows in random media

  17. Towards a multilevel cognitive probabilistic representation of space

    Science.gov (United States)

    Tapus, Adriana; Vasudevan, Shrihari; Siegwart, Roland

    2005-03-01

    This paper addresses the problem of perception and representation of space for a mobile agent. A probabilistic hierarchical framework is suggested as a solution to this problem. The method proposed is a combination of probabilistic belief with "Object Graph Models" (OGM). The world is viewed from a topological optic, in terms of objects and relationships between them. The hierarchical representation that we propose permits an efficient and reliable modeling of the information that the mobile agent would perceive from its environment. The integration of both navigational and interactional capabilities through efficient representation is also addressed. Experiments on a set of images taken from the real world that validate the approach are reported. This framework draws on the general understanding of human cognition and perception and contributes towards the overall efforts to build cognitive robot companions.

  18. Probabilistic real-time contingency ranking method

    International Nuclear Information System (INIS)

    Mijuskovic, N.A.; Stojnic, D.

    2000-01-01

    This paper describes a real-time contingency method based on a probabilistic index-expected energy not supplied. This way it is possible to take into account the stochastic nature of the electric power system equipment outages. This approach enables more comprehensive ranking of contingencies and it is possible to form reliability cost values that can form the basis for hourly spot price calculations. The electric power system of Serbia is used as an example for the method proposed. (author)

  19. Use of Probabilistic Engineering Methods in the Detailed Design and Development Phases of the NASA Ares Launch Vehicle

    Science.gov (United States)

    Fayssal, Safie; Weldon, Danny

    2008-01-01

    The United States National Aeronautics and Space Administration (NASA) is in the midst of a space exploration program called Constellation to send crew and cargo to the international Space Station, to the moon, and beyond. As part of the Constellation program, a new launch vehicle, Ares I, is being developed by NASA Marshall Space Flight Center. Designing a launch vehicle with high reliability and increased safety requires a significant effort in understanding design variability and design uncertainty at the various levels of the design (system, element, subsystem, component, etc.) and throughout the various design phases (conceptual, preliminary design, etc.). In a previous paper [1] we discussed a probabilistic functional failure analysis approach intended mainly to support system requirements definition, system design, and element design during the early design phases. This paper provides an overview of the application of probabilistic engineering methods to support the detailed subsystem/component design and development as part of the "Design for Reliability and Safety" approach for the new Ares I Launch Vehicle. Specifically, the paper discusses probabilistic engineering design analysis cases that had major impact on the design and manufacturing of the Space Shuttle hardware. The cases represent important lessons learned from the Space Shuttle Program and clearly demonstrate the significance of probabilistic engineering analysis in better understanding design deficiencies and identifying potential design improvement for Ares I. The paper also discusses the probabilistic functional failure analysis approach applied during the early design phases of Ares I and the forward plans for probabilistic design analysis in the detailed design and development phases.

  20. Probabilistic model of random uncertainties in structural dynamics for mis-tuned bladed disks; Modele probabiliste des incertitudes en dynamique des structures pour le desaccordage des roues aubagees

    Energy Technology Data Exchange (ETDEWEB)

    Capiez-Lernout, E.; Soize, Ch. [Universite de Marne la Vallee, Lab. de Mecanique, 77 (France)

    2003-10-01

    The mis-tuning of blades is frequently the cause of spatial localizations for the dynamic forced response in turbomachinery industry. The random character of mis-tuning requires the construction of probabilistic models of random uncertainties. A usual parametric probabilistic description considers the mis-tuning through the Young modulus of each blade. This model consists in mis-tuning blade eigenfrequencies, assuming the blade modal shapes unchanged. Recently a new approach known as a non-parametric model of random uncertainties has been introduced for modelling random uncertainties in elasto-dynamics. This paper proposes the construction of a non-parametric model which is coherent with all the uncertainties which characterize mis-tuning. As mis-tuning is a phenomenon which is independent from one blade to another one, the structure is considered as an assemblage of substructures. The mean reduced matrix model required by the non-parametric approach is thus constructed by dynamic sub-structuring. A comparative approach is also needed to study the influence of the non-parametric approach for a usual parametric model adapted to mis-tuning. A numerical example is presented. (authors)

  1. Understanding onsets of rainfall in Southern Africa using temporal probabilistic modelling

    CSIR Research Space (South Africa)

    Cheruiyot, D

    2010-12-01

    Full Text Available This research investigates an alternative approach to automatically evolve the hidden temporal distribution of onset of rainfall directly from multivariate time series (MTS) data in the absence of domain experts. Temporal probabilistic modelling...

  2. Probabilistic analysis of flaw distribution on structure under cyclic load

    International Nuclear Information System (INIS)

    Kwak, Sang Log; Choi, Young Hwan; Kim, Hho Jung

    2003-01-01

    Flaw geometries, applied stress, and material properties are major input variables for the fracture mechanics analysis. Probabilistic approach can be applied for the consideration of uncertainties within these input variables. But probabilistic analysis requires many assumptions due to the lack of initial flaw distributions data. In this study correlations are examined between initial flaw distributions and in-service flaw distributions on structures under cyclic load. For the analysis, LEFM theories and Monte Carlo simulation are applied. Result shows that in-service flaw distributions are determined by initial flaw distributions rather than fatigue crack growth rate. So initial flaw distribution can be derived from in-service flaw distributions

  3. Probabilistic precursor analysis - an application of PSA

    International Nuclear Information System (INIS)

    Hari Prasad, M.; Gopika, V.; Sanyasi Rao, V.V.S.; Vaze, K.K.

    2011-01-01

    Incidents are inevitably part of the operational life of any complex industrial facility, and it is hard to predict how various contributing factors combine to cause the outcome. However, it should be possible to detect the existence of latent conditions that, together with the triggering failure(s), result in abnormal events. These incidents are called precursors. Precursor study, by definition, focuses on how a particular event might have adversely developed. This paper focuses on the events which can be analyzed to assess their potential to develop into core damage situation and looks into extending Probabilistic Safety Assessment techniques to precursor studies and explains the benefits through a typical case study. A preliminary probabilistic precursor analysis has been carried out for a typical NPP. The major advantages of this approach are the strong potential for augmenting event analysis which is currently carried out purely on deterministic basis. (author)

  4. Application of dynamic probabilistic safety assessment approach for accident sequence precursor analysis: Case study for steam generator tube rupture

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Han Sul; Heo, Gyun Young [Kyung Hee University, Yongin (Korea, Republic of); Kim, Tae Wan [Incheon National University, Incheon (Korea, Republic of)

    2017-03-15

    The purpose of this research is to introduce the technical standard of accident sequence precursor (ASP) analysis, and to propose a case study using the dynamic-probabilistic safety assessment (D-PSA) approach. The D-PSA approach can aid in the determination of high-risk/low-frequency accident scenarios from all potential scenarios. It can also be used to investigate the dynamic interaction between the physical state and the actions of the operator in an accident situation for risk quantification. This approach lends significant potential for safety analysis. Furthermore, the D-PSA approach provides a more realistic risk assessment by minimizing assumptions used in the conventional PSA model so-called the static-PSA model, which are relatively static in comparison. We performed risk quantification of a steam generator tube rupture (SGTR) accident using the dynamic event tree (DET) methodology, which is the most widely used methodology in D-PSA. The risk quantification results of D-PSA and S-PSA are compared and evaluated. Suggestions and recommendations for using D-PSA are described in order to provide a technical perspective.

  5. Application of dynamic probabilistic safety assessment approach for accident sequence precursor analysis: Case study for steam generator tube rupture

    International Nuclear Information System (INIS)

    Lee, Han Sul; Heo, Gyun Young; Kim, Tae Wan

    2017-01-01

    The purpose of this research is to introduce the technical standard of accident sequence precursor (ASP) analysis, and to propose a case study using the dynamic-probabilistic safety assessment (D-PSA) approach. The D-PSA approach can aid in the determination of high-risk/low-frequency accident scenarios from all potential scenarios. It can also be used to investigate the dynamic interaction between the physical state and the actions of the operator in an accident situation for risk quantification. This approach lends significant potential for safety analysis. Furthermore, the D-PSA approach provides a more realistic risk assessment by minimizing assumptions used in the conventional PSA model so-called the static-PSA model, which are relatively static in comparison. We performed risk quantification of a steam generator tube rupture (SGTR) accident using the dynamic event tree (DET) methodology, which is the most widely used methodology in D-PSA. The risk quantification results of D-PSA and S-PSA are compared and evaluated. Suggestions and recommendations for using D-PSA are described in order to provide a technical perspective

  6. Probabilistic graphical models to deal with age estimation of living persons.

    Science.gov (United States)

    Sironi, Emanuele; Gallidabino, Matteo; Weyermann, Céline; Taroni, Franco

    2016-03-01

    Due to the rise of criminal, civil and administrative judicial situations involving people lacking valid identity documents, age estimation of living persons has become an important operational procedure for numerous forensic and medicolegal services worldwide. The chronological age of a given person is generally estimated from the observed degree of maturity of some selected physical attributes by means of statistical methods. However, their application in the forensic framework suffers from some conceptual and practical drawbacks, as recently claimed in the specialised literature. The aim of this paper is therefore to offer an alternative solution for overcoming these limits, by reiterating the utility of a probabilistic Bayesian approach for age estimation. This approach allows one to deal in a transparent way with the uncertainty surrounding the age estimation process and to produce all the relevant information in the form of posterior probability distribution about the chronological age of the person under investigation. Furthermore, this probability distribution can also be used for evaluating in a coherent way the possibility that the examined individual is younger or older than a given legal age threshold having a particular legal interest. The main novelty introduced by this work is the development of a probabilistic graphical model, i.e. a Bayesian network, for dealing with the problem at hand. The use of this kind of probabilistic tool can significantly facilitate the application of the proposed methodology: examples are presented based on data related to the ossification status of the medial clavicular epiphysis. The reliability and the advantages of this probabilistic tool are presented and discussed.

  7. Quantitative probabilistic functional diffusion mapping in newly diagnosed glioblastoma treated with radiochemotherapy.

    Science.gov (United States)

    Ellingson, Benjamin M; Cloughesy, Timothy F; Lai, Albert; Nghiemphu, Phioanh L; Liau, Linda M; Pope, Whitney B

    2013-03-01

    Functional diffusion mapping (fDM) is a cancer imaging technique that uses voxel-wise changes in apparent diffusion coefficients (ADC) to evaluate response to treatment. Despite promising initial results, uncertainty in image registration remains the largest barrier to widespread clinical application. The current study introduces a probabilistic approach to fDM quantification to overcome some of these limitations. A total of 143 patients with newly diagnosed glioblastoma who were undergoing standard radiochemotherapy were enrolled in this retrospective study. Traditional and probabilistic fDMs were calculated using ADC maps acquired before and after therapy. Probabilistic fDMs were calculated by applying random, finite translational, and rotational perturbations to both pre-and posttherapy ADC maps, then repeating calculation of fDMs reflecting changes after treatment, resulting in probabilistic fDMs showing the voxel-wise probability of fDM classification. Probabilistic fDMs were then compared with traditional fDMs in their ability to predict progression-free survival (PFS) and overall survival (OS). Probabilistic fDMs applied to patients with newly diagnosed glioblastoma treated with radiochemotherapy demonstrated shortened PFS and OS among patients with a large volume of tumor with decreasing ADC evaluated at the posttreatment time with respect to the baseline scans. Alternatively, patients with a large volume of tumor with increasing ADC evaluated at the posttreatment time with respect to baseline scans were more likely to progress later and live longer. Probabilistic fDMs performed better than traditional fDMs at predicting 12-month PFS and 24-month OS with use of receiver-operator characteristic analysis. Univariate log-rank analysis on Kaplan-Meier data also revealed that probabilistic fDMs could better separate patients on the basis of PFS and OS, compared with traditional fDMs. Results suggest that probabilistic fDMs are a more predictive biomarker in

  8. Evaluation of Probabilistic Disease Forecasts.

    Science.gov (United States)

    Hughes, Gareth; Burnett, Fiona J

    2017-10-01

    The statistical evaluation of probabilistic disease forecasts often involves calculation of metrics defined conditionally on disease status, such as sensitivity and specificity. However, for the purpose of disease management decision making, metrics defined conditionally on the result of the forecast-predictive values-are also important, although less frequently reported. In this context, the application of scoring rules in the evaluation of probabilistic disease forecasts is discussed. An index of separation with application in the evaluation of probabilistic disease forecasts, described in the clinical literature, is also considered and its relation to scoring rules illustrated. Scoring rules provide a principled basis for the evaluation of probabilistic forecasts used in plant disease management. In particular, the decomposition of scoring rules into interpretable components is an advantageous feature of their application in the evaluation of disease forecasts.

  9. A PROBABILISTIC DEMAND APPLICATION IN THE AMERICAN CRACKER MARKET

    Directory of Open Access Journals (Sweden)

    Rutherford Cd. Johnson

    2016-07-01

    Full Text Available Knowledge of the distribution of consumer buying strategies by producers may permit improved marketing strategies and improved ability to respond to volatile market conditions. In order to investigate potential ways of gaining such knowledge, this study extends the work of Kahneman, Russell, and Thaler through the application of a probabilistic demand framework using Choice Wave theory, based on the Schrödinger Wave Equation in quantum mechanics. Probabilistic variability of response to health information and its potential influence on buying strategies is also investigated, extending the work of Clement, Johnson, Hu, Pagoulatos, and Debertin. In the present study, the domestic cracker market within fourteen U.S. metropolitan areas is segmented, using the Choice Wave Probabilistic Demand approach, into two statistically independent “Consumer Types” of seven metropolitan areas each. The two Consumer Types are shown to have statistically different elasticities than each other and from the combined market as a whole. This approach may provide not only improved marketing strategies through improved awareness of consumer preferences and buying strategies, especially in the volatile agricultural sector, but also may be useful in aiding producers of store brand/private label products in finding desirable markets in which to compete against national brands. The results also suggest that supply/producer-side strategies should take into account the ways in which information, whether under the direct control of the producer or not, may influence and change consumer buying strategies.

  10. Probabilistic Latent Semantic Analyses (PLSA in Bibliometric Analysis for Technology Forecasting

    Directory of Open Access Journals (Sweden)

    Wang Zan

    2007-03-01

    Full Text Available Due to the availability of internet-based abstract services and patent databases, bibliometric analysis has become one of key technology forecasting approaches. Recently, latent semantic analysis (LSA has been applied to improve the accuracy in document clustering. In this paper, a new LSA method, probabilistic latent semantic analysis (PLSA which uses probabilistic methods and algebra to search latent space in the corpus is further applied in document clustering. The results show that PLSA is more accurate than LSA and the improved iteration method proposed by authors can simplify the computing process and improve the computing efficiency

  11. Probabilistic studies for safety at optimum cost

    International Nuclear Information System (INIS)

    Pitner, P.

    1999-01-01

    By definition, the risk of failure of very reliable components is difficult to evaluate. How can the best strategies for in service inspection and maintenance be defined to limit this risk to an acceptable level at optimum cost? It is not sufficient to design structures with margins, it is also essential to understand how they age. The probabilistic approach has made it possible to develop well proven concepts. (author)

  12. Evaluation Methodology between Globalization and Localization Features Approaches for Skin Cancer Lesions Classification

    Science.gov (United States)

    Ahmed, H. M.; Al-azawi, R. J.; Abdulhameed, A. A.

    2018-05-01

    Huge efforts have been put in the developing of diagnostic methods to skin cancer disease. In this paper, two different approaches have been addressed for detection the skin cancer in dermoscopy images. The first approach uses a global method that uses global features for classifying skin lesions, whereas the second approach uses a local method that uses local features for classifying skin lesions. The aim of this paper is selecting the best approach for skin lesion classification. The dataset has been used in this paper consist of 200 dermoscopy images from Pedro Hispano Hospital (PH2). The achieved results are; sensitivity about 96%, specificity about 100%, precision about 100%, and accuracy about 97% for globalization approach while, sensitivity about 100%, specificity about 100%, precision about 100%, and accuracy about 100% for Localization Approach, these results showed that the localization approach achieved acceptable accuracy and better than globalization approach for skin cancer lesions classification.

  13. Probabilistic stability analysis: the way forward for stability analysis of sustainable power systems.

    Science.gov (United States)

    Milanović, Jovica V

    2017-08-13

    Future power systems will be significantly different compared with their present states. They will be characterized by an unprecedented mix of a wide range of electricity generation and transmission technologies, as well as responsive and highly flexible demand and storage devices with significant temporal and spatial uncertainty. The importance of probabilistic approaches towards power system stability analysis, as a subsection of power system studies routinely carried out by power system operators, has been highlighted in previous research. However, it may not be feasible (or even possible) to accurately model all of the uncertainties that exist within a power system. This paper describes for the first time an integral approach to probabilistic stability analysis of power systems, including small and large angular stability and frequency stability. It provides guidance for handling uncertainties in power system stability studies and some illustrative examples of the most recent results of probabilistic stability analysis of uncertain power systems.This article is part of the themed issue 'Energy management: flexibility, risk and optimization'. © 2017 The Author(s).

  14. Probabilistic Fatigue Design of Composite Material for Wind Turbine Blades

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard

    2011-01-01

    In the present paper a probabilistic design approach to fatigue design of wind turbine blades is presented. The physical uncertainty on the fatigue strength for composite material is estimated using public available fatigue tests. Further, the model uncertainty on Miner rule for damage accumulation...

  15. The exposure of Sydney (Australia) to earthquake-generated tsunamis, storms and sea level rise: a probabilistic multi-hazard approach.

    Science.gov (United States)

    Dall'Osso, F; Dominey-Howes, D; Moore, C; Summerhayes, S; Withycombe, G

    2014-12-10

    Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney.

  16. Probabilistic environmental risk characterization of pharmaceuticals in sewage treatment plant discharges.

    Science.gov (United States)

    Christensen, Anne Munch; Markussen, Bo; Baun, Anders; Halling-Sørensen, Bent

    2009-10-01

    The occurrence of pharmaceuticals in different water bodies and the findings of effects on aquatic organisms in ecotoxicity tests have raised concerns about environmental risks of pharmaceuticals in receiving waters. Due to the fact that the amount of ecotoxicological studies has increased significantly during the last decade, probabilistic approaches for risk characterization of these compounds may be feasible. This approach was evaluated by applying it to 22 human-used pharmaceuticals covering both pharmaceuticals with a high volume and high ecotoxicity, using ecotoxicological effect data from laboratory studies and comparing these to monitoring data on the effluents from sewage treatment plants in Europe and pharmaceutical sales quantities. We found that for 19 of the 22 selected pharmaceuticals the existing data were sufficient for probabilistic risk characterizations. The subsequently modeled ratios between monitored concentrations and low-effect concentrations were mostly above a factor of 100. Compared to the current paradigm for EU environmental risk assessment where a safety factor of 10 or 100 might have been used it seems that for the modeled compounds there's a low environmental risk. However, similarly calculated ratios for five pharmaceuticals (propranolol, ibuprofen, furosemide, ofloxacin, and ciprofloxacin) were below 100, while ibuprofen and ciprofloxacin are considered to be of high concern due to lack of ecotoxicity studies. This paper shows that by applying probabilistic approaches, existing data can be used to execute a comprehensive study on probability of impacts, thereby contributing to a more comprehensive environmental risk assessment of pharmaceuticals.

  17. Incorporating linguistic, probabilistic, and possibilistic information in a risk-based approach for ranking contaminated sites.

    Science.gov (United States)

    Zhang, Kejiang; Achari, Gopal; Pei, Yuansheng

    2010-10-01

    Different types of uncertain information-linguistic, probabilistic, and possibilistic-exist in site characterization. Their representation and propagation significantly influence the management of contaminated sites. In the absence of a framework with which to properly represent and integrate these quantitative and qualitative inputs together, decision makers cannot fully take advantage of the available and necessary information to identify all the plausible alternatives. A systematic methodology was developed in the present work to incorporate linguistic, probabilistic, and possibilistic information into the Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), a subgroup of Multi-Criteria Decision Analysis (MCDA) methods for ranking contaminated sites. The identification of criteria based on the paradigm of comparative risk assessment provides a rationale for risk-based prioritization. Uncertain linguistic, probabilistic, and possibilistic information identified in characterizing contaminated sites can be properly represented as numerical values, intervals, probability distributions, and fuzzy sets or possibility distributions, and linguistic variables according to their nature. These different kinds of representation are first transformed into a 2-tuple linguistic representation domain. The propagation of hybrid uncertainties is then carried out in the same domain. This methodology can use the original site information directly as much as possible. The case study shows that this systematic methodology provides more reasonable results. © 2010 SETAC.

  18. Study of Power Fluctuation from Dispersed Generations and Loads and its Impact on a Distribution Network through a Probabilistic Approach

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Chen, Zhe; Bak-Jensen, Birgitte

    2007-01-01

    In order to assess the performance of distribution system under normal operating conditions with large integration of renewable energy based dispersed generation (DG) units, probabilistic modeling of the distribution system is necessary in order to take into consideration the stochastic behavior...... of load demands and DG units such as wind generation and combined heat and power plant generation. This paper classifies probabilistic models of load demands and DG units into summer and winter period, weekday and weekend as well as in 24 hours a day. The voltage results from the probabilistic load flow...

  19. Probabilistic broadcasting of mixed states

    International Nuclear Information System (INIS)

    Li Lvjun; Li Lvzhou; Wu Lihua; Zou Xiangfu; Qiu Daowen

    2009-01-01

    It is well known that the non-broadcasting theorem proved by Barnum et al is a fundamental principle of quantum communication. As we are aware, optimal broadcasting (OB) is the only method to broadcast noncommuting mixed states approximately. In this paper, motivated by the probabilistic cloning of quantum states proposed by Duan and Guo, we propose a new way for broadcasting noncommuting mixed states-probabilistic broadcasting (PB), and we present a sufficient condition for PB of mixed states. To a certain extent, we generalize the probabilistic cloning theorem from pure states to mixed states, and in particular, we generalize the non-broadcasting theorem, since the case that commuting mixed states can be exactly broadcast can be thought of as a special instance of PB where the success ratio is 1. Moreover, we discuss probabilistic local broadcasting (PLB) of separable bipartite states

  20. Probabilistic modeling of timber structures

    DEFF Research Database (Denmark)

    Köhler, Jochen; Sørensen, John Dalsgaard; Faber, Michael Havbro

    2007-01-01

    The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) [Joint Committee of Structural Safety. Probabilistic Model Code, Internet...... Publication: www.jcss.ethz.ch; 2001] and of the COST action E24 ‘Reliability of Timber Structures' [COST Action E 24, Reliability of timber structures. Several meetings and Publications, Internet Publication: http://www.km.fgg.uni-lj.si/coste24/coste24.htm; 2005]. The present proposal is based on discussions...... and comments from participants of the COST E24 action and the members of the JCSS. The paper contains a description of the basic reference properties for timber strength parameters and ultimate limit state equations for timber components. The recommended probabilistic model for these basic properties...

  1. Uses of probabilistic estimates of seismic hazard and nuclear power plants in the US

    International Nuclear Information System (INIS)

    Reiter, L.

    1983-01-01

    The use of probabilistic estimates is playing an increased role in the review of seismic hazard at nuclear power plants. The NRC Geosciences Branch emphasis has been on using these estimates in a relative rather than absolute manner and to gain insight on other approaches. Examples of this use include estimates to determine design levels, to determine equivalent hazard at different sites, to help define more realistic seismotectonic provinces, and to assess implied levels of acceptable risk using deterministic methods. Increased use of probabilistic estimates is expected. Probabilistic estimates of seismic hazard have a potential for misuse, however, and their successful integration into decision making requires they not be divorced from physical insight and scientific intuition

  2. Structural reliability codes for probabilistic design

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    1997-01-01

    probabilistic code format has not only strong influence on the formal reliability measure, but also on the formal cost of failure to be associated if a design made to the target reliability level is considered to be optimal. In fact, the formal cost of failure can be different by several orders of size for two...... different, but by and large equally justifiable probabilistic code formats. Thus, the consequence is that a code format based on decision theoretical concepts and formulated as an extension of a probabilistic code format must specify formal values to be used as costs of failure. A principle of prudence...... is suggested for guiding the choice of the reference probabilistic code format for constant reliability. In the author's opinion there is an urgent need for establishing a standard probabilistic reliability code. This paper presents some considerations that may be debatable, but nevertheless point...

  3. Probabilistic Space Weather Forecasting: a Bayesian Perspective

    Science.gov (United States)

    Camporeale, E.; Chandorkar, M.; Borovsky, J.; Care', A.

    2017-12-01

    Most of the Space Weather forecasts, both at operational and research level, are not probabilistic in nature. Unfortunately, a prediction that does not provide a confidence level is not very useful in a decision-making scenario. Nowadays, forecast models range from purely data-driven, machine learning algorithms, to physics-based approximation of first-principle equations (and everything that sits in between). Uncertainties pervade all such models, at every level: from the raw data to finite-precision implementation of numerical methods. The most rigorous way of quantifying the propagation of uncertainties is by embracing a Bayesian probabilistic approach. One of the simplest and most robust machine learning technique in the Bayesian framework is Gaussian Process regression and classification. Here, we present the application of Gaussian Processes to the problems of the DST geomagnetic index forecast, the solar wind type classification, and the estimation of diffusion parameters in radiation belt modeling. In each of these very diverse problems, the GP approach rigorously provide forecasts in the form of predictive distributions. In turn, these distributions can be used as input for ensemble simulations in order to quantify the amplification of uncertainties. We show that we have achieved excellent results in all of the standard metrics to evaluate our models, with very modest computational cost.

  4. An integral approach to the use of probabilistic risk assessment methods

    International Nuclear Information System (INIS)

    Schwarzblat, M.; Arellano, J.

    1987-01-01

    In this chapter some of the work developed at the Instituto de Investigaciones Electricas in the area of probabilistic risk analysis are presented. In this area, work has been basically focused in the following directions: development and implementation of methods, and applications to real systems. The first part of this paper describes the area of methods development and implementation, presenting an integrated package of computer programs for fault tree analysis. In the second part some of the most important applications developed for real systems are presented. (author)

  5. Confluence Reduction for Probabilistic Systems (extended version)

    NARCIS (Netherlands)

    Timmer, Mark; Stoelinga, Mariëlle Ida Antoinette; van de Pol, Jan Cornelis

    2010-01-01

    This paper presents a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We prove that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To support the

  6. Probabilistic Flood Defence Assessment Tools

    Directory of Open Access Journals (Sweden)

    Slomp Robert

    2016-01-01

    institutions managing flood the defences, and not by just a small number of experts in probabilistic assessment. Therefore, data management and use of software are main issues that have been covered in courses and training in 2016 and 2017. All in all, this is the largest change in the assessment of Dutch flood defences since 1996. In 1996 probabilistic techniques were first introduced to determine hydraulic boundary conditions (water levels and waves (wave height, wave period and direction for different return periods. To simplify the process, the assessment continues to consist of a three-step approach, moving from simple decision rules, to the methods for semi-probabilistic assessment, and finally to a fully probabilistic analysis to compare the strength of flood defences with the hydraulic loads. The formal assessment results are thus mainly based on the fully probabilistic analysis and the ultimate limit state of the strength of a flood defence. For complex flood defences, additional models and software were developed. The current Hydra software suite (for policy analysis, formal flood defence assessment and design will be replaced by the model Ringtoets. New stand-alone software has been developed for revetments, geotechnical analysis and slope stability of the foreshore. Design software and policy analysis software, including the Delta model, will be updated in 2018. A fully probabilistic method results in more precise assessments and more transparency in the process of assessment and reconstruction of flood defences. This is of increasing importance, as large-scale infrastructural projects in a highly urbanized environment are increasingly subject to political and societal pressure to add additional features. For this reason, it is of increasing importance to be able to determine which new feature really adds to flood protection, to quantify how much its adds to the level of flood protection and to evaluate if it is really worthwhile. Please note: The Netherlands

  7. Design of Probabilistic Random Forests with Applications to Anticancer Drug Sensitivity Prediction.

    Science.gov (United States)

    Rahman, Raziur; Haider, Saad; Ghosh, Souparno; Pal, Ranadip

    2015-01-01

    Random forests consisting of an ensemble of regression trees with equal weights are frequently used for design of predictive models. In this article, we consider an extension of the methodology by representing the regression trees in the form of probabilistic trees and analyzing the nature of heteroscedasticity. The probabilistic tree representation allows for analytical computation of confidence intervals (CIs), and the tree weight optimization is expected to provide stricter CIs with comparable performance in mean error. We approached the ensemble of probabilistic trees' prediction from the perspectives of a mixture distribution and as a weighted sum of correlated random variables. We applied our methodology to the drug sensitivity prediction problem on synthetic and cancer cell line encyclopedia dataset and illustrated that tree weights can be selected to reduce the average length of the CI without increase in mean error.

  8. Advanced Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Technical Exchange Meeting

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2013-09-01

    During FY13, the INL developed an advanced SMR PRA framework which has been described in the report Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Technical Framework Specification, INL/EXT-13-28974 (April 2013). In this framework, the various areas are considered: Probabilistic models to provide information specific to advanced SMRs Representation of specific SMR design issues such as having co-located modules and passive safety features Use of modern open-source and readily available analysis methods Internal and external events resulting in impacts to safety All-hazards considerations Methods to support the identification of design vulnerabilities Mechanistic and probabilistic data needs to support modeling and tools In order to describe this framework more fully and obtain feedback on the proposed approaches, the INL hosted a technical exchange meeting during August 2013. This report describes the outcomes of that meeting.

  9. Computer aided probabilistic assessment of containment integrity

    International Nuclear Information System (INIS)

    Tsai, J.C.; Touchton, R.A.

    1984-01-01

    In the probabilistic risk assessment (PRA) of a nuclear power plant, there are three probability-based techniques which are widely used for event sequence frequency quantification (including nodal probability estimation). These three techniques are the event tree analysis, the fault tree analysis and the Bayesian approach for database development. In the barrier analysis for assessing radionuclide release to the environment in a PRA study, these techniques are employed to a greater extent in estimating conditions which could lead to failure of the fuel cladding and the reactor coolant system (RCS) pressure boundary, but to a lesser degree in the containment pressure boundary failure analysis. The main reason is that containment issues are currently still in a state of flux. In this paper, the authors describe briefly the computer programs currently used by the nuclear industry to do event tree analyses, fault tree analyses and the Bayesian update. The authors discuss how these computer aided probabilistic techniques might be adopted for failure analysis of the containment pressure boundary

  10. Incorporating psychological influences in probabilistic cost analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kujawski, Edouard; Alvaro, Mariana; Edwards, William

    2004-01-08

    Today's typical probabilistic cost analysis assumes an ''ideal'' project that is devoid of the human and organizational considerations that heavily influence the success and cost of real-world projects. In the real world ''Money Allocated Is Money Spent'' (MAIMS principle); cost underruns are rarely available to protect against cost overruns while task overruns are passed on to the total project cost. Realistic cost estimates therefore require a modified probabilistic cost analysis that simultaneously models the cost management strategy including budget allocation. Psychological influences such as overconfidence in assessing uncertainties and dependencies among cost elements and risks are other important considerations that are generally not addressed. It should then be no surprise that actual project costs often exceed the initial estimates and are delivered late and/or with a reduced scope. This paper presents a practical probabilistic cost analysis model that incorporates recent findings in human behavior and judgment under uncertainty, dependencies among cost elements, the MAIMS principle, and project management practices. Uncertain cost elements are elicited from experts using the direct fractile assessment method and fitted with three-parameter Weibull distributions. The full correlation matrix is specified in terms of two parameters that characterize correlations among cost elements in the same and in different subsystems. The analysis is readily implemented using standard Monte Carlo simulation tools such as {at}Risk and Crystal Ball{reg_sign}. The analysis of a representative design and engineering project substantiates that today's typical probabilistic cost analysis is likely to severely underestimate project cost for probability of success values of importance to contractors and procuring activities. The proposed approach provides a framework for developing a viable cost management strategy for

  11. Global floor planning approach for VLSI design

    International Nuclear Information System (INIS)

    LaPotin, D.P.

    1986-01-01

    Within a hierarchical design environment, initial decisions regarding the partitioning and choice of module attributes greatly impact the quality of the resulting IC in terms of area and electrical performance. This dissertation presents a global floor-planning approach which allows designers to quickly explore layout issues during the initial stages of the IC design process. In contrast to previous efforts, which address the floor-planning problem from a strict module placement point of view, this approach considers floor-planning from an area planning point of view. The approach is based upon a combined min-cut and slicing paradigm, which ensures routability. To provide flexibility, modules may be specified as having a number of possible dimensions and orientations, and I/O pads as well as layout constraints are considered. A slicing-tree representation is employed, upon which a sequence of traversal operations are applied in order to obtain an area efficient layout. An in-place partitioning technique, which provides an improvement over previous min-cut and slicing-based efforts, is discussed. Global routing and module I/O pin assignment are provided for floor-plan evaluation purposes. A computer program, called Mason, has been developed which efficiently implements the approach and provides an interactive environment for designers to perform floor-planning. Performance of this program is illustrated via several industrial examples

  12. Cutting costs through detailed probabilistic fire risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Luiz; Huser, Asmund; Vianna, Savio [Det Norske Veritas PRINCIPIA, Rio de Janeiro, RJ (Brazil)

    2004-07-01

    A new procedure for calculation of fire risks to offshore installations has been developed. The purposes of the procedure are to calculate the escalation and impairment frequencies to be applied in quantitative risk analyses, to optimize Passive Fire Protection (PFP) arrangement, and to optimize other fire mitigation means. The novelties of the procedure are that it uses state of the art Computational Fluid Dynamics (CFD) models to simulate fires and radiation, as well as the use of a probabilistic approach to decide the dimensioning fire loads. A CFD model of an actual platform was used to investigate the dynamic properties of a large set of jet fires, resulting in detailed knowledge of the important parameters that decide the severity of offshore fires. These results are applied to design the procedure. Potential increase in safety is further obtained for those conditions where simplified tools may have failed to predict abnormal heat loads due to geometrical effects. Using a field example it is indicated that the probabilistic approach can give significant reductions in PFP coverage with corresponding cost savings, still keeping the risk at acceptable level. (author)

  13. Probabilistic Approaches for Multi-Hazard Risk Assessment of Structures and Systems

    Science.gov (United States)

    Kwag, Shinyoung

    Performance assessment of structures, systems, and components for multi-hazard scenarios has received significant attention in recent years. However, the concept of multi-hazard analysis is quite broad in nature and the focus of existing literature varies across a wide range of problems. In some cases, such studies focus on hazards that either occur simultaneously or are closely correlated with each other. For example, seismically induced flooding or seismically induced fires. In other cases, multi-hazard studies relate to hazards that are not dependent or correlated but have strong likelihood of occurrence at different times during the lifetime of a structure. The current approaches for risk assessment need enhancement to account for multi-hazard risks. It must be able to account for uncertainty propagation in a systems-level analysis, consider correlation among events or failure modes, and allow integration of newly available information from continually evolving simulation models, experimental observations, and field measurements. This dissertation presents a detailed study that proposes enhancements by incorporating Bayesian networks and Bayesian updating within a performance-based probabilistic framework. The performance-based framework allows propagation of risk as well as uncertainties in the risk estimates within a systems analysis. Unlike conventional risk assessment techniques such as a fault-tree analysis, a Bayesian network can account for statistical dependencies and correlations among events/hazards. The proposed approach is extended to develop a risk-informed framework for quantitative validation and verification of high fidelity system-level simulation tools. Validation of such simulations can be quite formidable within the context of a multi-hazard risk assessment in nuclear power plants. The efficiency of this approach lies in identification of critical events, components, and systems that contribute to the overall risk. Validation of any event or

  14. Applying probabilistic temporal and multisite data quality control methods to a public health mortality registry in Spain: a systematic approach to quality control of repositories.

    Science.gov (United States)

    Sáez, Carlos; Zurriaga, Oscar; Pérez-Panadés, Jordi; Melchor, Inma; Robles, Montserrat; García-Gómez, Juan M

    2016-11-01

    To assess the variability in data distributions among data sources and over time through a case study of a large multisite repository as a systematic approach to data quality (DQ). Novel probabilistic DQ control methods based on information theory and geometry are applied to the Public Health Mortality Registry of the Region of Valencia, Spain, with 512 143 entries from 2000 to 2012, disaggregated into 24 health departments. The methods provide DQ metrics and exploratory visualizations for (1) assessing the variability among multiple sources and (2) monitoring and exploring changes with time. The methods are suited to big data and multitype, multivariate, and multimodal data. The repository was partitioned into 2 probabilistically separated temporal subgroups following a change in the Spanish National Death Certificate in 2009. Punctual temporal anomalies were noticed due to a punctual increment in the missing data, along with outlying and clustered health departments due to differences in populations or in practices. Changes in protocols, differences in populations, biased practices, or other systematic DQ problems affected data variability. Even if semantic and integration aspects are addressed in data sharing infrastructures, probabilistic variability may still be present. Solutions include fixing or excluding data and analyzing different sites or time periods separately. A systematic approach to assessing temporal and multisite variability is proposed. Multisite and temporal variability in data distributions affects DQ, hindering data reuse, and an assessment of such variability should be a part of systematic DQ procedures. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. Probabilistic finite elements for fracture mechanics

    Science.gov (United States)

    Besterfield, Glen

    1988-01-01

    The probabilistic finite element method (PFEM) is developed for probabilistic fracture mechanics (PFM). A finite element which has the near crack-tip singular strain embedded in the element is used. Probabilistic distributions, such as expectation, covariance and correlation stress intensity factors, are calculated for random load, random material and random crack length. The method is computationally quite efficient and can be expected to determine the probability of fracture or reliability.

  16. Probabilistic design of aluminum sheet drawing for reduced risk of wrinkling and fracture

    International Nuclear Information System (INIS)

    Zhang Wenfeng; Shivpuri, Rajiv

    2009-01-01

    Often, sheet drawing processes are designed to provide the geometry of the final part, and then the process parameters such as blank dimensions, blank holder forces (BHFs), press strokes and interface friction are designed and controlled to provide the greatest drawability (largest depth of draw without violating the wrinkling and thinning constraints). The exclusion of inherent process variations in this design can often lead to process designs that are unreliable and uncontrollable. In this paper, a general multi-criteria design approach is presented to quantify the uncertainties and to incorporate them into the response surface method (RSM) based model so as to conduct probabilistic optimization. A surrogate RSM model of the process mechanics is generated using FEM-based high-fidelity models and design of experiments (DOEs), and a simple linear weighted approach is used to formulate the objective function or the quality index (QI). To demonstrate this approach, deep drawing of an aluminum Hishida part is analyzed. With the predetermined blank shape, tooling design and fixed drawing depth, a probabilistic design (PD) is successfully carried out to find the optimal combination of BHF and friction coefficient under variation of material properties. The results show that with the probabilistic approach, the QI improved by 42% over the traditional deterministic design (DD). It also shows that by further reducing the variation of friction coefficient to 2%, the QI will improve further to 98.97%

  17. Uncertainty analysis in the applications of nuclear probabilistic risk assessment

    International Nuclear Information System (INIS)

    Le Duy, T.D.

    2011-01-01

    The aim of this thesis is to propose an approach to model parameter and model uncertainties affecting the results of risk indicators used in the applications of nuclear Probabilistic Risk assessment (PRA). After studying the limitations of the traditional probabilistic approach to represent uncertainty in PRA model, a new approach based on the Dempster-Shafer theory has been proposed. The uncertainty analysis process of the proposed approach consists in five main steps. The first step aims to model input parameter uncertainties by belief and plausibility functions according to the data PRA model. The second step involves the propagation of parameter uncertainties through the risk model to lay out the uncertainties associated with output risk indicators. The model uncertainty is then taken into account in the third step by considering possible alternative risk models. The fourth step is intended firstly to provide decision makers with information needed for decision making under uncertainty (parametric and model) and secondly to identify the input parameters that have significant uncertainty contributions on the result. The final step allows the process to be continued in loop by studying the updating of beliefs functions given new data. The proposed methodology was implemented on a real but simplified application of PRA model. (author)

  18. Probabilistic Mu-Calculus

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Mardare, Radu Iulian; Xue, Bingtian

    2016-01-01

    We introduce a version of the probabilistic µ-calculus (PMC) built on top of a probabilistic modal logic that allows encoding n-ary inequational conditions on transition probabilities. PMC extends previously studied calculi and we prove that, despite its expressiveness, it enjoys a series of good...... metaproperties. Firstly, we prove the decidability of satisfiability checking by establishing the small model property. An algorithm for deciding the satisfiability problem is developed. As a second major result, we provide a complete axiomatization for the alternation-free fragment of PMC. The completeness proof...

  19. Foundation plate on the elastic half-space, deterministic and probabilistic approach

    Directory of Open Access Journals (Sweden)

    Tvrdá Katarína

    2017-01-01

    Full Text Available Interaction between the foundation plate and subgrade can be described by different mathematical - physical model. Elastic foundation can be modelled by different types of models, e.g. one-parametric model, two-parametric model and a comprehensive model - Boussinesque (elastic half-space had been used. The article deals with deterministic and probabilistic analysis of deflection of the foundation plate on the elastic half-space. Contact between the foundation plate and subsoil was modelled using contact elements node-node. At the end the obtained results are presented.

  20. A Global Public Goods Approach to the Health of Migrants.

    Science.gov (United States)

    Widdows, Heather; Marway, Herjeet

    2015-07-01

    This paper explores a global public goods approach to the health of migrants. It suggests that this approach establishes that there are a number of health goods which must be provided to migrants not because these are theirs by right (although this may independently be the case), but because these goods are primary goods which fit the threefold criteria of global public goods. There are two key advantages to this approach: first, it is non-confrontational and non-oppositional, and second, it provides self-interested arguments to provide at least some health goods to migrants and thus appeals to those little moved by rights-based arguments.

  1. Probabilistic encoding of stimulus strength in astrocyte global calcium signals.

    Science.gov (United States)

    Croft, Wayne; Reusch, Katharina; Tilunaite, Agne; Russell, Noah A; Thul, Rüdiger; Bellamy, Tomas C

    2016-04-01

    Astrocyte calcium signals can range in size from subcellular microdomains to waves that spread through the whole cell (and into connected cells). The differential roles of such local or global calcium signaling are under intense investigation, but the mechanisms by which local signals evolve into global signals in astrocytes are not well understood, nor are the computational rules by which physiological stimuli are transduced into a global signal. To investigate these questions, we transiently applied receptor agonists linked to calcium signaling to primary cultures of cerebellar astrocytes. Astrocytes repetitively tested with the same stimulus responded with global signals intermittently, indicating that each stimulus had a defined probability for triggering a response. The response probability varied between agonists, increased with agonist concentration, and could be positively and negatively modulated by crosstalk with other signaling pathways. To better understand the processes determining the evolution of a global signal, we recorded subcellular calcium "puffs" throughout the whole cell during stimulation. The key requirement for puffs to trigger a global calcium wave following receptor activation appeared to be the synchronous release of calcium from three or more sites, rather than an increasing calcium load accumulating in the cytosol due to increased puff size, amplitude, or frequency. These results suggest that the concentration of transient stimuli will be encoded into a probability of generating a global calcium response, determined by the likelihood of synchronous release from multiple subcellular sites. © 2015 Wiley Periodicals, Inc.

  2. A probabilistic EAC management of Ni-base Alloy in PWR

    International Nuclear Information System (INIS)

    Lee, Tae Hyun; Hwang, Il Soon

    2009-01-01

    Material aging is a principle cause for the aging of engineering systems that can lead to reduction in reliability and continued safety and increase in the cost of operation and maintenance. As the nuclear power plants get older, aging becomes an issue, because aging degradation can affect the structural integrity of systems and components in the same manner. To ensure the safe operation of nuclear power plants, it is essential to assess the effects of age-related degradation of plant structures, systems, and components. In this study, we propose a framework for probabilistic assessment of primary pressure-boundary components, with particular attention to Environmentally Assisted Cracking (EAC) of pipings and nozzles on Nuclear Power Plants (NPP). The framework on EAC management is targeted for the degradation prediction using mechanism and probabilistic treatment and probabilistic assessment of defect detection and sizing. Also, the EAC-induced failure process has examined the effect of uncertainties in key parameters in models for EAC growth model, final fracture and inspection, based on a sensitivity study and updating using Bayesian inference approach. (author)

  3. Probabilistic reasoning in intelligent systems networks of plausible inference

    CERN Document Server

    Pearl, Judea

    1988-01-01

    Probabilistic Reasoning in Intelligent Systems is a complete and accessible account of the theoretical foundations and computational methods that underlie plausible reasoning under uncertainty. The author provides a coherent explication of probability as a language for reasoning with partial belief and offers a unifying perspective on other AI approaches to uncertainty, such as the Dempster-Shafer formalism, truth maintenance systems, and nonmonotonic logic. The author distinguishes syntactic and semantic approaches to uncertainty--and offers techniques, based on belief networks, that provid

  4. Probabilistic graphs as a conceptual and computational tool in hydrology and water management

    Science.gov (United States)

    Schoups, Gerrit

    2014-05-01

    Originally developed in the fields of machine learning and artificial intelligence, probabilistic graphs constitute a general framework for modeling complex systems in the presence of uncertainty. The framework consists of three components: 1. Representation of the model as a graph (or network), with nodes depicting random variables in the model (e.g. parameters, states, etc), which are joined together by factors. Factors are local probabilistic or deterministic relations between subsets of variables, which, when multiplied together, yield the joint distribution over all variables. 2. Consistent use of probability theory for quantifying uncertainty, relying on basic rules of probability for assimilating data into the model and expressing unknown variables as a function of observations (via the posterior distribution). 3. Efficient, distributed approximation of the posterior distribution using general-purpose algorithms that exploit model structure encoded in the graph. These attributes make probabilistic graphs potentially useful as a conceptual and computational tool in hydrology and water management (and beyond). Conceptually, they can provide a common framework for existing and new probabilistic modeling approaches (e.g. by drawing inspiration from other fields of application), while computationally they can make probabilistic inference feasible in larger hydrological models. The presentation explores, via examples, some of these benefits.

  5. Formalizing Probabilistic Safety Claims

    Science.gov (United States)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  6. Multiaxial probabilistic elastic-plastic constitutive simulations of soils

    Science.gov (United States)

    Sadrinezhad, Arezoo

    Fokker-Planck-Kolmogorov (FPK) equation approach has recently been developed to simulate elastic-plastic constitutive behaviors of materials with uncertain material properties. The FPK equation approach transforms the stochastic constitutive rate equation, which is a stochastic, nonlinear, ordinary differential equation (ODE) in the stress-pseudo time space into a second-order accurate, deterministic, linear FPK partial differential equation (PDE) in the probability density of stress-pseudo time space. This approach does not suffer from the drawbacks of the traditional approaches such as the Monte Carlo approach and the perturbation approach for solving nonlinear ODEs with random coefficients. In this study, the existing one dimensional FPK framework for probabilistic constitutive modeling of soils is extended to multi--dimension. However, the multivariate FPK PDEs cannot be solved using the traditional mathematical techniques such as finite difference techniques due to their high computational cost. Therefore, computationally efficient algorithms based on the Fourier spectral approach are developed for solving a class of FPK PDEs that arises in probabilistic elasto-plasticity. This class includes linear FPK PDEs in (stress) space and (pseudo) time - having space-independent but time-dependent, and both space- and time-dependent coefficients - with impulse initial conditions and reflecting boundary conditions. The solution algorithms, rely on first mapping the stress space of the governing PDE between 0 and 2pi using the change of coordinates rule, followed by approximating the solution of the PDE in the 2pi-periodic domain by a finite Fourier series in the stress space and unknown time-dependent solution coefficients. Finally, the time-dependent solution coefficients are obtained from the initial condition. The accuracy and efficiency of the developed algorithms are tested. The developed algorithms are used to simulate uniaxial and multiaxial, monotonic and cyclic

  7. The Probabilistic Convolution Tree: Efficient Exact Bayesian Inference for Faster LC-MS/MS Protein Inference

    Science.gov (United States)

    Serang, Oliver

    2014-01-01

    Exact Bayesian inference can sometimes be performed efficiently for special cases where a function has commutative and associative symmetry of its inputs (called “causal independence”). For this reason, it is desirable to exploit such symmetry on big data sets. Here we present a method to exploit a general form of this symmetry on probabilistic adder nodes by transforming those probabilistic adder nodes into a probabilistic convolution tree with which dynamic programming computes exact probabilities. A substantial speedup is demonstrated using an illustration example that can arise when identifying splice forms with bottom-up mass spectrometry-based proteomics. On this example, even state-of-the-art exact inference algorithms require a runtime more than exponential in the number of splice forms considered. By using the probabilistic convolution tree, we reduce the runtime to and the space to where is the number of variables joined by an additive or cardinal operator. This approach, which can also be used with junction tree inference, is applicable to graphs with arbitrary dependency on counting variables or cardinalities and can be used on diverse problems and fields like forward error correcting codes, elemental decomposition, and spectral demixing. The approach also trivially generalizes to multiple dimensions. PMID:24626234

  8. Several comparison result of two types of equilibrium (Pareto Schemes and Stackelberg Scheme) of game theory approach in probabilistic vendor – buyer supply chain system with imperfect quality

    Science.gov (United States)

    Setiawan, R.

    2018-05-01

    In this paper, Economic Order Quantity (EOQ) of the vendor-buyer supply-chain model under a probabilistic condition with imperfect quality items has been analysed. The analysis is delivered using two concepts in game theory approach, which is Stackelberg equilibrium and Pareto Optimal, under non-cooperative and cooperative games, respectively. Another result is getting acomparison of theoptimal result between integrated scheme and game theory approach based on analytical and numerical result using appropriate simulation data.

  9. Probabilistic Risk Assessment (PRA): A Practical and Cost Effective Approach

    Science.gov (United States)

    Lee, Lydia L.; Ingegneri, Antonino J.; Djam, Melody

    2006-01-01

    The Lunar Reconnaissance Orbiter (LRO) is the first mission of the Robotic Lunar Exploration Program (RLEP), a space exploration venture to the Moon, Mars and beyond. The LRO mission includes spacecraft developed by NASA Goddard Space Flight Center (GSFC) and seven instruments built by GSFC, Russia, and contractors across the nation. LRO is defined as a measurement mission, not a science mission. It emphasizes the overall objectives of obtaining data to facilitate returning mankind safely to the Moon in preparation for an eventual manned mission to Mars. As the first mission in response to the President's commitment of the journey of exploring the solar system and beyond: returning to the Moon in the next decade, then venturing further into the solar system, ultimately sending humans to Mars and beyond, LRO has high-visibility to the public but limited resources and a tight schedule. This paper demonstrates how NASA's Lunar Reconnaissance Orbiter Mission project office incorporated reliability analyses in assessing risks and performing design tradeoffs to ensure mission success. Risk assessment is performed using NASA Procedural Requirements (NPR) 8705.5 - Probabilistic Risk Assessment (PRA) Procedures for NASA Programs and Projects to formulate probabilistic risk assessment (PRA). As required, a limited scope PRA is being performed for the LRO project. The PRA is used to optimize the mission design within mandated budget, manpower, and schedule constraints. The technique that LRO project office uses to perform PRA relies on the application of a component failure database to quantify the potential mission success risks. To ensure mission success in an efficient manner, low cost and tight schedule, the traditional reliability analyses, such as reliability predictions, Failure Modes and Effects Analysis (FMEA), and Fault Tree Analysis (FTA), are used to perform PRA for the large system of LRO with more than 14,000 piece parts and over 120 purchased or contractor

  10. Solving Unconstrained Global Optimization Problems via Hybrid Swarm Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Jui-Yu Wu

    2013-01-01

    Full Text Available Stochastic global optimization (SGO algorithms such as the particle swarm optimization (PSO approach have become popular for solving unconstrained global optimization (UGO problems. The PSO approach, which belongs to the swarm intelligence domain, does not require gradient information, enabling it to overcome this limitation of traditional nonlinear programming methods. Unfortunately, PSO algorithm implementation and performance depend on several parameters, such as cognitive parameter, social parameter, and constriction coefficient. These parameters are tuned by using trial and error. To reduce the parametrization of a PSO method, this work presents two efficient hybrid SGO approaches, namely, a real-coded genetic algorithm-based PSO (RGA-PSO method and an artificial immune algorithm-based PSO (AIA-PSO method. The specific parameters of the internal PSO algorithm are optimized using the external RGA and AIA approaches, and then the internal PSO algorithm is applied to solve UGO problems. The performances of the proposed RGA-PSO and AIA-PSO algorithms are then evaluated using a set of benchmark UGO problems. Numerical results indicate that, besides their ability to converge to a global minimum for each test UGO problem, the proposed RGA-PSO and AIA-PSO algorithms outperform many hybrid SGO algorithms. Thus, the RGA-PSO and AIA-PSO approaches can be considered alternative SGO approaches for solving standard-dimensional UGO problems.

  11. Using Bayesian Model Averaging (BMA) to calibrate probabilistic surface temperature forecasts over Iran

    Energy Technology Data Exchange (ETDEWEB)

    Soltanzadeh, I. [Tehran Univ. (Iran, Islamic Republic of). Inst. of Geophysics; Azadi, M.; Vakili, G.A. [Atmospheric Science and Meteorological Research Center (ASMERC), Teheran (Iran, Islamic Republic of)

    2011-07-01

    Using Bayesian Model Averaging (BMA), an attempt was made to obtain calibrated probabilistic numerical forecasts of 2-m temperature over Iran. The ensemble employs three limited area models (WRF, MM5 and HRM), with WRF used with five different configurations. Initial and boundary conditions for MM5 and WRF are obtained from the National Centers for Environmental Prediction (NCEP) Global Forecast System (GFS) and for HRM the initial and boundary conditions come from analysis of Global Model Europe (GME) of the German Weather Service. The resulting ensemble of seven members was run for a period of 6 months (from December 2008 to May 2009) over Iran. The 48-h raw ensemble outputs were calibrated using BMA technique for 120 days using a 40 days training sample of forecasts and relative verification data. The calibrated probabilistic forecasts were assessed using rank histogram and attribute diagrams. Results showed that application of BMA improved the reliability of the raw ensemble. Using the weighted ensemble mean forecast as a deterministic forecast it was found that the deterministic-style BMA forecasts performed usually better than the best member's deterministic forecast. (orig.)

  12. Using Bayesian Model Averaging (BMA to calibrate probabilistic surface temperature forecasts over Iran

    Directory of Open Access Journals (Sweden)

    I. Soltanzadeh

    2011-07-01

    Full Text Available Using Bayesian Model Averaging (BMA, an attempt was made to obtain calibrated probabilistic numerical forecasts of 2-m temperature over Iran. The ensemble employs three limited area models (WRF, MM5 and HRM, with WRF used with five different configurations. Initial and boundary conditions for MM5 and WRF are obtained from the National Centers for Environmental Prediction (NCEP Global Forecast System (GFS and for HRM the initial and boundary conditions come from analysis of Global Model Europe (GME of the German Weather Service. The resulting ensemble of seven members was run for a period of 6 months (from December 2008 to May 2009 over Iran. The 48-h raw ensemble outputs were calibrated using BMA technique for 120 days using a 40 days training sample of forecasts and relative verification data. The calibrated probabilistic forecasts were assessed using rank histogram and attribute diagrams. Results showed that application of BMA improved the reliability of the raw ensemble. Using the weighted ensemble mean forecast as a deterministic forecast it was found that the deterministic-style BMA forecasts performed usually better than the best member's deterministic forecast.

  13. Compression of Probabilistic XML documents

    NARCIS (Netherlands)

    Veldman, Irma

    2009-01-01

    Probabilistic XML (PXML) files resulting from data integration can become extremely large, which is undesired. For XML there are several techniques available to compress the document and since probabilistic XML is in fact (a special form of) XML, it might benefit from these methods even more. In

  14. Adjoint-based global variance reduction approach for reactor analysis problems

    International Nuclear Information System (INIS)

    Zhang, Qiong; Abdel-Khalik, Hany S.

    2011-01-01

    A new variant of a hybrid Monte Carlo-Deterministic approach for simulating particle transport problems is presented and compared to the SCALE FW-CADIS approach. The new approach, denoted by the Subspace approach, optimizes the selection of the weight windows for reactor analysis problems where detailed properties of all fuel assemblies are required everywhere in the reactor core. Like the FW-CADIS approach, the Subspace approach utilizes importance maps obtained from deterministic adjoint models to derive automatic weight-window biasing. In contrast to FW-CADIS, the Subspace approach identifies the correlations between weight window maps to minimize the computational time required for global variance reduction, i.e., when the solution is required everywhere in the phase space. The correlations are employed to reduce the number of maps required to achieve the same level of variance reduction that would be obtained with single-response maps. Numerical experiments, serving as proof of principle, are presented to compare the Subspace and FW-CADIS approaches in terms of the global reduction in standard deviation. (author)

  15. Probabilistic Design Storm Method for Improved Flood Estimation in Ungauged Catchments

    Science.gov (United States)

    Berk, Mario; Å pačková, Olga; Straub, Daniel

    2017-12-01

    The design storm approach with event-based rainfall-runoff models is a standard method for design flood estimation in ungauged catchments. The approach is conceptually simple and computationally inexpensive, but the underlying assumptions can lead to flawed design flood estimations. In particular, the implied average recurrence interval (ARI) neutrality between rainfall and runoff neglects uncertainty in other important parameters, leading to an underestimation of design floods. The selection of a single representative critical rainfall duration in the analysis leads to an additional underestimation of design floods. One way to overcome these nonconservative approximations is the use of a continuous rainfall-runoff model, which is associated with significant computational cost and requires rainfall input data that are often not readily available. As an alternative, we propose a novel Probabilistic Design Storm method that combines event-based flood modeling with basic probabilistic models and concepts from reliability analysis, in particular the First-Order Reliability Method (FORM). The proposed methodology overcomes the limitations of the standard design storm approach, while utilizing the same input information and models without excessive computational effort. Additionally, the Probabilistic Design Storm method allows deriving so-called design charts, which summarize representative design storm events (combinations of rainfall intensity and other relevant parameters) for floods with different return periods. These can be used to study the relationship between rainfall and runoff return periods. We demonstrate, investigate, and validate the method by means of an example catchment located in the Bavarian Pre-Alps, in combination with a simple hydrological model commonly used in practice.

  16. Suggestions for an improved HRA method for use in Probabilistic Safety Assessment

    International Nuclear Information System (INIS)

    Parry, Gareth W.

    1995-01-01

    This paper discusses why an improved Human Reliability Analysis (HRA) approach for use in Probabilistic Safety Assessments (PSAs) is needed, and proposes a set of requirements on the improved HRA method. The constraints imposed by the need to embed the approach into the PSA methodology are discussed. One approach to laying the foundation for an improved method, using models from the cognitive psychology and behavioral science disciplines, is outlined

  17. Probabilistic Infinite Secret Sharing

    OpenAIRE

    Csirmaz, László

    2013-01-01

    The study of probabilistic secret sharing schemes using arbitrary probability spaces and possibly infinite number of participants lets us investigate abstract properties of such schemes. It highlights important properties, explains why certain definitions work better than others, connects this topic to other branches of mathematics, and might yield new design paradigms. A probabilistic secret sharing scheme is a joint probability distribution of the shares and the secret together with a colle...

  18. Developing a Korean standard brain atlas on the basis of statistical and probabilistic approach and visualization tool for functional image analysis

    Energy Technology Data Exchange (ETDEWEB)

    Koo, B. B.; Lee, J. M.; Kim, J. S.; Kim, I. Y.; Kim, S. I. [Hanyang University, Seoul (Korea, Republic of); Lee, J. S.; Lee, D. S.; Kwon, J. S. [Seoul National University College of Medicine, Seoul (Korea, Republic of); Kim, J. J. [Yonsei University College of Medicine, Seoul (Korea, Republic of)

    2003-06-01

    The probabilistic anatomical maps are used to localize the functional neuro-images and morphological variability. The quantitative indicator is very important to inquire the anatomical position of an activated region because functional image data has the low-resolution nature and no inherent anatomical information. Although previously developed MNI probabilistic anatomical map was enough to localize the data, it was not suitable for the Korean brains because of the morphological difference between Occidental and Oriental. In this study, we develop a probabilistic anatomical map for Korean normal brain. Normal 75 brains of T1-weighted spoiled gradient echo magnetic resonance images were acquired on a 1.5-T GESIGNA scanner. Then, a standard brain is selected in the group through a clinician searches a brain of the average property in the Talairach coordinate system. With the standard brain, an anatomist delineates 89 regions of interest (ROI) parcellating cortical and subcortical areas. The parcellated ROIs of the standard are warped and overlapped into each brain by maximizing intensity similarity. And every brain is automatically labeled with the registered ROIs. Each of the same-labeled region is linearly normalize to the standard brain, and the occurrence of each region is counted. Finally, 89 probabilistic ROI volumes are generated. This paper presents a probabilistic anatomical map for localizing the functional and structural analysis of Korean normal brain. In the future, we'll develop the group specific probabilistic anatomical maps of OCD and schizophrenia disease.

  19. Developing a Korean standard brain atlas on the basis of statistical and probabilistic approach and visualization tool for functional image analysis

    International Nuclear Information System (INIS)

    Koo, B. B.; Lee, J. M.; Kim, J. S.; Kim, I. Y.; Kim, S. I.; Lee, J. S.; Lee, D. S.; Kwon, J. S.; Kim, J. J.

    2003-01-01

    The probabilistic anatomical maps are used to localize the functional neuro-images and morphological variability. The quantitative indicator is very important to inquire the anatomical position of an activated region because functional image data has the low-resolution nature and no inherent anatomical information. Although previously developed MNI probabilistic anatomical map was enough to localize the data, it was not suitable for the Korean brains because of the morphological difference between Occidental and Oriental. In this study, we develop a probabilistic anatomical map for Korean normal brain. Normal 75 brains of T1-weighted spoiled gradient echo magnetic resonance images were acquired on a 1.5-T GESIGNA scanner. Then, a standard brain is selected in the group through a clinician searches a brain of the average property in the Talairach coordinate system. With the standard brain, an anatomist delineates 89 regions of interest (ROI) parcellating cortical and subcortical areas. The parcellated ROIs of the standard are warped and overlapped into each brain by maximizing intensity similarity. And every brain is automatically labeled with the registered ROIs. Each of the same-labeled region is linearly normalize to the standard brain, and the occurrence of each region is counted. Finally, 89 probabilistic ROI volumes are generated. This paper presents a probabilistic anatomical map for localizing the functional and structural analysis of Korean normal brain. In the future, we'll develop the group specific probabilistic anatomical maps of OCD and schizophrenia disease

  20. Hazardous waste transportation risk assessment: Benefits of a combined deterministic and probabilistic Monte Carlo approach in expressing risk uncertainty

    International Nuclear Information System (INIS)

    Policastro, A.J.; Lazaro, M.A.; Cowen, M.A.; Hartmann, H.M.; Dunn, W.E.; Brown, D.F.

    1995-01-01

    This paper presents a combined deterministic and probabilistic methodology for modeling hazardous waste transportation risk and expressing the uncertainty in that risk. Both the deterministic and probabilistic methodologies are aimed at providing tools useful in the evaluation of alternative management scenarios for US Department of Energy (DOE) hazardous waste treatment, storage, and disposal (TSD). The probabilistic methodology can be used to provide perspective on and quantify uncertainties in deterministic predictions. The methodology developed has been applied to 63 DOE shipments made in fiscal year 1992, which contained poison by inhalation chemicals that represent an inhalation risk to the public. Models have been applied to simulate shipment routes, truck accident rates, chemical spill probabilities, spill/release rates, dispersion, population exposure, and health consequences. The simulation presented in this paper is specific to trucks traveling from DOE sites to their commercial TSD facilities, but the methodology is more general. Health consequences are presented as the number of people with potentially life-threatening health effects. Probabilistic distributions were developed (based on actual item data) for accident release amounts, time of day and season of the accident, and meteorological conditions

  1. Probabilistic assessments of climate change impacts on durum wheat in the Mediterranean region

    Directory of Open Access Journals (Sweden)

    R. Ferrise

    2011-05-01

    Full Text Available Recently, the availability of multi-model ensemble prediction methods has permitted a shift from a scenario-based approach to a risk-based approach in assessing the effects of climate change. This provides more useful information to decision-makers who need probability estimates to assess the seriousness of the projected impacts.

    In this study, a probabilistic framework for evaluating the risk of durum wheat yield shortfall over the Mediterranean Basin has been exploited. An artificial neural network, trained to emulate the outputs of a process-based crop growth model, has been adopted to create yield response surfaces which are then overlaid with probabilistic projections of future temperature and precipitation changes in order to estimate probabilistic projections of future yields. The risk is calculated as the relative frequency of projected yields below a selected threshold.

    In contrast to previous studies, which suggest that the beneficial effects of elevated atmospheric CO2 concentration over the next few decades would outweigh the detrimental effects of the early stages of climatic warming and drying, the results of this study are of greater concern.

  2. Incorporating organizational factors into probabilistic safety assessment of nuclear power plants through canonical probabilistic models

    Energy Technology Data Exchange (ETDEWEB)

    Galan, S.F. [Dpto. de Inteligencia Artificial, E.T.S.I. Informatica (UNED), Juan del Rosal, 16, 28040 Madrid (Spain)]. E-mail: seve@dia.uned.es; Mosleh, A. [2100A Marie Mount Hall, Materials and Nuclear Engineering Department, University of Maryland, College Park, MD 20742 (United States)]. E-mail: mosleh@umd.edu; Izquierdo, J.M. [Area de Modelado y Simulacion, Consejo de Seguridad Nuclear, Justo Dorado, 11, 28040 Madrid (Spain)]. E-mail: jmir@csn.es

    2007-08-15

    The {omega}-factor approach is a method that explicitly incorporates organizational factors into Probabilistic safety assessment of nuclear power plants. Bayesian networks (BNs) are the underlying formalism used in this approach. They have a structural part formed by a graph whose nodes represent organizational variables, and a parametric part that consists of conditional probabilities, each of them quantifying organizational influences between one variable and its parents in the graph. The aim of this paper is twofold. First, we discuss some important limitations of current procedures in the {omega}-factor approach for either assessing conditional probabilities from experts or estimating them from data. We illustrate the discussion with an example that uses data from Licensee Events Reports of nuclear power plants for the estimation task. Second, we introduce significant improvements in the way BNs for the {omega}-factor approach can be constructed, so that parameter acquisition becomes easier and more intuitive. The improvements are based on the use of noisy-OR gates as model of multicausal interaction between each BN node and its parents.

  3. Incorporating organizational factors into probabilistic safety assessment of nuclear power plants through canonical probabilistic models

    International Nuclear Information System (INIS)

    Galan, S.F.; Mosleh, A.; Izquierdo, J.M.

    2007-01-01

    The ω-factor approach is a method that explicitly incorporates organizational factors into Probabilistic safety assessment of nuclear power plants. Bayesian networks (BNs) are the underlying formalism used in this approach. They have a structural part formed by a graph whose nodes represent organizational variables, and a parametric part that consists of conditional probabilities, each of them quantifying organizational influences between one variable and its parents in the graph. The aim of this paper is twofold. First, we discuss some important limitations of current procedures in the ω-factor approach for either assessing conditional probabilities from experts or estimating them from data. We illustrate the discussion with an example that uses data from Licensee Events Reports of nuclear power plants for the estimation task. Second, we introduce significant improvements in the way BNs for the ω-factor approach can be constructed, so that parameter acquisition becomes easier and more intuitive. The improvements are based on the use of noisy-OR gates as model of multicausal interaction between each BN node and its parents

  4. Probabilistic Methods for the Quantification of Uncertainty and Error in Computational Fluid Dynamic Simulations

    National Research Council Canada - National Science Library

    Faragher, John

    2004-01-01

    ... conservatism to allow for them. This report examines the feasibility of using a probabilistic approach for modelling the component temperatures in an engine using CFD (Computational Fluid Dynamics).

  5. Probabilistic brains: knowns and unknowns

    Science.gov (United States)

    Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E

    2015-01-01

    There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561

  6. Probabilistic simple sticker systems

    Science.gov (United States)

    Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod

    2017-04-01

    A model for DNA computing using the recombination behavior of DNA molecules, known as a sticker system, was introduced by by L. Kari, G. Paun, G. Rozenberg, A. Salomaa, and S. Yu in the paper entitled DNA computing, sticker systems and universality from the journal of Acta Informatica vol. 35, pp. 401-420 in the year 1998. A sticker system uses the Watson-Crick complementary feature of DNA molecules: starting from the incomplete double stranded sequences, and iteratively using sticking operations until a complete double stranded sequence is obtained. It is known that sticker systems with finite sets of axioms and sticker rules generate only regular languages. Hence, different types of restrictions have been considered to increase the computational power of sticker systems. Recently, a variant of restricted sticker systems, called probabilistic sticker systems, has been introduced [4]. In this variant, the probabilities are initially associated with the axioms, and the probability of a generated string is computed by multiplying the probabilities of all occurrences of the initial strings in the computation of the string. Strings for the language are selected according to some probabilistic requirements. In this paper, we study fundamental properties of probabilistic simple sticker systems. We prove that the probabilistic enhancement increases the computational power of simple sticker systems.

  7. Improving predictive power of physically based rainfall-induced shallow landslide models: a probabilistic approach

    Directory of Open Access Journals (Sweden)

    S. Raia

    2014-03-01

    Full Text Available Distributed models to forecast the spatial and temporal occurrence of rainfall-induced shallow landslides are based on deterministic laws. These models extend spatially the static stability models adopted in geotechnical engineering, and adopt an infinite-slope geometry to balance the resisting and the driving forces acting on the sliding mass. An infiltration model is used to determine how rainfall changes pore-water conditions, modulating the local stability/instability conditions. A problem with the operation of the existing models lays in the difficulty in obtaining accurate values for the several variables that describe the material properties of the slopes. The problem is particularly severe when the models are applied over large areas, for which sufficient information on the geotechnical and hydrological conditions of the slopes is not generally available. To help solve the problem, we propose a probabilistic Monte Carlo approach to the distributed modeling of rainfall-induced shallow landslides. For this purpose, we have modified the transient rainfall infiltration and grid-based regional slope-stability analysis (TRIGRS code. The new code (TRIGRS-P adopts a probabilistic approach to compute, on a cell-by-cell basis, transient pore-pressure changes and related changes in the factor of safety due to rainfall infiltration. Infiltration is modeled using analytical solutions of partial differential equations describing one-dimensional vertical flow in isotropic, homogeneous materials. Both saturated and unsaturated soil conditions can be considered. TRIGRS-P copes with the natural variability inherent to the mechanical and hydrological properties of the slope materials by allowing values of the TRIGRS model input parameters to be sampled randomly from a given probability distribution. The range of variation and the mean value of the parameters can be determined by the usual methods used for preparing the TRIGRS input parameters. The outputs

  8. Shear-wave velocity-based probabilistic and deterministic assessment of seismic soil liquefaction potential

    Science.gov (United States)

    Kayen, R.; Moss, R.E.S.; Thompson, E.M.; Seed, R.B.; Cetin, K.O.; Der Kiureghian, A.; Tanaka, Y.; Tokimatsu, K.

    2013-01-01

    Shear-wave velocity (Vs) offers a means to determine the seismic resistance of soil to liquefaction by a fundamental soil property. This paper presents the results of an 11-year international project to gather new Vs site data and develop probabilistic correlations for seismic soil liquefaction occurrence. Toward that objective, shear-wave velocity test sites were identified, and measurements made for 301 new liquefaction field case histories in China, Japan, Taiwan, Greece, and the United States over a decade. The majority of these new case histories reoccupy those previously investigated by penetration testing. These new data are combined with previously published case histories to build a global catalog of 422 case histories of Vs liquefaction performance. Bayesian regression and structural reliability methods facilitate a probabilistic treatment of the Vs catalog for performance-based engineering applications. Where possible, uncertainties of the variables comprising both the seismic demand and the soil capacity were estimated and included in the analysis, resulting in greatly reduced overall model uncertainty relative to previous studies. The presented data set and probabilistic analysis also help resolve the ancillary issues of adjustment for soil fines content and magnitude scaling factors.

  9. Probabilistic numerical discrimination in mice.

    Science.gov (United States)

    Berkay, Dilara; Çavdaroğlu, Bilgehan; Balcı, Fuat

    2016-03-01

    Previous studies showed that both human and non-human animals can discriminate between different quantities (i.e., time intervals, numerosities) with a limited level of precision due to their endogenous/representational uncertainty. In addition, other studies have shown that subjects can modulate their temporal categorization responses adaptively by incorporating information gathered regarding probabilistic contingencies into their time-based decisions. Despite the psychophysical similarities between the interval timing and nonverbal counting functions, the sensitivity of count-based decisions to probabilistic information remains an unanswered question. In the current study, we investigated whether exogenous probabilistic information can be integrated into numerosity-based judgments by mice. In the task employed in this study, reward was presented either after few (i.e., 10) or many (i.e., 20) lever presses, the last of which had to be emitted on the lever associated with the corresponding trial type. In order to investigate the effect of probabilistic information on performance in this task, we manipulated the relative frequency of different trial types across different experimental conditions. We evaluated the behavioral performance of the animals under models that differed in terms of their assumptions regarding the cost of responding (e.g., logarithmically increasing vs. no response cost). Our results showed for the first time that mice could adaptively modulate their count-based decisions based on the experienced probabilistic contingencies in directions predicted by optimality.

  10. Probabilistic Design and Analysis Framework

    Science.gov (United States)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  11. Probabilistic Role Models and the Guarded Fragment

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...... fragment of first-order logic. A characterization of equivalence in that logic in terms of bisimulations is given....

  12. Probabilistic role models and the guarded fragment

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...... fragment of first-order logic. A characterization of equivalence in that logic in terms of bisimulations is given....

  13. Comparison of probabilistic and deterministic fiber tracking of cranial nerves.

    Science.gov (United States)

    Zolal, Amir; Sobottka, Stephan B; Podlesek, Dino; Linn, Jennifer; Rieger, Bernhard; Juratli, Tareq A; Schackert, Gabriele; Kitzler, Hagen H

    2017-09-01

    OBJECTIVE The depiction of cranial nerves (CNs) using diffusion tensor imaging (DTI) is of great interest in skull base tumor surgery and DTI used with deterministic tracking methods has been reported previously. However, there are still no good methods usable for the elimination of noise from the resulting depictions. The authors have hypothesized that probabilistic tracking could lead to more accurate results, because it more efficiently extracts information from the underlying data. Moreover, the authors have adapted a previously described technique for noise elimination using gradual threshold increases to probabilistic tracking. To evaluate the utility of this new approach, a comparison is provided with this work between the gradual threshold increase method in probabilistic and deterministic tracking of CNs. METHODS Both tracking methods were used to depict CNs II, III, V, and the VII+VIII bundle. Depiction of 240 CNs was attempted with each of the above methods in 30 healthy subjects, which were obtained from 2 public databases: the Kirby repository (KR) and Human Connectome Project (HCP). Elimination of erroneous fibers was attempted by gradually increasing the respective thresholds (fractional anisotropy [FA] and probabilistic index of connectivity [PICo]). The results were compared with predefined ground truth images based on corresponding anatomical scans. Two label overlap measures (false-positive error and Dice similarity coefficient) were used to evaluate the success of both methods in depicting the CN. Moreover, the differences between these parameters obtained from the KR and HCP (with higher angular resolution) databases were evaluated. Additionally, visualization of 10 CNs in 5 clinical cases was attempted with both methods and evaluated by comparing the depictions with intraoperative findings. RESULTS Maximum Dice similarity coefficients were significantly higher with probabilistic tracking (p cranial nerves. Probabilistic tracking with a gradual

  14. Probabilistic Programming (Invited Talk)

    OpenAIRE

    Yang, Hongseok

    2017-01-01

    Probabilistic programming refers to the idea of using standard programming constructs for specifying probabilistic models from machine learning and statistics, and employing generic inference algorithms for answering various queries on these models, such as posterior inference and estimation of model evidence. Although this idea itself is not new and was, in fact, explored by several programming-language and statistics researchers in the early 2000, it is only in the last few years that proba...

  15. Children as Global Citizens: A Socratic Approach to Teaching Character

    Science.gov (United States)

    Helterbran, Valeri R.; Strahler, Brianna R.

    2013-01-01

    Educators around the world are being challenged to promote positive global citizenship skills in the face of daily news concerning widespread discord, dissonance, injustice, and corruption. This article describes a Socratic approach to developing global citizenship. Recognizing the central role of teachers in educating future generations of a…

  16. Probabilistic Harmonic Modeling of Wind Power Plants

    DEFF Research Database (Denmark)

    Guest, Emerson; Jensen, Kim H.; Rasmussen, Tonny Wederberg

    2017-01-01

    A probabilistic sequence domain (SD) harmonic model of a grid-connected voltage-source converter is used to estimate harmonic emissions in a wind power plant (WPP) comprised of Type-IV wind turbines. The SD representation naturally partitioned converter generated voltage harmonics into those...... with deterministic phase and those with probabilistic phase. A case study performed on a string of ten 3MW, Type-IV wind turbines implemented in PSCAD was used to verify the probabilistic SD harmonic model. The probabilistic SD harmonic model can be employed in the planning phase of WPP projects to assess harmonic...

  17. Bayesian Hierarchical Structure for Quantifying Population Variability to Inform Probabilistic Health Risk Assessments.

    Science.gov (United States)

    Shao, Kan; Allen, Bruce C; Wheeler, Matthew W

    2017-10-01

    Human variability is a very important factor considered in human health risk assessment for protecting sensitive populations from chemical exposure. Traditionally, to account for this variability, an interhuman uncertainty factor is applied to lower the exposure limit. However, using a fixed uncertainty factor rather than probabilistically accounting for human variability can hardly support probabilistic risk assessment advocated by a number of researchers; new methods are needed to probabilistically quantify human population variability. We propose a Bayesian hierarchical model to quantify variability among different populations. This approach jointly characterizes the distribution of risk at background exposure and the sensitivity of response to exposure, which are commonly represented by model parameters. We demonstrate, through both an application to real data and a simulation study, that using the proposed hierarchical structure adequately characterizes variability across different populations. © 2016 Society for Risk Analysis.

  18. Topics in Probabilistic Judgment Aggregation

    Science.gov (United States)

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  19. Improved detection of chemical substances from colorimetric sensor data using probabilistic machine learning

    DEFF Research Database (Denmark)

    Mølgaard, Lasse Lohilahti; Buus, Ole Thomsen; Larsen, Jan

    2017-01-01

    We present a data-driven machine learning approach to detect drug- and explosives-precursors using colorimetric sensor technology for air-sampling. The sensing technology has been developed in the context of the CRIM-TRACK project. At present a fully- integrated portable prototype for air sampling...... of the highly multi-variate data produced from the colorimetric chip a number of machine learning techniques are employed to provide reliable classification of target analytes from confounders found in the air streams. We demonstrate that a data-driven machine learning method using dimensionality reduction...... in combination with a probabilistic classifier makes it possible to produce informative features and a high detection rate of analytes. Furthermore, the probabilistic machine learning approach provides a means of automatically identifying unreliable measurements that could produce false predictions...

  20. A logic for inductive probabilistic reasoning

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2005-01-01

    Inductive probabilistic reasoning is understood as the application of inference patterns that use statistical background information to assign (subjective) probabilities to single events. The simplest such inference pattern is direct inference: from '70% of As are Bs" and "a is an A" infer...... that a is a B with probability 0.7. Direct inference is generalized by Jeffrey's rule and the principle of cross-entropy minimization. To adequately formalize inductive probabilistic reasoning is an interesting topic for artificial intelligence, as an autonomous system acting in a complex environment may have...... to base its actions on a probabilistic model of its environment, and the probabilities needed to form this model can often be obtained by combining statistical background information with particular observations made, i.e., by inductive probabilistic reasoning. In this paper a formal framework...

  1. A probabilistic scenario approach for developing improved Reduced Emissions from Deforestation and Degradation (REDD+ baselines

    Directory of Open Access Journals (Sweden)

    Malika Virah-Sawmy

    2015-07-01

    By generating robust probabilistic baseline scenarios, exponential smoothing models can facilitate the effectiveness of REDD+ payments, support a more efficient allocation of scarce conservation resources, and improve our understanding of effective forest conservation investments, also beyond REDD+.

  2. A convergence theory for probabilistic metric spaces | Jäger ...

    African Journals Online (AJOL)

    We develop a theory of probabilistic convergence spaces based on Tardiff's neighbourhood systems for probabilistic metric spaces. We show that the resulting category is a topological universe and we characterize a subcategory that is isomorphic to the category of probabilistic metric spaces. Keywords: Probabilistic metric ...

  3. River Flow Prediction Using the Nearest Neighbor Probabilistic Ensemble Method

    Directory of Open Access Journals (Sweden)

    H. Sanikhani

    2016-02-01

    Full Text Available Introduction: In the recent years, researchers interested on probabilistic forecasting of hydrologic variables such river flow.A probabilistic approach aims at quantifying the prediction reliability through a probability distribution function or a prediction interval for the unknown future value. The evaluation of the uncertainty associated to the forecast is seen as a fundamental information, not only to correctly assess the prediction, but also to compare forecasts from different methods and to evaluate actions and decisions conditionally on the expected values. Several probabilistic approaches have been proposed in the literature, including (1 methods that use resampling techniques to assess parameter and model uncertainty, such as the Metropolis algorithm or the Generalized Likelihood Uncertainty Estimation (GLUE methodology for an application to runoff prediction, (2 methods based on processing the forecast errors of past data to produce the probability distributions of future values and (3 methods that evaluate how the uncertainty propagates from the rainfall forecast to the river discharge prediction, as the Bayesian forecasting system. Materials and Methods: In this study, two different probabilistic methods are used for river flow prediction.Then the uncertainty related to the forecast is quantified. One approach is based on linear predictors and in the other, nearest neighbor was used. The nonlinear probabilistic ensemble can be used for nonlinear time series analysis using locally linear predictors, while NNPE utilize a method adapted for one step ahead nearest neighbor methods. In this regard, daily river discharge (twelve years of Dizaj and Mashin Stations on Baranduz-Chay basin in west Azerbijan and Zard-River basin in Khouzestan provinces were used, respectively. The first six years of data was applied for fitting the model. The next three years was used to calibration and the remained three yeas utilized for testing the models

  4. Development of optimization-based probabilistic earthquake scenarios for the city of Tehran

    Science.gov (United States)

    Zolfaghari, M. R.; Peyghaleh, E.

    2016-01-01

    This paper presents the methodology and practical example for the application of optimization process to select earthquake scenarios which best represent probabilistic earthquake hazard in a given region. The method is based on simulation of a large dataset of potential earthquakes, representing the long-term seismotectonic characteristics in a given region. The simulation process uses Monte-Carlo simulation and regional seismogenic source parameters to generate a synthetic earthquake catalogue consisting of a large number of earthquakes, each characterized with magnitude, location, focal depth and fault characteristics. Such catalogue provides full distributions of events in time, space and size; however, demands large computation power when is used for risk assessment, particularly when other sources of uncertainties are involved in the process. To reduce the number of selected earthquake scenarios, a mixed-integer linear program formulation is developed in this study. This approach results in reduced set of optimization-based probabilistic earthquake scenario, while maintaining shape of hazard curves and full probabilistic picture by minimizing the error between hazard curves driven by full and reduced sets of synthetic earthquake scenarios. To test the model, the regional seismotectonic and seismogenic characteristics of northern Iran are used to simulate a set of 10,000-year worth of events consisting of some 84,000 earthquakes. The optimization model is then performed multiple times with various input data, taking into account probabilistic seismic hazard for Tehran city as the main constrains. The sensitivity of the selected scenarios to the user-specified site/return period error-weight is also assessed. The methodology could enhance run time process for full probabilistic earthquake studies like seismic hazard and risk assessment. The reduced set is the representative of the contributions of all possible earthquakes; however, it requires far less

  5. PROBABILISTIC FINITE ELEMENT ANALYSIS OF A HEAVY DUTY RADIATOR UNDER INTERNAL PRESSURE LOADING

    Directory of Open Access Journals (Sweden)

    ROBIN ROY P.

    2017-09-01

    Full Text Available Engine cooling is vital in keeping the engine at most efficient temperature for the different vehicle speed and operating road conditions. Radiator is one of the key components in the heavy duty engine cooling system. Heavy duty radiator is subjected to various kinds of loading such as pressure, thermal, vibration, internal erosion, external corrosion, creep. Pressure cycle durability is one of the most important characteristic in the design of heavy duty radiator. Current design methodologies involve design of heavy duty radiator using the nominal finite element approach which does not take into account of the variations occurring in the geometry, material and boundary condition, leading to over conservative and uneconomical designs of radiator system. A new approach is presented in the paper to integrate traditional linear finite element method and probabilistic approach to design a heavy duty radiator by including the uncertainty in the computational model. As a first step, nominal run is performed with input design variables and desired responses are extracted. A probabilistic finite elementanalysis is performed to identify the robust designs and validated for reliability. Probabilistic finite element includes the uncertainty of the material thickness, dimensional and geometrical variation. Gaussian distribution is employed to define the random variation and uncertainty. Monte Carlo method is used to generate the random design points.Output response distributions of the random design points are post-processed using different statistical and probability technique to find the robust design. The above approach of systematic virtual modelling and analysis of the data helps to find efficient and reliable robust design.

  6. Stochastic Simulation and Forecast of Hydrologic Time Series Based on Probabilistic Chaos Expansion

    Science.gov (United States)

    Li, Z.; Ghaith, M.

    2017-12-01

    Hydrological processes are characterized by many complex features, such as nonlinearity, dynamics and uncertainty. How to quantify and address such complexities and uncertainties has been a challenging task for water engineers and managers for decades. To support robust uncertainty analysis, an innovative approach for the stochastic simulation and forecast of hydrologic time series is developed is this study. Probabilistic Chaos Expansions (PCEs) are established through probabilistic collocation to tackle uncertainties associated with the parameters of traditional hydrological models. The uncertainties are quantified in model outputs as Hermite polynomials with regard to standard normal random variables. Sequentially, multivariate analysis techniques are used to analyze the complex nonlinear relationships between meteorological inputs (e.g., temperature, precipitation, evapotranspiration, etc.) and the coefficients of the Hermite polynomials. With the established relationships between model inputs and PCE coefficients, forecasts of hydrologic time series can be generated and the uncertainties in the future time series can be further tackled. The proposed approach is demonstrated using a case study in China and is compared to a traditional stochastic simulation technique, the Markov-Chain Monte-Carlo (MCMC) method. Results show that the proposed approach can serve as a reliable proxy to complicated hydrological models. It can provide probabilistic forecasting in a more computationally efficient manner, compared to the traditional MCMC method. This work provides technical support for addressing uncertainties associated with hydrological modeling and for enhancing the reliability of hydrological modeling results. Applications of the developed approach can be extended to many other complicated geophysical and environmental modeling systems to support the associated uncertainty quantification and risk analysis.

  7. Applications of nuclear safety probabilistic risk assessment to nuclear security for optimized risk mitigation

    Energy Technology Data Exchange (ETDEWEB)

    Donnelly, S.K.; Harvey, S.B. [Amec Foster Wheeler, Toronto, Ontario (Canada)

    2016-06-15

    Critical infrastructure assets such as nuclear power generating stations are potential targets for malevolent acts. Probabilistic methodologies can be applied to evaluate the real-time security risk based upon intelligence and threat levels. By employing this approach, the application of security forces and other protective measures can be optimized. Existing probabilistic safety analysis (PSA) methodologies and tools employed. in the nuclear industry can be adapted to security applications for this purpose. Existing PSA models can also be adapted and enhanced to consider total plant risk, due to nuclear safety risks as well as security risks. By creating a Probabilistic Security Model (PSM), safety and security practitioners can maximize the safety and security of the plant while minimizing the significant costs associated with security upgrades and security forces. (author)

  8. A Practical Probabilistic Graphical Modeling Tool for Weighing ...

    Science.gov (United States)

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for ecological risk determinations. Probabilistic approaches can provide both a quantitative weighing of lines of evidence and methods for evaluating risk and uncertainty. The current modeling structure wasdeveloped for propagating uncertainties in measured endpoints and their influence on the plausibility of adverse effects. To illustrate the approach, we apply the model framework to the sediment quality triad using example lines of evidence for sediment chemistry measurements, bioassay results, and in situ infauna diversity of benthic communities using a simplified hypothetical case study. We then combine the three lines evidence and evaluate sensitivity to the input parameters, and show how uncertainties are propagated and how additional information can be incorporated to rapidly update the probability of impacts. The developed network model can be expanded to accommodate additional lines of evidence, variables and states of importance, and different types of uncertainties in the lines of evidence including spatial and temporal as well as measurement errors. We provide a flexible Bayesian network structure for weighing and integrating lines of evidence for ecological risk determinations

  9. Centralized Multi-Sensor Square Root Cubature Joint Probabilistic Data Association

    Directory of Open Access Journals (Sweden)

    Yu Liu

    2017-11-01

    Full Text Available This paper focuses on the tracking problem of multiple targets with multiple sensors in a nonlinear cluttered environment. To avoid Jacobian matrix computation and scaling parameter adjustment, improve numerical stability, and acquire more accurate estimated results for centralized nonlinear tracking, a novel centralized multi-sensor square root cubature joint probabilistic data association algorithm (CMSCJPDA is proposed. Firstly, the multi-sensor tracking problem is decomposed into several single-sensor multi-target tracking problems, which are sequentially processed during the estimation. Then, in each sensor, the assignment of its measurements to target tracks is accomplished on the basis of joint probabilistic data association (JPDA, and a weighted probability fusion method with square root version of a cubature Kalman filter (SRCKF is utilized to estimate the targets’ state. With the measurements in all sensors processed CMSCJPDA is derived and the global estimated state is achieved. Experimental results show that CMSCJPDA is superior to the state-of-the-art algorithms in the aspects of tracking accuracy, numerical stability, and computational cost, which provides a new idea to solve multi-sensor tracking problems.

  10. Probabilistic studies of accident sequences

    International Nuclear Information System (INIS)

    Villemeur, A.; Berger, J.P.

    1986-01-01

    For several years, Electricite de France has carried out probabilistic assessment of accident sequences for nuclear power plants. In the framework of this program many methods were developed. As the interest in these studies was increasing and as adapted methods were developed, Electricite de France has undertaken a probabilistic safety assessment of a nuclear power plant [fr

  11. Probabilistic framework for product design optimization and risk management

    Science.gov (United States)

    Keski-Rahkonen, J. K.

    2018-05-01

    Probabilistic methods have gradually gained ground within engineering practices but currently it is still the industry standard to use deterministic safety margin approaches to dimensioning components and qualitative methods to manage product risks. These methods are suitable for baseline design work but quantitative risk management and product reliability optimization require more advanced predictive approaches. Ample research has been published on how to predict failure probabilities for mechanical components and furthermore to optimize reliability through life cycle cost analysis. This paper reviews the literature for existing methods and tries to harness their best features and simplify the process to be applicable in practical engineering work. Recommended process applies Monte Carlo method on top of load-resistance models to estimate failure probabilities. Furthermore, it adds on existing literature by introducing a practical framework to use probabilistic models in quantitative risk management and product life cycle costs optimization. The main focus is on mechanical failure modes due to the well-developed methods used to predict these types of failures. However, the same framework can be applied on any type of failure mode as long as predictive models can be developed.

  12. Real-time probabilistic covariance tracking with efficient model update.

    Science.gov (United States)

    Wu, Yi; Cheng, Jian; Wang, Jinqiao; Lu, Hanqing; Wang, Jun; Ling, Haibin; Blasch, Erik; Bai, Li

    2012-05-01

    The recently proposed covariance region descriptor has been proven robust and versatile for a modest computational cost. The covariance matrix enables efficient fusion of different types of features, where the spatial and statistical properties, as well as their correlation, are characterized. The similarity between two covariance descriptors is measured on Riemannian manifolds. Based on the same metric but with a probabilistic framework, we propose a novel tracking approach on Riemannian manifolds with a novel incremental covariance tensor learning (ICTL). To address the appearance variations, ICTL incrementally learns a low-dimensional covariance tensor representation and efficiently adapts online to appearance changes of the target with only O(1) computational complexity, resulting in a real-time performance. The covariance-based representation and the ICTL are then combined with the particle filter framework to allow better handling of background clutter, as well as the temporary occlusions. We test the proposed probabilistic ICTL tracker on numerous benchmark sequences involving different types of challenges including occlusions and variations in illumination, scale, and pose. The proposed approach demonstrates excellent real-time performance, both qualitatively and quantitatively, in comparison with several previously proposed trackers.

  13. Convex sets in probabilistic normed spaces

    International Nuclear Information System (INIS)

    Aghajani, Asadollah; Nourouzi, Kourosh

    2008-01-01

    In this paper we obtain some results on convexity in a probabilistic normed space. We also investigate the concept of CSN-closedness and CSN-compactness in a probabilistic normed space and generalize the corresponding results of normed spaces

  14. Action Recognition Using Motion Primitives and Probabilistic Edit Distance

    DEFF Research Database (Denmark)

    Fihl, Preben; Holte, Michael Boelstoft; Moeslund, Thomas B.

    2006-01-01

    In this paper we describe a recognition approach based on the notion of primitives. As opposed to recognizing actions based on temporal trajectories or temporal volumes, primitive-based recognition is based on representing a temporal sequence containing an action by only a few characteristic time...... into a string containing a sequence of symbols, each representing a primitives. After pruning the string a probabilistic Edit Distance classifier is applied to identify which action best describes the pruned string. The approach is evaluated on five one-arm gestures and the recognition rate is 91...

  15. Probabilistic finite elements

    Science.gov (United States)

    Belytschko, Ted; Wing, Kam Liu

    1987-01-01

    In the Probabilistic Finite Element Method (PFEM), finite element methods have been efficiently combined with second-order perturbation techniques to provide an effective method for informing the designer of the range of response which is likely in a given problem. The designer must provide as input the statistical character of the input variables, such as yield strength, load magnitude, and Young's modulus, by specifying their mean values and their variances. The output then consists of the mean response and the variance in the response. Thus the designer is given a much broader picture of the predicted performance than with simply a single response curve. These methods are applicable to a wide class of problems, provided that the scale of randomness is not too large and the probabilistic density functions possess decaying tails. By incorporating the computational techniques we have developed in the past 3 years for efficiency, the probabilistic finite element methods are capable of handling large systems with many sources of uncertainties. Sample results for an elastic-plastic ten-bar structure and an elastic-plastic plane continuum with a circular hole subject to cyclic loadings with the yield stress on the random field are given.

  16. A hybrid deterministic-probabilistic approach to model the mechanical response of helically arranged hierarchical strands

    Science.gov (United States)

    Fraldi, M.; Perrella, G.; Ciervo, M.; Bosia, F.; Pugno, N. M.

    2017-09-01

    Very recently, a Weibull-based probabilistic strategy has been successfully applied to bundles of wires to determine their overall stress-strain behaviour, also capturing previously unpredicted nonlinear and post-elastic features of hierarchical strands. This approach is based on the so-called "Equal Load Sharing (ELS)" hypothesis by virtue of which, when a wire breaks, the load acting on the strand is homogeneously redistributed among the surviving wires. Despite the overall effectiveness of the method, some discrepancies between theoretical predictions and in silico Finite Element-based simulations or experimental findings might arise when more complex structures are analysed, e.g. helically arranged bundles. To overcome these limitations, an enhanced hybrid approach is proposed in which the probability of rupture is combined with a deterministic mechanical model of a strand constituted by helically-arranged and hierarchically-organized wires. The analytical model is validated comparing its predictions with both Finite Element simulations and experimental tests. The results show that generalized stress-strain responses - incorporating tension/torsion coupling - are naturally found and, once one or more elements break, the competition between geometry and mechanics of the strand microstructure, i.e. the different cross sections and helical angles of the wires in the different hierarchical levels of the strand, determines the no longer homogeneous stress redistribution among the surviving wires whose fate is hence governed by a "Hierarchical Load Sharing" criterion.

  17. Lessons learned on probabilistic methodology for precursor analyses

    Energy Technology Data Exchange (ETDEWEB)

    Babst, Siegfried [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Berlin (Germany); Wielenberg, Andreas; Gaenssmantel, Gerhard [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Garching (Germany)

    2016-11-15

    Based on its experience in precursor assessment of operating experience from German NPP and related international activities in the field, GRS has identified areas for enhancing probabilistic methodology. These are related to improving the completeness of PSA models, to insufficiencies in probabilistic assessment approaches, and to enhancements of precursor assessment methods. Three examples from the recent practice in precursor assessments illustrating relevant methodological insights are provided and discussed in more detail. Our experience reinforces the importance of having full scope, current PSA models up to Level 2 PSA and including hazard scenarios for precursor analysis. Our lessons learned include that PSA models should be regularly updated regarding CCF data and inclusion of newly discovered CCF mechanisms or groups. Moreover, precursor classification schemes should be extended to degradations and unavailabilities of the containment function. Finally, PSA and precursor assessments should put more emphasis on the consideration of passive provisions for safety, e. g. by sensitivity cases.

  18. Lessons learned on probabilistic methodology for precursor analyses

    International Nuclear Information System (INIS)

    Babst, Siegfried; Wielenberg, Andreas; Gaenssmantel, Gerhard

    2016-01-01

    Based on its experience in precursor assessment of operating experience from German NPP and related international activities in the field, GRS has identified areas for enhancing probabilistic methodology. These are related to improving the completeness of PSA models, to insufficiencies in probabilistic assessment approaches, and to enhancements of precursor assessment methods. Three examples from the recent practice in precursor assessments illustrating relevant methodological insights are provided and discussed in more detail. Our experience reinforces the importance of having full scope, current PSA models up to Level 2 PSA and including hazard scenarios for precursor analysis. Our lessons learned include that PSA models should be regularly updated regarding CCF data and inclusion of newly discovered CCF mechanisms or groups. Moreover, precursor classification schemes should be extended to degradations and unavailabilities of the containment function. Finally, PSA and precursor assessments should put more emphasis on the consideration of passive provisions for safety, e. g. by sensitivity cases.

  19. Probabilistic reasoning with graphical security models

    NARCIS (Netherlands)

    Kordy, Barbara; Pouly, Marc; Schweitzer, Patrick

    This work provides a computational framework for meaningful probabilistic evaluation of attack–defense scenarios involving dependent actions. We combine the graphical security modeling technique of attack–defense trees with probabilistic information expressed in terms of Bayesian networks. In order

  20. Probabilistic Dynamics for Integrated Analysis of Accident Sequences considering Uncertain Events

    Directory of Open Access Journals (Sweden)

    Robertas Alzbutas

    2015-01-01

    Full Text Available The analytical/deterministic modelling and simulation/probabilistic methods are used separately as a rule in order to analyse the physical processes and random or uncertain events. However, in the currently used probabilistic safety assessment this is an issue. The lack of treatment of dynamic interactions between the physical processes on one hand and random events on the other hand causes the limited assessment. In general, there are a lot of mathematical modelling theories, which can be used separately or integrated in order to extend possibilities of modelling and analysis. The Theory of Probabilistic Dynamics (TPD and its augmented version based on the concept of stimulus and delay are introduced for the dynamic reliability modelling and the simulation of accidents in hybrid (continuous-discrete systems considering uncertain events. An approach of non-Markovian simulation and uncertainty analysis is discussed in order to adapt the Stimulus-Driven TPD for practical applications. The developed approach and related methods are used as a basis for a test case simulation in view of various methods applications for severe accident scenario simulation and uncertainty analysis. For this and for wider analysis of accident sequences the initial test case specification is then extended and discussed. Finally, it is concluded that enhancing the modelling of stimulated dynamics with uncertainty and sensitivity analysis allows the detailed simulation of complex system characteristics and representation of their uncertainty. The developed approach of accident modelling and analysis can be efficiently used to estimate the reliability of hybrid systems and at the same time to analyze and possibly decrease the uncertainty of this estimate.

  1. A Probabilistic Assessment of the Next Geomagnetic Reversal

    OpenAIRE

    Buffett, B; Davis, W

    2018-01-01

    ©2018. American Geophysical Union. All Rights Reserved. Deterministic forecasts for the next geomagnetic reversal are not feasible due to large uncertainties in the present-day state of the Earth's core. A more practical approach relies on probabilistic assessments using paleomagnetic observations to characterize the amplitude of fluctuations in the geomagnetic dipole. We use paleomagnetic observations for the past 2 Myr to construct a stochastic model for the axial dipole field and apply wel...

  2. Need to use probabilistic risk approach in performance assessment of waste disposal facilities

    International Nuclear Information System (INIS)

    Bonano, E.J.; Gallegos, D.P.

    1991-01-01

    Regulations governing the disposal of radioactive, hazardous, and/or mixed wastes will likely require, either directly or indirectly, that the performance of disposal facilities be assessed quantitatively. Such analyses, commonly called ''performance assessments,'' rely on the use of predictive models to arrive at a quantitative estimate of the potential impact of disposal on the environment and the safety and health of the public. It has been recognized that a suite of uncertainties affect the results of a performance assessment. These uncertainties are conventionally categorized as (1) uncertainty in the future state of the disposal system (facility and surrounding medium), (2) uncertainty in models (including conceptual models, mathematical models, and computer codes), and (3) uncertainty in data and parameters. Decisions regarding the suitability of a waste disposal facility must be made in light of these uncertainties. Hence, an approach is needed that would allow the explicit consideration of these uncertainties so that their impact on the estimated consequences of disposal can be evaluated. While most regulations for waste disposal do not prescribe the consideration of uncertainties, it is proposed that, even in such cases, a meaningful decision regarding the suitability of a waste disposal facility cannot be made without considering the impact of the attendant uncertainties. A probabilistic risk assessment (PRA) approach provides the formalism for considering the uncertainties and the technical basis that the decision makers can use in discharging their duties. A PRA methodology developed and demonstrated for the disposal of high-level radioactive waste provides a general framework for assessing the disposal of all types of wastes (radioactive, hazardous, and mixed). 15 refs., 1 fig., 1 tab

  3. Probabilistic approaches applied to damage and embrittlement of structural materials in nuclear power plants

    International Nuclear Information System (INIS)

    Vincent, L.

    2012-01-01

    The present study deals with the long-term mechanical behaviour and damage of structural materials in nuclear power plants. An experimental way is first followed to study the thermal fatigue of austenitic stainless steels with a focus on the effects of mean stress and bi-axiality. Furthermore, the measurement of displacement fields by Digital Image Correlation techniques has been successfully used to detect early crack initiation during high cycle fatigue tests. A probabilistic model based on the shielding zones surrounding existing cracks is proposed to describe the development of crack networks. A more numeric way is then followed to study the embrittlement consequences of the irradiation hardening of the bainitic steel constitutive of nuclear pressure vessels. A crystalline plasticity law, developed in agreement with lower scale results (Dislocation Dynamics), is introduced in a Finite Element code in order to run simulations on aggregates and obtain the distributions of the maximum principal stress inside a Representative Volume Element. These distributions are then used to improve the classical Local Approach to Fracture which estimates the probability for a microstructural defect to be loaded up to a critical level. (author) [fr

  4. Probabilistic Seismic Risk Assessment in Manizales, Colombia:Quantifying Losses for Insurance Purposes

    Institute of Scientific and Technical Information of China (English)

    Mario A.Salgado-Gálvez; Gabriel A.Bernal; Daniela Zuloaga; Mabel C.Marulanda; Omar-Darío Cardona; Sebastián Henao

    2017-01-01

    A fully probabilistic seismic risk assessment was developed in Manizales,Colombia,considering assets of different types.The first type includes elements that are part of the water and sewage network,and the second type includes public and private buildings.This assessment required the development of a probabilistic seismic hazard analysis that accounts for the dynamic soil response,assembling high resolution exposure databases,and the development of damage models for different types of elements.The economic appraisal of the exposed assets was developed together with specialists of the water utilities company of Manizales and the city administration.The risk assessment was performed using several Comprehensive Approach to Probabilistic Risk Assessment modules as well as the R-System,obtaining results in terms of traditional metrics such as loss exceedance curve,average annual loss,and probable maximum loss.For the case of pipelines,repair rates were also estimated.The results for the water and sewage network were used in activities related to the expansion and maintenance strategies,as well as for the exploration of financial retention and transfer alternatives using insurance schemes based on technical,probabilistic,and prospective damage and loss estimations.In the case of the buildings,the results were used in the update of the technical premium values of the existing collective insurance scheme.

  5. Very Short-term Nonparametric Probabilistic Forecasting of Renewable Energy Generation - with Application to Solar Energy

    DEFF Research Database (Denmark)

    Golestaneh, Faranak; Pinson, Pierre; Gooi, Hoay Beng

    2016-01-01

    Due to the inherent uncertainty involved in renewable energy forecasting, uncertainty quantification is a key input to maintain acceptable levels of reliability and profitability in power system operation. A proposal is formulated and evaluated here for the case of solar power generation, when only...... approach to generate very short-term predictive densities, i.e., for lead times between a few minutes to one hour ahead, with fast frequency updates. We rely on an Extreme Learning Machine (ELM) as a fast regression model, trained in varied ways to obtain both point and quantile forecasts of solar power...... generation. Four probabilistic methods are implemented as benchmarks. Rival approaches are evaluated based on a number of test cases for two solar power generation sites in different climatic regions, allowing us to show that our approach results in generation of skilful and reliable probabilistic forecasts...

  6. Compositional Solution Space Quantification for Probabilistic Software Analysis

    Science.gov (United States)

    Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem

    2014-01-01

    Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.

  7. Chiefly Symmetric: Results on the Scalability of Probabilistic Model Checking for Operating-System Code

    Directory of Open Access Journals (Sweden)

    Marcus Völp

    2012-11-01

    Full Text Available Reliability in terms of functional properties from the safety-liveness spectrum is an indispensable requirement of low-level operating-system (OS code. However, with evermore complex and thus less predictable hardware, quantitative and probabilistic guarantees become more and more important. Probabilistic model checking is one technique to automatically obtain these guarantees. First experiences with the automated quantitative analysis of low-level operating-system code confirm the expectation that the naive probabilistic model checking approach rapidly reaches its limits when increasing the numbers of processes. This paper reports on our work-in-progress to tackle the state explosion problem for low-level OS-code caused by the exponential blow-up of the model size when the number of processes grows. We studied the symmetry reduction approach and carried out our experiments with a simple test-and-test-and-set lock case study as a representative example for a wide range of protocols with natural inter-process dependencies and long-run properties. We quickly see a state-space explosion for scenarios where inter-process dependencies are insignificant. However, once inter-process dependencies dominate the picture models with hundred and more processes can be constructed and analysed.

  8. Consideration of aging in probabilistic safety assessment

    International Nuclear Information System (INIS)

    Titina, B.; Cepin, M.

    2007-01-01

    Probabilistic safety assessment is a standardised tool for assessment of safety of nuclear power plants. It is a complement to the safety analyses. Standard probabilistic models of safety equipment assume component failure rate as a constant. Ageing of systems, structures and components can theoretically be included in new age-dependent probabilistic safety assessment, which generally causes the failure rate to be a function of age. New age-dependent probabilistic safety assessment models, which offer explicit calculation of the ageing effects, are developed. Several groups of components are considered which require their unique models: e.g. operating components e.g. stand-by components. The developed models on the component level are inserted into the models of the probabilistic safety assessment in order that the ageing effects are evaluated for complete systems. The preliminary results show that the lack of necessary data for consideration of ageing causes highly uncertain models and consequently the results. (author)

  9. Complexity characterization in a probabilistic approach to dynamical systems through information geometry and inductive inference

    International Nuclear Information System (INIS)

    Ali, S A; Kim, D-H; Cafaro, C; Giffin, A

    2012-01-01

    Information geometric techniques and inductive inference methods hold great promise for solving computational problems of interest in classical and quantum physics, especially with regard to complexity characterization of dynamical systems in terms of their probabilistic description on curved statistical manifolds. In this paper, we investigate the possibility of describing the macroscopic behavior of complex systems in terms of the underlying statistical structure of their microscopic degrees of freedom by the use of statistical inductive inference and information geometry. We review the maximum relative entropy formalism and the theoretical structure of the information geometrodynamical approach to chaos on statistical manifolds M S . Special focus is devoted to a description of the roles played by the sectional curvature K M S , the Jacobi field intensity J M S and the information geometrodynamical entropy S M S . These quantities serve as powerful information-geometric complexity measures of information-constrained dynamics associated with arbitrary chaotic and regular systems defined on M S . Finally, the application of such information-geometric techniques to several theoretical models is presented.

  10. Implications of probabilistic risk assessment

    International Nuclear Information System (INIS)

    Cullingford, M.C.; Shah, S.M.; Gittus, J.H.

    1987-01-01

    Probabilistic risk assessment (PRA) is an analytical process that quantifies the likelihoods, consequences and associated uncertainties of the potential outcomes of postulated events. Starting with planned or normal operation, probabilistic risk assessment covers a wide range of potential accidents and considers the whole plant and the interactions of systems and human actions. Probabilistic risk assessment can be applied in safety decisions in design, licensing and operation of industrial facilities, particularly nuclear power plants. The proceedings include a review of PRA procedures, methods and technical issues in treating uncertainties, operating and licensing issues and future trends. Risk assessment for specific reactor types or components and specific risks (eg aircraft crashing onto a reactor) are used to illustrate the points raised. All 52 articles are indexed separately. (U.K.)

  11. Branching bisimulation congruence for probabilistic systems

    NARCIS (Netherlands)

    Trcka, N.; Georgievska, S.; Aldini, A.; Baier, C.

    2008-01-01

    The notion of branching bisimulation for the alternating model of probabilistic systems is not a congruence with respect to parallel composition. In this paper we first define another branching bisimulation in the more general model allowing consecutive probabilistic transitions, and we prove that

  12. A hybrid approach for global sensitivity analysis

    International Nuclear Information System (INIS)

    Chakraborty, Souvik; Chowdhury, Rajib

    2017-01-01

    Distribution based sensitivity analysis (DSA) computes sensitivity of the input random variables with respect to the change in distribution of output response. Although DSA is widely appreciated as the best tool for sensitivity analysis, the computational issue associated with this method prohibits its use for complex structures involving costly finite element analysis. For addressing this issue, this paper presents a method that couples polynomial correlated function expansion (PCFE) with DSA. PCFE is a fully equivalent operational model which integrates the concepts of analysis of variance decomposition, extended bases and homotopy algorithm. By integrating PCFE into DSA, it is possible to considerably alleviate the computational burden. Three examples are presented to demonstrate the performance of the proposed approach for sensitivity analysis. For all the problems, proposed approach yields excellent results with significantly reduced computational effort. The results obtained, to some extent, indicate that proposed approach can be utilized for sensitivity analysis of large scale structures. - Highlights: • A hybrid approach for global sensitivity analysis is proposed. • Proposed approach integrates PCFE within distribution based sensitivity analysis. • Proposed approach is highly efficient.

  13. Bayesian uncertainty analyses of probabilistic risk models

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1989-01-01

    Applications of Bayesian principles to the uncertainty analyses are discussed in the paper. A short review of the most important uncertainties and their causes is provided. An application of the principle of maximum entropy to the determination of Bayesian prior distributions is described. An approach based on so called probabilistic structures is presented in order to develop a method of quantitative evaluation of modelling uncertainties. The method is applied to a small example case. Ideas for application areas for the proposed method are discussed

  14. Safety of long-distance pipelines. Probabilistic and deterministic aspects; Sicherheit von Rohrfernleitungen. Probabilistik und Deterministik im Vergleich

    Energy Technology Data Exchange (ETDEWEB)

    Hollaender, Robert [Leipzig Univ. (Germany). Inst. fuer Infrastruktur und Ressourcenmanagement

    2013-03-15

    The Committee for Long-Distance Pipelines (Berlin, Federal Republic of Germany) reported on the relation between deterministic and probabilistic approaches in order to contribute to a better understanding of the safety management of long-distance pipelines. The respective strengths and weaknesses as well as the deterministic and probabilistic fundamentals of the safety management are described. The comparison includes fundamental aspects, but is essentially determined by the special character of the technical plant 'long-distance pipeline' as an infrastructure project in the area. This special feature results to special operation conditions and related responsibilities. However, our legal system 'long-distance pipeline' does not grant the same legal position in comparison to other infrastructural facilities such as streets and railways. Thus, the question whether and in what manner the impacts from the land-use in the environment of long-distance pipelines have to be considered is again and again the initial point for the discussion on probabilistic and deterministic approaches.

  15. Risk assessment using probabilistic standards

    International Nuclear Information System (INIS)

    Avila, R.

    2004-01-01

    A core element of risk is uncertainty represented by plural outcomes and their likelihood. No risk exists if the future outcome is uniquely known and hence guaranteed. The probability that we will die some day is equal to 1, so there would be no fatal risk if sufficiently long time frame is assumed. Equally, rain risk does not exist if there was 100% assurance of rain tomorrow, although there would be other risks induced by the rain. In a formal sense, any risk exists if, and only if, more than one outcome is expected at a future time interval. In any practical risk assessment we have to deal with uncertainties associated with the possible outcomes. One way of dealing with the uncertainties is to be conservative in the assessments. For example, we may compare the maximal exposure to a radionuclide with a conservatively chosen reference value. In this case, if the exposure is below the reference value then it is possible to assure that the risk is low. Since single values are usually compared; this approach is commonly called 'deterministic'. Its main advantage lies in the simplicity and in that it requires minimum information. However, problems arise when the reference values are actually exceeded or might be exceeded, as in the case of potential exposures, and when the costs for realizing the reference values are high. In those cases, the lack of knowledge on the degree of conservatism involved impairs a rational weighing of the risks against other interests. In this presentation we will outline an approach for dealing with uncertainties that in our opinion is more consistent. We will call it a 'fully probabilistic risk assessment'. The essence of this approach consists in measuring the risk in terms of probabilities, where the later are obtained from comparison of two probabilistic distributions, one reflecting the uncertainties in the outcomes and one reflecting the uncertainties in the reference value (standard) used for defining adverse outcomes. Our first aim

  16. Global Kalman filter approaches to estimate absolute angles of lower limb segments.

    Science.gov (United States)

    Nogueira, Samuel L; Lambrecht, Stefan; Inoue, Roberto S; Bortole, Magdo; Montagnoli, Arlindo N; Moreno, Juan C; Rocon, Eduardo; Terra, Marco H; Siqueira, Adriano A G; Pons, Jose L

    2017-05-16

    In this paper we propose the use of global Kalman filters (KFs) to estimate absolute angles of lower limb segments. Standard approaches adopt KFs to improve the performance of inertial sensors based on individual link configurations. In consequence, for a multi-body system like a lower limb exoskeleton, the inertial measurements of one link (e.g., the shank) are not taken into account in other link angle estimations (e.g., foot). Global KF approaches, on the other hand, correlate the collective contribution of all signals from lower limb segments observed in the state-space model through the filtering process. We present a novel global KF (matricial global KF) relying only on inertial sensor data, and validate both this KF and a previously presented global KF (Markov Jump Linear Systems, MJLS-based KF), which fuses data from inertial sensors and encoders from an exoskeleton. We furthermore compare both methods to the commonly used local KF. The results indicate that the global KFs performed significantly better than the local KF, with an average root mean square error (RMSE) of respectively 0.942° for the MJLS-based KF, 1.167° for the matrical global KF, and 1.202° for the local KFs. Including the data from the exoskeleton encoders also resulted in a significant increase in performance. The results indicate that the current practice of using KFs based on local models is suboptimal. Both the presented KF based on inertial sensor data, as well our previously presented global approach fusing inertial sensor data with data from exoskeleton encoders, were superior to local KFs. We therefore recommend to use global KFs for gait analysis and exoskeleton control.

  17. Use and Communication of Probabilistic Forecasts.

    Science.gov (United States)

    Raftery, Adrian E

    2016-12-01

    Probabilistic forecasts are becoming more and more available. How should they be used and communicated? What are the obstacles to their use in practice? I review experience with five problems where probabilistic forecasting played an important role. This leads me to identify five types of potential users: Low Stakes Users, who don't need probabilistic forecasts; General Assessors, who need an overall idea of the uncertainty in the forecast; Change Assessors, who need to know if a change is out of line with expectatations; Risk Avoiders, who wish to limit the risk of an adverse outcome; and Decision Theorists, who quantify their loss function and perform the decision-theoretic calculations. This suggests that it is important to interact with users and to consider their goals. The cognitive research tells us that calibration is important for trust in probability forecasts, and that it is important to match the verbal expression with the task. The cognitive load should be minimized, reducing the probabilistic forecast to a single percentile if appropriate. Probabilities of adverse events and percentiles of the predictive distribution of quantities of interest seem often to be the best way to summarize probabilistic forecasts. Formal decision theory has an important role, but in a limited range of applications.

  18. Use and Communication of Probabilistic Forecasts

    Science.gov (United States)

    Raftery, Adrian E.

    2015-01-01

    Probabilistic forecasts are becoming more and more available. How should they be used and communicated? What are the obstacles to their use in practice? I review experience with five problems where probabilistic forecasting played an important role. This leads me to identify five types of potential users: Low Stakes Users, who don’t need probabilistic forecasts; General Assessors, who need an overall idea of the uncertainty in the forecast; Change Assessors, who need to know if a change is out of line with expectatations; Risk Avoiders, who wish to limit the risk of an adverse outcome; and Decision Theorists, who quantify their loss function and perform the decision-theoretic calculations. This suggests that it is important to interact with users and to consider their goals. The cognitive research tells us that calibration is important for trust in probability forecasts, and that it is important to match the verbal expression with the task. The cognitive load should be minimized, reducing the probabilistic forecast to a single percentile if appropriate. Probabilities of adverse events and percentiles of the predictive distribution of quantities of interest seem often to be the best way to summarize probabilistic forecasts. Formal decision theory has an important role, but in a limited range of applications. PMID:28446941

  19. Probabilistic numerics and uncertainty in computations.

    Science.gov (United States)

    Hennig, Philipp; Osborne, Michael A; Girolami, Mark

    2015-07-08

    We deliver a call to arms for probabilistic numerical methods : algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.

  20. Probabilistic confidence for decisions based on uncertain reliability estimates

    Science.gov (United States)

    Reid, Stuart G.

    2013-05-01

    Reliability assessments are commonly carried out to provide a rational basis for risk-informed decisions concerning the design or maintenance of engineering systems and structures. However, calculated reliabilities and associated probabilities of failure often have significant uncertainties associated with the possible estimation errors relative to the 'true' failure probabilities. For uncertain probabilities of failure, a measure of 'probabilistic confidence' has been proposed to reflect the concern that uncertainty about the true probability of failure could result in a system or structure that is unsafe and could subsequently fail. The paper describes how the concept of probabilistic confidence can be applied to evaluate and appropriately limit the probabilities of failure attributable to particular uncertainties such as design errors that may critically affect the dependability of risk-acceptance decisions. This approach is illustrated with regard to the dependability of structural design processes based on prototype testing with uncertainties attributable to sampling variability.

  1. Classification of Company Performance using Weighted Probabilistic Neural Network

    Science.gov (United States)

    Yasin, Hasbi; Waridi Basyiruddin Arifin, Adi; Warsito, Budi

    2018-05-01

    Classification of company performance can be judged by looking at its financial status, whether good or bad state. Classification of company performance can be achieved by some approach, either parametric or non-parametric. Neural Network is one of non-parametric methods. One of Artificial Neural Network (ANN) models is Probabilistic Neural Network (PNN). PNN consists of four layers, i.e. input layer, pattern layer, addition layer, and output layer. The distance function used is the euclidean distance and each class share the same values as their weights. In this study used PNN that has been modified on the weighting process between the pattern layer and the addition layer by involving the calculation of the mahalanobis distance. This model is called the Weighted Probabilistic Neural Network (WPNN). The results show that the company's performance modeling with the WPNN model has a very high accuracy that reaches 100%.

  2. Probabilistic linguistics

    NARCIS (Netherlands)

    Bod, R.; Heine, B.; Narrog, H.

    2010-01-01

    Probabilistic linguistics takes all linguistic evidence as positive evidence and lets statistics decide. It allows for accurate modelling of gradient phenomena in production and perception, and suggests that rule-like behaviour is no more than a side effect of maximizing probability. This chapter

  3. Probabilistic safety assessment - regulatory perspective

    International Nuclear Information System (INIS)

    Solanki, R.B.; Paul, U.K.; Hajra, P.; Agarwal, S.K.

    2002-01-01

    Full text: Nuclear power plants (NPPs) have been designed, constructed and operated mainly based on deterministic safety analysis philosophy. In this approach, a substantial amount of safety margin is incorporated in the design and operational requirements. Additional margin is incorporated by applying the highest quality engineering codes, standards and practices, and the concept of defence-in-depth in design and operating procedures, by including conservative assumptions and acceptance criteria in plant response analysis of postulated initiating events (PIEs). However, as the probabilistic approach has been improved and refined over the years, it is possible for the designer, operator and regulator to get a more detailed and realistic picture of the safety importance of plant design features, operating procedures and operational practices by using probabilistic safety assessment (PSA) along with the deterministic methodology. At present, many countries including USA, UK and France are using PSA insights in their decision making along with deterministic basis. India has also made substantial progress in the development of methods for carrying out PSA. However, consensus on the use of PSA in regulatory decision-making has not been achieved yet. This paper emphasises on the requirements (e.g.,level of details, key modelling assumptions, data, modelling aspects, success criteria, sensitivity and uncertainty analysis) for improving the quality and consistency in performance and use of PSA that can facilitate meaningful use of the PSA insights in the regulatory decision-making in India. This paper also provides relevant information on international scenario and various application areas of PSA along with progress made in India. The PSA perspective presented in this paper may help in achieving consensus on the use of PSA for regulatory / utility decision-making in design and operation of NPPs

  4. Probabilistic soft sets and dual probabilistic soft sets in decision making with positive and negative parameters

    Science.gov (United States)

    Fatimah, F.; Rosadi, D.; Hakim, R. B. F.

    2018-03-01

    In this paper, we motivate and introduce probabilistic soft sets and dual probabilistic soft sets for handling decision making problem in the presence of positive and negative parameters. We propose several types of algorithms related to this problem. Our procedures are flexible and adaptable. An example on real data is also given.

  5. Trait-Dependent Biogeography: (Re)Integrating Biology into Probabilistic Historical Biogeographical Models.

    Science.gov (United States)

    Sukumaran, Jeet; Knowles, L Lacey

    2018-04-20

    The development of process-based probabilistic models for historical biogeography has transformed the field by grounding it in modern statistical hypothesis testing. However, most of these models abstract away biological differences, reducing species to interchangeable lineages. We present here the case for reintegration of biology into probabilistic historical biogeographical models, allowing a broader range of questions about biogeographical processes beyond ancestral range estimation or simple correlation between a trait and a distribution pattern, as well as allowing us to assess how inferences about ancestral ranges themselves might be impacted by differential biological traits. We show how new approaches to inference might cope with the computational challenges resulting from the increased complexity of these trait-based historical biogeographical models. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Probabilistic estimation of residential air exchange rates for population-based human exposure modeling

    Science.gov (United States)

    Residential air exchange rates (AERs) are a key determinant in the infiltration of ambient air pollution indoors. Population-based human exposure models using probabilistic approaches to estimate personal exposure to air pollutants have relied on input distributions from AER meas...

  7. Integrated probabilistic assessment for DHC initiation, growth and leak-before-break of PHWR pressure tubes

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Young-Jin [Power Engineering Research Institute, KEPCO Engineering and Construction, 188 Gumi-dong, Bundang-gu, Seongnam-si, Gyeonggi-do 463-870 (Korea, Republic of); Chang, Yoon-Suk, E-mail: yschang@khu.ac.kr [Department of Nuclear Engineering, Kyung Hee University, 1732 Deogyeong-daero, Giheung-gu, Yongin-si, Gyeonggi-do 446-701 (Korea, Republic of)

    2014-08-15

    Highlights: • We develop an integrated approach for probabilistic assessment of PHWR pressure tube. • We examine probabilities of DHC initiation, growth, penetration and LBB failure. • The proposed approach is helpful to calculate rupture probabilities in reactor flaws even in the case of very low rupture probability. - Abstract: A few hundred zirconium alloy pressure tubes in a pressurized heavy water reactor (PHWR) serve as the nuclear fuel channel, as well as the reactor coolant pressure boundary. The pressure tubes are inspected periodically and a fitness-for-service assessment (FFSA) must be conducted if any flaw is detected in the inspection. A Canadian standard provides FFSA procedures of PHWR pressure tubes, which include probabilistic assessment for flaws considering delayed hydride cracking (DHC) and leak-before-break (LBB). In the present study, an integrated approach with detailed stepwise calculation procedures and integration methodology for probabilistic assessment of pressure tube was developed. In the first step of this approach, a probability of the DHC initiation, growth and penetration for single initial flaw is calculated. In the next step, a probability of LBB failure, which means tube rupture, for single through-wall crack (TWC) is calculated. Finally, a rupture probability for all initial flaws in a reactor can be calculated using the penetration probability for single flaw and the LBB failure probability for single TWC, as well as the predicted total number of initial flaw in the reactor.

  8. Integrated probabilistic assessment for DHC initiation, growth and leak-before-break of PHWR pressure tubes

    International Nuclear Information System (INIS)

    Oh, Young-Jin; Chang, Yoon-Suk

    2014-01-01

    Highlights: • We develop an integrated approach for probabilistic assessment of PHWR pressure tube. • We examine probabilities of DHC initiation, growth, penetration and LBB failure. • The proposed approach is helpful to calculate rupture probabilities in reactor flaws even in the case of very low rupture probability. - Abstract: A few hundred zirconium alloy pressure tubes in a pressurized heavy water reactor (PHWR) serve as the nuclear fuel channel, as well as the reactor coolant pressure boundary. The pressure tubes are inspected periodically and a fitness-for-service assessment (FFSA) must be conducted if any flaw is detected in the inspection. A Canadian standard provides FFSA procedures of PHWR pressure tubes, which include probabilistic assessment for flaws considering delayed hydride cracking (DHC) and leak-before-break (LBB). In the present study, an integrated approach with detailed stepwise calculation procedures and integration methodology for probabilistic assessment of pressure tube was developed. In the first step of this approach, a probability of the DHC initiation, growth and penetration for single initial flaw is calculated. In the next step, a probability of LBB failure, which means tube rupture, for single through-wall crack (TWC) is calculated. Finally, a rupture probability for all initial flaws in a reactor can be calculated using the penetration probability for single flaw and the LBB failure probability for single TWC, as well as the predicted total number of initial flaw in the reactor

  9. Why do probabilistic finite element analysis ?

    CERN Document Server

    Thacker, Ben H

    2008-01-01

    The intention of this book is to provide an introduction to performing probabilistic finite element analysis. As a short guideline, the objective is to inform the reader of the use, benefits and issues associated with performing probabilistic finite element analysis without excessive theory or mathematical detail.

  10. Error Discounting in Probabilistic Category Learning

    Science.gov (United States)

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.

    2011-01-01

    The assumption in some current theories of probabilistic categorization is that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report 2 probabilistic-categorization experiments in which we investigated error…

  11. Towards a probabilistic regional reanalysis system for Europe: evaluation of precipitation from experiments

    Directory of Open Access Journals (Sweden)

    Liselotte Bach

    2016-11-01

    Full Text Available A new development in the field of reanalyses is the incorporation of uncertainty estimation capabilities. We have developed a probabilistic regional reanalysis system for the CORDEX-EUR11 domain that is based on the numerical weather prediction model COSMO at a 12-km grid spacing. The lateral boundary conditions of all ensemble members are provided by the global reanalysis ERA-Interim. In the basic implementation of the system, uncertainties due to observation errors are estimated. Atmospheric assimilation of conventional observations perturbed by means of random samples of observation error yields estimates of the reanalysis uncertainty conditioned to observation errors. The data assimilation employed is a new scheme based on observation nudging that we denote ensemble nudging. The lower boundary of the atmosphere is regularly updated by external snow depth, sea surface temperature and soil moisture analyses. One of the most important purposes of reanalyses is the estimation of so-called essential climate variables. For regional reanalyses, precipitation has been identified as one of the essential climate variables that are potentially better represented than in other climate data sets. For that reason, we assess the representation of precipitation in our system in a pilot study. Based on two experiments, each of which extends over one month, we conduct a preliminary comparison to the global reanalysis ERA-Interim, a dynamical downscaling of the latter and the high-resolution regional reanalysis COSMO-REA6. In a next step, we assess our reanalysis system's probabilistic capabilities versus the ECMWF-EPS in terms of six-hourly precipitation sums. The added value of our probabilistic regional reanalysis system motivates the current production of a 5-year-long test reanalysis COSMO-EN-REA12 in the framework of the FP7-funded project Uncertainties in Ensembles of Regional Re-Analyses (UERRA.

  12. Probabilistic programming in Python using PyMC3

    Directory of Open Access Journals (Sweden)

    John Salvatier

    2016-04-01

    Full Text Available Probabilistic programming allows for automatic Bayesian inference on user-defined probabilistic models. Recent advances in Markov chain Monte Carlo (MCMC sampling allow inference on increasingly complex models. This class of MCMC, known as Hamiltonian Monte Carlo, requires gradient information which is often not readily available. PyMC3 is a new open source probabilistic programming framework written in Python that uses Theano to compute gradients via automatic differentiation as well as compile probabilistic programs on-the-fly to C for increased speed. Contrary to other probabilistic programming languages, PyMC3 allows model specification directly in Python code. The lack of a domain specific language allows for great flexibility and direct interaction with the model. This paper is a tutorial-style introduction to this software package.

  13. Probabilistic design of fibre concrete structures

    Science.gov (United States)

    Pukl, R.; Novák, D.; Sajdlová, T.; Lehký, D.; Červenka, J.; Červenka, V.

    2017-09-01

    Advanced computer simulation is recently well-established methodology for evaluation of resistance of concrete engineering structures. The nonlinear finite element analysis enables to realistically predict structural damage, peak load, failure, post-peak response, development of cracks in concrete, yielding of reinforcement, concrete crushing or shear failure. The nonlinear material models can cover various types of concrete and reinforced concrete: ordinary concrete, plain or reinforced, without or with prestressing, fibre concrete, (ultra) high performance concrete, lightweight concrete, etc. Advanced material models taking into account fibre concrete properties such as shape of tensile softening branch, high toughness and ductility are described in the paper. Since the variability of the fibre concrete material properties is rather high, the probabilistic analysis seems to be the most appropriate format for structural design and evaluation of structural performance, reliability and safety. The presented combination of the nonlinear analysis with advanced probabilistic methods allows evaluation of structural safety characterized by failure probability or by reliability index respectively. Authors offer a methodology and computer tools for realistic safety assessment of concrete structures; the utilized approach is based on randomization of the nonlinear finite element analysis of the structural model. Uncertainty of the material properties or their randomness obtained from material tests are accounted in the random distribution. Furthermore, degradation of the reinforced concrete materials such as carbonation of concrete, corrosion of reinforcement, etc. can be accounted in order to analyze life-cycle structural performance and to enable prediction of the structural reliability and safety in time development. The results can serve as a rational basis for design of fibre concrete engineering structures based on advanced nonlinear computer analysis. The presented

  14. Probabilistically modeling lava flows with MOLASSES

    Science.gov (United States)

    Richardson, J. A.; Connor, L.; Connor, C.; Gallant, E.

    2017-12-01

    Modeling lava flows through Cellular Automata methods enables a computationally inexpensive means to quickly forecast lava flow paths and ultimate areal extents. We have developed a lava flow simulator, MOLASSES, that forecasts lava flow inundation over an elevation model from a point source eruption. This modular code can be implemented in a deterministic fashion with given user inputs that will produce a single lava flow simulation. MOLASSES can also be implemented in a probabilistic fashion where given user inputs define parameter distributions that are randomly sampled to create many lava flow simulations. This probabilistic approach enables uncertainty in input data to be expressed in the model results and MOLASSES outputs a probability map of inundation instead of a determined lava flow extent. Since the code is comparatively fast, we use it probabilistically to investigate where potential vents are located that may impact specific sites and areas, as well as the unconditional probability of lava flow inundation of sites or areas from any vent. We have validated the MOLASSES code to community-defined benchmark tests and to the real world lava flows at Tolbachik (2012-2013) and Pico do Fogo (2014-2015). To determine the efficacy of the MOLASSES simulator at accurately and precisely mimicking the inundation area of real flows, we report goodness of fit using both model sensitivity and the Positive Predictive Value, the latter of which is a Bayesian posterior statistic. Model sensitivity is often used in evaluating lava flow simulators, as it describes how much of the lava flow was successfully modeled by the simulation. We argue that the positive predictive value is equally important in determining how good a simulator is, as it describes the percentage of the simulation space that was actually inundated by lava.

  15. Comparing Categorical and Probabilistic Fingerprint Evidence.

    Science.gov (United States)

    Garrett, Brandon; Mitchell, Gregory; Scurich, Nicholas

    2018-04-23

    Fingerprint examiners traditionally express conclusions in categorical terms, opining that impressions do or do not originate from the same source. Recently, probabilistic conclusions have been proposed, with examiners estimating the probability of a match between recovered and known prints. This study presented a nationally representative sample of jury-eligible adults with a hypothetical robbery case in which an examiner opined on the likelihood that a defendant's fingerprints matched latent fingerprints in categorical or probabilistic terms. We studied model language developed by the U.S. Defense Forensic Science Center to summarize results of statistical analysis of the similarity between prints. Participant ratings of the likelihood the defendant left prints at the crime scene and committed the crime were similar when exposed to categorical and strong probabilistic match evidence. Participants reduced these likelihoods when exposed to the weaker probabilistic evidence, but did not otherwise discriminate among the prints assigned different match probabilities. © 2018 American Academy of Forensic Sciences.

  16. Assignment of functional activations to probabilistic cytoarchitectonic areas revisited.

    Science.gov (United States)

    Eickhoff, Simon B; Paus, Tomas; Caspers, Svenja; Grosbras, Marie-Helene; Evans, Alan C; Zilles, Karl; Amunts, Katrin

    2007-07-01

    Probabilistic cytoarchitectonic maps in standard reference space provide a powerful tool for the analysis of structure-function relationships in the human brain. While these microstructurally defined maps have already been successfully used in the analysis of somatosensory, motor or language functions, several conceptual issues in the analysis of structure-function relationships still demand further clarification. In this paper, we demonstrate the principle approaches for anatomical localisation of functional activations based on probabilistic cytoarchitectonic maps by exemplary analysis of an anterior parietal activation evoked by visual presentation of hand gestures. After consideration of the conceptual basis and implementation of volume or local maxima labelling, we comment on some potential interpretational difficulties, limitations and caveats that could be encountered. Extending and supplementing these methods, we then propose a supplementary approach for quantification of structure-function correspondences based on distribution analysis. This approach relates the cytoarchitectonic probabilities observed at a particular functionally defined location to the areal specific null distribution of probabilities across the whole brain (i.e., the full probability map). Importantly, this method avoids the need for a unique classification of voxels to a single cortical area and may increase the comparability between results obtained for different areas. Moreover, as distribution-based labelling quantifies the "central tendency" of an activation with respect to anatomical areas, it will, in combination with the established methods, allow an advanced characterisation of the anatomical substrates of functional activations. Finally, the advantages and disadvantages of the various methods are discussed, focussing on the question of which approach is most appropriate for a particular situation.

  17. Research on probabilistic assessment method based on the corroded pipeline assessment criteria

    International Nuclear Information System (INIS)

    Zhang Guangli; Luo, Jinheng; Zhao Xinwei; Zhang Hua; Zhang Liang; Zhang Yi

    2012-01-01

    Pipeline integrity assessments are performed using conventional deterministic approaches, even though there are many uncertainties about the parameters in the pipeline integrity assessment. In this paper, a probabilistic assessment method is provided for the gas pipeline with corrosion defects based on the current corroded pipe evaluation criteria, and the failure probability of corroded pipelines due to the uncertainties of loadings, material property and measurement accuracy is estimated using Monte-Carlo technique. Furthermore, the sensitivity analysis approach is introduced to rank the influence of various random variables to the safety of pipeline. And the method to determine the critical defect size based on acceptable failure probability is proposed. Highlights: ► The folias factor in pipeline corrosion assessment methods was analyzed. ► The probabilistic method was applied in corrosion assessment methods. ► The influence of assessment variables to the reliability of pipeline was ranked. ► The acceptable failure probability was used to determine the critical defect size.

  18. Probabilistic analysis of modernization options

    International Nuclear Information System (INIS)

    Wunderlich, W.O.; Giles, J.E.

    1991-01-01

    This paper reports on benefit-cost analysis for hydropower operations, a standard procedure for reaching planning decisions. Cost overruns and benefit shortfalls are also common occurrences. One reason for the difficulty of predicting future benefits and costs is that they usually cannot be represented with sufficient reliability by accurate values, because of the many uncertainties that enter the analysis through assumptions on inputs and system parameters. Therefore, ranges of variables need to be analyzed instead of single values. As a consequence, the decision criteria, such as net benefit and benefit-cost ratio, also vary over some range. A probabilistic approach will be demonstrated as a tool for assessing the reliability of the results

  19. Examining the global health arena: strengths and weaknesses of a convention approach to global health challenges.

    Science.gov (United States)

    Haffeld, Just Balstad; Siem, Harald; Røttingen, John-Arne

    2010-01-01

    The article comprises a conceptual framework to analyze the strengths and weaknesses of a global health convention. The analyses are inspired by Lawrence Gostin's suggested Framework Convention on Global Health. The analytical model takes a starting-point in events tentatively following a logic sequence: Input (global health funding), Processes (coordination, cooperation, accountability, allocation of aid), Output (definition of basic survival needs), Outcome (access to health services), and Impact (health for all). It then examines to what degree binding international regulations can create order in such a sequence of events. We conclude that a global health convention could be an appropriate instrument to deal with some of the problems of global health. We also show that some of the tasks preceding a convention approach might be to muster international support for supra-national health regulations, negotiate compromises between existing stakeholders in the global health arena, and to utilize WHO as a platform for further discussions on a global health convention. © 2010 American Society of Law, Medicine & Ethics, Inc.

  20. Current status and future expectation concerning probabilistic risk assessment of NPPs. 1. Features and issues of probabilistic risk assessment methodology

    International Nuclear Information System (INIS)

    Yamashita, Masahiro

    2012-01-01

    Probabilistic risk assessment (PRA) of Nuclear Power Plants (NPPs) could play an important role in assuring safety of NPPs. However PRA had not always effectively used, which was indicated in Japanese government's report on Fukushima Daiichi NPP accident. At the Risk Technical Committee (RTC) of Standards Committee of Atomic Energy Society of Japan, preparation of standards (implementing criteria) focusing on PRA methodology and investigation on basic philosophy for use of PRA had been in progress. Based on activities of RTC, a serial in three articles including this described current status and future expectation concerning probabilistic risk assessment of NPPs. This article introduced features and issues of PRA methodology related to the use of PRA. Features of PRA methodology could be shown as (1) systematic and comprehensive understanding of risk, (2) support of grading approach, (3) identification of effective safety upgrade measures and (4) quantitative understanding of effects of uncertainty. Issues of PRA methodology were (1) extension of PRA application area, (2) upgrade of PRA methodology, (3) quality assurance of PRA, (4) treatment of uncertainty and (5) quantitative evaluation criteria. (T. Tanaka)

  1. Probabilistic analysis of fires in nuclear plants

    International Nuclear Information System (INIS)

    Unione, A.; Teichmann, T.

    1985-01-01

    The aim of this paper is to describe a multilevel (i.e., staged) probabilistic analysis of fire risks in nuclear plants (as part of a general PRA) which maximizes the benefits of the FRA (fire risk assessment) in a cost effective way. The approach uses several stages of screening, physical modeling of clearly dominant risk contributors, searches for direct (e.g., equipment dependences) and secondary (e.g., fire induced internal flooding) interactions, and relies on lessons learned and available data from and surrogate FRAs. The general methodology is outlined. 6 figs., 10 tabs

  2. Probabilistic Simulation of Multi-Scale Composite Behavior

    Science.gov (United States)

    Chamis, Christos C.

    2012-01-01

    A methodology is developed to computationally assess the non-deterministic composite response at all composite scales (from micro to structural) due to the uncertainties in the constituent (fiber and matrix) properties, in the fabrication process and in structural variables (primitive variables). The methodology is computationally efficient for simulating the probability distributions of composite behavior, such as material properties, laminate and structural responses. Bi-products of the methodology are probabilistic sensitivities of the composite primitive variables. The methodology has been implemented into the computer codes PICAN (Probabilistic Integrated Composite ANalyzer) and IPACS (Integrated Probabilistic Assessment of Composite Structures). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in composite typical laminates and comparing the results with the Monte Carlo simulation method. Available experimental data of composite laminate behavior at all scales fall within the scatters predicted by PICAN. Multi-scaling is extended to simulate probabilistic thermo-mechanical fatigue and to simulate the probabilistic design of a composite redome in order to illustrate its versatility. Results show that probabilistic fatigue can be simulated for different temperature amplitudes and for different cyclic stress magnitudes. Results also show that laminate configurations can be selected to increase the redome reliability by several orders of magnitude without increasing the laminate thickness--a unique feature of structural composites. The old reference denotes that nothing fundamental has been done since that time.

  3. Development of a regional ensemble prediction method for probabilistic weather prediction

    International Nuclear Information System (INIS)

    Nohara, Daisuke; Tamura, Hidetoshi; Hirakuchi, Hiromaru

    2015-01-01

    A regional ensemble prediction method has been developed to provide probabilistic weather prediction using a numerical weather prediction model. To obtain consistent perturbations with the synoptic weather pattern, both of initial and lateral boundary perturbations were given by differences between control and ensemble member of the Japan Meteorological Agency (JMA)'s operational one-week ensemble forecast. The method provides a multiple ensemble member with a horizontal resolution of 15 km for 48-hour based on a downscaling of the JMA's operational global forecast accompanied with the perturbations. The ensemble prediction was examined in the case of heavy snow fall event in Kanto area on January 14, 2013. The results showed that the predictions represent different features of high-resolution spatiotemporal distribution of precipitation affected by intensity and location of extra-tropical cyclone in each ensemble member. Although the ensemble prediction has model bias of mean values and variances in some variables such as wind speed and solar radiation, the ensemble prediction has a potential to append a probabilistic information to a deterministic prediction. (author)

  4. Probabilistic accounting of uncertainty in forecasts of species distributions under climate change

    Science.gov (United States)

    Seth J. Wenger; Nicholas A. Som; Daniel C. Dauwalter; Daniel J. Isaak; Helen M. Neville; Charles H. Luce; Jason B. Dunham; Michael K. Young; Kurt D. Fausch; Bruce E. Rieman

    2013-01-01

    Forecasts of species distributions under future climates are inherently uncertain, but there have been few attempts to describe this uncertainty comprehensively in a probabilistic manner. We developed a Monte Carlo approach that accounts for uncertainty within generalized linear regression models (parameter uncertainty and residual error), uncertainty among competing...

  5. Probabilistic Cue Combination: Less Is More

    Science.gov (United States)

    Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen

    2013-01-01

    Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the "dilution effect,"…

  6. Robust stabilisation of time-varying delay systems with probabilistic uncertainties

    Science.gov (United States)

    Jiang, Ning; Xiong, Junlin; Lam, James

    2016-09-01

    For robust stabilisation of time-varying delay systems, only sufficient conditions are available to date. A natural question is as follows: if the existing sufficient conditions are not satisfied, and hence no controllers can be found, what can one do to improve the stability performance of time-varying delay systems? This question is addressed in this paper when there is a probabilistic structure on the parameter uncertainty set. A randomised algorithm is proposed to design a state-feedback controller, which stabilises the system over the uncertainty domain in a probabilistic sense. The capability of the designed controller is quantified by the probability of stability of the resulting closed-loop system. The accuracy of the solution obtained from the randomised algorithm is also analysed. Finally, numerical examples are used to illustrate the effectiveness and advantages of the developed controller design approach.

  7. Reliability of structures of industrial installations. Theory and applications of probabilistic mechanics

    International Nuclear Information System (INIS)

    Procaccia, H.; Morilhat, P.; Carle, R.; Menjon, G.

    1996-01-01

    The management of the service life of mechanical materials implies an evaluation of their risk of failure during their use. To evaluate this risk the following methods are used: the classical frequency statistics applied to experience feedback data concerning failures noticed during operation of active parts (pumps, valves, exchangers, circuit breakers etc..); the Bayesian approach in the case of scarce statistical data and when experts are needed to compensate the lack of information; the structures reliability approach when no data are available and when a theoretical model of degradation must be used, in particular for passive structures (pressure vessels, pipes, tanks, etc..). The aim of this book is to describe the principles and applications of this third approach to industrial installations. Chapter 1 recalls the historical aspects of the probabilistic approach to the reliability of structures and the existing codes. Chapter 2 presents the level 1 deterministic method applied so far for the conceiving of passive structures. The Cornell reliability index, already used in civil engineering codes, is defined in chapter 3. The Hasofer and Lind reliability index, a generalization of the Cornell index, is defined in chapter 4. Chapter 5 concerns the application of probabilistic approaches to optimization studies with the introduction of the economical variables linked to the risk and the possible actions to limit this risk (in-service inspection, maintenance, repairing etc..). Chapters 6 and 7 describe the Monte Carlo simulation and approximation methods for failure probabilistic calculations, and recall the fracture mechanics basis and the models of load and degradation of industrial installations. Some applications are given in chapter 9 with the cases of the safety margins quantization of a fissured pipe and the optimizing of the in-service inspection policy of a steam generator. Chapter 10 raises the problem of the coupling between mechanical and reliability

  8. Integrated software health management for aerospace guidance, navigation, and control systems: A probabilistic reasoning approach

    Science.gov (United States)

    Mbaya, Timmy

    Embedded Aerospace Systems have to perform safety and mission critical operations in a real-time environment where timing and functional correctness are extremely important. Guidance, Navigation, and Control (GN&C) systems substantially rely on complex software interfacing with hardware in real-time; any faults in software or hardware, or their interaction could result in fatal consequences. Integrated Software Health Management (ISWHM) provides an approach for detection and diagnosis of software failures while the software is in operation. The ISWHM approach is based on probabilistic modeling of software and hardware sensors using a Bayesian network. To meet memory and timing constraints of real-time embedded execution, the Bayesian network is compiled into an Arithmetic Circuit, which is used for on-line monitoring. This type of system monitoring, using an ISWHM, provides automated reasoning capabilities that compute diagnoses in a timely manner when failures occur. This reasoning capability enables time-critical mitigating decisions and relieves the human agent from the time-consuming and arduous task of foraging through a multitude of isolated---and often contradictory---diagnosis data. For the purpose of demonstrating the relevance of ISWHM, modeling and reasoning is performed on a simple simulated aerospace system running on a real-time operating system emulator, the OSEK/Trampoline platform. Models for a small satellite and an F-16 fighter jet GN&C (Guidance, Navigation, and Control) system have been implemented. Analysis of the ISWHM is then performed by injecting faults and analyzing the ISWHM's diagnoses.

  9. Probabilistic Fatigue Damage Prognosis Using a Surrogate Model Trained Via 3D Finite Element Analysis

    Science.gov (United States)

    Leser, Patrick E.; Hochhalter, Jacob D.; Newman, John A.; Leser, William P.; Warner, James E.; Wawrzynek, Paul A.; Yuan, Fuh-Gwo

    2015-01-01

    Utilizing inverse uncertainty quantification techniques, structural health monitoring can be integrated with damage progression models to form probabilistic predictions of a structure's remaining useful life. However, damage evolution in realistic structures is physically complex. Accurately representing this behavior requires high-fidelity models which are typically computationally prohibitive. In the present work, a high-fidelity finite element model is represented by a surrogate model, reducing computation times. The new approach is used with damage diagnosis data to form a probabilistic prediction of remaining useful life for a test specimen under mixed-mode conditions.

  10. Deterministic and Probabilistic Serviceability Assessment of Footbridge Vibrations due to a Single Walker Crossing

    Directory of Open Access Journals (Sweden)

    Cristoforo Demartino

    2018-01-01

    Full Text Available This paper presents a numerical study on the deterministic and probabilistic serviceability assessment of footbridge vibrations due to a single walker crossing. The dynamic response of the footbridge is analyzed by means of modal analysis, considering only the first lateral and vertical modes. Single span footbridges with uniform mass distribution are considered, with different values of the span length, natural frequencies, mass, and structural damping and with different support conditions. The load induced by a single walker crossing the footbridge is modeled as a moving sinusoidal force either in the lateral or in the vertical direction. The variability of the characteristics of the load induced by walkers is modeled using probability distributions taken from the literature defining a Standard Population of walkers. Deterministic and probabilistic approaches were adopted to assess the peak response. Based on the results of the simulations, deterministic and probabilistic vibration serviceability assessment methods are proposed, not requiring numerical analyses. Finally, an example of the application of the proposed method to a truss steel footbridge is presented. The results highlight the advantages of the probabilistic procedure in terms of reliability quantification.

  11. Delineating probabilistic species pools in ecology and biogeography

    OpenAIRE

    Karger, Dirk Nikolaus; Cord, Anna F; Kessler, Michael; Kreft, Holger; Kühn, Ingolf; Pompe, Sven; Sandel, Brody; Sarmento Cabral, Juliano; Smith, Adam B; Svenning, Jens-Christian; Tuomisto, Hanna; Weigelt, Patrick; Wesche, Karsten

    2016-01-01

    Aim To provide a mechanistic and probabilistic framework for defining the species pool based on species-specific probabilities of dispersal, environmental suitability and biotic interactions within a specific temporal extent, and to show how probabilistic species pools can help disentangle the geographical structure of different community assembly processes. Innovation Probabilistic species pools provide an improved species pool definition based on probabilities in conjuncti...

  12. Integrated assessment of the global warming problem: A decision-analytical approach

    International Nuclear Information System (INIS)

    Van Lenthe, J.; Hendrickx, L.; Vlek, C.A.J.

    1994-12-01

    The multi-disciplinary character of the global warming problem asks for an integrated assessment approach for ordering and combining the various physical, ecological, economical, and sociological results. The Netherlands initiated their own National Research Program (NRP) on Global Air Pollution and Climate Change (NRP). The first phase (NRP-1) identified the integration theme as one of five central research themes. The second phase (NRP-2) shows a growing concern for integrated assessment issues. The current two-year research project 'Characterizing the risks: a comparative analysis of the risks of global warming and of relevant policy options, which started in September 1993, comes under the integrated assessment part of the Dutch NRP. The first part of the interim report describes the search for an integrated assessment methodology. It starts with emphasizing the need for integrated assessment at a relatively high level of aggregation and from a policy point of view. The conclusion will be that a decision-analytical approach might fit the purpose of a policy-oriented integrated modeling of the global warming problem. The discussion proceeds with an account on decision analysis and its explicit incorporation and analysis of uncertainty. Then influence diagrams, a relatively recent development in decision analysis, are introduced as a useful decision-analytical approach for integrated assessment. Finally, a software environment for creating and analyzing complex influence diagram models is discussed. The second part of the interim report provides a first, provisional integrated modeling of the global warming problem, emphasizing on the illustration of the decision-analytical approach. Major problem elements are identified and an initial problem structure is developed. The problem structure is described in terms of hierarchical influence diagrams. At some places the qualitative structure is filled with quantitative data

  13. Probabilistic Modeling of High-Temperature Material Properties of a 5-Harness 0/90 Sylramic Fiber/ CVI-SiC/ MI-SiC Woven Composite

    Science.gov (United States)

    Nagpal, Vinod K.; Tong, Michael; Murthy, P. L. N.; Mital, Subodh

    1998-01-01

    An integrated probabilistic approach has been developed to assess composites for high temperature applications. This approach was used to determine thermal and mechanical properties and their probabilistic distributions of a 5-harness 0/90 Sylramic fiber/CVI-SiC/Mi-SiC woven Ceramic Matrix Composite (CMC) at high temperatures. The purpose of developing this approach was to generate quantitative probabilistic information on this CMC to help complete the evaluation for its potential application for HSCT combustor liner. This approach quantified the influences of uncertainties inherent in constituent properties called primitive variables on selected key response variables of the CMC at 2200 F. The quantitative information is presented in the form of Cumulative Density Functions (CDFs). Probability Density Functions (PDFS) and primitive variable sensitivities on response. Results indicate that the scatters in response variables were reduced by 30-50% when the uncertainties in the primitive variables, which showed the most influence, were reduced by 50%.

  14. PROBABILISTIC SAFETY ASSESSMENT OF OPERATIONAL ACCIDENTS AT THE WASTE ISOLATION PILOT PLANT

    International Nuclear Information System (INIS)

    Rucker, D.F.

    2000-01-01

    This report presents a probabilistic safety assessment of radioactive doses as consequences from accident scenarios to complement the deterministic assessment presented in the Waste Isolation Pilot Plant (WIPP) Safety Analysis Report (SAR). The International Council of Radiation Protection (ICRP) recommends both assessments be conducted to ensure that ''an adequate level of safety has been achieved and that no major contributors to risk are overlooked'' (ICRP 1993). To that end, the probabilistic assessment for the WIPP accident scenarios addresses the wide range of assumptions, e.g. the range of values representing the radioactive source of an accident, that could possibly have been overlooked by the SAR. Routine releases of radionuclides from the WIPP repository to the environment during the waste emplacement operations are expected to be essentially zero. In contrast, potential accidental releases from postulated accident scenarios during waste handling and emplacement could be substantial, which necessitates the need for radiological air monitoring and confinement barriers (DOE 1999). The WIPP Safety Analysis Report (SAR) calculated doses from accidental releases to the on-site (at 100 m from the source) and off-site (at the Exclusive Use Boundary and Site Boundary) public by a deterministic approach. This approach, as demonstrated in the SAR, uses single-point values of key parameters to assess the 50-year, whole-body committed effective dose equivalent (CEDE). The basic assumptions used in the SAR to formulate the CEDE are retained for this report's probabilistic assessment. However, for the probabilistic assessment, single-point parameter values were replaced with probability density functions (PDF) and were sampled over an expected range. Monte Carlo simulations were run, in which 10,000 iterations were performed by randomly selecting one value for each parameter and calculating the dose. Statistical information was then derived from the 10,000 iteration

  15. PROBABILISTIC SAFETY ASSESSMENT OF OPERATIONAL ACCIDENTS AT THE WASTE ISOLATION PILOT PLANT

    Energy Technology Data Exchange (ETDEWEB)

    Rucker, D.F.

    2000-09-01

    This report presents a probabilistic safety assessment of radioactive doses as consequences from accident scenarios to complement the deterministic assessment presented in the Waste Isolation Pilot Plant (WIPP) Safety Analysis Report (SAR). The International Council of Radiation Protection (ICRP) recommends both assessments be conducted to ensure that ''an adequate level of safety has been achieved and that no major contributors to risk are overlooked'' (ICRP 1993). To that end, the probabilistic assessment for the WIPP accident scenarios addresses the wide range of assumptions, e.g. the range of values representing the radioactive source of an accident, that could possibly have been overlooked by the SAR. Routine releases of radionuclides from the WIPP repository to the environment during the waste emplacement operations are expected to be essentially zero. In contrast, potential accidental releases from postulated accident scenarios during waste handling and emplacement could be substantial, which necessitates the need for radiological air monitoring and confinement barriers (DOE 1999). The WIPP Safety Analysis Report (SAR) calculated doses from accidental releases to the on-site (at 100 m from the source) and off-site (at the Exclusive Use Boundary and Site Boundary) public by a deterministic approach. This approach, as demonstrated in the SAR, uses single-point values of key parameters to assess the 50-year, whole-body committed effective dose equivalent (CEDE). The basic assumptions used in the SAR to formulate the CEDE are retained for this report's probabilistic assessment. However, for the probabilistic assessment, single-point parameter values were replaced with probability density functions (PDF) and were sampled over an expected range. Monte Carlo simulations were run, in which 10,000 iterations were performed by randomly selecting one value for each parameter and calculating the dose. Statistical information was then derived

  16. Nuclear fuel cycle cost analysis using a probabilistic simulation technique

    International Nuclear Information System (INIS)

    Won, Il Ko; Jong, Won Choi; Chul, Hyung Kang; Jae, Sol Lee; Kun, Jai Lee

    1998-01-01

    A simple approach was described to incorporate the Monte Carlo simulation technique into a fuel cycle cost estimate. As a case study, the once-through and recycle fuel cycle options were tested with some alternatives (ie. the change of distribution type for input parameters), and the simulation results were compared with the values calculated by a deterministic method. A three-estimate approach was used for converting cost inputs into the statistical parameters of assumed probabilistic distributions. It was indicated that the Monte Carlo simulation by a Latin Hypercube Sampling technique and subsequent sensitivity analyses were useful for examining uncertainty propagation of fuel cycle costs, and could more efficiently provide information to decisions makers than a deterministic method. It was shown from the change of distribution types of input parameters that the values calculated by the deterministic method were set around a 40 th ∼ 50 th percentile of the output distribution function calculated by probabilistic simulation. Assuming lognormal distribution of inputs, however, the values calculated by the deterministic method were set around an 85 th percentile of the output distribution function calculated by probabilistic simulation. It was also indicated from the results of the sensitivity analysis that the front-end components were generally more sensitive than the back-end components, of which the uranium purchase cost was the most important factor of all. It showed, also, that the discount rate made many contributions to the fuel cycle cost, showing the rank of third or fifth of all components. The results of this study could be useful in applications to another options, such as the Dcp (Direct Use of PWR spent fuel In Candu reactors) cycle with high cost uncertainty

  17. Methods for Probabilistic Fault Diagnosis: An Electrical Power System Case Study

    Science.gov (United States)

    Ricks, Brian W.; Mengshoel, Ole J.

    2009-01-01

    Health management systems that more accurately and quickly diagnose faults that may occur in different technical systems on-board a vehicle will play a key role in the success of future NASA missions. We discuss in this paper the diagnosis of abrupt continuous (or parametric) faults within the context of probabilistic graphical models, more specifically Bayesian networks that are compiled to arithmetic circuits. This paper extends our previous research, within the same probabilistic setting, on diagnosis of abrupt discrete faults. Our approach and diagnostic algorithm ProDiagnose are domain-independent; however we use an electrical power system testbed called ADAPT as a case study. In one set of ADAPT experiments, performed as part of the 2009 Diagnostic Challenge, our system turned out to have the best performance among all competitors. In a second set of experiments, we show how we have recently further significantly improved the performance of the probabilistic model of ADAPT. While these experiments are obtained for an electrical power system testbed, we believe they can easily be transitioned to real-world systems, thus promising to increase the success of future NASA missions.

  18. Probabilistic Approach to Provide Scenarios of Earthquake-Induced Slope Failures (PARSIFAL Applied to the Alcoy Basin (South Spain

    Directory of Open Access Journals (Sweden)

    Salvatore Martino

    2018-02-01

    Full Text Available The PARSIFAL (Probabilistic Approach to pRovide Scenarios of earthquake-Induced slope FAiLures approach was applied in the basin of Alcoy (Alicante, South Spain, to provide a comprehensive scenario of earthquake-induced landslides. The basin of Alcoy is well known for several historical landslides, mainly represented by earth-slides, that involve urban settlement as well as infrastructures (i.e., roads, bridges. The PARSIFAL overcomes several limits existing in other approaches, allowing the concomitant analyses of: (i first-time landslides (due to both rock-slope failures and shallow earth-slides and reactivations of existing landslides; (ii slope stability analyses of different failure mechanisms; (iii comprehensive mapping of earthquake-induced landslide scenarios in terms of exceedance probability of critical threshold values of co-seismic displacements. Geotechnical data were used to constrain the slope stability analysis, while specific field surveys were carried out to measure jointing and strength conditions of rock masses and to inventory already existing landslides. GIS-based susceptibility analyses were performed to assess the proneness to shallow earth-slides as well as to verify kinematic compatibility to planar or wedge rock-slides and to topples. The experienced application of PARSIFAL to the Alcoy basin: (i confirms the suitability of the approach at a municipality scale, (ii outputs the main role of saturation in conditioning slope instabilities in this case study, (iii demonstrates the reliability of the obtained results respect to the historical data.

  19. probabilistic assessment of calcium carbonate export and dissolution in the modern ocean

    OpenAIRE

    Battaglia Gianna; Steinacher Marco; Joos Fortunat

    2016-01-01

    The marine cycle of calcium carbonate (CaCO3) is an important element of the carbon cycle and co-governs the distribution of carbon and alkalinity within the ocean. However, CaCO3 export fluxes and mechanisms governing CaCO3 dissolution are highly uncertain. We present an observationally constrained, probabilistic assessment of the global and regional CaCO3 budgets. Parameters governing pelagic CaCO3 export fluxes and dissolution rates are sampled using a Monte Carlo sche...

  20. Assessing performance and validating finite element simulations using probabilistic knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Dolin, Ronald M.; Rodriguez, E. A. (Edward A.)

    2002-01-01

    Two probabilistic approaches for assessing performance are presented. The first approach assesses probability of failure by simultaneously modeling all likely events. The probability each event causes failure along with the event's likelihood of occurrence contribute to the overall probability of failure. The second assessment method is based on stochastic sampling using an influence diagram. Latin-hypercube sampling is used to stochastically assess events. The overall probability of failure is taken as the maximum probability of failure of all the events. The Likelihood of Occurrence simulation suggests failure does not occur while the Stochastic Sampling approach predicts failure. The Likelihood of Occurrence results are used to validate finite element predictions.