WorldWideScience

Sample records for poisson probabilistic approaches

  1. Poisson-Like Spiking in Circuits with Probabilistic Synapses

    Science.gov (United States)

    Moreno-Bote, Rubén

    2014-01-01

    Neuronal activity in cortex is variable both spontaneously and during stimulation, and it has the remarkable property that it is Poisson-like over broad ranges of firing rates covering from virtually zero to hundreds of spikes per second. The mechanisms underlying cortical-like spiking variability over such a broad continuum of rates are currently unknown. We show that neuronal networks endowed with probabilistic synaptic transmission, a well-documented source of variability in cortex, robustly generate Poisson-like variability over several orders of magnitude in their firing rate without fine-tuning of the network parameters. Other sources of variability, such as random synaptic delays or spike generation jittering, do not lead to Poisson-like variability at high rates because they cannot be sufficiently amplified by recurrent neuronal networks. We also show that probabilistic synapses predict Fano factor constancy of synaptic conductances. Our results suggest that synaptic noise is a robust and sufficient mechanism for the type of variability found in cortex. PMID:25032705

  2. Probabilistic approach to mechanisms

    CERN Document Server

    Sandler, BZ

    1984-01-01

    This book discusses the application of probabilistics to the investigation of mechanical systems. The book shows, for example, how random function theory can be applied directly to the investigation of random processes in the deflection of cam profiles, pitch or gear teeth, pressure in pipes, etc. The author also deals with some other technical applications of probabilistic theory, including, amongst others, those relating to pneumatic and hydraulic mechanisms and roller bearings. Many of the aspects are illustrated by examples of applications of the techniques under discussion.

  3. Probabilistic approaches to recommendations

    CERN Document Server

    Barbieri, Nicola; Ritacco, Ettore

    2014-01-01

    The importance of accurate recommender systems has been widely recognized by academia and industry, and recommendation is rapidly becoming one of the most successful applications of data mining and machine learning. Understanding and predicting the choices and preferences of users is a challenging task: real-world scenarios involve users behaving in complex situations, where prior beliefs, specific tendencies, and reciprocal influences jointly contribute to determining the preferences of users toward huge amounts of information, services, and products. Probabilistic modeling represents a robus

  4. The probabilistic solution of stochastic oscillators with even nonlinearity under poisson excitation

    Science.gov (United States)

    Guo, Siu-Siu; Er, Guo-Kang

    2012-06-01

    The probabilistic solutions of nonlinear stochastic oscillators with even nonlinearity driven by Poisson white noise are investigated in this paper. The stationary probability density function (PDF) of the oscillator responses governed by the reduced Fokker-Planck-Kolmogorov equation is obtained with exponentialpolynomial closure (EPC) method. Different types of nonlinear oscillators are considered. Monte Carlo simulation is conducted to examine the effectiveness and accuracy of the EPC method in this case. It is found that the PDF solutions obtained with EPC agree well with those obtained with Monte Carlo simulation, especially in the tail regions of the PDFs of oscillator responses. Numerical analysis shows that the mean of displacement is nonzero and the PDF of displacement is nonsymmetric about its mean when there is even nonlinearity in displacement in the oscillator. Numerical analysis further shows that the mean of velocity always equals zero and the PDF of velocity is symmetrically distributed about its mean.

  5. Analytic Bayesian solution of the two-stage poisson-type problem in probabilistic risk analysis

    International Nuclear Information System (INIS)

    Frohner, F.H.

    1985-01-01

    The basic purpose of probabilistic risk analysis is to make inferences about the probabilities of various postulated events, with an account of all relevant information such as prior knowledge and operating experience with the specific system under study, as well as experience with other similar systems. Estimation of the failure rate of a Poisson-type system leads to an especially simple Bayesian solution in closed form if the prior probabilty implied by the invariance properties of the problem is properly taken into account. This basic simplicity persists if a more realistic prior, representing order of magnitude knowledge of the rate parameter, is employed instead. Moreover, the more realistic prior allows direct incorporation of experience gained from other similar systems, without need to postulate a statistical model for an underlying ensemble. The analytic formalism is applied to actual nuclear reactor data

  6. Probabilistic approach to earthquake prediction.

    Directory of Open Access Journals (Sweden)

    G. D'Addezio

    2002-06-01

    Full Text Available The evaluation of any earthquake forecast hypothesis requires the application of rigorous statistical methods. It implies a univocal definition of the model characterising the concerned anomaly or precursor, so as it can be objectively recognised in any circumstance and by any observer.A valid forecast hypothesis is expected to maximise successes and minimise false alarms. The probability gain associated to a precursor is also a popular way to estimate the quality of the predictions based on such precursor. Some scientists make use of a statistical approach based on the computation of the likelihood of an observed realisation of seismic events, and on the comparison of the likelihood obtained under different hypotheses. This method can be extended to algorithms that allow the computation of the density distribution of the conditional probability of earthquake occurrence in space, time and magnitude. Whatever method is chosen for building up a new hypothesis, the final assessment of its validity should be carried out by a test on a new and independent set of observations. The implementation of this test could, however, be problematic for seismicity characterised by long-term recurrence intervals. Even using the historical record, that may span time windows extremely variable between a few centuries to a few millennia, we have a low probability to catch more than one or two events on the same fault. Extending the record of earthquakes of the past back in time up to several millennia, paleoseismology represents a great opportunity to study how earthquakes recur through time and thus provide innovative contributions to time-dependent seismic hazard assessment. Sets of paleoseimologically dated earthquakes have been established for some faults in the Mediterranean area: the Irpinia fault in Southern Italy, the Fucino fault in Central Italy, the El Asnam fault in Algeria and the Skinos fault in Central Greece. By using the age of the

  7. A dictionary learning approach for Poisson image deblurring.

    Science.gov (United States)

    Ma, Liyan; Moisan, Lionel; Yu, Jian; Zeng, Tieyong

    2013-07-01

    The restoration of images corrupted by blur and Poisson noise is a key issue in medical and biological image processing. While most existing methods are based on variational models, generally derived from a maximum a posteriori (MAP) formulation, recently sparse representations of images have shown to be efficient approaches for image recovery. Following this idea, we propose in this paper a model containing three terms: a patch-based sparse representation prior over a learned dictionary, the pixel-based total variation regularization term and a data-fidelity term capturing the statistics of Poisson noise. The resulting optimization problem can be solved by an alternating minimization technique combined with variable splitting. Extensive experimental results suggest that in terms of visual quality, peak signal-to-noise ratio value and the method noise, the proposed algorithm outperforms state-of-the-art methods.

  8. An approach to numerically solving the Poisson equation

    Science.gov (United States)

    Feng, Zhichen; Sheng, Zheng-Mao

    2015-06-01

    We introduce an approach for numerically solving the Poisson equation by using a physical model, which is a way to solve a partial differential equation without the finite difference method. This method is especially useful for obtaining the solutions in very many free-charge neutral systems with open boundary conditions. It can be used for arbitrary geometry and mesh style and is more efficient comparing with the widely-used iterative algorithm with multigrid methods. It is especially suitable for parallel computing. This method can also be applied to numerically solving other partial differential equations whose Green functions exist in analytic expression.

  9. Probabilistic approach to pattern selection

    Science.gov (United States)

    Yukalov, V. I.

    2001-03-01

    The problem of pattern selection arises when the evolution equations have many solutions, whereas observed patterns constitute a much more restricted set. An approach is advanced for treating the problem of pattern selection by defining the probability distribution of patterns. Then the most probable pattern naturally corresponds to the largest probability weight. This approach provides the ordering principle for the multiplicity of solutions explaining why some of them are more preferable than others. The approach is applied to solving the problem of turbulent photon filamentation in resonant media.

  10. Probabilistic solution of nonlinear oscillators excited by combined Gaussian and Poisson white noises

    Science.gov (United States)

    Zhu, H. T.; Er, G. K.; Iu, V. P.; Kou, K. P.

    2011-06-01

    The stationary probability density function (PDF) solution of the stochastic response of nonlinear oscillators is investigated in this paper. The external excitation is assumed to be a combination of Gaussian and Poisson white noises. The PDF solution is governed by the generalized Kolmogorov equation which is solved by the exponential-polynomial closure (EPC) method. In order to evaluate the effectiveness of the EPC method, different nonlinear oscillators are considered in numerical analysis. Nonlinearity exists either in displacement or in velocity for these nonlinear oscillators. The impulse arrival rate, mono-modal PDF and bi-modal PDF are also considered in this study. Compared to the PDF given by Monte Carlo simulation, the EPC method presents good agreement with the simulated result, which can also be observed in the tail region of the PDF solution.

  11. A probabilistic approach to controlling crevice chemistry

    International Nuclear Information System (INIS)

    Millett, P.J.; Brobst, G.E.; Riddle, J.

    1995-01-01

    It has been generally accepted that the corrosion of steam generator tubing could be reduced if the local pH in regions where impurities concentrate could be controlled. The practice of molar ratio control is based on this assumption. Unfortunately, due to the complexity of the crevice concentration process, efforts to model the crevice chemistry based on bulk water conditions are quite uncertain. In-situ monitoring of the crevice chemistry is desirable, but may not be achievable in the near future. The current methodology for assessing the crevice chemistry is to monitor the hideout return chemistry when the plant shuts down. This approach also has its shortcomings, but may provide sufficient data to evaluate whether the crevice pH is in a desirable range. In this paper, an approach to controlling the crevice chemistry based on a target molar ratio indicator is introduced. The molar ratio indicator is based on what is believed to be the most reliable hideout return data. Probabilistic arguments are then used to show that the crevice pH will most likely be in a desirable range when the target molar ratio is achieved

  12. Probabilistic approaches for geotechnical site characterization and slope stability analysis

    CERN Document Server

    Cao, Zijun; Li, Dianqing

    2017-01-01

    This is the first book to revisit geotechnical site characterization from a probabilistic point of view and provide rational tools to probabilistically characterize geotechnical properties and underground stratigraphy using limited information obtained from a specific site. This book not only provides new probabilistic approaches for geotechnical site characterization and slope stability analysis, but also tackles the difficulties in practical implementation of these approaches. In addition, this book also develops efficient Monte Carlo simulation approaches for slope stability analysis and implements these approaches in a commonly available spreadsheet environment. These approaches and the software package are readily available to geotechnical practitioners and alleviate them from reliability computational algorithms. The readers will find useful information for a non-specialist to determine project-specific statistics of geotechnical properties and to perform probabilistic analysis of slope stability.

  13. Probabilistic Forecasting of Photovoltaic Generation: An Efficient Statistical Approach

    DEFF Research Database (Denmark)

    Wan, Can; Lin, Jin; Song, Yonghua

    2017-01-01

    This letter proposes a novel efficient probabilistic forecasting approach to accurately quantify the variability and uncertainty of the power production from photovoltaic (PV) systems. Distinguished from most existing models, a linear programming based prediction interval construction model for PV...

  14. A Markov Chain Approach to Probabilistic Swarm Guidance

    Science.gov (United States)

    Acikmese, Behcet; Bayard, David S.

    2012-01-01

    This paper introduces a probabilistic guidance approach for the coordination of swarms of autonomous agents. The main idea is to drive the swarm to a prescribed density distribution in a prescribed region of the configuration space. In its simplest form, the probabilistic approach is completely decentralized and does not require communication or collabo- ration between agents. Agents make statistically independent probabilistic decisions based solely on their own state, that ultimately guides the swarm to the desired density distribution in the configuration space. In addition to being completely decentralized, the probabilistic guidance approach has a novel autonomous self-repair property: Once the desired swarm density distribution is attained, the agents automatically repair any damage to the distribution without collaborating and without any knowledge about the damage.

  15. Overview of the probabilistic risk assessment approach

    International Nuclear Information System (INIS)

    Reed, J.W.

    1985-01-01

    The techniques of probabilistic risk assessment (PRA) are applicable to Department of Energy facilities. The background and techniques of PRA are given with special attention to seismic, wind and flooding external events. A specific application to seismic events is provided to demonstrate the method. However, the PRA framework is applicable also to wind and external flooding. 3 references, 8 figures, 1 table

  16. A probabilistic approach to crack instability

    Science.gov (United States)

    Chudnovsky, A.; Kunin, B.

    1989-01-01

    A probabilistic model of brittle fracture is examined with reference to two-dimensional problems. The model is illustrated by using experimental data obtained for 25 macroscopically identical specimens made of short-fiber-reinforced composites. It is shown that the model proposed here provides a predictive formalism for the probability distributions of critical crack depth, critical loads, and crack arrest depths. It also provides similarity criteria for small-scale testing.

  17. Deterministic and probabilistic approach to safety analysis

    International Nuclear Information System (INIS)

    Heuser, F.W.

    1980-01-01

    The examples discussed in this paper show that reliability analysis methods fairly well can be applied in order to interpret deterministic safety criteria in quantitative terms. For further improved extension of applied reliability analysis it has turned out that the influence of operational and control systems and of component protection devices should be considered with the aid of reliability analysis methods in detail. Of course, an extension of probabilistic analysis must be accompanied by further development of the methods and a broadening of the data base. (orig.)

  18. Analysis of Blood Transfusion Data Using Bivariate Zero-Inflated Poisson Model: A Bayesian Approach.

    Science.gov (United States)

    Mohammadi, Tayeb; Kheiri, Soleiman; Sedehi, Morteza

    2016-01-01

    Recognizing the factors affecting the number of blood donation and blood deferral has a major impact on blood transfusion. There is a positive correlation between the variables "number of blood donation" and "number of blood deferral": as the number of return for donation increases, so does the number of blood deferral. On the other hand, due to the fact that many donors never return to donate, there is an extra zero frequency for both of the above-mentioned variables. In this study, in order to apply the correlation and to explain the frequency of the excessive zero, the bivariate zero-inflated Poisson regression model was used for joint modeling of the number of blood donation and number of blood deferral. The data was analyzed using the Bayesian approach applying noninformative priors at the presence and absence of covariates. Estimating the parameters of the model, that is, correlation, zero-inflation parameter, and regression coefficients, was done through MCMC simulation. Eventually double-Poisson model, bivariate Poisson model, and bivariate zero-inflated Poisson model were fitted on the data and were compared using the deviance information criteria (DIC). The results showed that the bivariate zero-inflated Poisson regression model fitted the data better than the other models.

  19. Upper limit for Poisson variable incorporating systematic uncertainties by Bayesian approach

    International Nuclear Information System (INIS)

    Zhu, Yongsheng

    2007-01-01

    To calculate the upper limit for the Poisson observable at given confidence level with inclusion of systematic uncertainties in background expectation and signal efficiency, formulations have been established along the line of Bayesian approach. A FORTRAN program, BPULE, has been developed to implement the upper limit calculation

  20. A Poisson hierarchical modelling approach to detecting copy number variation in sequence coverage data

    KAUST Repository

    Sepúlveda, Nuno

    2013-02-26

    Background: The advent of next generation sequencing technology has accelerated efforts to map and catalogue copy number variation (CNV) in genomes of important micro-organisms for public health. A typical analysis of the sequence data involves mapping reads onto a reference genome, calculating the respective coverage, and detecting regions with too-low or too-high coverage (deletions and amplifications, respectively). Current CNV detection methods rely on statistical assumptions (e.g., a Poisson model) that may not hold in general, or require fine-tuning the underlying algorithms to detect known hits. We propose a new CNV detection methodology based on two Poisson hierarchical models, the Poisson-Gamma and Poisson-Lognormal, with the advantage of being sufficiently flexible to describe different data patterns, whilst robust against deviations from the often assumed Poisson model.Results: Using sequence coverage data of 7 Plasmodium falciparum malaria genomes (3D7 reference strain, HB3, DD2, 7G8, GB4, OX005, and OX006), we showed that empirical coverage distributions are intrinsically asymmetric and overdispersed in relation to the Poisson model. We also demonstrated a low baseline false positive rate for the proposed methodology using 3D7 resequencing data and simulation. When applied to the non-reference isolate data, our approach detected known CNV hits, including an amplification of the PfMDR1 locus in DD2 and a large deletion in the CLAG3.2 gene in GB4, and putative novel CNV regions. When compared to the recently available FREEC and cn.MOPS approaches, our findings were more concordant with putative hits from the highest quality array data for the 7G8 and GB4 isolates.Conclusions: In summary, the proposed methodology brings an increase in flexibility, robustness, accuracy and statistical rigour to CNV detection using sequence coverage data. 2013 Seplveda et al.; licensee BioMed Central Ltd.

  1. The probabilistic approach and the deterministic licensing procedure

    International Nuclear Information System (INIS)

    Fabian, H.; Feigel, A.; Gremm, O.

    1984-01-01

    If safety goals are given, the creativity of the engineers is necessary to transform the goals into actual safety measures. That is, safety goals are not sufficient for the derivation of a safety concept; the licensing process asks ''What does a safe plant look like.'' The answer connot be given by a probabilistic procedure, but need definite deterministic statements; the conclusion is, that the licensing process needs a deterministic approach. The probabilistic approach should be used in a complementary role in cases where deterministic criteria are not complete, not detailed enough or not consistent and additional arguments for decision making in connection with the adequacy of a specific measure are necessary. But also in these cases the probabilistic answer has to be transformed into a clear deterministic statement. (orig.)

  2. Information fusion in signal and image processing major probabilistic and non-probabilistic numerical approaches

    CERN Document Server

    Bloch, Isabelle

    2010-01-01

    The area of information fusion has grown considerably during the last few years, leading to a rapid and impressive evolution. In such fast-moving times, it is important to take stock of the changes that have occurred. As such, this books offers an overview of the general principles and specificities of information fusion in signal and image processing, as well as covering the main numerical methods (probabilistic approaches, fuzzy sets and possibility theory and belief functions).

  3. A sampling-based approach to probabilistic pursuit evasion

    KAUST Repository

    Mahadevan, Aditya

    2012-05-01

    Probabilistic roadmaps (PRMs) are a sampling-based approach to motion-planning that encodes feasible paths through the environment using a graph created from a subset of valid positions. Prior research has shown that PRMs can be augmented with useful information to model interesting scenarios related to multi-agent interaction and coordination. © 2012 IEEE.

  4. Poisson regression approach for modeling fatal injury rates amongst Malaysian workers

    International Nuclear Information System (INIS)

    Kamarulzaman Ibrahim; Heng Khai Theng

    2005-01-01

    Many safety studies are based on the analysis carried out on injury surveillance data. The injury surveillance data gathered for the analysis include information on number of employees at risk of injury in each of several strata where the strata are defined in terms of a series of important predictor variables. Further insight into the relationship between fatal injury rates and predictor variables may be obtained by the poisson regression approach. Poisson regression is widely used in analyzing count data. In this study, poisson regression is used to model the relationship between fatal injury rates and predictor variables which are year (1995-2002), gender, recording system and industry type. Data for the analysis were obtained from PERKESO and Jabatan Perangkaan Malaysia. It is found that the assumption that the data follow poisson distribution has been violated. After correction for the problem of over dispersion, the predictor variables that are found to be significant in the model are gender, system of recording, industry type, two interaction effects (interaction between recording system and industry type and between year and industry type). Introduction Regression analysis is one of the most popular

  5. PROBABILISTIC APPROACH TO OBJECT DETECTION AND RECOGNITION FOR VIDEOSTREAM PROCESSING

    Directory of Open Access Journals (Sweden)

    Volodymyr Kharchenko

    2017-07-01

    Full Text Available Purpose: The represented research results are aimed to improve theoretical basics of computer vision and artificial intelligence of dynamical system. Proposed approach of object detection and recognition is based on probabilistic fundamentals to ensure the required level of correct object recognition. Methods: Presented approach is grounded at probabilistic methods, statistical methods of probability density estimation and computer-based simulation at verification stage of development. Results: Proposed approach for object detection and recognition for video stream data processing has shown several advantages in comparison with existing methods due to its simple realization and small time of data processing. Presented results of experimental verification look plausible for object detection and recognition in video stream. Discussion: The approach can be implemented in dynamical system within changeable environment such as remotely piloted aircraft systems and can be a part of artificial intelligence in navigation and control systems.

  6. Application of probabilistic risk based optimization approaches in environmental restoration

    International Nuclear Information System (INIS)

    Goldammer, W.

    1995-01-01

    The paper presents a general approach to site-specific risk assessments and optimization procedures. In order to account for uncertainties in the assessment of the current situation and future developments, optimization parameters are treated as probabilistic distributions. The assessments are performed within the framework of a cost-benefit analysis. Radiation hazards and conventional risks are treated within an integrated approach. Special consideration is given to consequences of low probability events such as, earthquakes or major floods. Risks and financial costs are combined to an overall figure of detriment allowing one to distinguish between benefits of available reclamation options. The probabilistic analysis uses a Monte Carlo simulation technique. The paper demonstrates the applicability of this approach in aiding the reclamation planning using an example from the German reclamation program for uranium mining and milling sites

  7. Probabilistic approaches to life prediction of nuclear plant structural components

    International Nuclear Information System (INIS)

    Villain, B.; Pitner, P.; Procaccia, H.

    1996-01-01

    In the last decade there has been an increasing interest at EDF in developing and applying probabilistic methods for a variety of purposes. In the field of structural integrity and reliability they are used to evaluate the effect of deterioration due to aging mechanisms, mainly on major passive structural components such as steam generators, pressure vessels and piping in nuclear plants. Because there can be numerous uncertainties involved in a assessment of the performance of these structural components, probabilistic methods. The benefits of a probabilistic approach are the clear treatment of uncertainly and the possibility to perform sensitivity studies from which it is possible to identify and quantify the effect of key factors and mitigative actions. They thus provide information to support effective decisions to optimize In-Service Inspection planning and maintenance strategies and for realistic lifetime prediction or reassessment. The purpose of the paper is to discuss and illustrate the methods available at EDF for probabilistic component life prediction. This includes a presentation of software tools in classical, Bayesian and structural reliability, and an application on two case studies (steam generator tube bundle, reactor pressure vessel). (authors)

  8. Probabilistic approaches to life prediction of nuclear plant structural components

    International Nuclear Information System (INIS)

    Villain, B.; Pitner, P.; Procaccia, H.

    1996-01-01

    In the last decade there has been an increasing interest at EDF in developing and applying probabilistic methods for a variety of purposes. In the field of structural integrity and reliability they are used to evaluate the effect of deterioration due to aging mechanisms, mainly on major passive structural components such as steam generators, pressure vessels and piping in nuclear plants. Because there can be numerous uncertainties involved in an assessment of the performance of these structural components, probabilistic methods provide an attractive alternative or supplement to more conventional deterministic methods. The benefits of a probabilistic approach are the clear treatment of uncertainty and the possibility to perform sensitivity studies from which it is possible to identify and quantify the effect of key factors and mitigative actions. They thus provide information to support effective decisions to optimize In-Service Inspection planning and maintenance strategies and for realistic lifetime prediction or reassessment. The purpose of the paper is to discuss and illustrate the methods available at EDF for probabilistic component life prediction. This includes a presentation of software tools in classical, Bayesian and structural reliability, and an application on two case studies (steam generator tube bundle, reactor pressure vessel)

  9. A Poisson regression approach for modelling spatial autocorrelation between geographically referenced observations.

    Science.gov (United States)

    Mohebbi, Mohammadreza; Wolfe, Rory; Jolley, Damien

    2011-10-03

    Analytic methods commonly used in epidemiology do not account for spatial correlation between observations. In regression analyses, omission of that autocorrelation can bias parameter estimates and yield incorrect standard error estimates. We used age standardised incidence ratios (SIRs) of esophageal cancer (EC) from the Babol cancer registry from 2001 to 2005, and extracted socioeconomic indices from the Statistical Centre of Iran. The following models for SIR were used: (1) Poisson regression with agglomeration-specific nonspatial random effects; (2) Poisson regression with agglomeration-specific spatial random effects. Distance-based and neighbourhood-based autocorrelation structures were used for defining the spatial random effects and a pseudolikelihood approach was applied to estimate model parameters. The Bayesian information criterion (BIC), Akaike's information criterion (AIC) and adjusted pseudo R2, were used for model comparison. A Gaussian semivariogram with an effective range of 225 km best fit spatial autocorrelation in agglomeration-level EC incidence. The Moran's I index was greater than its expected value indicating systematic geographical clustering of EC. The distance-based and neighbourhood-based Poisson regression estimates were generally similar. When residual spatial dependence was modelled, point and interval estimates of covariate effects were different to those obtained from the nonspatial Poisson model. The spatial pattern evident in the EC SIR and the observation that point estimates and standard errors differed depending on the modelling approach indicate the importance of accounting for residual spatial correlation in analyses of EC incidence in the Caspian region of Iran. Our results also illustrate that spatial smoothing must be applied with care.

  10. A poisson regression approach for modelling spatial autocorrelation between geographically referenced observations

    Directory of Open Access Journals (Sweden)

    Jolley Damien

    2011-10-01

    Full Text Available Abstract Background Analytic methods commonly used in epidemiology do not account for spatial correlation between observations. In regression analyses, omission of that autocorrelation can bias parameter estimates and yield incorrect standard error estimates. Methods We used age standardised incidence ratios (SIRs of esophageal cancer (EC from the Babol cancer registry from 2001 to 2005, and extracted socioeconomic indices from the Statistical Centre of Iran. The following models for SIR were used: (1 Poisson regression with agglomeration-specific nonspatial random effects; (2 Poisson regression with agglomeration-specific spatial random effects. Distance-based and neighbourhood-based autocorrelation structures were used for defining the spatial random effects and a pseudolikelihood approach was applied to estimate model parameters. The Bayesian information criterion (BIC, Akaike's information criterion (AIC and adjusted pseudo R2, were used for model comparison. Results A Gaussian semivariogram with an effective range of 225 km best fit spatial autocorrelation in agglomeration-level EC incidence. The Moran's I index was greater than its expected value indicating systematic geographical clustering of EC. The distance-based and neighbourhood-based Poisson regression estimates were generally similar. When residual spatial dependence was modelled, point and interval estimates of covariate effects were different to those obtained from the nonspatial Poisson model. Conclusions The spatial pattern evident in the EC SIR and the observation that point estimates and standard errors differed depending on the modelling approach indicate the importance of accounting for residual spatial correlation in analyses of EC incidence in the Caspian region of Iran. Our results also illustrate that spatial smoothing must be applied with care.

  11. A new approach to probabilistic belief change

    CSIR Research Space (South Africa)

    Rens, G

    2015-05-01

    Full Text Available and there are various probability distributions one could attach to the eight worlds. We shall use this cellphone scenario as a running example throughout the paper. We shall refer to the operations of belief revision, expan- sion, contraction and update collectively... contained in the KB is represented, as long as one can derive which worlds are possible and their assigned probabilities. In this paper, we suggest a new unified approach to redis- tribute the probability mass of worlds in a KB resulting from any belief...

  12. Standardized approach for developing probabilistic exposure factor distributions

    Energy Technology Data Exchange (ETDEWEB)

    Maddalena, Randy L.; McKone, Thomas E.; Sohn, Michael D.

    2003-03-01

    The effectiveness of a probabilistic risk assessment (PRA) depends critically on the quality of input information that is available to the risk assessor and specifically on the probabilistic exposure factor distributions that are developed and used in the exposure and risk models. Deriving probabilistic distributions for model inputs can be time consuming and subjective. The absence of a standard approach for developing these distributions can result in PRAs that are inconsistent and difficult to review by regulatory agencies. We present an approach that reduces subjectivity in the distribution development process without limiting the flexibility needed to prepare relevant PRAs. The approach requires two steps. First, we analyze data pooled at a population scale to (1) identify the most robust demographic variables within the population for a given exposure factor, (2) partition the population data into subsets based on these variables, and (3) construct archetypal distributions for each subpopulation. Second, we sample from these archetypal distributions according to site- or scenario-specific conditions to simulate exposure factor values and use these values to construct the scenario-specific input distribution. It is envisaged that the archetypal distributions from step 1 will be generally applicable so risk assessors will not have to repeatedly collect and analyze raw data for each new assessment. We demonstrate the approach for two commonly used exposure factors--body weight (BW) and exposure duration (ED)--using data for the U.S. population. For these factors we provide a first set of subpopulation based archetypal distributions along with methodology for using these distributions to construct relevant scenario-specific probabilistic exposure factor distributions.

  13. Probabilistic Approach to Conditional Probability of Release of Hazardous Materials from Railroad Tank Cars during Accidents

    Science.gov (United States)

    2009-10-13

    This paper describes a probabilistic approach to estimate the conditional probability of release of hazardous materials from railroad tank cars during train accidents. Monte Carlo methods are used in developing a probabilistic model to simulate head ...

  14. A probabilistic approach to Radiological Environmental Impact Assessment

    International Nuclear Information System (INIS)

    Avila, Rodolfo; Larsson, Carl-Magnus

    2001-01-01

    Since a radiological environmental impact assessment typically relies on limited data and poorly based extrapolation methods, point estimations, as implied by a deterministic approach, do not suffice. To be of practical use for risk management, it is necessary to quantify the uncertainty margins of the estimates as well. In this paper we discuss how to work out a probabilistic approach for dealing with uncertainties in assessments of the radiological risks to non-human biota of a radioactive contamination. Possible strategies for deriving the relevant probability distribution functions from available empirical data and theoretical knowledge are outlined

  15. A Multifaceted Approach for Safety Design and Probabilistic Optimization

    Directory of Open Access Journals (Sweden)

    Po Ting Lin

    2015-01-01

    Full Text Available Safety design and probabilistic optimization are fields that are widely subject to uncertainty, thus making traditional deterministic methods highly unreliable for these two fields. Popular design optimizations methods widely used for safety design and probabilistic optimization are the performance measure approach (PMA and the performance measure approach (RIA. In a problem where the analysis is performed from an infeasible design space, a modified reliability index approach (MRIA is employed to address some inefficiency of the traditional RIA to be able to find the optimal solutions. The PMA uses an inverse reliability analysis, which is more computationally efficient at finding the most probable design points but has been reported to have numerical instabilities on some cases. In this paper, three benchmark examples were thoroughly studied with various initial points to examine the stability and efficiency of MRIA and PMA. A hybrid reliability approach (HRA was then presented after determining a selection factor from the optimum conditions. The proposed HRA aims to determine which of the two optimization methods would be more appropriate during the optimization processes.

  16. Probabilistic Fuzzy Approach to Evaluation of Logistics Service Effectiveness

    Directory of Open Access Journals (Sweden)

    Rudnik Katarzyna

    2014-12-01

    Full Text Available Logistics service providers offer a whole or partial logistics business service over a certain time period. Between such companies, the effectiveness of specific logistics services can vary. Logistics service providers seek the effective performance of logistics service. The purpose of this paper is to present a new approach for the evaluation of logistics service effectiveness, along with a specific computer system implementing the proposed approach – a sophisticated inference system, an extension of the Mamdani probabilistic fuzzy system. The paper presents specific knowledge concerning the relationships between effectiveness indicators in the form of fuzzy rules which contain marginal and conditional probabilities of fuzzy events. An inference diagram is also shown. A family of Yager's parameterized t-norms is proposed as inference operators. It facilitates the optimization of system parameters and enables flexible adjustment of the system to empirical data. A case study was used to illustrate the new approach for the evaluation of logistics service effectiveness. The approach is demonstrated on logistics services in a logistics company. We deem the analysis of a probabilistic fuzzy knowledge base to be useful for the evaluation of effectiveness of logistics services in a logistics company over a given time period.

  17. A probabilistic approach for representation of interval uncertainty

    International Nuclear Information System (INIS)

    Zaman, Kais; Rangavajhala, Sirisha; McDonald, Mark P.; Mahadevan, Sankaran

    2011-01-01

    In this paper, we propose a probabilistic approach to represent interval data for input variables in reliability and uncertainty analysis problems, using flexible families of continuous Johnson distributions. Such a probabilistic representation of interval data facilitates a unified framework for handling aleatory and epistemic uncertainty. For fitting probability distributions, methods such as moment matching are commonly used in the literature. However, unlike point data where single estimates for the moments of data can be calculated, moments of interval data can only be computed in terms of upper and lower bounds. Finding bounds on the moments of interval data has been generally considered an NP-hard problem because it includes a search among the combinations of multiple values of the variables, including interval endpoints. In this paper, we present efficient algorithms based on continuous optimization to find the bounds on second and higher moments of interval data. With numerical examples, we show that the proposed bounding algorithms are scalable in polynomial time with respect to increasing number of intervals. Using the bounds on moments computed using the proposed approach, we fit a family of Johnson distributions to interval data. Furthermore, using an optimization approach based on percentiles, we find the bounding envelopes of the family of distributions, termed as a Johnson p-box. The idea of bounding envelopes for the family of Johnson distributions is analogous to the notion of empirical p-box in the literature. Several sets of interval data with different numbers of intervals and type of overlap are presented to demonstrate the proposed methods. As against the computationally expensive nested analysis that is typically required in the presence of interval variables, the proposed probabilistic representation enables inexpensive optimization-based strategies to estimate bounds on an output quantity of interest.

  18. A unified statistical approach to non-negative matrix factorization and probabilistic latent semantic indexing.

    Science.gov (United States)

    Devarajan, Karthik; Wang, Guoli; Ebrahimi, Nader

    2015-04-01

    Non-negative matrix factorization (NMF) is a powerful machine learning method for decomposing a high-dimensional nonnegative matrix V into the product of two nonnegative matrices, W and H , such that V ∼ W H . It has been shown to have a parts-based, sparse representation of the data. NMF has been successfully applied in a variety of areas such as natural language processing, neuroscience, information retrieval, image processing, speech recognition and computational biology for the analysis and interpretation of large-scale data. There has also been simultaneous development of a related statistical latent class modeling approach, namely, probabilistic latent semantic indexing (PLSI), for analyzing and interpreting co-occurrence count data arising in natural language processing. In this paper, we present a generalized statistical approach to NMF and PLSI based on Renyi's divergence between two non-negative matrices, stemming from the Poisson likelihood. Our approach unifies various competing models and provides a unique theoretical framework for these methods. We propose a unified algorithm for NMF and provide a rigorous proof of monotonicity of multiplicative updates for W and H . In addition, we generalize the relationship between NMF and PLSI within this framework. We demonstrate the applicability and utility of our approach as well as its superior performance relative to existing methods using real-life and simulated document clustering data.

  19. A probabilistic approach for RIA fuel failure criteria

    International Nuclear Information System (INIS)

    Carlo Vitanza, Dr.

    2008-01-01

    Substantial experimental data have been produced in support of the definition of the RIA safety limits for water reactor fuels at high burn up. Based on these data, fuel failure enthalpy limits can be derived based on methods having a varying degree of complexity. However, regardless of sophistication, it is unlikely that any deterministic approach would result in perfect predictions of all failure and non failure data obtained in RIA tests. Accordingly, a probabilistic approach is proposed in this paper, where in addition to a best estimate evaluation of the failure enthalpy, a RIA fuel failure probability distribution is defined within an enthalpy band surrounding the best estimate failure enthalpy. The band width and the failure probability distribution within this band are determined on the basis of the whole data set, including failure and non failure data and accounting for the actual scatter of the database. The present probabilistic approach can be used in conjunction with any deterministic model or correlation. For deterministic models or correlations having good prediction capability, the probability distribution will be sharply increasing within a narrow band around the best estimate value. For deterministic predictions of lower quality, instead, the resulting probability distribution will be broad and coarser

  20. An approach for obtaining integrable Hamiltonians from Poisson-commuting polynomial families

    Science.gov (United States)

    Leyvraz, F.

    2017-07-01

    We discuss a general approach permitting the identification of a broad class of sets of Poisson-commuting Hamiltonians, which are integrable in the sense of Liouville. It is shown that all such Hamiltonians can be solved explicitly by a separation of variables ansatz. The method leads in particular to a proof that the so-called "goldfish" Hamiltonian is maximally superintegrable and leads to an elementary identification of a full set of integrals of motion. The Hamiltonians in involution with the "goldfish" Hamiltonian are also explicitly integrated. New integrable Hamiltonians are identified, among which some have the property of being isochronous, that is, all their orbits have the same period. Finally, a peculiar structure is identified in the Poisson brackets between the elementary symmetric functions and the set of Hamiltonians commuting with the "goldfish" Hamiltonian: these can be expressed as products between elementary symmetric functions and Hamiltonians. The structure displays an invariance property with respect to one element and has both a symmetry and a closure property. The meaning of this structure is not altogether clear to the author, but it turns out to be a powerful tool.

  1. A dynamic probabilistic safety margin characterization approach in support of Integrated Deterministic and Probabilistic Safety Analysis

    International Nuclear Information System (INIS)

    Di Maio, Francesco; Rai, Ajit; Zio, Enrico

    2016-01-01

    The challenge of Risk-Informed Safety Margin Characterization (RISMC) is to develop a methodology for estimating system safety margins in the presence of stochastic and epistemic uncertainties affecting the system dynamic behavior. This is useful to support decision-making for licensing purposes. In the present work, safety margin uncertainties are handled by Order Statistics (OS) (with both Bracketing and Coverage approaches) to jointly estimate percentiles of the distributions of the safety parameter and of the time required for it to reach these percentiles values during its dynamic evolution. The novelty of the proposed approach consists in the integration of dynamic aspects (i.e., timing of events) into the definition of a dynamic safety margin for a probabilistic Quantification of Margin and Uncertainties (QMU). The system here considered for demonstration purposes is the Lead–Bismuth Eutectic- eXperimental Accelerator Driven System (LBE-XADS). - Highlights: • We integrate dynamic aspects into the definition of a safety margins. • We consider stochastic and epistemic uncertainties affecting the system dynamics. • Uncertainties are handled by Order Statistics (OS). • We estimate the system grace time during accidental scenarios. • We apply the approach to an LBE-XADS accidental scenario.

  2. A Probabilistic, Facility-Centric Approach to Lightning Strike Location

    Science.gov (United States)

    Huddleston, Lisa L.; Roeder, William p.; Merceret, Francis J.

    2012-01-01

    A new probabilistic facility-centric approach to lightning strike location has been developed. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even with the location error ellipse. This technique is adapted from a method of calculating the probability of debris collisionith spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force Station. Future applications could include forensic meteorology.

  3. A hybrid approach for probabilistic forecasting of electricity price

    DEFF Research Database (Denmark)

    Wan, Can; Xu, Zhao; Wang, Yelei

    2014-01-01

    The electricity market plays a key role in realizing the economic prophecy of smart grids. Accurate and reliable electricity market price forecasting is essential to facilitate various decision making activities of market participants in the future smart grid environment. However, due to the nons...... electricity price forecasting is proposed in this paper. The effectiveness of the proposed hybrid method has been validated through comprehensive tests using real price data from Australian electricity market.......The electricity market plays a key role in realizing the economic prophecy of smart grids. Accurate and reliable electricity market price forecasting is essential to facilitate various decision making activities of market participants in the future smart grid environment. However, due...... to probabilistic interval forecasts can be of great importance to quantify the uncertainties of potential forecasts, thus effectively supporting the decision making activities against uncertainties and risks ahead. This paper proposes a hybrid approach to construct prediction intervals of MCPs with a two...

  4. Transmission capacity assessment by probabilistic planning. An approach

    International Nuclear Information System (INIS)

    Lammintausta, M.

    2002-01-01

    The Finnish electricity markets participate in the Scandinavian markets, Nord-Pool. The Finnish market is free for marketers, producers and consumers. All these participants can be seen as customers of the transmission network, which in turn can be considered to be a market place in which electricity can be sold and bought. The Finnish transmission network is owned and operated by an independent company, Fingrid that has the full responsibility of the Finnish transmission system. The available transfer capacity of a transmission route is traditionally limited by deterministic security constraints. More efficient and flexible network utilisation could be achieved with probabilistic planning methods. This report introduces a simple and practical probabilistic approach for transfer limit and risk assessment. The method is based on the economical benefit and risk predictions. It uses also the existing results of deterministic data and it could be used side by side with the deterministic method. The basic concept and necessary equations for expected risks of various market players have been derived for further developments. The outage costs and thereby the risks of the market participants depend on how the system operator reacts to the faults. In the Finnish power system consumers will usually experience no costs due to the faults because of meshed network and counter trade method preferred by the system operator. The costs to the producers and dealers are also low because of the counter trade method. The network company will lose the cost of reparation, additional losses and cost of regulation power because of counter trades. In case power flows will be rearranged drastically because of aggressive strategies used in the electricity markets, the only way to fulfil the needs of free markets is that the network operator buys regulation power for short-term problems and reinforces the network in the long-term situations. The reinforcement is done if the network can not be

  5. Probabilistic evaluation approach for nonlinear vehicle-bridge dynamic performances

    Science.gov (United States)

    Jin, Zhibin; Pei, Shiling; Li, Xiaozhen; Qiang, Shizhong

    2015-03-01

    Railroad vehicle and bridge coupled lateral vibration problems are traditionally solved through detailed nonlinear models in time domain using limited samples to represent rail irregularity. Ideally, a random vibration and reliability based approach should be implemented because of the random nature of the excitation process. In this study, vehicle-bridge coupled dynamic equation was derived using the principle of virtual work utilizing a linearized wheel-rail contact equation. This simplification enables the calculation of the system random lateral responses through the pseudo-excitation method. By applying rail irregularity as random excitations to the system, this study utilized an explicit linearization method to avoid iterative solution at each time step of the integration. The results from the linearized method were validated through comparison with results obtained from Monte-Carlo simulations. By applying the linearized approach to probabilistic assessment of the vehicle-bridge system reliability, it was shown that system probability of exceedance of admissible limits increases with train speed and reduces with increased bridge self-weight. It is concluded that the proposed approach provides a viable efficient alternative to investigate the random dynamic characteristics of vehicle-bridge system especially in the lateral direction, which is dominated by the random rail irregularities.

  6. A Probabilistic Approach for Breast Boundary Extraction in Mammograms

    Directory of Open Access Journals (Sweden)

    Hamed Habibi Aghdam

    2013-01-01

    Full Text Available The extraction of the breast boundary is crucial to perform further analysis of mammogram. Methods to extract the breast boundary can be classified into two categories: methods based on image processing techniques and those based on models. The former use image transformation techniques such as thresholding, morphological operations, and region growing. In the second category, the boundary is extracted using more advanced techniques, such as the active contour model. The problem with thresholding methods is that it is a hard to automatically find the optimal threshold value by using histogram information. On the other hand, active contour models require defining a starting point close to the actual boundary to be able to successfully extract the boundary. In this paper, we propose a probabilistic approach to address the aforementioned problems. In our approach we use local binary patterns to describe the texture around each pixel. In addition, the smoothness of the boundary is handled by using a new probability model. Experimental results show that the proposed method reaches 38% and 50% improvement with respect to the results obtained by the active contour model and threshold-based methods respectively, and it increases the stability of the boundary extraction process up to 86%.

  7. A probabilistic approach for validating protein NMR chemical shift assignments

    International Nuclear Information System (INIS)

    Wang Bowei; Wang, Yunjun; Wishart, David S.

    2010-01-01

    It has been estimated that more than 20% of the proteins in the BMRB are improperly referenced and that about 1% of all chemical shift assignments are mis-assigned. These statistics also reflect the likelihood that any newly assigned protein will have shift assignment or shift referencing errors. The relatively high frequency of these errors continues to be a concern for the biomolecular NMR community. While several programs do exist to detect and/or correct chemical shift mis-referencing or chemical shift mis-assignments, most can only do one, or the other. The one program (SHIFTCOR) that is capable of handling both chemical shift mis-referencing and mis-assignments, requires the 3D structure coordinates of the target protein. Given that chemical shift mis-assignments and chemical shift re-referencing issues should ideally be addressed prior to 3D structure determination, there is a clear need to develop a structure-independent approach. Here, we present a new structure-independent protocol, which is based on using residue-specific and secondary structure-specific chemical shift distributions calculated over small (3-6 residue) fragments to identify mis-assigned resonances. The method is also able to identify and re-reference mis-referenced chemical shift assignments. Comparisons against existing re-referencing or mis-assignment detection programs show that the method is as good or superior to existing approaches. The protocol described here has been implemented into a freely available Java program called 'Probabilistic Approach for protein Nmr Assignment Validation (PANAV)' and as a web server (http://redpoll.pharmacy.ualberta.ca/PANAVhttp://redpoll.pharmacy.ualberta.ca/PANAV) which can be used to validate and/or correct as well as re-reference assigned protein chemical shifts.

  8. Bayesian probabilistic network approach for managing earthquake risks of cities

    DEFF Research Database (Denmark)

    Bayraktarli, Yahya; Faber, Michael

    2011-01-01

    This paper considers the application of Bayesian probabilistic networks (BPNs) to large-scale risk based decision making in regard to earthquake risks. A recently developed risk management framework is outlined which utilises Bayesian probabilistic modelling, generic indicator based risk models...... and geographical information systems. The proposed framework comprises several modules: A module on the probabilistic description of potential future earthquake shaking intensity, a module on the probabilistic assessment of spatial variability of soil liquefaction, a module on damage assessment of buildings...... and a fourth module on the consequences of an earthquake. Each of these modules is integrated into a BPN. Special attention is given to aggregated risk, i.e. the risk contribution from assets at multiple locations in a city subjected to the same earthquake. The application of the methodology is illustrated...

  9. Assessing Probabilistic Risk Assessment Approaches for Insect Biological Control Introductions.

    Science.gov (United States)

    Kaufman, Leyla V; Wright, Mark G

    2017-07-07

    The introduction of biological control agents to new environments requires host specificity tests to estimate potential non-target impacts of a prospective agent. Currently, the approach is conservative, and is based on physiological host ranges determined under captive rearing conditions, without consideration for ecological factors that may influence realized host range. We use historical data and current field data from introduced parasitoids that attack an endemic Lepidoptera species in Hawaii to validate a probabilistic risk assessment (PRA) procedure for non-target impacts. We use data on known host range and habitat use in the place of origin of the parasitoids to determine whether contemporary levels of non-target parasitism could have been predicted using PRA. Our results show that reasonable predictions of potential non-target impacts may be made if comprehensive data are available from places of origin of biological control agents, but scant data produce poor predictions. Using apparent mortality data rather than marginal attack rate estimates in PRA resulted in over-estimates of predicted non-target impact. Incorporating ecological data into PRA models improved the predictive power of the risk assessments.

  10. A probabilistic approach to quantifying spatial patterns of flow regimes and network-scale connectivity

    Science.gov (United States)

    Garbin, Silvia; Alessi Celegon, Elisa; Fanton, Pietro; Botter, Gianluca

    2017-04-01

    The temporal variability of river flow regime is a key feature structuring and controlling fluvial ecological communities and ecosystem processes. In particular, streamflow variability induced by climate/landscape heterogeneities or other anthropogenic factors significantly affects the connectivity between streams with notable implication for river fragmentation. Hydrologic connectivity is a fundamental property that guarantees species persistence and ecosystem integrity in riverine systems. In riverine landscapes, most ecological transitions are flow-dependent and the structure of flow regimes may affect ecological functions of endemic biota (i.e., fish spawning or grazing of invertebrate species). Therefore, minimum flow thresholds must be guaranteed to support specific ecosystem services, like fish migration, aquatic biodiversity and habitat suitability. In this contribution, we present a probabilistic approach aiming at a spatially-explicit, quantitative assessment of hydrologic connectivity at the network-scale as derived from river flow variability. Dynamics of daily streamflows are estimated based on catchment-scale climatic and morphological features, integrating a stochastic, physically based approach that accounts for the stochasticity of rainfall with a water balance model and a geomorphic recession flow model. The non-exceedance probability of ecologically meaningful flow thresholds is used to evaluate the fragmentation of individual stream reaches, and the ensuing network-scale connectivity metrics. A multi-dimensional Poisson Process for the stochastic generation of rainfall is used to evaluate the impact of climate signature on reach-scale and catchment-scale connectivity. The analysis shows that streamflow patterns and network-scale connectivity are influenced by the topology of the river network and the spatial variability of climatic properties (rainfall, evapotranspiration). The framework offers a robust basis for the prediction of the impact of

  11. Probabilistic model-based approach for heart beat detection.

    Science.gov (United States)

    Chen, Hugh; Erol, Yusuf; Shen, Eric; Russell, Stuart

    2016-09-01

    Nowadays, hospitals are ubiquitous and integral to modern society. Patients flow in and out of a veritable whirlwind of paperwork, consultations, and potential inpatient admissions, through an abstracted system that is not without flaws. One of the biggest flaws in the medical system is perhaps an unexpected one: the patient alarm system. One longitudinal study reported an 88.8% rate of false alarms, with other studies reporting numbers of similar magnitudes. These false alarm rates lead to deleterious effects that manifest in a lower standard of care across clinics. This paper discusses a model-based probabilistic inference approach to estimate physiological variables at a detection level. We design a generative model that complies with a layman's understanding of human physiology and perform approximate Bayesian inference. One primary goal of this paper is to justify a Bayesian modeling approach to increasing robustness in a physiological domain. In order to evaluate our algorithm we look at the application of heart beat detection using four datasets provided by PhysioNet, a research resource for complex physiological signals, in the form of the PhysioNet 2014 Challenge set-p1 and set-p2, the MIT-BIH Polysomnographic Database, and the MGH/MF Waveform Database. On these data sets our algorithm performs on par with the other top six submissions to the PhysioNet 2014 challenge. The overall evaluation scores in terms of sensitivity and positive predictivity values obtained were as follows: set-p1 (99.72%), set-p2 (93.51%), MIT-BIH (99.66%), and MGH/MF (95.53%). These scores are based on the averaging of gross sensitivity, gross positive predictivity, average sensitivity, and average positive predictivity.

  12. A Finite Element Procedure with Poisson Iteration Method Adopting Pattern Approach Technique for Near-Incompressible Rubber Problems

    Directory of Open Access Journals (Sweden)

    Young-Doo Kwon

    2014-08-01

    Full Text Available A finite element procedure is presented for the analysis of rubber-like hyperelastic materials. The volumetric incompressibility condition of rubber deformation is included in the formulation using the penalty method, while the principle of virtual work is used to derive a nonlinear finite element equation for the large displacement problem that is presented in a total-Lagrangian description. The behavior of rubber deformation is represented by hyperelastic constitutive relations based on a generalized Mooney-Rivlin model. The proposed finite element procedure using analytic differentiation exhibited results that matched very well with those from the well-known commercial packages NISA II and ABAQUS. Furthermore, the convergence of equilibrium iteration is quite slow or frequently fails in the case of near-incompressible rubber. To prevent such phenomenon even for the case that Poisson's ratio is very close to 0.5, Poisson's ratio of 0.49000 is used, first, to get an approximate solution without any difficulty; then the applied load is maintained and Poisson's ratio is increased to 0.49999 following a proposed pattern and adopting a technique of relaxation by monitoring the convergence rate. For a given Poisson ratio near 0.5, with this approach, we could reduce the number of substeps considerably.

  13. Comparison of Two Probabilistic Fatigue Damage Assessment Approaches Using Prognostic Performance Metrics

    Data.gov (United States)

    National Aeronautics and Space Administration — A general framework for probabilistic prognosis using maximum entropy approach, MRE, is proposed in this paper to include all available information and uncertainties...

  14. Estimated Quality of Multistage Process on the Basis of Probabilistic Approach with Continuous Response Functions

    Directory of Open Access Journals (Sweden)

    Yuri B. Tebekin

    2011-11-01

    Full Text Available The article is devoted to the problem of the quality management for multiphase processes on the basis of the probabilistic approach. Method with continuous response functions is offered from the application of the method of Lagrange multipliers.

  15. Metal induced inhalation exposure in urban population: A probabilistic approach

    Science.gov (United States)

    Widziewicz, Kamila; Loska, Krzysztof

    2016-03-01

    The paper was aimed at assessing the health risk in the populations of three Silesian cities: Bielsko-Biała, Częstochowa and Katowice exposed to the inhalation intake of cadmium, nickel and arsenic present in airborne particulate matter. In order to establish how the exposure parameters affects risk a probabilistic risk assessment framework was used. The risk model was based on the results of the annual measurements of As, Cd and Ni concentrations in PM2.5 and the sets of data on the concentrations of those elements in PM10 collected by the Voivodship Inspectorate of Environmental Protection over 2012-2013 period. The risk was calculated as an incremental lifetime risk of cancer (ILCR) in particular age groups (infants, children, adults) following Monte Carlo approach. With the aim of depicting the effect the variability of exposure parameters exerts on the risk, the initial parameters of the risk model: metals concentrations, its infiltration into indoor environment, exposure duration, exposure frequency, lung deposition efficiency, daily lung ventilation and body weight were modeled as random variables. The distribution of inhalation cancer risk due to exposure to ambient metals concentrations was LN (1.80 × 10-6 ± 2.89 × 10-6) and LN (6.17 × 10-7 ± 1.08 × 10-6) for PM2.5 and PM10-bound metals respectively and did not exceed the permissible limit of the acceptable risk. The highest probability of contracting cancer was observed for Katowice residents exposed to PM2.5 - LN (2.01 × 10-6 ± 3.24 × 10-6). Across the tested age groups adults were approximately one order of magnitude at higher risk compared to infants. Sensitivity analysis showed that exposure duration (ED) and body weight (BW) were the two variables, which contributed the most to the ILCR.

  16. A Probabilistic Approach for Robustness Evaluation of Timber Structures

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Sørensen, John Dalsgaard

    to criteria a) and b) the timber frame structure has one column with a reliability index a bit lower than an assumed target level. By removal three columns one by one no significant extensive failure of the entire structure or significant parts of it are obatined. Therefore the structure can be considered......A probabilistic based robustness analysis has been performed for a glulam frame structure supporting the roof over the main court in a Norwegian sports centre. The robustness analysis is based on the framework for robustness analysis introduced in the Danish Code of Practice for the Safety...... of Structures and a probabilistic modelling of the timber material proposed in the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS). Due to the framework in the Danish Code the timber structure has to be evaluated with respect to the following criteria where at least one shall...

  17. A probabilistic approach for determining the control mode in CREAM

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Seong, Poong Hyun; Hollnagel, Erik

    2006-01-01

    The control mode is the core concept for the prediction of human performance in CREAM. In this paper, we propose a probabilistic method for determining the control mode which is a substitute for the existing deterministic method. The new method is based on a probabilistic model, a Bayesian network. This paper describes the mathematical procedure for developing the Bayesian network for determining the control mode. The Bayesian network developed in this paper is an extension of the existing deterministic method. Using the Bayesian network, we expect that we can get the best estimate of the control mode given the available data and information about the context. The mathematical background and procedure for developing equivalent Bayesian networks for given discrete functions provided in this paper can be applied to other discrete functions to develop probabilistic models

  18. Bayesian probabilistic network approach for managing earthquake risks of cities

    DEFF Research Database (Denmark)

    Bayraktarli, Yahya; Faber, Michael

    2011-01-01

    This paper considers the application of Bayesian probabilistic networks (BPNs) to large-scale risk based decision making in regard to earthquake risks. A recently developed risk management framework is outlined which utilises Bayesian probabilistic modelling, generic indicator based risk models...... and a fourth module on the consequences of an earthquake. Each of these modules is integrated into a BPN. Special attention is given to aggregated risk, i.e. the risk contribution from assets at multiple locations in a city subjected to the same earthquake. The application of the methodology is illustrated...

  19. Poisson Autoregression

    DEFF Research Database (Denmark)

    Fokianos, Konstantinos; Rahbek, Anders Christian; Tjøstheim, Dag

    This paper considers geometric ergodicity and likelihood based inference for linear and nonlinear Poisson autoregressions. In the linear case the conditional mean is linked linearly to its past values as well as the observed values of the Poisson process. This also applies to the conditional...... variance, implying an interpretation as an integer valued GARCH process. In a nonlinear conditional Poisson model, the conditional mean is a nonlinear function of its past values and a nonlinear function of past observations. As a particular example an exponential autoregressive Poisson model for time...... series is considered. Under geometric ergodicity the maximum likelihood estimators of the parameters are shown to be asymptotically Gaussian in the linear model. In addition we provide a consistent estimator of the asymptotic covariance, which is used in the simulations and the analysis of some...

  20. Implementation of upper limit calculation for a poisson variable by bayesian approach

    International Nuclear Information System (INIS)

    Zhu Yongsheng

    2008-01-01

    The calculation of Bayesian confidence upper limit for a Poisson variable including both signal and background with and without systematic uncertainties has been formulated. A Fortran 77 routine, BPULE, has been developed to implement the calculation. The routine can account for systematic uncertainties in the background expectation and signal efficiency. The systematic uncertainties may be separately parameterized by a Gaussian, Log-Gaussian or flat probability density function (pdf). Some technical details of BPULE have been discussed. (authors)

  1. Probabilistic rainfall thresholds for landslide occurrence using a Bayesian approach

    Science.gov (United States)

    Berti, M.; Martina, M.; Franceschini, S.; Pignone, S.; Simoni, A.; Pizziolo, M.

    2012-04-01

    Landslide rainfall thresholds are commonly defined as the critical value of two combined variables (e.g. rainfall duration and rainfall intensity) responsible for the occurrence of landslides in a given area. Various methods have been proposed in the literature to predict the rainfall conditions that are likely to trigger landslides, using for instance physically-based models or statistical analysis of historical catalogues. Most of these methods share an implicit deterministic view: the occurrence of landslides can be predicted by comparing the input value (rainfall conditions) with the threshold, and a single output (landslide or no-landslide) is only possible for a given input. In practical applications, however, a deterministic approach is not always applicable. Failure conditions are often achieved with a unique combination of many relevant factors (hydrologic response, weathering, changes in field stress, anthropic activity) and landslide triggering cannot be predicted by rainfall alone. When different outputs (landslide or no-landslide) can be obtained for the same input (rainfall conditions) a deterministic approach is no longer applicable and a probabilistic model is preferable. In this study we propose a new method to evaluate the rainfall thresholds based on Bayes probability. The method is simple, statistically rigorous, and provides a way to define thresholds in complex cases, when conventional approaches become highly subjective. The Bayes theorem is a direct application of conditional probabilities and it allows to computed the conditional probability to have a landslide (A) when a rainfall event of a given magnitude (B) is expected. The fundamental aspect of the Bayes approach is that the landslide probability P(A|B) depends not only on the observed probability of the triggering rainfall P(B|A), but also on the marginal probability of the expected rainfall event P(B). Therefore, both the rainfall that resulted in landslides and the rainfall that not

  2. Systems analysis approach to probabilistic modeling of fault trees

    International Nuclear Information System (INIS)

    Bartholomew, R.J.; Qualls, C.R.

    1985-01-01

    A method of probabilistic modeling of fault tree logic combined with stochastic process theory (Markov modeling) has been developed. Systems are then quantitatively analyzed probabilistically in terms of their failure mechanisms including common cause/common mode effects and time dependent failure and/or repair rate effects that include synergistic and propagational mechanisms. The modeling procedure results in a state vector set of first order, linear, inhomogeneous, differential equations describing the time dependent probabilities of failure described by the fault tree. The solutions of this Failure Mode State Variable (FMSV) model are cumulative probability distribution functions of the system. A method of appropriate synthesis of subsystems to form larger systems is developed and applied to practical nuclear power safety systems

  3. Intermediate probabilistic safety assessment approach for safety critical digital systems

    International Nuclear Information System (INIS)

    Taeyong, Sung; Hyun Gook, Kang

    2001-01-01

    Even though the conventional probabilistic safety assessment methods are immature for applying to microprocessor-based digital systems, practical needs force to apply it. In the Korea, UCN 5 and 6 units are being constructed and Korean Next Generation Reactor is being designed using the digital instrumentation and control equipment for the safety related functions. Korean regulatory body requires probabilistic safety assessment. This paper analyzes the difficulties on the assessment of digital systems and suggests an intermediate framework for evaluating their safety using fault tree models. The framework deals with several important characteristics of digital systems including software modules and fault-tolerant features. We expect that the analysis result will provide valuable design feedback. (authors)

  4. A probabilistic approach of sum rules for heat polynomials

    International Nuclear Information System (INIS)

    Vignat, C; Lévêque, O

    2012-01-01

    In this paper, we show that the sum rules for generalized Hermite polynomials derived by Daboul and Mizrahi (2005 J. Phys. A: Math. Gen. http://dx.doi.org/10.1088/0305-4470/38/2/010) and by Graczyk and Nowak (2004 C. R. Acad. Sci., Ser. 1 338 849) can be interpreted and easily recovered using a probabilistic moment representation of these polynomials. The covariance property of the raising operator of the harmonic oscillator, which is at the origin of the identities proved in Daboul and Mizrahi and the dimension reduction effect expressed in the main result of Graczyk and Nowak are both interpreted in terms of the rotational invariance of the Gaussian distributions. As an application of these results, we uncover a probabilistic moment interpretation of two classical integrals of the Wigner function that involve the associated Laguerre polynomials. (paper)

  5. Tools for voltage stability analysis, including a probabilistic approach

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Filho, X.; Martins, N.; Bianco, A.; Pinto, H.J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M.V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P.; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  6. Poisson Autoregression

    DEFF Research Database (Denmark)

    Fokianos, Konstantinos; Rahbek, Anders Christian; Tjøstheim, Dag

    2009-01-01

    In this article we consider geometric ergodicity and likelihood-based inference for linear and nonlinear Poisson autoregression. In the linear case, the conditional mean is linked linearly to its past values, as well as to the observed values of the Poisson process. This also applies to the condi......In this article we consider geometric ergodicity and likelihood-based inference for linear and nonlinear Poisson autoregression. In the linear case, the conditional mean is linked linearly to its past values, as well as to the observed values of the Poisson process. This also applies...... to the conditional variance, making possible interpretation as an integer-valued generalized autoregressive conditional heteroscedasticity process. In a nonlinear conditional Poisson model, the conditional mean is a nonlinear function of its past values and past observations. As a particular example, we consider...... ergodicity proceeds via Markov theory and irreducibility. Finding transparent conditions for proving ergodicity turns out to be a delicate problem in the original model formulation. This problem is circumvented by allowing a perturbation of the model. We show that as the perturbations can be chosen...

  7. A random matrix approach to the crossover of energy-level statistics from Wigner to Poisson

    International Nuclear Information System (INIS)

    Datta, Nilanjana; Kunz, Herve

    2004-01-01

    We analyze a class of parametrized random matrix models, introduced by Rosenzweig and Porter, which is expected to describe the energy level statistics of quantum systems whose classical dynamics varies from regular to chaotic as a function of a parameter. We compute the generating function for the correlations of energy levels, in the limit of infinite matrix size. The crossover between Poisson and Wigner statistics is measured by a renormalized coupling constant. The model is exactly solved in the sense that, in the limit of infinite matrix size, the energy-level correlation functions and their generating function are given in terms of a finite set of integrals

  8. Analysis on Poisson and Gamma spaces

    OpenAIRE

    Kondratiev, Yuri; Silva, Jose Luis; Streit, Ludwig; Us, Georgi

    1999-01-01

    We study the spaces of Poisson, compound Poisson and Gamma noises as special cases of a general approach to non-Gaussian white noise calculus, see \\cite{KSS96}. We use a known unitary isomorphism between Poisson and compound Poisson spaces in order to transport analytic structures from Poisson space to compound Poisson space. Finally we study a Fock type structure of chaos decomposition on Gamma space.

  9. Poisson Autoregression

    DEFF Research Database (Denmark)

    Fokianos, Konstantinos; Rahbæk, Anders; Tjøstheim, Dag

    This paper considers geometric ergodicity and likelihood based inference for linear and nonlinear Poisson autoregressions. In the linear case the conditional mean is linked linearly to its past values as well as the observed values of the Poisson process. This also applies to the conditional...... proceeds via Markov theory and irreducibility. Finding transparent conditions for proving ergodicity turns out to be a delicate problem in the original model formulation. This problem is circumvented by allowing a perturbation of the model. We show that as the perturbations can be chosen to be arbitrarily...

  10. Optimization-Based Approaches to Control of Probabilistic Boolean Networks

    Directory of Open Access Journals (Sweden)

    Koichi Kobayashi

    2017-02-01

    Full Text Available Control of gene regulatory networks is one of the fundamental topics in systems biology. In the last decade, control theory of Boolean networks (BNs, which is well known as a model of gene regulatory networks, has been widely studied. In this review paper, our previously proposed methods on optimal control of probabilistic Boolean networks (PBNs are introduced. First, the outline of PBNs is explained. Next, an optimal control method using polynomial optimization is explained. The finite-time optimal control problem is reduced to a polynomial optimization problem. Furthermore, another finite-time optimal control problem, which can be reduced to an integer programming problem, is also explained.

  11. xLPR - a probabilistic approach to piping integrity analysis

    International Nuclear Information System (INIS)

    Harrington, C.; Rudland, D.; Fyfitch, S.

    2015-01-01

    The xLPR Code is a probabilistic fracture mechanics (PFM) computational tool that can be used to quantitatively determine a best-estimate probability of failure with well characterized uncertainties for reactor coolant system components, beginning with the piping systems and including the effects of relevant active degradation mechanisms. The initial application planned for xLPR is somewhat narrowly focused on validating LBB (leak-before-break) compliance in PWSCC-susceptible systems such as coolant systems of PWRs. The xLPR code incorporates a set of deterministic models that represent the full range of physical phenomena necessary to evaluate both fatigue and PWSCC degradation modes from crack initiation through failure. These models are each implemented in a modular form and linked together by a probabilistic framework that contains the logic for xLPR execution, exercises the individual modules as required, and performs necessary administrative and bookkeeping functions. The completion of the first production version of the xLPR code in a fully documented, releasable condition is presently planned for spring 2015

  12. Impact of external events on site evaluation: a probabilistic approach

    International Nuclear Information System (INIS)

    Jaccarino, E.; Giuliani, P.; Zaffiro, C.

    1975-01-01

    A probabilistic method is proposed for definition of the reference external events of nuclear sites. The external events taken into account are earthquakes, floods and tornadoes. On the basis of the available historical data for each event it is possible to perform statistical analyses to determine the probability of occurrence on site of events of given characteristics. For earthquakes, the method of analysis takes into consideration both the annual frequency of seismic events in Italy and the probabilistic distribution of areas stricken by each event. For floods, the methods of analysis of hydrological data and the basic criteria for the determination of design events are discussed and the general lines of the hydraulic analysis of a nuclear site are shown. For tornadoes, the statistical analysis has been performed for the events which occurred in Italy during the last 40 years; these events have been classified according to an empirical intensity scale. The probability of each reference event should be a function of the potential radiological damage associated with the particular type of plant which must be installed on the site. Thus the reference event could be chosen such that for the whole of the national territory the risk for safety and environmental protection is the same. (author)

  13. Probabilistic seasonal Forecasts to deterministic Farm Leve Decisions: Innovative Approach

    Science.gov (United States)

    Mwangi, M. W.

    2015-12-01

    Climate change and vulnerability are major challenges in ensuring household food security. Climate information services have the potential to cushion rural households from extreme climate risks. However, most the probabilistic nature of climate information products is not easily understood by majority of smallholder farmers. Despite the probabilistic nature, climate information have proved to be a valuable climate risk adaptation strategy at the farm level. This calls for innovative ways to help farmers understand and apply climate information services to inform their farm level decisions. The study endeavored to co-design and test appropriate innovation systems for climate information services uptake and scale up necessary for achieving climate risk development. In addition it also determined the conditions necessary to support the effective performance of the proposed innovation system. Data and information sources included systematic literature review, secondary sources, government statistics, focused group discussions, household surveys and semi-structured interviews. Data wasanalyzed using both quantitative and qualitative data analysis techniques. Quantitative data was analyzed using the Statistical Package for Social Sciences (SPSS) software. Qualitative data was analyzed using qualitative techniques, which involved establishing the categories and themes, relationships/patterns and conclusions in line with the study objectives. Sustainable livelihood, reduced household poverty and climate change resilience were the impact that resulted from the study.

  14. All-possible-couplings approach to measuring probabilistic context.

    Directory of Open Access Journals (Sweden)

    Ehtibar N Dzhafarov

    Full Text Available From behavioral sciences to biology to quantum mechanics, one encounters situations where (i a system outputs several random variables in response to several inputs, (ii for each of these responses only some of the inputs may "directly" influence them, but (iii other inputs provide a "context" for this response by influencing its probabilistic relations to other responses. These contextual influences are very different, say, in classical kinetic theory and in the entanglement paradigm of quantum mechanics, which are traditionally interpreted as representing different forms of physical determinism. One can mathematically construct systems with other types of contextuality, whether or not empirically realizable: those that form special cases of the classical type, those that fall between the classical and quantum ones, and those that violate the quantum type. We show how one can quantify and classify all logically possible contextual influences by studying various sets of probabilistic couplings, i.e., sets of joint distributions imposed on random outputs recorded at different (mutually incompatible values of inputs.

  15. A simplified probabilistic approach to manage and prioritise dams

    International Nuclear Information System (INIS)

    Nortje, J. H.

    1997-01-01

    A probabilistic method which enables engineers to compare dams that vary in size, type and location, and establish priorities with regard to the urgency of rectifying observed deficiencies, was described. The method uses a simple algorithm that determines, using only three factors, the total expected losses that are likely to occur during the life of the dam: (1) the probability of failure of the dam, (2) the consequences of theoretical failure, and (3) a reduction factor depending on the standard of operation, maintenance and emergency preparedness. From these elements one or more indices can be calculated which express the hazard potential of the dam in question either in terms of human lives that would be lost, or in terms of economic losses that would occur during a dam break event. 7 refs., 5 tabs., 2 appendices

  16. Unit commitment with probabilistic reserve: An IPSO approach

    International Nuclear Information System (INIS)

    Lee, Tsung-Ying; Chen, Chun-Lung

    2007-01-01

    This paper presents a new algorithm for solution of the nonlinear optimal scheduling problem. This algorithm is named the iteration particle swarm optimization (IPSO). A new index, called iteration best, is incorporated into particle swarm optimization (PSO) to improve the solution quality and computation efficiency. IPSO is applied to solve the unit commitment with probabilistic reserve problem of a power system. The outage cost as well as fuel cost of thermal units was considered in the unit commitment program to evaluate the level of spinning reserve. The optimal scheduling of on line generation units was reached while minimizing the sum of fuel cost and outage cost. A 48 unit power system was used as a numerical example to test the new algorithm. The optimal scheduling of on line generation units could be reached in the testing results while satisfying the requirement of the objective function

  17. Probabilistic Risk Assessment (PRA): A Practical and Cost Effective Approach

    Science.gov (United States)

    Lee, Lydia L.; Ingegneri, Antonino J.; Djam, Melody

    2006-01-01

    The Lunar Reconnaissance Orbiter (LRO) is the first mission of the Robotic Lunar Exploration Program (RLEP), a space exploration venture to the Moon, Mars and beyond. The LRO mission includes spacecraft developed by NASA Goddard Space Flight Center (GSFC) and seven instruments built by GSFC, Russia, and contractors across the nation. LRO is defined as a measurement mission, not a science mission. It emphasizes the overall objectives of obtaining data to facilitate returning mankind safely to the Moon in preparation for an eventual manned mission to Mars. As the first mission in response to the President's commitment of the journey of exploring the solar system and beyond: returning to the Moon in the next decade, then venturing further into the solar system, ultimately sending humans to Mars and beyond, LRO has high-visibility to the public but limited resources and a tight schedule. This paper demonstrates how NASA's Lunar Reconnaissance Orbiter Mission project office incorporated reliability analyses in assessing risks and performing design tradeoffs to ensure mission success. Risk assessment is performed using NASA Procedural Requirements (NPR) 8705.5 - Probabilistic Risk Assessment (PRA) Procedures for NASA Programs and Projects to formulate probabilistic risk assessment (PRA). As required, a limited scope PRA is being performed for the LRO project. The PRA is used to optimize the mission design within mandated budget, manpower, and schedule constraints. The technique that LRO project office uses to perform PRA relies on the application of a component failure database to quantify the potential mission success risks. To ensure mission success in an efficient manner, low cost and tight schedule, the traditional reliability analyses, such as reliability predictions, Failure Modes and Effects Analysis (FMEA), and Fault Tree Analysis (FTA), are used to perform PRA for the large system of LRO with more than 14,000 piece parts and over 120 purchased or contractor

  18. Decision making in probabilistic studies - comparison of frequentist and bayesian approaches

    International Nuclear Information System (INIS)

    Aupied, J.

    1992-06-01

    This technical note is a critical comparison of the classic approach and of the Bayesian approach, as applied to probabilistic studies on operational reliability. Tests of assumptions and confidence intervals have been analyzed with particular emphasis on the problems of application. Bayesian techniques applied to the estimation of reliability parameters bring new solutions to the processing of experience feedback, by taking into consideration expert judgments. A number of case studies illustrate the two approaches

  19. A probabilistic approach to uncertainty quantification with limited information

    International Nuclear Information System (INIS)

    Red-Horse, J.R.; Benjamin, A.S.

    2004-01-01

    Many safety assessments depend upon models that rely on probabilistic characterizations about which there is incomplete knowledge. For example, a system model may depend upon the time to failure of a piece of equipment for which no failures have actually been observed. The analysts in this case are faced with the task of developing a failure model for the equipment in question, having very limited knowledge about either the correct form of the failure distribution or the statistical parameters that characterize the distribution. They may assume that the process conforms to a Weibull or log-normal distribution or that it can be characterized by a particular mean or variance, but those assumptions impart more knowledge to the analysis than is actually available. To address this challenge, we propose a method where random variables comprising equivalence classes constrained by the available information are approximated using polynomial chaos expansions (PCEs). The PCE approximations are based on rigorous mathematical concepts developed from functional analysis and measure theory. The method has been codified in a computational tool, AVOCET, and has been applied successfully to example problems. Results indicate that it should be applicable to a broad range of engineering problems that are characterized by both irreducible andreducible uncertainty

  20. An Approach for Long-lead Probabilistic Forecast of Droughts

    Science.gov (United States)

    Madadgar, S.; Moradkhani, H.

    2013-12-01

    Spatio-temporal analysis of historical droughts across the Gunnison river Basin in CO, USA is studied and the probability distribution of future droughts is obtained. The Standardized Runoff Index (SRI) is employed to analyze the drought status across the spatial extent of the basin. To apply SRI in drought forecasting, the Precipitation Runoff Modeling System (PRMS) is used to estimate the runoff generated in the spatial units of the basin. A recently developed multivariate forecast technique is then used to model the joint behavior between the correlated variables of accumulated runoff over the forecast and predicting periods. The probability of future droughts in the forecast season given the observed drought in the last season is evaluated by the conditional probabilities derived from the forecast model. Using the conditional probabilities of future droughts, the runoff variation over the basin with the particular chance of occurrence is obtained as well. The forecast model also provides the uncertainty bound of future runoff produced at each spatial unit across the basin. Our results indicate that the statistical method developed in this study is a useful procedure in presenting the probabilistic forecasting of droughts given the spatio-temporal characteristics of droughts in the past.

  1. Tractable approximations for probabilistic models: The adaptive Thouless-Anderson-Palmer mean field approach

    DEFF Research Database (Denmark)

    Opper, Manfred; Winther, Ole

    2001-01-01

    We develop an advanced mean held method for approximating averages in probabilistic data models that is based on the Thouless-Anderson-Palmer (TAP) approach of disorder physics. In contrast to conventional TAP. where the knowledge of the distribution of couplings between the random variables...

  2. A probabilistic approach for mapping free-text queries to complex web forms

    NARCIS (Netherlands)

    Tjin-Kam-Jet, Kien; Trieschnigg, Rudolf Berend; Hiemstra, Djoerd

    Web applications with complex interfaces consisting of multiple input fields should understand free-text queries. We propose a probabilistic approach to map parts of a free-text query to the fields of a complex web form. Our method uses token models rather than only static dictionaries to create

  3. A probabilistic modeling approach in thermal inactivation: estimation of postprocess Bacillus cereus spore prevalence and concentration

    NARCIS (Netherlands)

    Membre, J.M.; Amezquita, A.; Bassett, J.; Giavedoni, P.; Blackburn, W.; Gorris, L.G.M.

    2006-01-01

    The survival of spore-forming bacteria is linked to the safety and stability of refrigerated processed foods of extended durability (REPFEDs). A probabilistic modeling approach was used to assess the prevalence and concentration of Bacillus cereus spores surviving heat treatment for a semiliquid

  4. An approach to handle Real Time and Probabilistic behaviors in e-commerce

    DEFF Research Database (Denmark)

    Diaz, G.; Larsen, Kim Guldstrand; Pardo, J.

    2005-01-01

    In this work we describe an approach to deal with systems having at the same time probabilistic and real-time behav- iors. The main goal in the paper is to show the automatic translation from a real time model based on UPPAAL tool, which makes automatic verification of Real Time Systems, to the R...

  5. A probabilistic approach to emission-line galaxy classification

    Science.gov (United States)

    de Souza, R. S.; Dantas, M. L. L.; Costa-Duarte, M. V.; Feigelson, E. D.; Killedar, M.; Lablanche, P.-Y.; Vilalta, R.; Krone-Martins, A.; Beck, R.; Gieseke, F.

    2017-12-01

    We invoke a Gaussian mixture model (GMM) to jointly analyse two traditional emission-line classification schemes of galaxy ionization sources: the Baldwin-Phillips-Terlevich (BPT) and WH α versus [N II]/H α (WHAN) diagrams, using spectroscopic data from the Sloan Digital Sky Survey Data Release 7 and SEAGal/STARLIGHT data sets. We apply a GMM to empirically define classes of galaxies in a three-dimensional space spanned by the log [O III]/H β, log [N II]/H α and log EW(H α) optical parameters. The best-fitting GMM based on several statistical criteria suggests a solution around four Gaussian components (GCs), which are capable to explain up to 97 per cent of the data variance. Using elements of information theory, we compare each GC to their respective astronomical counterpart. GC1 and GC4 are associated with star-forming galaxies, suggesting the need to define a new starburst subgroup. GC2 is associated with BPT's active galactic nuclei (AGN) class and WHAN's weak AGN class. GC3 is associated with BPT's composite class and WHAN's strong AGN class. Conversely, there is no statistical evidence - based on four GCs - for the existence of a Seyfert/low-ionization nuclear emission-line region (LINER) dichotomy in our sample. Notwithstanding, the inclusion of an additional GC5 unravels it. The GC5 appears associated with the LINER and passive galaxies on the BPT and WHAN diagrams, respectively. This indicates that if the Seyfert/LINER dichotomy is there, it does not account significantly to the global data variance and may be overlooked by standard metrics of goodness of fit. Subtleties aside, we demonstrate the potential of our methodology to recover/unravel different objects inside the wilderness of astronomical data sets, without lacking the ability to convey physically interpretable results. The probabilistic classifications from the GMM analysis are publicly available within the COINtoolbox at https://cointoolbox.github.io/GMM_Catalogue/.

  6. Topological risk mapping of runway overruns: A probabilistic approach

    International Nuclear Information System (INIS)

    The paper presents a topological risk mapping for aircraft overruns. The proposed procedure is based on the study published in 2008 by Hall et al. (Analysis of aircraft overruns and undershoots for runway safety areas. Airport Cooperative Research Program. Washington, DC: Transportation Research Board; 2008). In that study the authors performed an analysis of aircraft overruns and undershoots for runway safety areas proposing the ACRP hazard probability model. In the present study the model was integrated into a two-step Monte Carlo simulation procedure to assess the risk of overrun accidents and to provide a topological risk map for a specific airport area. The model was modified to utilize traffic-related and weather-related factors described by statistical distributions of historical data of the airport under analysis. The probability distribution of overrun events was then combined with the Longitudinal and Lateral Location models Hall et al. (Analysis of aircraft overruns and undershoots for runway safety areas. Airport Cooperative Research Program. Washington, DC: Transportation Research Board; 2008) to obtain a two-dimensional grid assessing the probability of each area to be the end point of a runway overrun. The expected kinetic energy of the aircraft in a given point of the grid is used as severity index. The procedure is suitable for generalisation and it allows a more detailed planning of Airport Safety Areas (ASA), improving the correct implementation of ICAO recommendations. Results are also useful for land planning and structural analyses in airport areas. - Highlights: • Two-step probabilistic procedure for the topological characterisation of overrun risk in airports. • Monte Carlo simulation applied to existing overrun probability and location models. • Proposed topological severity index: Iso-Kinetic Energy Areas (KEA). • Expected kinetic energy almost constant for about 1000 m beyond the runway end

  7. A probabilistic approach to delineating functional brain regions

    DEFF Research Database (Denmark)

    Kalbitzer, Jan; Svarer, Claus; Frokjaer, Vibe G

    2009-01-01

    The purpose of this study was to develop a reliable observer-independent approach to delineating volumes of interest (VOIs) for functional brain regions that are not identifiable on structural MR images. The case is made for the raphe nuclei, a collection of nuclei situated in the brain stem known......-independent, reliable approach to delineating regions that can be identified only by functional imaging, here exemplified by the raphe nuclei. This approach can be used in future studies to create functional VOI maps based on neuroreceptor fingerprints retrieved through in vivo brain imaging Udgivelsesdato: 2009/6...

  8. A Probabilistic Approach to Tropical Cyclone Conditions of Readiness (TCCOR)

    National Research Council Canada - National Science Library

    Wallace, Kenneth A

    2008-01-01

    Tropical Cyclone Conditions of Readiness (TCCOR) are set at DoD installations in the Western Pacific to convey the risk associated with the onset of destructive winds from approaching tropical cyclones...

  9. Probabilistic Fuzzy Approach to Evaluation of Logistics Service Effectiveness

    OpenAIRE

    Rudnik Katarzyna; Pisz Iwona

    2014-01-01

    Logistics service providers offer a whole or partial logistics business service over a certain time period. Between such companies, the effectiveness of specific logistics services can vary. Logistics service providers seek the effective performance of logistics service. The purpose of this paper is to present a new approach for the evaluation of logistics service effectiveness, along with a specific computer system implementing the proposed approach – a sophisticated inference system, an ext...

  10. A probabilistic approach to quantum mechanics based on 'tomograms'

    International Nuclear Information System (INIS)

    Caponigro, M.; Mancini, S.; Man'ko, V.I.

    2006-01-01

    It is usually believed that a picture of Quantum Mechanics in terms of true probabilities cannot be given due to the uncertainty relations. Here we discuss a tomographic approach to quantum states that leads to a probability representation of quantum states. This can be regarded as a classical-like formulation of quantum mechanics which avoids the counterintuitive concepts of wave function and density operator. The relevant concepts of quantum mechanics are then reconsidered and the epistemological implications of such approach discussed. (Abstract Copyright [2006], Wiley Periodicals, Inc.)

  11. Probabilistic approach for validation of advanced driver assistance systems

    NARCIS (Netherlands)

    Gietelink, O.J.; Schutter, B. de; Verhaegen, M.

    2005-01-01

    This paper presents a methodological approach for validation of advanced driver assistance systems. The methodology relies on the use of randomized algorithms that are more efficient than conventional validation that uses simulations and field tests, especially with increasing complexity of the

  12. A probabilistic approach for determining railway infrastructure capacity

    NARCIS (Netherlands)

    Heidergott, B.F.; de Kort, A.; Ayhan, H.

    2003-01-01

    We consider the problem of determining the capacity of a planned railway infrastructure layout under uncertainties. In order to address the long-term nature of the problem, in which the exact (future) demand of service is unknown, we develop a "timetable"-free approach to avoid the specification of

  13. A probabilistic approach to the drag-based model

    Science.gov (United States)

    Napoletano, Gianluca; Forte, Roberta; Moro, Dario Del; Pietropaolo, Ermanno; Giovannelli, Luca; Berrilli, Francesco

    2018-02-01

    The forecast of the time of arrival (ToA) of a coronal mass ejection (CME) to Earth is of critical importance for our high-technology society and for any future manned exploration of the Solar System. As critical as the forecast accuracy is the knowledge of its precision, i.e. the error associated to the estimate. We propose a statistical approach for the computation of the ToA using the drag-based model by introducing the probability distributions, rather than exact values, as input parameters, thus allowing the evaluation of the uncertainty on the forecast. We test this approach using a set of CMEs whose transit times are known, and obtain extremely promising results: the average value of the absolute differences between measure and forecast is 9.1h, and half of these residuals are within the estimated errors. These results suggest that this approach deserves further investigation. We are working to realize a real-time implementation which ingests the outputs of automated CME tracking algorithms as inputs to create a database of events useful for a further validation of the approach.

  14. Probabilistic Damage Characterization Using the Computationally-Efficient Bayesian Approach

    Science.gov (United States)

    Warner, James E.; Hochhalter, Jacob D.

    2016-01-01

    This work presents a computationally-ecient approach for damage determination that quanti es uncertainty in the provided diagnosis. Given strain sensor data that are polluted with measurement errors, Bayesian inference is used to estimate the location, size, and orientation of damage. This approach uses Bayes' Theorem to combine any prior knowledge an analyst may have about the nature of the damage with information provided implicitly by the strain sensor data to form a posterior probability distribution over possible damage states. The unknown damage parameters are then estimated based on samples drawn numerically from this distribution using a Markov Chain Monte Carlo (MCMC) sampling algorithm. Several modi cations are made to the traditional Bayesian inference approach to provide signi cant computational speedup. First, an ecient surrogate model is constructed using sparse grid interpolation to replace a costly nite element model that must otherwise be evaluated for each sample drawn with MCMC. Next, the standard Bayesian posterior distribution is modi ed using a weighted likelihood formulation, which is shown to improve the convergence of the sampling process. Finally, a robust MCMC algorithm, Delayed Rejection Adaptive Metropolis (DRAM), is adopted to sample the probability distribution more eciently. Numerical examples demonstrate that the proposed framework e ectively provides damage estimates with uncertainty quanti cation and can yield orders of magnitude speedup over standard Bayesian approaches.

  15. Assessing Probabilistic Risk Assessment Approaches for Insect Biological Control Introductions

    OpenAIRE

    Kaufman, Leyla V.; Wright, Mark G.

    2017-01-01

    The introduction of biological control agents to new environments requires host specificity tests to estimate potential non-target impacts of a prospective agent. Currently, the approach is conservative, and is based on physiological host ranges determined under captive rearing conditions, without consideration for ecological factors that may influence realized host range. We use historical data and current field data from introduced parasitoids that attack an endemic Lepidoptera species in H...

  16. A probabilistic fracture mechanics approach for structural reliability assessment of space flight systems

    Science.gov (United States)

    Sutharshana, S.; Creager, M.; Ebbeler, D.; Moore, N.

    1992-01-01

    A probabilistic fracture mechanics approach for predicting the failure life distribution due to subcritical crack growth is presented. A state-of-the-art crack propagation method is used in a Monte Carlo simulation to generate a distribution of failure lives. The crack growth failure model expresses failure life as a function of stochastic parameters including environment, loads, material properties, geometry, and model specification errors. A stochastic crack growth rate model that considers the uncertainties due to scatter in the data and mode misspecification is proposed. The rationale for choosing a particular type of probability distribution for each stochastic input parameter and for specifying the distribution parameters is presented. The approach is demonstrated through a probabilistic crack growth failure analysis of a welded tube in the Space Shuttle Main Engine. A discussion of the results from this application of the methodology is given.

  17. Combination of Evidence with Different Weighting Factors: A Novel Probabilistic-Based Dissimilarity Measure Approach

    Directory of Open Access Journals (Sweden)

    Mengmeng Ma

    2015-01-01

    Full Text Available To solve the invalidation problem of Dempster-Shafer theory of evidence (DS with high conflict in multisensor data fusion, this paper presents a novel combination approach of conflict evidence with different weighting factors using a new probabilistic dissimilarity measure. Firstly, an improved probabilistic transformation function is proposed to map basic belief assignments (BBAs to probabilities. Then, a new dissimilarity measure integrating fuzzy nearness and introduced correlation coefficient is proposed to characterize not only the difference between basic belief functions (BBAs but also the divergence degree of the hypothesis that two BBAs support. Finally, the weighting factors used to reassign conflicts on BBAs are developed and Dempster’s rule is chosen to combine the discounted sources. Simple numerical examples are employed to demonstrate the merit of the proposed method. Through analysis and comparison of the results, the new combination approach can effectively solve the problem of conflict management with better convergence performance and robustness.

  18. Development of a Quantitative Framework for Regulatory Risk Assessments: Probabilistic Approaches

    International Nuclear Information System (INIS)

    Wilmot, R.D.

    2003-11-01

    The Swedish regulators have been active in the field of performance assessment for many years and have developed sophisticated approaches to the development of scenarios and other aspects of assessments. These assessments have generally used dose as the assessment end-point and have been based on deterministic calculations. Recently introduced Swedish regulations have introduced a risk criterion for radioactive waste disposal: the annual risk of harmful effects after closure of a disposal facility should not exceed 10 -6 for a representative individual in the group exposed to the greatest risk. A recent review of the overall structure of risk assessments in safety cases concluded that there are a number of decisions and assumptions in the development of a risk assessment methodology that could potentially affect the calculated results. Regulatory understanding of these issues, potentially supported by independent calculations, is important in preparing for review of a proponent's risk assessment. One approach to evaluating risk in performance assessments is to use the concept of probability to express uncertainties, and to propagate these probabilities through the analysis. This report describes the various approaches available for undertaking such probabilistic analyses, both as a means of accounting for uncertainty in the determination of risk and more generally as a means of sensitivity and uncertainty analysis. The report discusses the overall nature of probabilistic analyses and how they are applied to both the calculation of risk and sensitivity analyses. Several approaches are available, including differential analysis, response surface methods and simulation. Simulation is the approach most commonly used, both in assessments for radioactive waste disposal and in other subject areas, and the report describes the key stages of this approach in detail. Decisions relating to the development of input PDFs, sampling methods (including approaches to the treatment

  19. Approaches to probabilistic model learning for mobile manipulation robots

    CERN Document Server

    Sturm, Jürgen

    2013-01-01

    Mobile manipulation robots are envisioned to provide many useful services both in domestic environments as well as in the industrial context. Examples include domestic service robots that implement large parts of the housework, and versatile industrial assistants that provide automation, transportation, inspection, and monitoring services. The challenge in these applications is that the robots have to function under changing, real-world conditions, be able to deal with considerable amounts of noise and uncertainty, and operate without the supervision of an expert. This book presents novel learning techniques that enable mobile manipulation robots, i.e., mobile platforms with one or more robotic manipulators, to autonomously adapt to new or changing situations. The approaches presented in this book cover the following topics: (1) learning the robot's kinematic structure and properties using actuation and visual feedback, (2) learning about articulated objects in the environment in which the robot is operating,...

  20. Analytic and probabilistic approaches to dynamics in negative curvature

    CERN Document Server

    Peigné, Marc; Sambusetti, Andrea

    2014-01-01

    The work of E. Hopf and G.A. Hedlund, in the 1930s, on transitivity and ergodicity of the geodesic flow for hyperbolic surfaces, marked the beginning of the investigation of the statistical properties and stochastic behavior of the flow. The first central limit theorem for the geodesic flow was proved in the 1960s by Y. Sinai for compact hyperbolic manifolds. Since then, strong relationships have been found between the fields of ergodic theory, analysis, and geometry. Different approaches and new tools have been developed to study the geodesic flow, including measure theory, thermodynamic formalism, transfer operators, Laplace operators, and Brownian motion. All these different points of view have led to a deep understanding of more general dynamical systems, in particular the so-called Anosov systems, with applications to geometric problems such as counting, equirepartition, mixing, and recurrence properties of the orbits. This book comprises two independent texts that provide a self-contained introduction t...

  1. A Computational Approach for Probabilistic Analysis of Water Impact Simulations

    Science.gov (United States)

    Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.

    2009-01-01

    NASA's development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.

  2. Simulation approaches to probabilistic structural design at the component level

    International Nuclear Information System (INIS)

    Stancampiano, P.A.

    1978-01-01

    In this paper, structural failure of large nuclear components is viewed as a random process with a low probability of occurrence. Therefore, a statistical interpretation of probability does not apply and statistical inferences cannot be made due to the sparcity of actual structural failure data. In such cases, analytical estimates of the failure probabilities may be obtained from stress-strength interference theory. Since the majority of real design applications are complex, numerical methods are required to obtain solutions. Monte Carlo simulation appears to be the best general numerical approach. However, meaningful applications of simulation methods suggest research activities in three categories: methods development, failure mode models development, and statistical data models development. (Auth.)

  3. Multilevel probabilistic approach to evaluate manufacturing defect in composite aircraft structures

    Energy Technology Data Exchange (ETDEWEB)

    Caracciolo, Paola, E-mail: paola.caracciolo@airbus.com [AIRBUS INDUSTRIES Germany, Department of Airframe Architecture and Integration-Research and Technology-Kreetslag, 10, D-21129, Hamburg (Germany)

    2014-05-15

    In this work it is developed a reliable approach and its feasibility to the design and analysis of a composite structures. The metric is compared the robustness and reliability designs versus the traditional design, to demonstrate the gain that can be achieved with a probabilistic approach. The use of the stochastic approach of the uncertain parameteters in combination with the multi-scale levels analysis is the main objective of this paper. The work is dedicated to analyze the uncertainties in the design, tests, manufacturing process, and key gates such as materials characteristic.

  4. A Kullback-Leibler approach for 3D reconstruction of spectral CT data corrupted by Poisson noise

    Science.gov (United States)

    Hohweiller, Tom; Ducros, Nicolas; Peyrin, Françoise; Sixou, Bruno

    2017-09-01

    While standard computed tomography (CT) data do not depend on energy, spectral computed tomography (SPCT) acquire energy-resolved data, which allows material decomposition of the object of interest. Decompo- sitions in the projection domain allow creating projection mass density (PMD) per materials. From decomposed projections, a tomographic reconstruction creates 3D material density volume. The decomposition is made pos- sible by minimizing a cost function. The variational approach is preferred since this is an ill-posed non-linear inverse problem. Moreover, noise plays a critical role when decomposing data. That is why in this paper, a new data fidelity term is used to take into account of the photonic noise. In this work two data fidelity terms were investigated: a weighted least squares (WLS) term, adapted to Gaussian noise, and the Kullback-Leibler distance (KL), adapted to Poisson noise. A regularized Gauss-Newton algorithm minimizes the cost function iteratively. Both methods decompose materials from a numerical phantom of a mouse. Soft tissues and bones are decomposed in the projection domain; then a tomographic reconstruction creates a 3D material density volume for each material. Comparing relative errors, KL is shown to outperform WLS for low photon counts, in 2D and 3D. This new method could be of particular interest when low-dose acquisitions are performed.

  5. A self-consistent phase-field approach to implicit solvation of charged molecules with Poisson-Boltzmann electrostatics.

    Science.gov (United States)

    Sun, Hui; Wen, Jiayi; Zhao, Yanxiang; Li, Bo; McCammon, J Andrew

    2015-12-28

    Dielectric boundary based implicit-solvent models provide efficient descriptions of coarse-grained effects, particularly the electrostatic effect, of aqueous solvent. Recent years have seen the initial success of a new such model, variational implicit-solvent model (VISM) [Dzubiella, Swanson, and McCammon Phys. Rev. Lett. 96, 087802 (2006) and J. Chem. Phys. 124, 084905 (2006)], in capturing multiple dry and wet hydration states, describing the subtle electrostatic effect in hydrophobic interactions, and providing qualitatively good estimates of solvation free energies. Here, we develop a phase-field VISM to the solvation of charged molecules in aqueous solvent to include more flexibility. In this approach, a stable equilibrium molecular system is described by a phase field that takes one constant value in the solute region and a different constant value in the solvent region, and smoothly changes its value on a thin transition layer representing a smeared solute-solvent interface or dielectric boundary. Such a phase field minimizes an effective solvation free-energy functional that consists of the solute-solvent interfacial energy, solute-solvent van der Waals interaction energy, and electrostatic free energy described by the Poisson-Boltzmann theory. We apply our model and methods to the solvation of single ions, two parallel plates, and protein complexes BphC and p53/MDM2 to demonstrate the capability and efficiency of our approach at different levels. With a diffuse dielectric boundary, our new approach can describe the dielectric asymmetry in the solute-solvent interfacial region. Our theory is developed based on rigorous mathematical studies and is also connected to the Lum-Chandler-Weeks theory (1999). We discuss these connections and possible extensions of our theory and methods.

  6. Probabilistic and machine learning-based retrieval approaches for biomedical dataset retrieval

    Science.gov (United States)

    Karisani, Payam; Qin, Zhaohui S; Agichtein, Eugene

    2018-01-01

    Abstract The bioCADDIE dataset retrieval challenge brought together different approaches to retrieval of biomedical datasets relevant to a user’s query, expressed as a text description of a needed dataset. We describe experiments in applying a data-driven, machine learning-based approach to biomedical dataset retrieval as part of this challenge. We report on a series of experiments carried out to evaluate the performance of both probabilistic and machine learning-driven techniques from information retrieval, as applied to this challenge. Our experiments with probabilistic information retrieval methods, such as query term weight optimization, automatic query expansion and simulated user relevance feedback, demonstrate that automatically boosting the weights of important keywords in a verbose query is more effective than other methods. We also show that although there is a rich space of potential representations and features available in this domain, machine learning-based re-ranking models are not able to improve on probabilistic information retrieval techniques with the currently available training data. The models and algorithms presented in this paper can serve as a viable implementation of a search engine to provide access to biomedical datasets. The retrieval performance is expected to be further improved by using additional training data that is created by expert annotation, or gathered through usage logs, clicks and other processes during natural operation of the system. Database URL: https://github.com/emory-irlab/biocaddie

  7. Exploring the uncertainties in cancer risk assessment using the integrated probabilistic risk assessment (IPRA) approach.

    Science.gov (United States)

    Slob, Wout; Bakker, Martine I; Biesebeek, Jan Dirk Te; Bokkers, Bas G H

    2014-08-01

    Current methods for cancer risk assessment result in single values, without any quantitative information on the uncertainties in these values. Therefore, single risk values could easily be overinterpreted. In this study, we discuss a full probabilistic cancer risk assessment approach in which all the generally recognized uncertainties in both exposure and hazard assessment are quantitatively characterized and probabilistically evaluated, resulting in a confidence interval for the final risk estimate. The methodology is applied to three example chemicals (aflatoxin, N-nitrosodimethylamine, and methyleugenol). These examples illustrate that the uncertainty in a cancer risk estimate may be huge, making single value estimates of cancer risk meaningless. Further, a risk based on linear extrapolation tends to be lower than the upper 95% confidence limit of a probabilistic risk estimate, and in that sense it is not conservative. Our conceptual analysis showed that there are two possible basic approaches for cancer risk assessment, depending on the interpretation of the dose-incidence data measured in animals. However, it remains unclear which of the two interpretations is the more adequate one, adding an additional uncertainty to the already huge confidence intervals for cancer risk estimates. © 2014 Society for Risk Analysis.

  8. Probabilistic Approach to Enable Extreme-Scale Simulations under Uncertainty and System Faults. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Knio, Omar [Duke Univ., Durham, NC (United States). Dept. of Mechanical Engineering and Materials Science

    2017-05-05

    The current project develops a novel approach that uses a probabilistic description to capture the current state of knowledge about the computational solution. To effectively spread the computational effort over multiple nodes, the global computational domain is split into many subdomains. Computational uncertainty in the solution translates into uncertain boundary conditions for the equation system to be solved on those subdomains, and many independent, concurrent subdomain simulations are used to account for this bound- ary condition uncertainty. By relying on the fact that solutions on neighboring subdomains must agree with each other, a more accurate estimate for the global solution can be achieved. Statistical approaches in this update process make it possible to account for the effect of system faults in the probabilistic description of the computational solution, and the associated uncertainty is reduced through successive iterations. By combining all of these elements, the probabilistic reformulation allows splitting the computational work over very many independent tasks for good scalability, while being robust to system faults.

  9. Dietary Exposure Assessment of Danish Consumers to Dithiocarbamate Residues in Food: a Comparison of the Deterministic and Probabilistic Approach

    DEFF Research Database (Denmark)

    Jensen, Bodil Hamborg; Andersen, Jens Hinge; Petersen, Annette

    2008-01-01

    Probabilistic and deterministic estimates of the acute and chronic exposure of the Danish populations to dithiocarbamate residues were performed. The Monte Carlo Risk Assessment programme (MCRA 4.0) was used for the probabilistic risk assessment. Food consumption data were obtained from...... the nationwide dietary survey conducted in 2000-02. Residue data for 5721 samples from the monitoring programme conducted in the period 1998-2003 were used for dithiocarbamates, which had been determined as carbon disulphide. Contributions from 26 commodities were included in the calculations. Using...... the probabilistic approach, the daily acute intakes at the 99.9% percentile for adults and children were 11.2 and 28.2 mu g kg(-1) body weight day(-1), representing 5.6% and 14.1% of the ARfD for maneb, respectively. When comparing the point estimate approach with the probabilistic approach, the outcome...

  10. AN APPROACH TO THE PROBABILISTIC CORROSION RATE ESTIMATION MODEL FOR INNER BOTTOM PLATES OF BULK CARRIERS

    Directory of Open Access Journals (Sweden)

    Romeo Meštrović

    2017-01-01

    Full Text Available This paper gives an approach to the probabilistic corrosion rate estimation model for inner bottom plates of bulk carriers. Firstly, by using the data from thickness measurements for inner bottom plates for considered 25 bulk carriers, the related best fitted linear model for the corrosion wastage is obtained as a function of ship’s age. In this model it is assumed that life of coating is 4 years. The obtained related corrosion rate is equal to mm/year. Notice that the obtained linear model is a particular case of a power model proposed in some earlier investigations. In view of the fact that the corrosion rate of ship hull structures is influenced by many factors, many of an uncertain nature, in recent studies several authors investigated a probabilistic model as more appropriate to describe the expected corrosion. Motivated by these investigations, and using 2926 thickness measurements for corrosion wastage of inner bottom plates of considered 38 special ships surveys, this paper examines the cumulative density function for the corrosion rate involved in the mentioned linear model, and considered here as a continuous random variable. The obtained statistical, numerical and graphical results show that the logistic distribution or normal distribution would be well appropriate for the probabilistic corrosion rate estimation model for inner bottom plates of bulk carriers. It is believed that this fact will be confirmed with greater statistical reliability in our future investigations including many more data collected on the considered corrosion.

  11. Development of a new adaptive ordinal approach to continuous-variable probabilistic optimization.

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente JosÔe; Chen, Chun-Hung (George Mason University, Fairfax, VA)

    2006-11-01

    A very general and robust approach to solving continuous-variable optimization problems involving uncertainty in the objective function is through the use of ordinal optimization. At each step in the optimization problem, improvement is based only on a relative ranking of the uncertainty effects on local design alternatives, rather than on precise quantification of the effects. One simply asks ''Is that alternative better or worse than this one?'' -not ''HOW MUCH better or worse is that alternative to this one?'' The answer to the latter question requires precise characterization of the uncertainty--with the corresponding sampling/integration expense for precise resolution. However, in this report we demonstrate correct decision-making in a continuous-variable probabilistic optimization problem despite extreme vagueness in the statistical characterization of the design options. We present a new adaptive ordinal method for probabilistic optimization in which the trade-off between computational expense and vagueness in the uncertainty characterization can be conveniently managed in various phases of the optimization problem to make cost-effective stepping decisions in the design space. Spatial correlation of uncertainty in the continuous-variable design space is exploited to dramatically increase method efficiency. Under many circumstances the method appears to have favorable robustness and cost-scaling properties relative to other probabilistic optimization methods, and uniquely has mechanisms for quantifying and controlling error likelihood in design-space stepping decisions. The method is asymptotically convergent to the true probabilistic optimum, so could be useful as a reference standard against which the efficiency and robustness of other methods can be compared--analogous to the role that Monte Carlo simulation plays in uncertainty propagation.

  12. Duplicate Detection in Probabilistic Data

    NARCIS (Netherlands)

    Panse, Fabian; van Keulen, Maurice; de Keijzer, Ander; Ritter, Norbert

    2009-01-01

    Collected data often contains uncertainties. Probabilistic databases have been proposed to manage uncertain data. To combine data from multiple autonomous probabilistic databases, an integration of probabilistic data has to be performed. Until now, however, data integration approaches have focused

  13. Poisson-Lie T-duality of string effective actions: A new approach to the dilaton puzzle

    Czech Academy of Sciences Publication Activity Database

    Jurčo, B.; Vysoký, Jan

    2018-01-01

    Roč. 130, August (2018), s. 1-26 ISSN 0393-0440 Institutional support: RVO:67985840 Keywords : Poisson-Lie T-duality * string effective actions * dilaton field Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 0.819, year: 2016 https://www.sciencedirect.com/science/article/pii/S0393044018301748?via%3Dihub

  14. Poisson-Lie T-duality of string effective actions: A new approach to the dilaton puzzle

    Czech Academy of Sciences Publication Activity Database

    Jurčo, B.; Vysoký, Jan

    2018-01-01

    Roč. 130, August (2018), s. 1-26 ISSN 0393-0440 Institutional support: RVO:67985840 Keywords : Poisson-Lie T-duality * string effective actions * dilaton field Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 0.819, year: 2016 https://www. science direct.com/ science /article/pii/S0393044018301748?via%3Dihub

  15. A Probabilistic Approach to Fitting Period–luminosity Relations and Validating Gaia Parallaxes

    Energy Technology Data Exchange (ETDEWEB)

    Sesar, Branimir; Fouesneau, Morgan; Bailer-Jones, Coryn A. L.; Gould, Andy; Rix, Hans-Walter [Max Planck Institute for Astronomy, Königstuhl 17, D-69117 Heidelberg (Germany); Price-Whelan, Adrian M., E-mail: bsesar@mpia.de [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States)

    2017-04-01

    Pulsating stars, such as Cepheids, Miras, and RR Lyrae stars, are important distance indicators and calibrators of the “cosmic distance ladder,” and yet their period–luminosity–metallicity (PLZ) relations are still constrained using simple statistical methods that cannot take full advantage of available data. To enable optimal usage of data provided by the Gaia mission, we present a probabilistic approach that simultaneously constrains parameters of PLZ relations and uncertainties in Gaia parallax measurements. We demonstrate this approach by constraining PLZ relations of type ab RR Lyrae stars in near-infrared W 1 and W 2 bands, using Tycho- Gaia Astrometric Solution (TGAS) parallax measurements for a sample of ≈100 type ab RR Lyrae stars located within 2.5 kpc of the Sun. The fitted PLZ relations are consistent with previous studies, and in combination with other data, deliver distances precise to 6% (once various sources of uncertainty are taken into account). To a precision of 0.05 mas (1 σ ), we do not find a statistically significant offset in TGAS parallaxes for this sample of distant RR Lyrae stars (median parallax of 0.8 mas and distance of 1.4 kpc). With only minor modifications, our probabilistic approach can be used to constrain PLZ relations of other pulsating stars, and we intend to apply it to Cepheid and Mira stars in the near future.

  16. Simplified probabilistic approach to determine safety factors in deterministic flaw acceptance criteria

    International Nuclear Information System (INIS)

    Barthelet, B.; Ardillon, E.

    1997-01-01

    The flaw acceptance rules in nuclear components rely on deterministic criteria supposed to ensure the safe operating of plants. The interest of having a reliable method of evaluating the safety margins and the integrity of components led Electricite de France to launch a study to link safety factors with requested reliability. A simplified analytical probabilistic approach is developed to analyse the failure risk in Fracture Mechanics. Assuming lognormal distributions of the main random variables, it is possible considering a simple Linear Elastic Fracture Mechanics model, to determine the failure probability as a function of mean values and logarithmic standard deviations. The 'design' failure point can be analytically calculated. Partial safety factors on the main variables (stress, crack size, material toughness) are obtained in relation with reliability target values. The approach is generalized to elastic plastic Fracture Mechanics (piping) by fitting J as a power law function of stress, crack size and yield strength. The simplified approach is validated by detailed probabilistic computations with PROBAN computer program. Assuming reasonable coefficients of variations (logarithmic standard deviations), the method helps to calibrate safety factors for different components taking into account reliability target values in normal, emergency and faulted conditions. Statistical data for the mechanical properties of the main basic materials complement the study. The work involves laboratory results and manufacture data. The results of this study are discussed within a working group of the French in service inspection code RSE-M. (authors)

  17. Illustration of probabilistic approach in consequence assessment of accidental radioactive releases

    International Nuclear Information System (INIS)

    Pecha, P.; Hofman, R.; Kuca, P.

    2008-01-01

    We are describing a certain application of uncertainty analysis of environmental model HARP applied on atmospheric and deposition sub-model. Simulation of uncertainties propagation through the model is basic inevitable task bringing data for advanced techniques of probabilistic consequence assessment and further improvement of reliability of model predictions based on statistical procedures of assimilation with measured data. The activities are investigated in the institute IITA AV CR within the grant project supported by GACR (2007-2009). The problem is solved in close cooperation with section of information systems in institute NRPI. The subject of investigation concerns evaluation of consequences of radioactivity propagation after an accidental radioactivity release from nuclear facility.Transport of activity is studied from initial atmospheric propagation, deposition of radionuclides on terrain and spreading through food chains towards human body .Subsequent deposition processes of admixtures and food chain activity transport are modeled. In the final step a hazard estimation based on doses on population is integrated into the software system HARP. Extension to probabilistic approach has increased the complexity substantially, but offers much more informative background for modem methods of estimation accounting for inherent stochastic nature of the problem. Example of probabilistic assessment illustrated here is based on uncertainty analysis of input parameters of SGPM model. Predicted background field of Cs-137 deposition are labelled with index p. as P X SGPM . Final goal is estimation of a certain unknown true background vector χ true , which accounts also for deficiencies of the SGPM formulation in itself insisting in insufficient description of reality. We must have on mind, that even if we know true values of all input parameters θ m true (m= 1 ,..., M) of SGPM model, the χ true still remain uncertain. One possibility how to approach reality insists

  18. Illustration of probabilistic approach in consequence assessment of accidental radioactive releases

    International Nuclear Information System (INIS)

    Pecha, P.; Hofman, R.; Kuca, P.

    2009-01-01

    We are describing a certain application of uncertainty analysis of environmental model HARP applied on atmospheric and deposition sub-model. Simulation of uncertainties propagation through the model is basic inevitable task bringing data for advanced techniques of probabilistic consequence assessment and further improvement of reliability of model predictions based on statistical procedures of assimilation with measured data. The activities are investigated in the institute IITA AV CR within the grant project supported by GACR (2007-2009). The problem is solved in close cooperation with section of information systems in institute NRPI. The subject of investigation concerns evaluation of consequences of radioactivity propagation after an accidental radioactivity release from nuclear facility.Transport of activity is studied from initial atmospheric propagation, deposition of radionuclides on terrain and spreading through food chains towards human body .Subsequent deposition processes of admixtures and food chain activity transport are modeled. In the final step a hazard estimation based on doses on population is integrated into the software system HARP. Extension to probabilistic approach has increased the complexity substantially, but offers much more informative background for modem methods of estimation accounting for inherent stochastic nature of the problem. Example of probabilistic assessment illustrated here is based on uncertainty analysis of input parameters of SGPM model. Predicted background field of Cs-137 deposition are labelled with index p. as P X SGPM . Final goal is estimation of a certain unknown true background vector χ true , which accounts also for deficiencies of the SGPM formulation in itself insisting in insufficient description of reality. We must have on mind, that even if we know true values of all input parameters θ m true (m= 1 ,..., M) of SGPM model, the χ true still remain uncertain. One possibility how to approach reality insists

  19. Experiencing a probabilistic approach to clarify and disclose uncertainties when setting occupational exposure limits.

    Science.gov (United States)

    Vernez, David; Fraize-Frontier, Sandrine; Vincent, Raymond; Binet, Stéphane; Rousselle, Christophe

    2018-03-15

    Assessment factors (AFs) are commonly used for deriving reference concentrations for chemicals. These factors take into account variabilities as well as uncertainties in the dataset, such as inter-species and intra-species variabilities or exposure duration extrapolation or extrapolation from the lowest-observed-adverse-effect level (LOAEL) to the noobserved- adverse-effect level (NOAEL). In a deterministic approach, the value of an AF is the result of a debate among experts and, often a conservative value is used as a default choice. A probabilistic framework to better take into account uncertainties and/or variability when setting occupational exposure limits (OELs) is presented and discussed in this paper. Each AF is considered as a random variable with a probabilistic distribution. A short literature was conducted before setting default distributions ranges and shapes for each AF commonly used. A random sampling, using Monte Carlo techniques, is then used for propagating the identified uncertainties and computing the final OEL distribution. Starting from the broad default distributions obtained, experts narrow it to its most likely range, according to the scientific knowledge available for a specific chemical. Introducing distribution rather than single deterministic values allows disclosing and clarifying variability and/or uncertainties inherent to the OEL construction process. This probabilistic approach yields quantitative insight into both the possible range and the relative likelihood of values for model outputs. It thereby provides a better support in decision-making and improves transparency. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.

  20. Poisson integrators for Lie-Poisson structures on R3

    International Nuclear Information System (INIS)

    Song Lina

    2011-01-01

    This paper is concerned with the study of Poisson integrators. We are interested in Lie-Poisson systems on R 3 . First, we focus on Poisson integrators for constant Poisson systems and the transformations used for transforming Lie-Poisson structures to constant Poisson structures. Then, we construct local Poisson integrators for Lie-Poisson systems on R 3 . Finally, we present the results of numerical experiments for two Lie-Poisson systems and compare our Poisson integrators with other known methods.

  1. Comparison of a Traditional Probabilistic Risk Assessment Approach with Advanced Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L; Mandelli, Diego; Zhegang Ma

    2014-11-01

    As part of the Light Water Sustainability Program (LWRS) [1], the purpose of the Risk Informed Safety Margin Characterization (RISMC) [2] Pathway research and development (R&D) is to support plant decisions for risk-informed margin management with the aim to improve economics, reliability, and sustain safety of current NPPs. In this paper, we describe the RISMC analysis process illustrating how mechanistic and probabilistic approaches are combined in order to estimate a safety margin. We use the scenario of a “station blackout” (SBO) wherein offsite power and onsite power is lost, thereby causing a challenge to plant safety systems. We describe the RISMC approach, illustrate the station blackout modeling, and contrast this with traditional risk analysis modeling for this type of accident scenario. We also describe our approach we are using to represent advanced flooding analysis.

  2. A fourth order PDE based fuzzy c- means approach for segmentation of microscopic biopsy images in presence of Poisson noise for cancer detection.

    Science.gov (United States)

    Kumar, Rajesh; Srivastava, Subodh; Srivastava, Rajeev

    2017-07-01

    For cancer detection from microscopic biopsy images, image segmentation step used for segmentation of cells and nuclei play an important role. Accuracy of segmentation approach dominate the final results. Also the microscopic biopsy images have intrinsic Poisson noise and if it is present in the image the segmentation results may not be accurate. The objective is to propose an efficient fuzzy c-means based segmentation approach which can also handle the noise present in the image during the segmentation process itself i.e. noise removal and segmentation is combined in one step. To address the above issues, in this paper a fourth order partial differential equation (FPDE) based nonlinear filter adapted to Poisson noise with fuzzy c-means segmentation method is proposed. This approach is capable of effectively handling the segmentation problem of blocky artifacts while achieving good tradeoff between Poisson noise removals and edge preservation of the microscopic biopsy images during segmentation process for cancer detection from cells. The proposed approach is tested on breast cancer microscopic biopsy data set with region of interest (ROI) segmented ground truth images. The microscopic biopsy data set contains 31 benign and 27 malignant images of size 896 × 768. The region of interest selected ground truth of all 58 images are also available for this data set. Finally, the result obtained from proposed approach is compared with the results of popular segmentation algorithms; fuzzy c-means, color k-means, texture based segmentation, and total variation fuzzy c-means approaches. The experimental results shows that proposed approach is providing better results in terms of various performance measures such as Jaccard coefficient, dice index, Tanimoto coefficient, area under curve, accuracy, true positive rate, true negative rate, false positive rate, false negative rate, random index, global consistency error, and variance of information as compared to other

  3. Comparison of Two Probabilistic Fatigue Damage Assessment Approaches Using Prognostic Performance Metrics

    Directory of Open Access Journals (Sweden)

    Xuefei Guan

    2011-01-01

    Full Text Available In this paper, two probabilistic prognosis updating schemes are compared. One is based on the classical Bayesian approach and the other is based on newly developed maximum relative entropy (MRE approach. The algorithm performance of the two models is evaluated using a set of recently developed prognostics-based metrics. Various uncertainties from measurements, modeling, and parameter estimations are integrated into the prognosis framework as random input variables for fatigue damage of materials. Measures of response variables are then used to update the statistical distributions of random variables and the prognosis results are updated using posterior distributions. Markov Chain Monte Carlo (MCMC technique is employed to provide the posterior samples for model updating in the framework. Experimental data are used to demonstrate the operation of the proposed probabilistic prognosis methodology. A set of prognostics-based metrics are employed to quantitatively evaluate the prognosis performance and compare the proposed entropy method with the classical Bayesian updating algorithm. In particular, model accuracy, precision, robustness and convergence are rigorously evaluated in addition to the qualitative visual comparison. Following this, potential development and improvement for the prognostics-based metrics are discussed in detail.

  4. Deterministic and probabilistic approaches in the supervision of Swiss nuclear power plants

    International Nuclear Information System (INIS)

    Theiss, K.; Schoen, G.; Schmocker, U.

    2002-01-01

    The report presents an overview of the planned approach to be taken by the Central Department for the Safety of Nuclear Facilities (HSK) in systematically integrating probabilistic safety analyses (PSA) into the supervision of Swiss nuclear power plants which, so far, has been largely deterministic in character. On the basis of a description of the present rank of PSA, the principles of risk-informed supervision as sought by HSK are outlined. In addition, practical applications of PSA are shown by a number of examples implemented along-side the development of the concept. HSK recognized the importance of PSA early on and required all Swiss nuclear power plants to conduct level-1 and level-2 PSA studies for full-power operation as early as in 1987. In early 1990, level-1 PSA was extended to cover also analyses of startup, shutdown, and outage modes of operation. PSA has become more important also as a consequence of the periodic safety reviews (PSR) carried out since 1995 within the framework of ongoing supervisory activities. It has become a permanent part in a holistic system evaluating the safety status of nuclear power plants. The next step planned by HSK is the introduction of risk-informed supervision in which probabilistic considerations will be employed side by side with deterministic approaches. (orig.) [de

  5. Probabilistic approach in treatment of deterministic analyses results of severe accidents

    International Nuclear Information System (INIS)

    Krajnc, B.; Mavko, B.

    1996-01-01

    Severe accidents sequences resulting in loss of the core geometric integrity have been found to have small probability of the occurrence. Because of their potential consequences to public health and safety, an evaluation of the core degradation progression and the resulting effects on the containment is necessary to determine the probability of a significant release of radioactive materials. This requires assessment of many interrelated phenomena including: steel and zircaloy oxidation, steam spikes, in-vessel debris cooling, potential vessel failure mechanisms, release of core material to the containment, containment pressurization from steam generation, or generation of non-condensable gases or hydrogen burn, and ultimately coolability of degraded core material. To asses the answer from the containment event trees in the sense of weather certain phenomenological event would happen or not the plant specific deterministic analyses should be performed. Due to the fact that there is a large uncertainty in the prediction of severe accidents phenomena in Level 2 analyses (containment event trees) the combination of probabilistic and deterministic approach should be used. In fact the result of the deterministic analyses of severe accidents are treated in probabilistic manner due to large uncertainty of results as a consequence of a lack of detailed knowledge. This paper discusses approach used in many IPEs, and which assures that the assigned probability for certain question in the event tree represent the probability that the event will or will not happen and that this probability also includes its uncertainty, which is mainly result of lack of knowledge. (author)

  6. Assessing dynamic postural control during exergaming in older adults: A probabilistic approach.

    Science.gov (United States)

    Soancatl Aguilar, V; Lamoth, C J C; Maurits, N M; Roerdink, J B T M

    2018-02-01

    Digital games controlled by body movements (exergames) have been proposed as a way to improve postural control among older adults. Exergames are meant to be played at home in an unsupervised way. However, only few studies have investigated the effect of unsupervised home-exergaming on postural control. Moreover, suitable methods to dynamically assess postural control during exergaming are still scarce. Dynamic postural control (DPC) assessment could be used to provide both meaningful feedback and automatic adjustment of exergame difficulty. These features could potentially foster unsupervised exergaming at home and improve the effectiveness of exergames as tools to improve balance control. The main aim of this study is to investigate the effect of six weeks of unsupervised home-exergaming on DPC as assessed by a recently developed probabilistic model. High probability values suggest 'deteriorated' postural control, whereas low probability values suggest 'good' postural control. In a pilot study, ten healthy older adults (average 77.9, SD 7.2 years) played an ice-skating exergame at home half an hour per day, three times a week during six weeks. The intervention effect on DPC was assessed using exergaming trials recorded by Kinect at baseline and every other week. Visualization of the results suggests that the probabilistic model is suitable for real-time DPC assessment. Moreover, linear mixed model analysis and parametric bootstrapping suggest a significant intervention effect on DPC. In conclusion, these results suggest that unsupervised exergaming for improving DPC among older adults is indeed feasible and that probabilistic models could be a new approach to assess DPC. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. A probabilistic approach to the computation of the levelized cost of electricity

    International Nuclear Information System (INIS)

    Geissmann, Thomas

    2017-01-01

    This paper sets forth a novel approach to calculate the levelized cost of electricity (LCOE) using a probabilistic model that accounts for endogenous input parameters. The approach is applied to the example of a nuclear and gas power project. Monte Carlo simulation results show that a correlation between input parameters has a significant effect on the model outcome. By controlling for endogeneity, a statistically significant difference in the mean LCOE estimate and a change in the order of input leverages is observed. Moreover, the paper discusses the role of discounting options and external costs in detail. In contrast to the gas power project, the economic viability of the nuclear project is considerably weaker. - Highlights: • First model of levelized cost of electricity accounting for uncertainty and endogeneities in input parameters. • Allowance for endogeneities significantly affects results. • Role of discounting options and external costs is discussed and modelled.

  8. Application of a time probabilistic approach to seismic landslide hazard estimates in Iran

    Science.gov (United States)

    Rajabi, A. M.; Del Gaudio, V.; Capolongo, D.; Khamehchiyan, M.; Mahdavifar, M. R.

    2009-04-01

    Iran is a country located in a tectonic active belt and is prone to earthquake and related phenomena. In the recent years, several earthquakes caused many fatalities and damages to facilities, e.g. the Manjil (1990), Avaj (2002), Bam (2003) and Firuzabad-e-Kojur (2004) earthquakes. These earthquakes generated many landslides. For instance, catastrophic landslides triggered by the Manjil Earthquake (Ms = 7.7) in 1990 buried the village of Fatalak, killed more than 130 peoples and cut many important road and other lifelines, resulting in major economic disruption. In general, earthquakes in Iran have been concentrated in two major zones with different seismicity characteristics: one is the region of Alborz and Central Iran and the other is the Zagros Orogenic Belt. Understanding where seismically induced landslides are most likely to occur is crucial in reducing property damage and loss of life in future earthquakes. For this purpose a time probabilistic approach for earthquake-induced landslide hazard at regional scale, proposed by Del Gaudio et al. (2003), has been applied to the whole Iranian territory to provide the basis of hazard estimates. This method consists in evaluating the recurrence of seismically induced slope failure conditions inferred from the Newmark's model. First, by adopting Arias Intensity to quantify seismic shaking and using different Arias attenuation relations for Alborz - Central Iran and Zagros regions, well-established methods of seismic hazard assessment, based on the Cornell (1968) method, were employed to obtain the occurrence probabilities for different levels of seismic shaking in a time interval of interest (50 year). Then, following Jibson (1998), empirical formulae specifically developed for Alborz - Central Iran and Zagros, were used to represent, according to the Newmark's model, the relation linking Newmark's displacement Dn to Arias intensity Ia and to slope critical acceleration ac. These formulae were employed to evaluate

  9. Retention and Curve Number Variability in a Small Agricultural Catchment: The Probabilistic Approach

    Directory of Open Access Journals (Sweden)

    Kazimierz Banasik

    2014-04-01

    Full Text Available The variability of the curve number (CN and the retention parameter (S of the Soil Conservation Service (SCS-CN method in a small agricultural, lowland watershed (23.4 km2 to the gauging station in central Poland has been assessed using the probabilistic approach: distribution fitting and confidence intervals (CIs. Empirical CNs and Ss were computed directly from recorded rainfall depths and direct runoff volumes. Two measures of the goodness of fit were used as selection criteria in the identification of the parent distribution function. The measures specified the generalized extreme value (GEV, normal and general logistic (GLO distributions for 100-CN and GLO, lognormal and GEV distributions for S. The characteristics estimated from theoretical distribution (median, quantiles were compared to the tabulated CN and to the antecedent runoff conditions of Hawkins and Hjelmfelt. The distribution fitting for the whole sample revealed a good agreement between the tabulated CN and the median and between the antecedent runoff conditions (ARCs of Hawkins and Hjelmfelt, which certified a good calibration of the model. However, the division of the CN sample due to heavy and moderate rainfall depths revealed a serious inconsistency between the parameters mentioned. This analysis proves that the application of the SCS-CN method should rely on deep insight into the probabilistic properties of CN and S.

  10. Simple probabilistic approach to evaluate radioiodine behavior at severe accidents: application to Phebus test FPT1

    International Nuclear Information System (INIS)

    Rydl, A.

    2007-01-01

    The contribution of radioiodine to risk from a severe accident is recognized to be one of the highest among all the fission products. In a long term (e.g. several days), volatile species of iodine are the most important forms of iodine from the safety point of view. These volatile forms ('volatile iodine') are mainly molecular iodine, I 2 , and various types of organic iodides, RI. A certain controversy exist today among the international research community about the relative importance of the processes leading to volatile iodine formation in containment under severe accident conditions. The amount of knowledge, coming from experiments, of the phenomenology of iodine behavior is enormous and it is embedded in specialized mechanistic or empirical codes. An exhaustive description of the processes governing the iodine behavior in containment is given in reference 1. Yet, all this knowledge is still not enough to resolve some important questions. Moreover, the results of different codes -when applied to relatively simple experiments, such as RTF or CAIMAN - vary widely. Thus, as a complement (or maybe even as an alternative in some instances) to deterministic analyses of iodine behavior, simple probabilistic approach is proposed in this work which could help to see the whole problem in a different perspective. The final goal of using this approach should be the characterization of uncertainties of the description of various processes in question. This would allow for identification of the processes which contribute most significantly to the overall uncertainty of the predictions of iodine volatility in containment. In this work we made a dedicated, small event tree to describe iodine behavior at an accident and we used that tree for a simple sensitivity study. For the evaluation of the tree, the US NRC code EVNTRE was used. To test the proposed probabilistic approach we analyzed results of the integral PHEBUS FPT1 experiment which comprises most of the important

  11. Assessment of Landslide-Tsunami Hazard for the Gulf of Mexico Using a Probabilistic Approach

    Science.gov (United States)

    Pampell, A.; Horrillo, J. J.; Parambath, L.; Shigihara, Y.

    2014-12-01

    The devastating consequences of recent tsunami events in Indonesia (2004) and Japan (2011) have prompted a scientific response in assessing tsunami hazard even in regions where an apparent low risk or/and lack of complete historical tsunami record exists. Although a great uncertainty exists regarding the recurrence rate of large-scale tsunami events in the Gulf of Mexico (GOM) due to sparsity of data, geological and historical evidences indicate that the most likely tsunami hazard could come from a submarine landslide triggered by a moderate earthquake. Under these circumstances, the assessment of the tsunami hazard in the region could be better accomplished by means of a probabilistic approach to identify tsunami sources. This study aims to customize for the GOM a probabilistic hazard assessment based on recurrence rates of tsunamigenic submarine mass failures (SMFs). The Monte Carlo Simulation (MCS) technique is employed utilizing matrix correlations for landslide parameters to incorporate the uncertainty related to location/water-depth and landslide dimension based on lognormal/normal distributions obtained from observed data. Along fixed transects over the continental slope of the GOM, slide angle of failure, sediment properties and seismic peak horizontal accelerations (PHA) are determined by publicly available data. These parameter values are used to perform slope stability analyses in randomly generated translational SMFs obtained from the MCS technique. Once the SMF is identified as tsunamigenic for a given PHA recurrence rate, a preliminary tsunami amplitude can be estimated using empirical formulations. Thus, the annual probability of a tsunamigenic SMF is determined by the joint probability of failure with the annual PHA. By using the probabilistic approach, we identified tsunami sources with recurrence rates from few thousands to 10,000 years which produce extreme wave amplitudes for each transect. The most likely extreme tsunamigenic SMF events for a

  12. Probabilistic Logic and Probabilistic Networks

    NARCIS (Netherlands)

    Haenni, R.; Romeijn, J.-W.; Wheeler, G.; Williamson, J.

    2009-01-01

    While in principle probabilistic logics might be applied to solve a range of problems, in practice they are rarely applied at present. This is perhaps because they seem disparate, complicated, and computationally intractable. However, we shall argue in this programmatic paper that several approaches

  13. A tractable DDN-POMDP Approach to Affective Dialogue Modeling for General Probabilistic Frame-based Dialogue Systems

    NARCIS (Netherlands)

    Bui Huu Trung, B.H.T.; Poel, Mannes; Nijholt, Antinus; Zwiers, Jakob

    2006-01-01

    We propose a new approach to developing a tractable affective dialogue model for general probabilistic frame-based dialogue systems. The dialogue model, based on the Partially Observable Markov Decision Process (POMDP) and the Dynamic Decision Network (DDN) techniques, is composed of two main parts,

  14. A tractable DDN-POMDP Approach to Affective Dialogue Modeling for General Probabilistic Frame-based Dialogue Systems

    NARCIS (Netherlands)

    Bui Huu Trung, B.H.T.; Poel, Mannes; Nijholt, Antinus; Zwiers, Jakob; Traum, D.; Alexandersson, J.; Jonsson, A.; Zukerman, I.

    2007-01-01

    We propose a new approach to developing a tractable affective dialogue model for general probabilistic frame-based dialogue systems. The dialogue model, based on the Partially Observable Markov Decision Process (POMDP) and the Dynamic Decision Network (DDN) techniques, is composed of two main parts,

  15. An in silico comparison between margin-based and probabilistic target-planning approaches in head and neck cancer patients

    NARCIS (Netherlands)

    Fontanarosa, Davide; van der Laan, Hans Paul; Witte, Marnix; Shakirin, Georgy; Roelofs, Erik; Langendijk, Johannes A.; Lambin, Philippe; van Herk, Marcel

    2013-01-01

    To apply target probabilistic planning (TPP) approach to intensity modulated radiotherapy (IMRT) plans for head and neck cancer (HNC) patients. Twenty plans of HNC patients were re-planned replacing the simultaneous integrated boost IMRT optimization objectives for minimum dose on the boost target

  16. 77 FR 29391 - An Approach for Probabilistic Risk Assessment in Risk-Informed Decisions on Plant-Specific...

    Science.gov (United States)

    2012-05-17

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0110] An Approach for Probabilistic Risk Assessment in Risk-Informed Decisions on Plant-Specific Changes to the Licensing Basis AGENCY: Nuclear Regulatory Commission. ACTION: Draft regulatory guide; request for comment. SUMMARY: The U.S. Nuclear Regulatory...

  17. A probabilistic approach to quantify the uncertainties in internal dose assessment using response surface and neural network

    International Nuclear Information System (INIS)

    Baek, M.; Lee, S.K.; Lee, U.C.; Kang, C.S.

    1996-01-01

    A probabilistic approach is formulated to assess the internal radiation exposure following the intake of radioisotopes. This probabilistic approach consists of 4 steps as follows: (1) screening, (2) quantification of uncertainties, (3) propagation of uncertainties, and (4) analysis of output. The approach has been applied for Pu-induced internal dose assessment and a multi-compartment dosimetric model is used for internal transport. In this approach, surrogate models of original system are constructed using response and neural network. And the results of these surrogate models are compared with those of original model. Each surrogate model well approximates the original model. The uncertainty and sensitivity analysis of the model parameters are evaluated in this process. Dominant contributors to each organ are identified and the results show that this approach could serve a good tool of assessing the internal radiation exposure

  18. Probabilistic approach to the prediction of radioactive contamination of agricultural production

    International Nuclear Information System (INIS)

    Fesenko, S.F.; Chernyaeva, L.G.; Sanzharova, N.I.; Aleksakhin, R.M.

    1993-01-01

    The organization of agricultural production on the territory contaminated as a result of the Chernobyl reactor disaster involves prediction of the content of radionuclides in agro-industrial products. Traditional methods of prediting the contamination in the products does not give sufficient agreement with actual data and as a result it is difficult to make the necessary decisions about eliminating the consequences of the disaster in the agro-industrial complex. In many ways this is because the available methods are based on data on the radionuclide content in soils, plants, and plant and animal products. The parameters of the models used in the prediction are also evaluated on the basis of these results. Even if obtained from a single field or herd of livestock, however, such indicators have substantial variation coefficients due to various factors such as the spatial structure of the fallouts, the variability of the soil properties, the sampling error, the errors of processing and measuring the samples, and well as the data-averaging error. Consequently the parameters of the radionuclide transfer along the agricultural chains are very variable, thus considerably reducing the reliability of predicted values. The reliability of the prediction of radioactive contamination of agricultural products can be increased substantially by taking a probabilistic approach involving information about the random laws of contamination of farming land and the statistical features of the parameters of radionuclie migration along food chains. Considering the above, comparative analysis is made of the results obtained on the basis of the traditional treatment (deterministic in the simplest form) and its probabilistic analog

  19. Sequence-based discrimination of protein-RNA interacting residues using a probabilistic approach.

    Science.gov (United States)

    Pai, Priyadarshini P; Dash, Tirtharaj; Mondal, Sukanta

    2017-04-07

    Protein interactions with ribonucleic acids (RNA) are well-known to be crucial for a wide range of cellular processes such as transcriptional regulation, protein synthesis or translation, and post-translational modifications. Identification of the RNA-interacting residues can provide insights into these processes and aid in relevant biotechnological manipulations. Owing to their eventual potential in combating diseases and industrial production, several computational attempts have been made over years using sequence- and structure-based information. Recent comparative studies suggest that despite these developments, many problems are faced with respect to the usability, prerequisites, and accessibility of various tools, thereby calling for an alternative approach and perspective supplementation in the prediction scenario. With this motivation, in this paper, we propose the use of a simple-yet-efficient conditional probabilistic approach based on the application of local occurrence of amino acids in the interacting region in a non-numeric sequence feature space, for discriminating between RNA interacting and non-interacting residues. The proposed method has been meticulously tested for robustness using a cross-estimation method showing MCC of 0.341 and F- measure of 66.84%. Upon exploring large scale applications using benchmark datasets available to date, this approach showed an encouraging performance comparable with the state-of-art. The software is available at https://github.com/ABCgrp/DORAEMON. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Probabilistic and Fuzzy Arithmetic Approaches for the Treatment of Uncertainties in the Installation of Torpedo Piles

    Directory of Open Access Journals (Sweden)

    Breno Pinheiro Jacob

    2008-06-01

    Full Text Available The “torpedo” pile is a foundation system that has been recently considered to anchor mooring lines and risers of floating production systems for offshore oil exploitation. The pile is installed in a free fall operation from a vessel. However, the soil parameters involved in the penetration model of the torpedo pile contain uncertainties that can affect the precision of analysis methods to evaluate its final penetration depth. Therefore, this paper deals with methodologies for the assessment of the sensitivity of the response to the variation of the uncertain parameters and mainly to incorporate into the analysis method techniques for the formal treatment of the uncertainties. Probabilistic and “possibilistic” approaches are considered, involving, respectively, the Monte Carlo method (MC and concepts of fuzzy arithmetic (FA. The results and performance of both approaches are compared, stressing the ability of the latter approach to efficiently deal with the uncertainties of the model, with outstanding computational efficiency, and therefore, to comprise an effective design tool.

  1. A probabilistic approach for optimal sensor allocation in structural health monitoring

    Science.gov (United States)

    Azarbayejani, M.; El-Osery, A. I.; Choi, K. K.; Reda Taha, M. M.

    2008-10-01

    Recent advances in sensor technology promote using large sensor networks to efficiently and economically monitor, identify and quantify damage in structures. In structural health monitoring (SHM) systems, the effectiveness and reliability of the sensor network are crucial to determine the optimal number and locations of sensors in SHM systems. Here, we suggest a probabilistic approach for identifying the optimal number and locations of sensors for SHM. We demonstrate a methodology to establish the probability distribution function that identifies the optimal sensor locations such that damage detection is enhanced. The approach is based on using the weights of a neural network trained from simulations using a priori knowledge about damage locations and damage severities to generate a normalized probability distribution function for optimal sensor allocation. We also demonstrate that the optimal sensor network can be related to the highest probability of detection (POD). The redundancy of the proposed sensor network is examined using a 'leave one sensor out' analysis. A prestressed concrete bridge is selected as a case study to demonstrate the effectiveness of the proposed method. The results show that the proposed approach can provide a robust design for sensor networks that are more efficient than a uniform distribution of sensors on a structure.

  2. A probabilistic approach for optimal sensor allocation in structural health monitoring

    International Nuclear Information System (INIS)

    Azarbayejani, M; Reda Taha, M M; El-Osery, A I; Choi, K K

    2008-01-01

    Recent advances in sensor technology promote using large sensor networks to efficiently and economically monitor, identify and quantify damage in structures. In structural health monitoring (SHM) systems, the effectiveness and reliability of the sensor network are crucial to determine the optimal number and locations of sensors in SHM systems. Here, we suggest a probabilistic approach for identifying the optimal number and locations of sensors for SHM. We demonstrate a methodology to establish the probability distribution function that identifies the optimal sensor locations such that damage detection is enhanced. The approach is based on using the weights of a neural network trained from simulations using a priori knowledge about damage locations and damage severities to generate a normalized probability distribution function for optimal sensor allocation. We also demonstrate that the optimal sensor network can be related to the highest probability of detection (POD). The redundancy of the proposed sensor network is examined using a 'leave one sensor out' analysis. A prestressed concrete bridge is selected as a case study to demonstrate the effectiveness of the proposed method. The results show that the proposed approach can provide a robust design for sensor networks that are more efficient than a uniform distribution of sensors on a structure

  3. Representative sets of design hydrographs for ungauged catchments: A regional approach using probabilistic region memberships

    Science.gov (United States)

    Brunner, Manuela Irene; Seibert, Jan; Favre, Anne-Catherine

    2018-02-01

    Traditional design flood estimation approaches have focused on peak discharges and have often neglected other hydrograph characteristics such as hydrograph volume and shape. Synthetic design hydrograph estimation procedures overcome this deficiency by jointly considering peak discharge, hydrograph volume, and shape. Such procedures have recently been extended to allow for the consideration of process variability within a catchment by a flood-type specific construction of design hydrographs. However, they depend on observed runoff time series and are not directly applicable in ungauged catchments where such series are not available. To obtain reliable flood estimates, there is a need for an approach that allows for the consideration of process variability in the construction of synthetic design hydrographs in ungauged catchments. In this study, we therefore propose an approach that combines a bivariate index flood approach with event-type specific synthetic design hydrograph construction. First, regions of similar flood reactivity are delineated and a classification rule that enables the assignment of ungauged catchments to one of these reactivity regions is established. Second, event-type specific synthetic design hydrographs are constructed using the pooled data divided by event type from the corresponding reactivity region in a bivariate index flood procedure. The approach was tested and validated on a dataset of 163 Swiss catchments. The results indicated that 1) random forest is a suitable classification model for the assignment of an ungauged catchment to one of the reactivity regions, 2) the combination of a bivariate index flood approach and event-type specific synthetic design hydrograph construction enables the consideration of event types in ungauged catchments, and 3) the use of probabilistic class memberships in regional synthetic design hydrograph construction helps to alleviate the problem of misclassification. Event-type specific synthetic design

  4. Improving predictive power of physically based rainfall-induced shallow landslide models: a probabilistic approach

    Directory of Open Access Journals (Sweden)

    S. Raia

    2014-03-01

    Full Text Available Distributed models to forecast the spatial and temporal occurrence of rainfall-induced shallow landslides are based on deterministic laws. These models extend spatially the static stability models adopted in geotechnical engineering, and adopt an infinite-slope geometry to balance the resisting and the driving forces acting on the sliding mass. An infiltration model is used to determine how rainfall changes pore-water conditions, modulating the local stability/instability conditions. A problem with the operation of the existing models lays in the difficulty in obtaining accurate values for the several variables that describe the material properties of the slopes. The problem is particularly severe when the models are applied over large areas, for which sufficient information on the geotechnical and hydrological conditions of the slopes is not generally available. To help solve the problem, we propose a probabilistic Monte Carlo approach to the distributed modeling of rainfall-induced shallow landslides. For this purpose, we have modified the transient rainfall infiltration and grid-based regional slope-stability analysis (TRIGRS code. The new code (TRIGRS-P adopts a probabilistic approach to compute, on a cell-by-cell basis, transient pore-pressure changes and related changes in the factor of safety due to rainfall infiltration. Infiltration is modeled using analytical solutions of partial differential equations describing one-dimensional vertical flow in isotropic, homogeneous materials. Both saturated and unsaturated soil conditions can be considered. TRIGRS-P copes with the natural variability inherent to the mechanical and hydrological properties of the slope materials by allowing values of the TRIGRS model input parameters to be sampled randomly from a given probability distribution. The range of variation and the mean value of the parameters can be determined by the usual methods used for preparing the TRIGRS input parameters. The outputs

  5. A probabilistic approach for the interpretation of RNA profiles as cell type evidence.

    Science.gov (United States)

    de Zoete, Jacob; Curran, James; Sjerps, Marjan

    2016-01-01

    DNA profiles can be used as evidence to distinguish between possible donors of a crime stain. In some cases, both the prosecution and the defence claim that the cell material was left by the suspect but they dispute which cell type was left behind. For example, in sexual offense cases the prosecution could claim that the sample contains semen cells where the defence argues that the sample contains skin cells. In these cases, traditional methods (e.g. a phosphatase test) can be used to examine the cell type contained in the sample. However, there are some drawbacks when using these methods. For instance, many of these techniques need to be carried out separately for each cell type and each of them requires part of the available sample, which reduces the amount that can be used for DNA analysis. Another option is messenger RNA (mRNA) evidence. mRNA expression levels vary among cell types and can be used to make (probability) statements about the cell type(s) present in a sample. Existing methods for the interpretation of RNA profiles as evidence for the presence of certain cell types aim at making categorical statements. Such statements limit the possibility to report the associated uncertainty. Some of these existing methods will be discussed. Most notably, a method based on a 'n/2' scoring rule (Lindenbergh et al.) and a method using marker values and cell type scoring thresholds (Roeder et al.). From a statistical point of view, a probabilistic approach is the most obvious choice. Two approaches (multinomial logistic regression and naïve Bayes') are suggested. All methods are compared, using two different datasets and several criteria regarding their ability to assess the evidential value of RNA profiles. We conclude that both the naïve Bayes' method and a method based on multinomial logistic regression, that produces a probabilistic statement as measure of the evidential value, are an important improvement of the existing methods. Besides a better performance

  6. Probabilistic approaches to accounting for data variability in the practical application of bioavailability in predicting aquatic risks from metals.

    Science.gov (United States)

    Ciffroy, Philippe; Charlatchka, Rayna; Ferreira, Daniel; Marang, Laura

    2013-07-01

    The biotic ligand model (BLM) theoretically enables the derivation of environmental quality standards that are based on true bioavailable fractions of metals. Several physicochemical variables (especially pH, major cations, dissolved organic carbon, and dissolved metal concentrations) must, however, be assigned to run the BLM, but they are highly variable in time and space in natural systems. This article describes probabilistic approaches for integrating such variability during the derivation of risk indexes. To describe each variable using a probability density function (PDF), several methods were combined to 1) treat censored data (i.e., data below the limit of detection), 2) incorporate the uncertainty of the solid-to-liquid partitioning of metals, and 3) detect outliers. From a probabilistic perspective, 2 alternative approaches that are based on log-normal and Γ distributions were tested to estimate the probability of the predicted environmental concentration (PEC) exceeding the predicted non-effect concentration (PNEC), i.e., p(PEC/PNEC>1). The probabilistic approach was tested on 4 real-case studies based on Cu-related data collected from stations on the Loire and Moselle rivers. The approach described in this article is based on BLM tools that are freely available for end-users (i.e., the Bio-Met software) and on accessible statistical data treatments. This approach could be used by stakeholders who are involved in risk assessments of metals for improving site-specific studies. Copyright © 2013 SETAC.

  7. A probabilistic scenario approach for developing improved Reduced Emissions from Deforestation and Degradation (REDD+ baselines

    Directory of Open Access Journals (Sweden)

    Malika Virah-Sawmy

    2015-07-01

    By generating robust probabilistic baseline scenarios, exponential smoothing models can facilitate the effectiveness of REDD+ payments, support a more efficient allocation of scarce conservation resources, and improve our understanding of effective forest conservation investments, also beyond REDD+.

  8. Spatial probabilistic approach on landslide susceptibility assessment from high resolution sensors derived parameters

    International Nuclear Information System (INIS)

    Aman, S N A; Latif, Z Abd; Pradhan, B

    2014-01-01

    Landslide occurrence depends on various interrelating factors which consequently initiate to massive mass of soil and rock debris that move downhill due to the gravity action. LiDAR has come with a progressive approach in mitigating landslide by permitting the formation of more accurate DEM compared to other active space borne and airborne remote sensing techniques. The objective of this research is to assess the susceptibility of landslide in Ulu Klang area by investigating the correlation between past landslide events with geo environmental factors. A high resolution LiDAR DEM was constructed to produce topographic attributes such as slope, curvature and aspect. These data were utilized to derive second deliverables of landslide parameters such as topographic wetness index (TWI), surface area ratio (SAR) and stream power index (SPI) as well as NDVI generated from IKONOS imagery. Subsequently, a probabilistic based frequency ratio model was applied to establish the spatial relationship between the landslide locations and each landslide related factor. Factor ratings were summed up to obtain Landslide Susceptibility Index (LSI) to construct the landslide susceptibility map

  9. Implementation of probabilistic approach in solving inverse problems as a grid-backed web service.

    Science.gov (United States)

    Kholodkov, K. I.; Aleshin, I. M.; Koryagin, V. N.; Shogin, A. N.; Sukhoroslov, O. V.

    2012-04-01

    In this work probabilistic approach to inverse problem was adopted. It leads to definition and sampling of a posteriori probability density function (APDF), which combines a priori system information with information, derived from observation data. Use of APDF implies significant computational resourses consumption, even for moderate model parameter count. However the computation of APDF value at different points is carried out completely independently, therefore this problem is considered ideal for loosely coupled distributed computing system. Globus Toolkit middleware was used, including the GridFTP for data transfer and GRAM for execution control, as well as TORQUE resource manager for each computing node. To reduce the hardware cost all grid services, except for GridFTP, run as virtual guests on execution nodes. Due to very insignificant resources utilization the guests make no footprint on node's computation power. To hide complex middleware interface from scientific users, user friendly web interface was created, which provides restricted but sufficient tool set. Determination of seismic anisotropy by wave form inversion was implemented as model problem. The interface allows user to edit model parameters, estimate execution time for specified parameter set, run calculation and perform result visualization. Details of start-up, management and results acquisition are hidden from user. This work was supported by Russian Foundation of Basic Research, grants 10-07-00491-a, 11-05-00988-a and 11-07-12045-ofi-m-2011

  10. A robust probabilistic approach for variational inversion in shallow water acoustic tomography

    International Nuclear Information System (INIS)

    Berrada, M; Badran, F; Crépon, M; Thiria, S; Hermand, J-P

    2009-01-01

    This paper presents a variational methodology for inverting shallow water acoustic tomography (SWAT) measurements. The aim is to determine the vertical profile of the speed of sound c(z), knowing the acoustic pressures generated by a frequency source and collected by a sparse vertical hydrophone array (VRA). A variational approach that minimizes a cost function measuring the distance between observations and their modeled equivalents is used. A regularization term in the form of a quadratic restoring term to a background is also added. To avoid inverting the variance–covariance matrix associated with the above-weighted quadratic background, this work proposes to model the sound speed vector using probabilistic principal component analysis (PPCA). The PPCA introduces an optimum reduced number of non-correlated latent variables η, which determine a new control vector and a new regularization term, expressed as η T η. The PPCA represents a rigorous formalism for the use of a priori information and allows an efficient implementation of the variational inverse method

  11. Environmental risk assessment of white phosphorus from the use of munitions - a probabilistic approach.

    Science.gov (United States)

    Voie, Øyvind Albert; Johnsen, Arnt; Strømseng, Arnljot; Longva, Kjetil Sager

    2010-03-15

    White phosphorus (P(4)) is a highly toxic compound used in various pyrotechnic products. Ammunitions containing P(4) are widely used in military training areas where the unburned products of P(4) contaminate soil and local ponds. Traditional risk assessment methods presuppose a homogeneous spatial distribution of pollutants. The distribution of P(4) in military training areas is heterogeneous, which reduces the probability of potential receptors being exposed to the P(4) by ingestion, for example. The current approach to assess the environmental risk from the use of P(4) suggests a Bayesian network (Bn) as a risk assessment tool. The probabilistic reasoning supported by a Bn allows us to take into account the heterogeneous distribution of P(4). Furthermore, one can combine empirical data and expert knowledge, which allows the inclusion of all kinds of data that are relevant to the problem. The current work includes an example of the use of the Bn as a risk assessment tool where the risk for P(4) poisoning in humans and grazing animals at a military shooting range in Northern Norway was calculated. P(4) was detected in several craters on the range at concentrations up to 5.7g/kg. The risk to human health was considered acceptable under the current land use. The risk for grazing animals such as sheep, however, was higher, suggesting that precautionary measures may be advisable.

  12. Complexity characterization in a probabilistic approach to dynamical systems through information geometry and inductive inference

    International Nuclear Information System (INIS)

    Ali, S A; Kim, D-H; Cafaro, C; Giffin, A

    2012-01-01

    Information geometric techniques and inductive inference methods hold great promise for solving computational problems of interest in classical and quantum physics, especially with regard to complexity characterization of dynamical systems in terms of their probabilistic description on curved statistical manifolds. In this paper, we investigate the possibility of describing the macroscopic behavior of complex systems in terms of the underlying statistical structure of their microscopic degrees of freedom by the use of statistical inductive inference and information geometry. We review the maximum relative entropy formalism and the theoretical structure of the information geometrodynamical approach to chaos on statistical manifolds M S . Special focus is devoted to a description of the roles played by the sectional curvature K M S , the Jacobi field intensity J M S and the information geometrodynamical entropy S M S . These quantities serve as powerful information-geometric complexity measures of information-constrained dynamics associated with arbitrary chaotic and regular systems defined on M S . Finally, the application of such information-geometric techniques to several theoretical models is presented.

  13. Human and management factors in probabilistic risk analysis: the SAM approach and observations from recent applications

    Energy Technology Data Exchange (ETDEWEB)

    Elisabeth Pate-Cornell, M.; Murphy, Dean M

    1996-08-01

    Most severe industrial accidents have been shown to involve one or more human errors and these are generally rooted in management problems. The objective of this paper is to draw some conclusions from the experience that we have acquired from three different studies of this phenomenon: (1) the Piper Alpha accident including problems of operations management and fire risks on-board offshore platforms, (2) the management of the heat shield of the NASA space shuttle orbiter, and (3) the roots of patient risks in anaesthesia. This paper describes and illustrates the SAM approach (System-Action-Management) that was developed and used in these studies to link the probabilities of system failures to human and management factors. This SAM model includes: first, a probabilistic risk analysis of the physical system, second, an analysis of the decisions and actions that affect the probabilities of its basic events, and third, a study of the management factors that influence those decisions and actions. In the three initial studies, the analytical links (conditional probabilities) among these three submodels were coarsely quantified based on statistical data whenever available, or most often, on expert opinions. This paper describes some observations that were made across these three studies, for example, the importance of the informal reward system, the difficulties in the communication of uncertainties, the problems of managing resource constraints, and the safety implications of the short cuts that they often imply.

  14. Probabilistic approaches applied to damage and embrittlement of structural materials in nuclear power plants

    International Nuclear Information System (INIS)

    Vincent, L.

    2012-01-01

    The present study deals with the long-term mechanical behaviour and damage of structural materials in nuclear power plants. An experimental way is first followed to study the thermal fatigue of austenitic stainless steels with a focus on the effects of mean stress and bi-axiality. Furthermore, the measurement of displacement fields by Digital Image Correlation techniques has been successfully used to detect early crack initiation during high cycle fatigue tests. A probabilistic model based on the shielding zones surrounding existing cracks is proposed to describe the development of crack networks. A more numeric way is then followed to study the embrittlement consequences of the irradiation hardening of the bainitic steel constitutive of nuclear pressure vessels. A crystalline plasticity law, developed in agreement with lower scale results (Dislocation Dynamics), is introduced in a Finite Element code in order to run simulations on aggregates and obtain the distributions of the maximum principal stress inside a Representative Volume Element. These distributions are then used to improve the classical Local Approach to Fracture which estimates the probability for a microstructural defect to be loaded up to a critical level. (author) [fr

  15. Diagnostic efficacy of optimised evaluation of planar MIBI myocardium perfusion scintigraphy: a probabilistic approach

    International Nuclear Information System (INIS)

    Kusmierek, J.; Plachcinska, A.

    1999-01-01

    Background: The Bayesian (probabilistic) approach to the results of a diagnostic test appears to be more informative than an interpretation of results in binary terms (having disease or not). The aim of our study was the analysis of the effect of an optimised evaluation of myocardium perfusion scintigrams on the probability of CAD in individual patients. Methods: 197 patients (132 males and 65 females) suspected of CAD, with no history of myocardial infarction were examined. Scintigraphic images were evaluated applying two methods of analysis: visual (semiquantitative) and quantitative, and the combination of both. The sensitivity and specificity of both methods (and their combination) in the detection of CAD were determined and optimal methods of scintigram evaluation, separately for males and females, were selected. All patients were subjected to coronary angiography. The pre-test probability of CAD was assessed according to Diamond (1) and the post-test probability was evaluated in accordance with Bayes's theorem. Patients were divided, according to a pre-test probability of CAD, into 3 groups: with low, medium and high probability of the disease. The same subdivision was made in relation to post-test probability of CAD. The numbers of patients in respective subgroups, before and after the test, were compared. Moreover, in order to test the reliability of post-test probability, its values were compared with real percentages of CAD occurrence among the patients under study, as demonstrated by the angiography. Results: The combination of visual and quantitative methods was accepted as the optimal method of male scintigram evaluation (with sensitivity and specificity equalling 95% and 82%, respectively) and a sole quantitative analysis as the optimal method of female scintigram evaluation (sensitivity and specificity amounted to 81% and 84%, respectively). In the subgroup of males the percentage of individuals with medium pre-test CAD probability equalled 52 and

  16. Registration of indoor TLS data: in favor of a probabilistic approach initialized by geo-location

    International Nuclear Information System (INIS)

    Hullo, Jean-Francois

    2013-01-01

    Many pre-maintenance operations of industrial facilities currently resort on to three dimensional CAD models. The acquisition of these models is performed from point clouds measured by Terrestrial Laser Scanning (TLS). When the scenes are complex, several view points for scanning, also known as stations, are necessary to ensure the completeness and the density of the survey data. The generation of a global point cloud, i.e. the expression of all the acquired data in a common reference frame, is a crucial step called registration. During this process, the pose parameters are estimated. If the GNSS Systems are now a solution for many outdoor scenes, the registration of indoor TLS data still remains a challenge. The goal of this thesis is to improve the acquisition process of TLS data in industrial environments. The aim is to guarantee the precision and accuracy of acquired data, while optimizing on-site acquisition time and protocols by, as often as possible, freeing the operator from the constraints inherent to conventional topography surveys. In a first part, we consider the state of the art of the means and methods used during the acquisition of dense point clouds of complex interior scenes (Part I). In a second part, we study and evaluate the data available for the registration: terrestrial laser data, primitive reconstruction algorithms in point clouds and indoor geo-location Systems (Part II). In the third part, we then formalize and experiment a registration algorithm based on the use of matched primitives, reconstructed from per station point clouds ( Part III). We finally propose a probabilistic approach for matching primitives, allowing the integration of a priori information and uncertainty in the constraints System used for calculating poses (Part IV). The contributions of our work are as follows: - to take a critical look at current methods of TLS data acquisition in industrial environments, - to evaluate, through experimentations, the information

  17. Homogeneous Poisson structures

    International Nuclear Information System (INIS)

    Shafei Deh Abad, A.; Malek, F.

    1993-09-01

    We provide an algebraic definition for Schouten product and give a decomposition for any homogenenous Poisson structure in any n-dimensional vector space. A large class of n-homogeneous Poisson structures in R k is also characterized. (author). 4 refs

  18. Analysis of time-correlated single photon counting data: a comparative evaluation of deterministic and probabilistic approaches

    Science.gov (United States)

    Smith, Darren A.; McKenzie, Grant; Jones, Anita C.; Smith, Trevor A.

    2017-12-01

    We review various methods for analysing time-resolved fluorescence data acquired using the time-correlated single photon counting method in an attempt to evaluate their benefits and limitations. We have applied these methods to both experimental and simulated data. The relative merits of using deterministic approaches, such as the commonly used iterative reconvolution method, and probabilistic approaches, such as the smoothed exponential series method, the maximum entropy method and recently proposed basis pursuit denoising (compressed sensing) method, are outlined. In particular, we show the value of using multiple methods to arrive at the most appropriate choice of model. We show that the use of probabilistic analysis methods can indicate whether a discrete component or distribution analysis provides the better representation of the data.

  19. A tractable DDN-POMDP Approach to Affective Dialogue Modeling for General Probabilistic Frame-based Dialogue Systems

    OpenAIRE

    Bui Huu Trung, B.H.T.; Poel, Mannes; Nijholt, Antinus; Zwiers, Jakob; Traum, D.; Alexandersson, J.; Jonsson, A.; Zukerman, I.

    2007-01-01

    We propose a new approach to developing a tractable affective dialogue model for general probabilistic frame-based dialogue systems. The dialogue model, based on the Partially Observable Markov Decision Process (POMDP) and the Dynamic Decision Network (DDN) techniques, is composed of two main parts, the slot level dialogue manager and the global dialogue manager. Our implemented dialogue manager prototype can handle hundreds of slots; each slot might have many values. A first evaluation of th...

  20. A tractable DDN-POMDP Approach to Affective Dialogue Modeling for General Probabilistic Frame-based Dialogue Systems

    OpenAIRE

    Bui Huu Trung, B.H.T.; Poel, Mannes; Nijholt, Antinus; Zwiers, Jakob

    2006-01-01

    We propose a new approach to developing a tractable affective dialogue model for general probabilistic frame-based dialogue systems. The dialogue model, based on the Partially Observable Markov Decision Process (POMDP) and the Dynamic Decision Network (DDN) techniques, is composed of two main parts, the slot level dialogue manager and the global dialogue manager. It has two new features: (1) being able to deal with a large number of slots and (2) being able to take into account some aspects o...

  1. Need to use probabilistic risk approach in performance assessment of waste disposal facilities

    International Nuclear Information System (INIS)

    Bonano, E.J.; Gallegos, D.P.

    1991-01-01

    Regulations governing the disposal of radioactive, hazardous, and/or mixed wastes will likely require, either directly or indirectly, that the performance of disposal facilities be assessed quantitatively. Such analyses, commonly called ''performance assessments,'' rely on the use of predictive models to arrive at a quantitative estimate of the potential impact of disposal on the environment and the safety and health of the public. It has been recognized that a suite of uncertainties affect the results of a performance assessment. These uncertainties are conventionally categorized as (1) uncertainty in the future state of the disposal system (facility and surrounding medium), (2) uncertainty in models (including conceptual models, mathematical models, and computer codes), and (3) uncertainty in data and parameters. Decisions regarding the suitability of a waste disposal facility must be made in light of these uncertainties. Hence, an approach is needed that would allow the explicit consideration of these uncertainties so that their impact on the estimated consequences of disposal can be evaluated. While most regulations for waste disposal do not prescribe the consideration of uncertainties, it is proposed that, even in such cases, a meaningful decision regarding the suitability of a waste disposal facility cannot be made without considering the impact of the attendant uncertainties. A probabilistic risk assessment (PRA) approach provides the formalism for considering the uncertainties and the technical basis that the decision makers can use in discharging their duties. A PRA methodology developed and demonstrated for the disposal of high-level radioactive waste provides a general framework for assessing the disposal of all types of wastes (radioactive, hazardous, and mixed). 15 refs., 1 fig., 1 tab

  2. A probabilistic multidimensional approach to quantify large wood recruitment from hillslopes in mountainous-forested catchments

    Science.gov (United States)

    Cislaghi, Alessio; Rigon, Emanuel; Lenzi, Mario Aristide; Bischetti, Gian Battista

    2018-04-01

    Large wood (LW) plays a key role in physical, chemical, environmental, and biological processes in most natural and seminatural streams. However, it is also a source of hydraulic hazard in anthropised territories. Recruitment from fluvial processes has been the subject of many studies, whereas less attention has been given to hillslope recruitment, which is linked to episodic and spatially distributed events and requires a reliable and accurate slope stability model and a hillslope-channel transfer model. The purpose of this study is to develop an innovative LW hillslope-recruitment estimation approach that combines forest stand characteristics in a spatially distributed form, a probabilistic multidimensional slope stability model able to include the reinforcement exerted by roots, and a hillslope-channel transfer procedure. The approach was tested on a small mountain headwater catchment in the eastern Italian Alps that is prone to shallow landslide and debris flow phenomena. The slope stability model (that had not been calibrated) provided accurate performances, in terms of unstable areas identification according to the landslide inventory (AUC = 0.832) and of LW volume estimation in comparison with LW volume produced by inventoried landslides (7702 m3 corresponding to a recurrence time of about 30 years in the susceptibility curve). The results showed that most LW potentially mobilised by landslides does not reach the channel network (only about 16%), in agreement with the few data reported by other studies, as well as the data normalized for unit length of channel and unit length of channel per year (0-116 m3/km and 0-4 m3/km y-1). This study represents an important contribution to LW research. A rigorous and site-specific estimation of LW hillslope recruitment should, in fact, be an integral part of more general studies on LW dynamics, for forest planning and management, and positioning in-channel wood retention structures.

  3. Projections of Temperature-Attributable Premature Deaths in 209 U.S. Cities Using a Cluster-Based Poisson Approach

    Science.gov (United States)

    Schwartz, Joel D.; Lee, Mihye; Kinney, Patrick L.; Yang, Suijia; Mills, David; Sarofim, Marcus C.; Jones, Russell; Streeter, Richard; St. Juliana, Alexis; Peers, Jennifer; hide

    2015-01-01

    Background: A warming climate will affect future temperature-attributable premature deaths. This analysis is the first to project these deaths at a near national scale for the United States using city and month-specific temperature-mortality relationships. Methods: We used Poisson regressions to model temperature-attributable premature mortality as a function of daily average temperature in 209 U.S. cities by month. We used climate data to group cities into clusters and applied an Empirical Bayes adjustment to improve model stability and calculate cluster-based month-specific temperature-mortality functions. Using data from two climate models, we calculated future daily average temperatures in each city under Representative Concentration Pathway 6.0. Holding population constant at 2010 levels, we combined the temperature data and cluster-based temperature-mortality functions to project city-specific temperature-attributable premature deaths for multiple future years which correspond to a single reporting year. Results within the reporting periods are then averaged to account for potential climate variability and reported as a change from a 1990 baseline in the future reporting years of 2030, 2050 and 2100. Results: We found temperature-mortality relationships that vary by location and time of year. In general, the largest mortality response during hotter months (April - September) was in July in cities with cooler average conditions. The largest mortality response during colder months (October-March) was at the beginning (October) and end (March) of the period. Using data from two global climate models, we projected a net increase in premature deaths, aggregated across all 209 cities, in all future periods compared to 1990. However, the magnitude and sign of the change varied by cluster and city. Conclusions: We found increasing future premature deaths across the 209 modeled U.S. cities using two climate model projections, based on constant temperature

  4. ProLBB - A Probabilistic Approach to Leak Before Break Demonstration

    International Nuclear Information System (INIS)

    Dillstroem, Peter; Weilin Zang

    2007-11-01

    Recently, the Swedish Nuclear Power Inspectorate has developed guidelines on how to demonstrate the existence of Leak Before Break (LBB). The guidelines, mainly based on NUREG/CR-6765, define the steps that must be fulfilled to get a conservative assessment of LBB acceptability. In this report, a probabilistic LBB approach is defined and implemented into the software ProLBB. The main conclusions, from the study presented in this report, are summarized below. - The probabilistic approach developed in this study was applied to different piping systems in both Boiler Water Reactors (BWR) and Pressurised Water Reactors (PWR). Pipe sizes were selected so that small, medium and large pipes were included in the analysis. The present study shows that the conditional probability of fracture is in general small for the larger diameter pipes when evaluated as function of leak flow rate. However, when evaluated as function of fraction of crack length around the circumference, then the larger diameter pipes will belong to the ones with the highest conditional fracture probabilities. - The total failure probability, corresponding to the product between the leak probability and the conditional fracture probability, will be very small for all pipe geometries when evaluated as function of fraction of crack length around the circumference. This is mainly due to a small leak probability which is consistent with expectations since no active damage mechanism has been assumed. - One of the objectives of the approach was to be able to check the influence of off-centre cracks (i.e. the possibility that cracks occur randomly around the pipe circumference). To satisfy this objective, new stress intensity factor solutions for off-centre cracks were developed. Also to check how off-centre cracks influence crack opening areas, new form factors solutions for COA were developed taking plastic deformation into account. - The influence from an off-center crack position on the conditional

  5. Zeroth Poisson Homology, Foliated Cohomology and Perfect Poisson Manifolds

    Science.gov (United States)

    Martínez-Torres, David; Miranda, Eva

    2018-01-01

    We prove that, for compact regular Poisson manifolds, the zeroth homology group is isomorphic to the top foliated cohomology group, and we give some applications. In particular, we show that, for regular unimodular Poisson manifolds, top Poisson and foliated cohomology groups are isomorphic. Inspired by the symplectic setting, we define what a perfect Poisson manifold is. We use these Poisson homology computations to provide families of perfect Poisson manifolds.

  6. Xplicit, a novel approach in probabilistic spatiotemporally explicit exposure and risk assessment for plant protection products.

    Science.gov (United States)

    Schad, Thorsten; Schulz, Ralf

    2011-10-01

    The quantification of risk (the likelihood and extent of adverse effects) is a prerequisite in regulatory decision making for plant protection products and is the goal of the Xplicit project. In its present development stage, realism is increased in the exposure assessment (EA), first by using real-world data on, e.g., landscape factors affecting exposure, and second, by taking the variability of key factors into account. Spatial and temporal variability is explicitly addressed. Scale dependencies are taken into account, which allows for risk quantification at different scales, for example, at landscape scale, an overall picture of the potential exposure of nontarget organisms can be derived (e.g., for all off-crop habitats in a given landscape); at local scale, exposure might be relevant to assess recovery and recolonization potential; intermediate scales might best refer to population level and hence might be relevant for risk management decisions (e.g., individual off-crop habitats). The Xplicit approach is designed to comply with a central paradigm of probabilistic approaches, namely, that each individual case that is derived from the variability functions employed should represent a potential real-world case. This is mainly achieved by operating in a spatiotemporally explicit fashion. Landscape factors affecting the local exposure of habitats of nontarget species (i.e., receptors) are derived from geodatabases. Variability in time is resolved by operating at discrete time steps, with the probability of events (e.g., application) or conditions (e.g., wind conditions) defined in probability density functions (PDFs). The propagation of variability of parameters into variability of exposure and risk is done using a Monte Carlo approach. Among the outcomes are expectancy values on the realistic worst-case exposure (predicted environmental concentration [PEC]), the probability p that the PEC exceeds the ecologically acceptable concentration (EAC) for a given

  7. A unifying probabilistic Bayesian approach to derive electron density from MRI for radiation therapy treatment planning

    International Nuclear Information System (INIS)

    Gudur, Madhu Sudhan Reddy; Hara, Wendy; Le, Quynh-Thu; Wang, Lei; Xing, Lei; Li, Ruijiang

    2014-01-01

    MRI significantly improves the accuracy and reliability of target delineation in radiation therapy for certain tumors due to its superior soft tissue contrast compared to CT. A treatment planning process with MRI as the sole imaging modality will eliminate systematic CT/MRI co-registration errors, reduce cost and radiation exposure, and simplify clinical workflow. However, MRI lacks the key electron density information necessary for accurate dose calculation and generating reference images for patient setup. The purpose of this work is to develop a unifying method to derive electron density from standard T1-weighted MRI. We propose to combine both intensity and geometry information into a unifying probabilistic Bayesian framework for electron density mapping. For each voxel, we compute two conditional probability density functions (PDFs) of electron density given its: (1) T1-weighted MRI intensity, and (2) geometry in a reference anatomy, obtained by deformable image registration between the MRI of the atlas and test patient. The two conditional PDFs containing intensity and geometry information are combined into a unifying posterior PDF, whose mean value corresponds to the optimal electron density value under the mean-square error criterion. We evaluated the algorithm’s accuracy of electron density mapping and its ability to detect bone in the head for eight patients, using an additional patient as the atlas or template. Mean absolute HU error between the estimated and true CT, as well as receiver operating characteristics for bone detection (HU > 200) were calculated. The performance was compared with a global intensity approach based on T1 and no density correction (set whole head to water). The proposed technique significantly reduced the errors in electron density estimation, with a mean absolute HU error of 126, compared with 139 for deformable registration (p = 2  ×  10 −4 ), 283 for the intensity approach (p = 2  ×  10 −6 ) and 282

  8. A unifying probabilistic Bayesian approach to derive electron density from MRI for radiation therapy treatment planning

    Science.gov (United States)

    Sudhan Reddy Gudur, Madhu; Hara, Wendy; Le, Quynh-Thu; Wang, Lei; Xing, Lei; Li, Ruijiang

    2014-11-01

    MRI significantly improves the accuracy and reliability of target delineation in radiation therapy for certain tumors due to its superior soft tissue contrast compared to CT. A treatment planning process with MRI as the sole imaging modality will eliminate systematic CT/MRI co-registration errors, reduce cost and radiation exposure, and simplify clinical workflow. However, MRI lacks the key electron density information necessary for accurate dose calculation and generating reference images for patient setup. The purpose of this work is to develop a unifying method to derive electron density from standard T1-weighted MRI. We propose to combine both intensity and geometry information into a unifying probabilistic Bayesian framework for electron density mapping. For each voxel, we compute two conditional probability density functions (PDFs) of electron density given its: (1) T1-weighted MRI intensity, and (2) geometry in a reference anatomy, obtained by deformable image registration between the MRI of the atlas and test patient. The two conditional PDFs containing intensity and geometry information are combined into a unifying posterior PDF, whose mean value corresponds to the optimal electron density value under the mean-square error criterion. We evaluated the algorithm’s accuracy of electron density mapping and its ability to detect bone in the head for eight patients, using an additional patient as the atlas or template. Mean absolute HU error between the estimated and true CT, as well as receiver operating characteristics for bone detection (HU > 200) were calculated. The performance was compared with a global intensity approach based on T1 and no density correction (set whole head to water). The proposed technique significantly reduced the errors in electron density estimation, with a mean absolute HU error of 126, compared with 139 for deformable registration (p = 2  ×  10-4), 283 for the intensity approach (p = 2  ×  10-6) and 282 without density

  9. Modifications to POISSON

    International Nuclear Information System (INIS)

    Harwood, L.H.

    1981-01-01

    At MSU we have used the POISSON family of programs extensively for magnetic field calculations. In the presently super-saturated computer situation, reducing the run time for the program is imperative. Thus, a series of modifications have been made to POISSON to speed up convergence. Two of the modifications aim at having the first guess solution as close as possible to the final solution. The other two aim at increasing the convergence rate. In this discussion, a working knowledge of POISSON is assumed. The amount of new code and expected time saving for each modification is discussed

  10. SEX-DETector: A Probabilistic Approach to Study Sex Chromosomes in Non-Model Organisms

    Science.gov (United States)

    Muyle, Aline; Käfer, Jos; Zemp, Niklaus; Mousset, Sylvain; Picard, Franck; Marais, Gabriel AB

    2016-01-01

    We propose a probabilistic framework to infer autosomal and sex-linked genes from RNA-seq data of a cross for any sex chromosome type (XY, ZW, and UV). Sex chromosomes (especially the non-recombining and repeat-dense Y, W, U, and V) are notoriously difficult to sequence. Strategies have been developed to obtain partially assembled sex chromosome sequences. Most of them remain difficult to apply to numerous non-model organisms, either because they require a reference genome, or because they are designed for evolutionarily old systems. Sequencing a cross (parents and progeny) by RNA-seq to study the segregation of alleles and infer sex-linked genes is a cost-efficient strategy, which also provides expression level estimates. However, the lack of a proper statistical framework has limited a broader application of this approach. Tests on empirical Silene data show that our method identifies 20–35% more sex-linked genes than existing pipelines, while making reliable inferences for downstream analyses. Approximately 12 individuals are needed for optimal results based on simulations. For species with an unknown sex-determination system, the method can assess the presence and type (XY vs. ZW) of sex chromosomes through a model comparison strategy. The method is particularly well optimized for sex chromosomes of young or intermediate age, which are expected in thousands of yet unstudied lineages. Any organisms, including non-model ones for which nothing is known a priori, that can be bred in the lab, are suitable for our method. SEX-DETector and its implementation in a Galaxy workflow are made freely available. PMID:27492231

  11. A probabilistic approach to quantifying hydrologic thresholds regulating migration of adult Atlantic salmon into spawning streams

    Science.gov (United States)

    Lazzaro, G.; Soulsby, C.; Tetzlaff, D.; Botter, G.

    2017-03-01

    Atlantic salmon is an economically and ecologically important fish species, whose survival is dependent on successful spawning in headwater rivers. Streamflow dynamics often have a strong control on spawning because fish require sufficiently high discharges to move upriver and enter spawning streams. However, these streamflow effects are modulated by biological factors such as the number and the timing of returning fish in relation to the annual spawning window in the fall/winter. In this paper, we develop and apply a novel probabilistic approach to quantify these interactions using a parsimonious outflux-influx model linking the number of female salmon emigrating (i.e., outflux) and returning (i.e., influx) to a spawning stream in Scotland. The model explicitly accounts for the interannual variability of the hydrologic regime and the hydrological connectivity of spawning streams to main rivers. Model results are evaluated against a detailed long-term (40 years) hydroecological data set that includes annual fluxes of salmon, allowing us to explicitly assess the role of discharge variability. The satisfactory model results show quantitatively that hydrologic variability contributes to the observed dynamics of salmon returns, with a good correlation between the positive (negative) peaks in the immigration data set and the exceedance (nonexceedance) probability of a threshold flow (0.3 m3/s). Importantly, model performance deteriorates when the interannual variability of flow regime is disregarded. The analysis suggests that flow thresholds and hydrological connectivity for spawning return represent a quantifiable and predictable feature of salmon rivers, which may be helpful in decision making where flow regimes are altered by water abstractions.

  12. Probabilistic Design Analysis (PDA) Approach to Determine the Probability of Cross-System Failures for a Space Launch Vehicle

    Science.gov (United States)

    Shih, Ann T.; Lo, Yunnhon; Ward, Natalie C.

    2010-01-01

    Quantifying the probability of significant launch vehicle failure scenarios for a given design, while still in the design process, is critical to mission success and to the safety of the astronauts. Probabilistic risk assessment (PRA) is chosen from many system safety and reliability tools to verify the loss of mission (LOM) and loss of crew (LOC) requirements set by the NASA Program Office. To support the integrated vehicle PRA, probabilistic design analysis (PDA) models are developed by using vehicle design and operation data to better quantify failure probabilities and to better understand the characteristics of a failure and its outcome. This PDA approach uses a physics-based model to describe the system behavior and response for a given failure scenario. Each driving parameter in the model is treated as a random variable with a distribution function. Monte Carlo simulation is used to perform probabilistic calculations to statistically obtain the failure probability. Sensitivity analyses are performed to show how input parameters affect the predicted failure probability, providing insight for potential design improvements to mitigate the risk. The paper discusses the application of the PDA approach in determining the probability of failure for two scenarios from the NASA Ares I project

  13. A Probabilistic Approach to Network Event Formation from Pre-Processed Waveform Data

    Science.gov (United States)

    Kohl, B. C.; Given, J.

    2017-12-01

    The current state of the art for seismic event detection still largely depends on signal detection at individual sensor stations, including picking accurate arrivals times and correctly identifying phases, and relying on fusion algorithms to associate individual signal detections to form event hypotheses. But increasing computational capability has enabled progress toward the objective of fully utilizing body-wave recordings in an integrated manner to detect events without the necessity of previously recorded ground truth events. In 2011-2012 Leidos (then SAIC) operated a seismic network to monitor activity associated with geothermal field operations in western Nevada. We developed a new association approach for detecting and quantifying events by probabilistically combining pre-processed waveform data to deal with noisy data and clutter at local distance ranges. The ProbDet algorithm maps continuous waveform data into continuous conditional probability traces using a source model (e.g. Brune earthquake or Mueller-Murphy explosion) to map frequency content and an attenuation model to map amplitudes. Event detection and classification is accomplished by combining the conditional probabilities from the entire network using a Bayesian formulation. This approach was successful in producing a high-Pd, low-Pfa automated bulletin for a local network and preliminary tests with regional and teleseismic data show that it has promise for global seismic and nuclear monitoring applications. The approach highlights several features that we believe are essential to achieving low-threshold automated event detection: Minimizes the utilization of individual seismic phase detections - in traditional techniques, errors in signal detection, timing, feature measurement and initial phase ID compound and propagate into errors in event formation, Has a formalized framework that utilizes information from non-detecting stations, Has a formalized framework that utilizes source information, in

  14. Scaling the Poisson Distribution

    Science.gov (United States)

    Farnsworth, David L.

    2014-01-01

    We derive the additive property of Poisson random variables directly from the probability mass function. An important application of the additive property to quality testing of computer chips is presented.

  15. On Poisson Nonlinear Transformations

    Directory of Open Access Journals (Sweden)

    Nasir Ganikhodjaev

    2014-01-01

    Full Text Available We construct the family of Poisson nonlinear transformations defined on the countable sample space of nonnegative integers and investigate their trajectory behavior. We have proved that these nonlinear transformations are regular.

  16. Extended Poisson Exponential Distribution

    Directory of Open Access Journals (Sweden)

    Anum Fatima

    2015-09-01

    Full Text Available A new mixture of Modified Exponential (ME and Poisson distribution has been introduced in this paper. Taking the Maximum of Modified Exponential random variable when the sample size follows a zero truncated Poisson distribution we have derived the new distribution, named as Extended Poisson Exponential distribution. This distribution possesses increasing and decreasing failure rates. The Poisson-Exponential, Modified Exponential and Exponential distributions are special cases of this distribution. We have also investigated some mathematical properties of the distribution along with Information entropies and Order statistics of the distribution. The estimation of parameters has been obtained using the Maximum Likelihood Estimation procedure. Finally we have illustrated a real data application of our distribution.

  17. Poisson branching point processes

    International Nuclear Information System (INIS)

    Matsuo, K.; Teich, M.C.; Saleh, B.E.A.

    1984-01-01

    We investigate the statistical properties of a special branching point process. The initial process is assumed to be a homogeneous Poisson point process (HPP). The initiating events at each branching stage are carried forward to the following stage. In addition, each initiating event independently contributes a nonstationary Poisson point process (whose rate is a specified function) located at that point. The additional contributions from all points of a given stage constitute a doubly stochastic Poisson point process (DSPP) whose rate is a filtered version of the initiating point process at that stage. The process studied is a generalization of a Poisson branching process in which random time delays are permitted in the generation of events. Particular attention is given to the limit in which the number of branching stages is infinite while the average number of added events per event of the previous stage is infinitesimal. In the special case when the branching is instantaneous this limit of continuous branching corresponds to the well-known Yule--Furry process with an initial Poisson population. The Poisson branching point process provides a useful description for many problems in various scientific disciplines, such as the behavior of electron multipliers, neutron chain reactions, and cosmic ray showers

  18. An lp-norm approach to robust probabilistic inspection of plate-like structure defects with guided waves

    Science.gov (United States)

    Li, Dan; Yu, Hao; Xu, Feng; Liu, Da Peng; Qiu Zhang, Jian; Ta, De an

    2017-10-01

    In this paper, an lp-norm approach to robust probabilistic inspection of plate-like structure defects with guided waves is proposed. Analytical results show that the more the outliers in the measurements, the smaller p is preferred. Moreover, the relationship between our and the conventional signal difference coefficient (SDC) is also provided. While the experimental results verify the analytical ones, it is also shown that our approach has a good tolerance for impulsive noise interference, namely the unexpected artifacts in the reconstructed tomographic image caused by the impulsive interference are eliminated.

  19. Probabilistic Data Integration

    NARCIS (Netherlands)

    van Keulen, Maurice

    2017-01-01

    Probabilistic data integration is a specific kind of data integration where integration problems such as inconsistency and uncertainty are handled by means of a probabilistic data representation. The approach is based on the view that data quality problems (as they occur in an integration process)

  20. A probabilistic approach on residual strength and damage buildup of high-performance fibers

    NARCIS (Netherlands)

    Knoester, Henk; Hulshof, Joost; Meester, Ronald

    2017-01-01

    An elementary, probabilistic model for fiber failure, developed by Coleman in the fifties of the last century, predicts a Weibull distributed time-to-failure for fibers subject to a constant load. This has been experimentally confirmed, not only for fibers but for load-bearing products in general.

  1. A probabilistic approach for the estimation of earthquake source parameters from spectral inversion

    Science.gov (United States)

    Supino, M.; Festa, G.; Zollo, A.

    2017-12-01

    The amplitude spectrum of a seismic signal related to an earthquake source carries information about the size of the rupture, moment, stress and energy release. Furthermore, it can be used to characterize the Green's function of the medium crossed by the seismic waves. We describe the earthquake amplitude spectrum assuming a generalized Brune's (1970) source model, and direct P- and S-waves propagating in a layered velocity model, characterized by a frequency-independent Q attenuation factor. The observed displacement spectrum depends indeed on three source parameters, the seismic moment (through the low-frequency spectral level), the corner frequency (that is a proxy of the fault length) and the high-frequency decay parameter. These parameters are strongly correlated each other and with the quality factor Q; a rigorous estimation of the associated uncertainties and parameter resolution is thus needed to obtain reliable estimations.In this work, the uncertainties are characterized adopting a probabilistic approach for the parameter estimation. Assuming an L2-norm based misfit function, we perform a global exploration of the parameter space to find the absolute minimum of the cost function and then we explore the cost-function associated joint a-posteriori probability density function around such a minimum, to extract the correlation matrix of the parameters. The global exploration relies on building a Markov chain in the parameter space and on combining a deterministic minimization with a random exploration of the space (basin-hopping technique). The joint pdf is built from the misfit function using the maximum likelihood principle and assuming a Gaussian-like distribution of the parameters. It is then computed on a grid centered at the global minimum of the cost-function. The numerical integration of the pdf finally provides mean, variance and correlation matrix associated with the set of best-fit parameters describing the model. Synthetic tests are performed to

  2. Probabilistic approach to cloud and snow detection on Advanced Very High Resolution Radiometer (AVHRR) imagery

    Science.gov (United States)

    Musial, J. P.; Hüsler, F.; Sütterlin, M.; Neuhaus, C.; Wunderle, S.

    2014-03-01

    Derivation of probability estimates complementary to geophysical data sets has gained special attention over the last years. Information about a confidence level of provided physical quantities is required to construct an error budget of higher-level products and to correctly interpret final results of a particular analysis. Regarding the generation of products based on satellite data a common input consists of a cloud mask which allows discrimination between surface and cloud signals. Further the surface information is divided between snow and snow-free components. At any step of this discrimination process a misclassification in a cloud/snow mask propagates to higher-level products and may alter their usability. Within this scope a novel probabilistic cloud mask (PCM) algorithm suited for the 1 km × 1 km Advanced Very High Resolution Radiometer (AVHRR) data is proposed which provides three types of probability estimates between: cloudy/clear-sky, cloudy/snow and clear-sky/snow conditions. As opposed to the majority of available techniques which are usually based on the decision-tree approach in the PCM algorithm all spectral, angular and ancillary information is used in a single step to retrieve probability estimates from the precomputed look-up tables (LUTs). Moreover, the issue of derivation of a single threshold value for a spectral test was overcome by the concept of multidimensional information space which is divided into small bins by an extensive set of intervals. The discrimination between snow and ice clouds and detection of broken, thin clouds was enhanced by means of the invariant coordinate system (ICS) transformation. The study area covers a wide range of environmental conditions spanning from Iceland through central Europe to northern parts of Africa which exhibit diverse difficulties for cloud/snow masking algorithms. The retrieved PCM cloud classification was compared to the Polar Platform System (PPS) version 2012 and Moderate Resolution Imaging

  3. An integrated approach to the probabilistic assessments of aircraft strikes and structural mode of damages to nuclear power plants

    International Nuclear Information System (INIS)

    Godbout, P.; Brais, A.

    1975-01-01

    The possibilities of an aircraft striking a Canadian nuclear power plant in the vicinity of an airport and of inducing structural failure modes have been evaluated. This evaluation, together with other studies, may enhance decisions in the development of general criteria for the siting of reactors near airports. The study made use, for assessment, of the probabilistic approach and made judicious applications of the finite Canadian, French, German, American and English resources that were available. The tools, techniques and methods used for achieving the above, form what may be called an integrated approach. This method of approach requires that the study be made in six consecutive steps as follows: the qualitative evaluation of having an aircraft strike on a site situated near an airport with the use of the logic model technique; the statistical data gathering on aircraft movements and accidents; evaluating the probability distribution and calculating the basic event probabilities; evaluating the probability of an aircraft strike and the application of the sensitivity approach; generating the probability density distribution versus strike impact energy, that is, the evaluation of the energy envelope; and the probabilistic evaluation of structural failure mode inducements

  4. Prediction Uncertainty and Groundwater Management: Approaches to get the Most out of Probabilistic Outputs

    Science.gov (United States)

    Peeters, L. J.; Mallants, D.; Turnadge, C.

    2017-12-01

    Groundwater impact assessments are increasingly being undertaken in a probabilistic framework whereby various sources of uncertainty (model parameters, model structure, boundary conditions, and calibration data) are taken into account. This has resulted in groundwater impact metrics being presented as probability density functions and/or cumulative distribution functions, spatial maps displaying isolines of percentile values for specific metrics, etc. Groundwater management on the other hand typically uses single values (i.e., in a deterministic framework) to evaluate what decisions are required to protect groundwater resources. For instance, in New South Wales, Australia, a nominal drawdown value of two metres is specified by the NSW Aquifer Interference Policy as trigger-level threshold. In many cases, when drawdowns induced by groundwater extraction exceed two metres, "make-good" provisions are enacted (such as the surrendering of extraction licenses). The information obtained from a quantitative uncertainty analysis can be used to guide decision making in several ways. Two examples are discussed here: the first of which would not require modification of existing "deterministic" trigger or guideline values, whereas the second example assumes that the regulatory criteria are also expressed in probabilistic terms. The first example is a straightforward interpretation of calculated percentile values for specific impact metrics. The second examples goes a step further, as the previous deterministic thresholds do not currently allow for a probabilistic interpretation; e.g., there is no statement that "the probability of exceeding the threshold shall not be larger than 50%". It would indeed be sensible to have a set of thresholds with an associated acceptable probability of exceedance (or probability of not exceeding a threshold) that decreases as the impact increases. We here illustrate how both the prediction uncertainty and management rules can be expressed in a

  5. Improvement of the methods of the estimation of reliability of multifunction corporative telecommunications on base of logician-probabilistic approach

    Directory of Open Access Journals (Sweden)

    Lozbinev F.Yu.

    2015-12-01

    Full Text Available The organizing and information-technological particularities of the object of the research are considered. The analysis of principles of building, process of the creation and evolutions of the topological scheme of network is executed. The feature of used radio of the electronic facilities is brought. On base of logician-probabilistic approach the algorithms of the calculation of the factor of readiness of the equipment of network on refusal (malfunction is designed. Estimation of South, North and West pathways with different variants of the equipment and topological schemes is executed.

  6. Combination of the deterministic and probabilistic approaches for risk-informed decision-making in US NRC regulatory guides

    International Nuclear Information System (INIS)

    Patrik, M.; Babic, P.

    2001-06-01

    The report responds to the trend where probabilistic safety analyses are attached, on a voluntary basis (as yet), to the mandatory deterministic assessment of modifications of NPP systems or operating procedures, resulting in risk-informed type documents. It contains a nearly complete Czech translation of US NRC Regulatory Guide 1.177 and presents some suggestions for improving a) PSA study applications; b) the development of NPP documents for the regulatory body; and c) the interconnection between PSA and traditional deterministic analyses as contained in the risk-informed approach. (P.A.)

  7. A note on probabilistic models over strings: the linear algebra approach.

    Science.gov (United States)

    Bouchard-Côté, Alexandre

    2013-12-01

    Probabilistic models over strings have played a key role in developing methods that take into consideration indels as phylogenetically informative events. There is an extensive literature on using automata and transducers on phylogenies to do inference on these probabilistic models, in which an important theoretical question is the complexity of computing the normalization of a class of string-valued graphical models. This question has been investigated using tools from combinatorics, dynamic programming, and graph theory, and has practical applications in Bayesian phylogenetics. In this work, we revisit this theoretical question from a different point of view, based on linear algebra. The main contribution is a set of results based on this linear algebra view that facilitate the analysis and design of inference algorithms on string-valued graphical models. As an illustration, we use this method to give a new elementary proof of a known result on the complexity of inference on the "TKF91" model, a well-known probabilistic model over strings. Compared to previous work, our proving method is easier to extend to other models, since it relies on a novel weak condition, triangular transducers, which is easy to establish in practice. The linear algebra view provides a concise way of describing transducer algorithms and their compositions, opens the possibility of transferring fast linear algebra libraries (for example, based on GPUs), as well as low rank matrix approximation methods, to string-valued inference problems.

  8. Wireless capsule endoscopy video segmentation using an unsupervised learning approach based on probabilistic latent semantic analysis with scale invariant features.

    Science.gov (United States)

    Shen, Yao; Guturu, Parthasarathy Partha; Buckles, Bill P

    2012-01-01

    Since wireless capsule endoscopy (WCE) is a novel technology for recording the videos of the digestive tract of a patient, the problem of segmenting the WCE video of the digestive tract into subvideos corresponding to the entrance, stomach, small intestine, and large intestine regions is not well addressed in the literature. A selected few papers addressing this problem follow supervised leaning approaches that presume availability of a large database of correctly labeled training samples. Considering the difficulties in procuring sizable WCE training data sets needed for achieving high classification accuracy, we introduce in this paper an unsupervised learning approach that employs Scale Invariant Feature Transform (SIFT) for extraction of local image features and the probabilistic latent semantic analysis (pLSA) model used in the linguistic content analysis for data clustering. Results of experimentation indicate that this method compares well in classification accuracy with the state-of-the-art supervised classification approaches to WCE video segmentation.

  9. An integral approach to the use of probabilistic risk assessment methods

    International Nuclear Information System (INIS)

    Schwarzblat, M.; Arellano, J.

    1987-01-01

    In this chapter some of the work developed at the Instituto de Investigaciones Electricas in the area of probabilistic risk analysis are presented. In this area, work has been basically focused in the following directions: development and implementation of methods, and applications to real systems. The first part of this paper describes the area of methods development and implementation, presenting an integrated package of computer programs for fault tree analysis. In the second part some of the most important applications developed for real systems are presented. (author)

  10. Foundation plate on the elastic half-space, deterministic and probabilistic approach

    Directory of Open Access Journals (Sweden)

    Tvrdá Katarína

    2017-01-01

    Full Text Available Interaction between the foundation plate and subgrade can be described by different mathematical - physical model. Elastic foundation can be modelled by different types of models, e.g. one-parametric model, two-parametric model and a comprehensive model - Boussinesque (elastic half-space had been used. The article deals with deterministic and probabilistic analysis of deflection of the foundation plate on the elastic half-space. Contact between the foundation plate and subsoil was modelled using contact elements node-node. At the end the obtained results are presented.

  11. Probabilistic Approach to Optimizing Active and Reactive Power Flow in Wind Farms Considering Wake Effects

    Directory of Open Access Journals (Sweden)

    Yong-Cheol Kang

    2013-10-01

    Full Text Available This paper presents a novel probabilistic optimization algorithm for simultaneous active and reactive power dispatch in power systems with significant wind power integration. Two types of load and wind-speed uncertainties have been assumed that follow normal and Weibull distributions, respectively. A PV bus model for wind turbines and the wake effect for correlated wind speed are used to achieve accurate AC power flow analysis. The power dispatch algorithm for a wind-power integrated system is modeled as a probabilistic optimal power flow (P-OPF problem, which is operated through fixed power factor control to supply reactive power. The proposed P-OPF framework also considers emission information, which clearly reflects the impact of the energy source on the environment. The P-OPF was tested on a modified IEEE 118-bus system with two wind farms. The results show that the proposed technique provides better system operation performance evaluation, which is helpful in making decisions about power system optimal dispatch under conditions of uncertainty.

  12. Paretian Poisson Processes

    Science.gov (United States)

    Eliazar, Iddo; Klafter, Joseph

    2008-05-01

    Many random populations can be modeled as a countable set of points scattered randomly on the positive half-line. The points may represent magnitudes of earthquakes and tornados, masses of stars, market values of public companies, etc. In this article we explore a specific class of random such populations we coin ` Paretian Poisson processes'. This class is elemental in statistical physics—connecting together, in a deep and fundamental way, diverse issues including: the Poisson distribution of the Law of Small Numbers; Paretian tail statistics; the Fréchet distribution of Extreme Value Theory; the one-sided Lévy distribution of the Central Limit Theorem; scale-invariance, renormalization and fractality; resilience to random perturbations.

  13. The Performance of Structure-Controller Coupled Systems Analysis Using Probabilistic Evaluation and Identification Model Approach

    Directory of Open Access Journals (Sweden)

    Mosbeh R. Kaloop

    2017-01-01

    Full Text Available This study evaluates the performance of passively controlled steel frame building under dynamic loads using time series analysis. A novel application is utilized for the time and frequency domains evaluation to analyze the behavior of controlling systems. In addition, the autoregressive moving average (ARMA neural networks are employed to identify the performance of the controller system. Three passive vibration control devices are utilized in this study, namely, tuned mass damper (TMD, tuned liquid damper (TLD, and tuned liquid column damper (TLCD. The results show that the TMD control system is a more reliable controller than TLD and TLCD systems in terms of vibration mitigation. The probabilistic evaluation and identification model showed that the probability analysis and ARMA neural network model are suitable to evaluate and predict the response of coupled building-controller systems.

  14. Linear and Nonlinear Guided Wave Imaging of Impact Damage in CFRP Using a Probabilistic Approach

    Directory of Open Access Journals (Sweden)

    Jan Hettler

    2016-11-01

    Full Text Available The amount and variety of composite structures that need to be inspected for the presence of impact damage has grown significantly in the last few decades. In this paper, an application of a probabilistic ultrasonic guided wave imaging technique for impact damage detection in carbon fiber-reinforced polymers (CFRP is presented. On the one hand, a linear, baseline-dependent, technique utilizing the well-known correlation-based RAPID method and an array of piezoelectric transducers is applied to detect impact-induced damage in plate-like composite structures. Furthermore, a baseline-independent nonlinear extension of the standard RAPID method is proposed, and its performance is demonstrated both numerically and experimentally. Compared to the conventional RAPID, the baseline-free version suffers from a somewhat lower imaging quality. However, this drawback is compensated by the fact that no damage-free (intact baseline is necessary for successful imaging of damage.

  15. Substation design improvement with a probabilistic reliability approach using the TOPASE program

    Energy Technology Data Exchange (ETDEWEB)

    Bulot, M.; Heroin, G.; Bergerot, J-L.; Le Du, M. [Electricite de France (France)

    1997-12-31

    TOPASE, (the French acronym for Probabilistic Tools and Data Processing for the Analysis of Electric Systems), developed by Electricite de France (EDF) to perform reliability studies on transmission substations, was described. TOPASE serves a dual objective of assisting in the automation of HV substation studies, as well as enabling electrical systems experts who are not necessarily specialists in reliability studies to perform such studies. The program is capable of quantifying the occurrence rate of undesirable events and of identifying critical equipment and the main incident scenarios. The program can be used to improve an existing substation, to choose an HV structure during the design stage, or to choose a system of protective devices. Data collected during 1996 and 1997 will be analyzed to identify useful experiences and to validate the basic concepts of the program. 4 figs.

  16. A Probabilistic Approach to Control of Complex Systems and Its Application to Real-Time Pricing

    Directory of Open Access Journals (Sweden)

    Koichi Kobayashi

    2014-01-01

    Full Text Available Control of complex systems is one of the fundamental problems in control theory. In this paper, a control method for complex systems modeled by a probabilistic Boolean network (PBN is studied. A PBN is widely used as a model of complex systems such as gene regulatory networks. For a PBN, the structural control problem is newly formulated. In this problem, a discrete probability distribution appeared in a PBN is controlled by the continuous-valued input. For this problem, an approximate solution method using a matrix-based representation for a PBN is proposed. Then, the problem is approximated by a linear programming problem. Furthermore, the proposed method is applied to design of real-time pricing systems of electricity. Electricity conservation is achieved by appropriately determining the electricity price over time. The effectiveness of the proposed method is presented by a numerical example on real-time pricing systems.

  17. Context Prediction of Mobile Users Based on Time-Inferred Pattern Networks: A Probabilistic Approach

    Directory of Open Access Journals (Sweden)

    Yong-Hyuk Kim

    2013-01-01

    Full Text Available We present a probabilistic method of predicting context of mobile users based on their historic context data. The presented method predicts general context based on probability theory through a novel graphical data structure, which is a kind of weighted directed multigraphs. User context data are transformed into the new graphical structure, in which each node represents a context or a combined context and each directed edge indicates a context transfer with the time weight inferred from corresponding time data. We also consider the periodic property of context data, and we devise a good solution to context data with such property. Through test, we could show the merits of the presented method.

  18. Probabilistic insurance

    OpenAIRE

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these preferences are intuitively appealing they are difficult to reconcile with expected utility theory. Under highly plausible assumptions about the utility function, willingness to pay for probabilistic i...

  19. US Department of Energy Approach to Probabilistic Evaluation of Long-Term Safety for a Potential Yucca Mountain Repository

    International Nuclear Information System (INIS)

    Dr. R. Dyer; Dr. R. Andrews; Dr. A. Van Luik

    2005-01-01

    Regulatory requirements being addressed in the US geological repository program for spent nuclear fuel and high-level waste disposal specify probabilistically defined mean-value dose limits. These dose limits reflect acceptable levels of risk. The probabilistic approach mandated by regulation calculates a ''risk of a dose,'' a risk of a potential given dose value at a specific time in the future to a hypothetical person. The mean value of the time-dependent performance measure needs to remain below an acceptable level defined by regulation. Because there are uncertain parameters that are important to system performance, the regulation mandates an analysis focused on the mean value of the performance measure, but that also explores the ''full range of defensible and reasonable parameter distributions''...System performance evaluations should not be unduly influenced by...''extreme physical situations and parameter values''. Challenges in this approach lie in defending the scientific basis for the models selected, and the data and distributions sampled. A significant challenge lies in showing that uncertainties are properly identified and evaluated. A single-value parameter has no uncertainty, and where used such values need to be supported by scientific information showing the selected value is appropriate. Uncertainties are inherent in data, but are also introduced by creating parameter distributions from data sets, selecting models from among alternative models, abstracting models for use in probabilistic analysis, and in selecting the range of initiating event probabilities for unlikely events. The goal of the assessment currently in progress is to evaluate the level of risk inherent in moving ahead to the next phase of repository development: construction. During the construction phase, more will be learned to inform a new long-term risk evaluation to support moving to the next phase: accepting waste. Therefore, though there was sufficient confidence of safety

  20. Tsunamigenic scenarios for southern Peru and northern Chile seismic gap: Deterministic and probabilistic hybrid approach for hazard assessment

    Science.gov (United States)

    González-Carrasco, J. F.; Gonzalez, G.; Aránguiz, R.; Yanez, G. A.; Melgar, D.; Salazar, P.; Shrivastava, M. N.; Das, R.; Catalan, P. A.; Cienfuegos, R.

    2017-12-01

    Plausible worst-case tsunamigenic scenarios definition plays a relevant role in tsunami hazard assessment focused in emergency preparedness and evacuation planning for coastal communities. During the last decade, the occurrence of major and moderate tsunamigenic earthquakes along worldwide subduction zones has given clues about critical parameters involved in near-field tsunami inundation processes, i.e. slip spatial distribution, shelf resonance of edge waves and local geomorphology effects. To analyze the effects of these seismic and hydrodynamic variables over the epistemic uncertainty of coastal inundation, we implement a combined methodology using deterministic and probabilistic approaches to construct 420 tsunamigenic scenarios in a mature seismic gap of southern Peru and northern Chile, extended from 17ºS to 24ºS. The deterministic scenarios are calculated using a regional distribution of trench-parallel gravity anomaly (TPGA) and trench-parallel topography anomaly (TPTA), three-dimensional Slab 1.0 worldwide subduction zones geometry model and published interseismic coupling (ISC) distributions. As result, we find four higher slip deficit zones interpreted as major seismic asperities of the gap, used in a hierarchical tree scheme to generate ten tsunamigenic scenarios with seismic magnitudes fluctuates between Mw 8.4 to Mw 8.9. Additionally, we construct ten homogeneous slip scenarios as inundation baseline. For the probabilistic approach, we implement a Karhunen - Loève expansion to generate 400 stochastic tsunamigenic scenarios over the maximum extension of the gap, with the same magnitude range of the deterministic sources. All the scenarios are simulated through a non-hydrostatic tsunami model Neowave 2D, using a classical nesting scheme, for five coastal major cities in northern Chile (Arica, Iquique, Tocopilla, Mejillones and Antofagasta) obtaining high resolution data of inundation depth, runup, coastal currents and sea level elevation. The

  1. Fractional Poisson Fields and Martingales

    Science.gov (United States)

    Aletti, Giacomo; Leonenko, Nikolai; Merzbach, Ely

    2018-01-01

    We present new properties for the Fractional Poisson process (FPP) and the Fractional Poisson field on the plane. A martingale characterization for FPPs is given. We extend this result to Fractional Poisson fields, obtaining some other characterizations. The fractional differential equations are studied. We consider a more general Mixed-Fractional Poisson process and show that this process is the stochastic solution of a system of fractional differential-difference equations. Finally, we give some simulations of the Fractional Poisson field on the plane.

  2. Fractional Poisson Fields and Martingales

    Science.gov (United States)

    Aletti, Giacomo; Leonenko, Nikolai; Merzbach, Ely

    2018-02-01

    We present new properties for the Fractional Poisson process (FPP) and the Fractional Poisson field on the plane. A martingale characterization for FPPs is given. We extend this result to Fractional Poisson fields, obtaining some other characterizations. The fractional differential equations are studied. We consider a more general Mixed-Fractional Poisson process and show that this process is the stochastic solution of a system of fractional differential-difference equations. Finally, we give some simulations of the Fractional Poisson field on the plane.

  3. Estimation of Poisson noise in spatial domain

    Science.gov (United States)

    Švihlík, Jan; Fliegel, Karel; Vítek, Stanislav; Kukal, Jaromír.; Krbcová, Zuzana

    2017-09-01

    This paper deals with modeling of astronomical images in the spatial domain. We consider astronomical light images contaminated by the dark current which is modeled by Poisson random process. Dark frame image maps the thermally generated charge of the CCD sensor. In this paper, we solve the problem of an addition of two Poisson random variables. At first, the noise analysis of images obtained from the astronomical camera is performed. It allows estimating parameters of the Poisson probability mass functions in every pixel of the acquired dark frame. Then the resulting distributions of the light image can be found. If the distributions of the light image pixels are identified, then the denoising algorithm can be applied. The performance of the Bayesian approach in the spatial domain is compared with the direct approach based on the method of moments and the dark frame subtraction.

  4. Agent autonomy approach to probabilistic physics-of-failure modeling of complex dynamic systems with interacting failure mechanisms

    Science.gov (United States)

    Gromek, Katherine Emily

    A novel computational and inference framework of the physics-of-failure (PoF) reliability modeling for complex dynamic systems has been established in this research. The PoF-based reliability models are used to perform a real time simulation of system failure processes, so that the system level reliability modeling would constitute inferences from checking the status of component level reliability at any given time. The "agent autonomy" concept is applied as a solution method for the system-level probabilistic PoF-based (i.e. PPoF-based) modeling. This concept originated from artificial intelligence (AI) as a leading intelligent computational inference in modeling of multi agents systems (MAS). The concept of agent autonomy in the context of reliability modeling was first proposed by M. Azarkhail [1], where a fundamentally new idea of system representation by autonomous intelligent agents for the purpose of reliability modeling was introduced. Contribution of the current work lies in the further development of the agent anatomy concept, particularly the refined agent classification within the scope of the PoF-based system reliability modeling, new approaches to the learning and the autonomy properties of the intelligent agents, and modeling interacting failure mechanisms within the dynamic engineering system. The autonomous property of intelligent agents is defined as agent's ability to self-activate, deactivate or completely redefine their role in the analysis. This property of agents and the ability to model interacting failure mechanisms of the system elements makes the agent autonomy fundamentally different from all existing methods of probabilistic PoF-based reliability modeling. 1. Azarkhail, M., "Agent Autonomy Approach to Physics-Based Reliability Modeling of Structures and Mechanical Systems", PhD thesis, University of Maryland, College Park, 2007.

  5. Probabilistic approach to diffusion in shear flows of generalized viscoelastic second-grade fluids

    International Nuclear Information System (INIS)

    Wafo Soh, C

    2010-01-01

    We study diffusion in point-source-driven shear flows of generalized second-grade fluids. We start by obtaining exact solutions of shear flows triggered by point sources under various boundary conditions. For unrestricted flows, we demonstrate that the velocity distribution is the probability density function of a coupled or uncoupled continuous-time random walk. In the first instance, the motion is described by a compound Poisson process with an explicit probability density function corresponding to the velocity distribution. The average waiting time in this situation is finite and is identified with the structural relaxation time. In the second case, we obtain an explicit formula for the probability density function in terms of special functions. In both cases, the probability density functions of the associated stochastic processes are leptokurtic at all finite times with variances linear in time. By using the method of images, we infer velocity fields for restricted flows from those of unrestricted flows. Equipped with some exact expressions of the velocity field, we analyze advection–diffusion via the Feynman–Kac formula, which lends itself naturally to Monte Carlo simulation

  6. Probabilistic Networks

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Lauritzen, Steffen Lilholt

    2001-01-01

    This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs.......This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs....

  7. Probabilistic Insurance

    NARCIS (Netherlands)

    P.P. Wakker (Peter); R.H. Thaler (Richard); A. Tversky (Amos)

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these

  8. Probabilistic Insurance

    NARCIS (Netherlands)

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    Probabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in premium to compensate for a 1% default risk. These observations cannot be

  9. Stable oscillations of a predator-prey probabilistic cellular automaton: a mean-field approach

    Energy Technology Data Exchange (ETDEWEB)

    Tome, Tania; Carvalho, Kelly C de [Instituto de FIsica, Universidade de Sao Paulo, Caixa Postal 66318, 05315-970 Sao Paulo (Brazil)

    2007-10-26

    We analyze a probabilistic cellular automaton describing the dynamics of coexistence of a predator-prey system. The individuals of each species are localized over the sites of a lattice and the local stochastic updating rules are inspired by the processes of the Lotka-Volterra model. Two levels of mean-field approximations are set up. The simple approximation is equivalent to an extended patch model, a simple metapopulation model with patches colonized by prey, patches colonized by predators and empty patches. This approximation is capable of describing the limited available space for species occupancy. The pair approximation is moreover able to describe two types of coexistence of prey and predators: one where population densities are constant in time and another displaying self-sustained time oscillations of the population densities. The oscillations are associated with limit cycles and arise through a Hopf bifurcation. They are stable against changes in the initial conditions and, in this sense, they differ from the Lotka-Volterra cycles which depend on initial conditions. In this respect, the present model is biologically more realistic than the Lotka-Volterra model.

  10. Reliability calculation of cracked components using probabilistic fracture mechanics and a Markovian approach

    International Nuclear Information System (INIS)

    Schmidt, T.

    1988-01-01

    The numerical reliability calculation of cracked construction components under cyclical fatigue stress can be done with the help of models of probabilistic fracture mechanics. An alternative to the Monte Carlo simulation method is examined; the alternative method is based on the description of failure processes with the help of a Markov process. The Markov method is traced back directly to the stochastic parameters of a two-dimensional fracture mechanics model, the effects of inspections and repairs also being considered. The probability of failure and expected failure frequency can be determined as time functions with the transition and conditional probabilities of the original or derived Markov process. For concrete calculation, an approximative Markov chain is designed which, under certain conditions, is capable of giving a sufficient approximation of the original Markov process and the reliability characteristics determined by it. The application of the MARKOV program code developed into an algorithm reveals sufficient conformity with the Monte Carlo reference results. The starting point of the investigation was the 'Deutsche Risikostudie B (DWR)' ('German Risk Study B (DWR)'), specifically, the reliability of the main coolant line. (orig./HP) [de

  11. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    Science.gov (United States)

    Crovelli, R.A.

    1988-01-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.

  12. Interval multistage joint-probabilistic integer programming approach for water resources allocation and management.

    Science.gov (United States)

    Gu, J J; Huang, G H; Guo, P; Shen, N

    2013-10-15

    In this study, an interval multistage joint-probabilistic integer programming method was developed to address certain problems in water resource regulation. This method effectively deals with data in the form of intervals and probability distribution. It can also process uncertain data in the form of joint probabilities. The proposed method can also reflect the linkage and dynamic variability between particular stages in multi-stage planning. Sensitivity analysis on moderate violations and security constraints showed that the degree of constraint violation was closely linked to the final benefits of the system. The developed method was applied in the case study of the joint-operation of the Tianzhuang and Bashan Reservoirs in Huaihe River, China. In this case study, the proposed method can deal with the water shortage problems downstream and the distribution problems caused by excess water in the reservoir. It can also guarantee the optimization of long-term water usage of both Reservoirs and the river downstream. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Identifying virus-cell fusion in two-channel fluorescence microscopy image sequences based on a layered probabilistic approach.

    Science.gov (United States)

    Godinez, William J; Lampe, Marko; Koch, Peter; Eils, Roland; Müller, Barbara; Rohr, Karl

    2012-09-01

    The entry process of virus particles into cells is decisive for infection. In this work, we investigate fusion of virus particles with the cell membrane via time-lapse fluorescence microscopy. To automatically identify fusion for single particles based on their intensity over time, we have developed a layered probabilistic approach. The approach decomposes the action of a single particle into three abstractions: the intensity over time, the underlying temporal intensity model, as well as a high level behavior. Each abstraction corresponds to a layer and these layers are represented via stochastic hybrid systems and hidden Markov models. We use a maxbelief strategy to efficiently combine both representations. To compute estimates for the abstractions we use a hybrid particle filter and the Viterbi algorithm. Based on synthetic image sequences, we characterize the performance of the approach as a function of the image noise. We also characterize the performance as a function of the tracking error. We have also successfully applied the approach to real image sequences displaying pseudotyped HIV-1 particles in contact with host cells and compared the experimental results with ground truth obtained by manual analysis.

  14. Application of dynamic probabilistic safety assessment approach for accident sequence precursor analysis: Case study for steam generator tube rupture

    International Nuclear Information System (INIS)

    Lee, Han Sul; Heo, Gyun Young; Kim, Tae Wan

    2017-01-01

    The purpose of this research is to introduce the technical standard of accident sequence precursor (ASP) analysis, and to propose a case study using the dynamic-probabilistic safety assessment (D-PSA) approach. The D-PSA approach can aid in the determination of high-risk/low-frequency accident scenarios from all potential scenarios. It can also be used to investigate the dynamic interaction between the physical state and the actions of the operator in an accident situation for risk quantification. This approach lends significant potential for safety analysis. Furthermore, the D-PSA approach provides a more realistic risk assessment by minimizing assumptions used in the conventional PSA model so-called the static-PSA model, which are relatively static in comparison. We performed risk quantification of a steam generator tube rupture (SGTR) accident using the dynamic event tree (DET) methodology, which is the most widely used methodology in D-PSA. The risk quantification results of D-PSA and S-PSA are compared and evaluated. Suggestions and recommendations for using D-PSA are described in order to provide a technical perspective

  15. How do individuals reason in the Wason card selection task [Review of: M. Oaksford, N. Chater Bayesian rationality: the probabilistic approach to human reasoning

    NARCIS (Netherlands)

    Wagenmakers, E.-J.

    2009-01-01

    The probabilistic approach to human reasoning is exemplified by the information gain model for the Wason card selection task. Although the model is elegant and original, several key aspects of the model warrant further discussion, particularly those concerning the scope of the task and the choice

  16. Stormwater Tank Performance: Design and Management Criteria for Capture Tanks Using a Continuous Simulation and a Semi-Probabilistic Analytical Approach

    Directory of Open Access Journals (Sweden)

    Flavio De Martino

    2013-10-01

    Full Text Available Stormwater tank performance significantly depends on management practices. This paper proposes a procedure to assess tank efficiency in terms of volume and pollutant concentration using four different capture tank management protocols. The comparison of the efficiency results reveals that, as expected, a combined bypass—stormwater tank system achieves better results than a tank alone. The management practices tested for the tank-only systems provide notably different efficiency results. The practice of immediately emptying after the end of the event exhibits significant levels of efficiency and operational advantages. All other configurations exhibit either significant operational problems or very low performances. The continuous simulation and semi-probabilistic approach for the best tank management practice are compared. The semi-probabilistic approach is based on a Weibull probabilistic model of the main characteristics of the rainfall process. Following this approach, efficiency indexes were established. The comparison with continuous simulations shows the reliability of the probabilistic approach even if this last is certainly very site sensitive.

  17. The Poisson aggregation process

    International Nuclear Information System (INIS)

    Eliazar, Iddo

    2016-01-01

    In this paper we introduce and analyze the Poisson Aggregation Process (PAP): a stochastic model in which a random collection of random balls is stacked over a general metric space. The scattering of the balls’ centers follows a general Poisson process over the metric space, and the balls’ radii are independent and identically distributed random variables governed by a general distribution. For each point of the metric space, the PAP counts the number of balls that are stacked over it. The PAP model is a highly versatile spatial counterpart of the temporal M/G/∞ model in queueing theory. The surface of the moon, scarred by circular meteor-impact craters, exemplifies the PAP model in two dimensions: the PAP counts the number of meteor-impacts that any given moon-surface point sustained. A comprehensive analysis of the PAP is presented, and the closed-form results established include: general statistics, stationary statistics, short-range and long-range dependencies, a Central Limit Theorem, an Extreme Limit Theorem, and fractality.

  18. Selective Contrast Adjustment by Poisson Equation

    Directory of Open Access Journals (Sweden)

    Ana-Belen Petro

    2013-09-01

    Full Text Available Poisson Image Editing is a new technique permitting to modify the gradient vector field of an image, and then to recover an image with a gradient approaching this modified gradient field. This amounts to solve a Poisson equation, an operation which can be efficiently performed by Fast Fourier Transform (FFT. This paper describes an algorithm applying this technique, with two different variants. The first variant enhances the contrast by increasing the gradient in the dark regions of the image. This method is well adapted to images with back light or strong shadows, and reveals details in the shadows. The second variant of the same Poisson technique enhances all small gradients in the image, thus also sometimes revealing details and texture.

  19. Probabilistic Analysis Methods for Hybrid Ventilation

    DEFF Research Database (Denmark)

    Brohus, Henrik; Frier, Christian; Heiselberg, Per

    This paper discusses a general approach for the application of probabilistic analysis methods in the design of ventilation systems. The aims and scope of probabilistic versus deterministic methods are addressed with special emphasis on hybrid ventilation systems. A preliminary application...

  20. Probabilistic Assessment of Cancer Risk for Astronauts on Lunar Missions

    Science.gov (United States)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2009-01-01

    During future lunar missions, exposure to solar particle events (SPEs) is a major safety concern for crew members during extra-vehicular activities (EVAs) on the lunar surface or Earth-to-moon transit. NASA s new lunar program anticipates that up to 15% of crew time may be on EVA, with minimal radiation shielding. For the operational challenge to respond to events of unknown size and duration, a probabilistic risk assessment approach is essential for mission planning and design. Using the historical database of proton measurements during the past 5 solar cycles, a typical hazard function for SPE occurrence was defined using a non-homogeneous Poisson model as a function of time within a non-specific future solar cycle of 4000 days duration. Distributions ranging from the 5th to 95th percentile of particle fluences for a specified mission period were simulated. Organ doses corresponding to particle fluences at the median and at the 95th percentile for a specified mission period were assessed using NASA s baryon transport model, BRYNTRN. The cancer fatality risk for astronauts as functions of age, gender, and solar cycle activity were then analyzed. The probability of exceeding the NASA 30- day limit of blood forming organ (BFO) dose inside a typical spacecraft was calculated. Future work will involve using this probabilistic risk assessment approach to SPE forecasting, combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks.

  1. Unified approach for estimating the probabilistic design S-N curves of three commonly used fatigue stress-life models

    International Nuclear Information System (INIS)

    Zhao Yongxiang; Wang Jinnuo; Gao Qing

    2001-01-01

    A unified approach, referred to as general maximum likelihood method, is presented for estimating probabilistic design S-N curves and their confidence bounds of the three commonly used fatigue stress-life models, namely three parameter, Langer and Basquin. The curves are described by a general form of mean and standard deviation S-N curves of the logarithm of fatigue life. Different from existent methods, i.e., the conventional method and the classical maximum likelihood method,present approach considers the statistical characteristics of whole test data. The parameters of the mean curve is firstly estimated by least square method and then, the parameters of the standard deviation curve is evaluated by mathematical programming method to be agreement with the maximum likelihood principle. Fit effects of the curves are assessed by fitted relation coefficient, total fitted standard error and the confidence bounds. Application to the virtual stress amplitude-crack initiation life data of a nuclear engineering material, Chinese 1Cr18Ni9Ti stainless steel pipe-weld metal, has indicated the validity of the approach to the S-N data where both S and N show the character of random variable. Practices to the two states of S-N data of Chinese 45 carbon steel notched specimens (k t = 2.0) have indicated the validity of present approach to the test results obtained respectively from group fatigue test and from maximum likelihood fatigue test. At the practices, it was revealed that in general the fit is best for the three-parameter model,slightly inferior for the Langer relation and poor for the Basquin equation. Relative to the existent methods, present approach has better fit. In addition, the possible non-conservative predictions of the existent methods, which are resulted from the influence of local statistical characteristics of the data, are also overcome by present approach

  2. A probabilistic approach for estimating the spatial extent of pesticide agricultural use sites and potential co-occurrence with listed species for use in ecological risk assessments.

    Science.gov (United States)

    Budreski, Katherine; Winchell, Michael; Padilla, Lauren; Bang, JiSu; Brain, Richard A

    2016-04-01

    A crop footprint refers to the estimated spatial extent of growing areas for a specific crop, and is commonly used to represent the potential "use site" footprint for a pesticide labeled for use on that crop. A methodology for developing probabilistic crop footprints to estimate the likelihood of pesticide use and the potential co-occurrence of pesticide use and listed species locations was tested at the national scale and compared to alternative methods. The probabilistic aspect of the approach accounts for annual crop rotations and the uncertainty in remotely sensed crop and land cover data sets. The crop footprints used historically are derived exclusively from the National Land Cover Database (NLCD) Cultivated Crops and/or Pasture/Hay classes. This approach broadly aggregates agriculture into 2 classes, which grossly overestimates the spatial extent of individual crops that are labeled for pesticide use. The approach also does not use all the available crop data, represents a single point in time, and does not account for the uncertainty in land cover data set classifications. The probabilistic crop footprint approach described herein incorporates best available information at the time of analysis from the National Agricultural Statistics Service (NASS) Cropland Data Layer (CDL) for 5 y (2008-2012 at the time of analysis), the 2006 NLCD, the 2007 NASS Census of Agriculture, and 5 y of NASS Quick Stats (2008-2012). The approach accounts for misclassification of crop classes in the CDL by incorporating accuracy assessment information by state, year, and crop. The NLCD provides additional information to improve the CDL crop probability through an adjustment based on the NLCD accuracy assessment data using the principles of Bayes' Theorem. Finally, crop probabilities are scaled at the state level by comparing against NASS surveys (Census of Agriculture and Quick Stats) of reported planted acres by crop. In an example application of the new method, the probabilistic

  3. Predicting carcinogenicity of diverse chemicals using probabilistic neural network modeling approaches

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Kunwar P., E-mail: kpsingh_52@yahoo.com [Academy of Scientific and Innovative Research, Council of Scientific and Industrial Research, New Delhi (India); Environmental Chemistry Division, CSIR-Indian Institute of Toxicology Research, Post Box 80, Mahatma Gandhi Marg, Lucknow 226 001 (India); Gupta, Shikha; Rai, Premanjali [Academy of Scientific and Innovative Research, Council of Scientific and Industrial Research, New Delhi (India); Environmental Chemistry Division, CSIR-Indian Institute of Toxicology Research, Post Box 80, Mahatma Gandhi Marg, Lucknow 226 001 (India)

    2013-10-15

    Robust global models capable of discriminating positive and non-positive carcinogens; and predicting carcinogenic potency of chemicals in rodents were developed. The dataset of 834 structurally diverse chemicals extracted from Carcinogenic Potency Database (CPDB) was used which contained 466 positive and 368 non-positive carcinogens. Twelve non-quantum mechanical molecular descriptors were derived. Structural diversity of the chemicals and nonlinearity in the data were evaluated using Tanimoto similarity index and Brock–Dechert–Scheinkman statistics. Probabilistic neural network (PNN) and generalized regression neural network (GRNN) models were constructed for classification and function optimization problems using the carcinogenicity end point in rat. Validation of the models was performed using the internal and external procedures employing a wide series of statistical checks. PNN constructed using five descriptors rendered classification accuracy of 92.09% in complete rat data. The PNN model rendered classification accuracies of 91.77%, 80.70% and 92.08% in mouse, hamster and pesticide data, respectively. The GRNN constructed with nine descriptors yielded correlation coefficient of 0.896 between the measured and predicted carcinogenic potency with mean squared error (MSE) of 0.44 in complete rat data. The rat carcinogenicity model (GRNN) applied to the mouse and hamster data yielded correlation coefficient and MSE of 0.758, 0.71 and 0.760, 0.46, respectively. The results suggest for wide applicability of the inter-species models in predicting carcinogenic potency of chemicals. Both the PNN and GRNN (inter-species) models constructed here can be useful tools in predicting the carcinogenicity of new chemicals for regulatory purposes. - Graphical abstract: Figure (a) shows classification accuracies (positive and non-positive carcinogens) in rat, mouse, hamster, and pesticide data yielded by optimal PNN model. Figure (b) shows generalization and predictive

  4. Probabilistic Route Selection Algorithm for IP Traceback

    Science.gov (United States)

    Yim, Hong-Bin; Jung, Jae-Il

    DoS(Denial of Service) or DDoS(Distributed DoS) attack is a major threaten and the most difficult problem to solve among many attacks. Moreover, it is very difficult to find a real origin of attackers because DoS/DDoS attacker uses spoofed IP addresses. To solve this problem, we propose a probabilistic route selection traceback algorithm, namely PRST, to trace the attacker's real origin. This algorithm uses two types of packets such as an agent packet and a reply agent packet. The agent packet is in use to find the attacker's real origin and the reply agent packet is in use to notify to a victim that the agent packet is reached the edge router of the attacker. After attacks occur, the victim generates the agent packet and sends it to a victim's edge router. The attacker's edge router received the agent packet generates the reply agent packet and send it to the victim. The agent packet and the reply agent packet is forwarded refer to probabilistic packet forwarding table (PPFT) by routers. The PRST algorithm runs on the distributed routers and PPFT is stored and managed by routers. We validate PRST algorithm by using mathematical approach based on Poisson distribution.

  5. Probabilistic safety assessment and optimal control of hazardous technological systems. A marked point process approach

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, J. [VTT Automation, Espoo (Finland)

    1997-04-01

    The thesis models risk management as an optimal control problem for a stochastic process. The approach classes the decisions made by management into three categories according to the control methods of a point process: (1) planned process lifetime, (2) modification of the design, and (3) operational decisions. The approach is used for optimization of plant shutdown criteria and surveillance test strategies of a hypothetical nuclear power plant. 62 refs. The thesis includes also five previous publications by author.

  6. Probabilistic safety assessment and optimal control of hazardous technological systems. A marked point process approach

    International Nuclear Information System (INIS)

    Holmberg, J.

    1997-04-01

    The thesis models risk management as an optimal control problem for a stochastic process. The approach classes the decisions made by management into three categories according to the control methods of a point process: (1) planned process lifetime, (2) modification of the design, and (3) operational decisions. The approach is used for optimization of plant shutdown criteria and surveillance test strategies of a hypothetical nuclear power plant

  7. Poisson hierarchy of discrete strings

    Energy Technology Data Exchange (ETDEWEB)

    Ioannidou, Theodora, E-mail: ti3@auth.gr [Faculty of Civil Engineering, School of Engineering, Aristotle University of Thessaloniki, 54249, Thessaloniki (Greece); Niemi, Antti J., E-mail: Antti.Niemi@physics.uu.se [Department of Physics and Astronomy, Uppsala University, P.O. Box 803, S-75108, Uppsala (Sweden); Laboratoire de Mathematiques et Physique Theorique CNRS UMR 6083, Fédération Denis Poisson, Université de Tours, Parc de Grandmont, F37200, Tours (France); Department of Physics, Beijing Institute of Technology, Haidian District, Beijing 100081 (China)

    2016-01-28

    The Poisson geometry of a discrete string in three dimensional Euclidean space is investigated. For this the Frenet frames are converted into a spinorial representation, the discrete spinor Frenet equation is interpreted in terms of a transfer matrix formalism, and Poisson brackets are introduced in terms of the spinor components. The construction is then generalised, in a self-similar manner, into an infinite hierarchy of Poisson algebras. As an example, the classical Virasoro (Witt) algebra that determines reparametrisation diffeomorphism along a continuous string, is identified as a particular sub-algebra, in the hierarchy of the discrete string Poisson algebra. - Highlights: • Witt (classical Virasoro) algebra is derived in the case of discrete string. • Infinite dimensional hierarchy of Poisson bracket algebras is constructed for discrete strings. • Spinor representation of discrete Frenet equations is developed.

  8. Poisson hierarchy of discrete strings

    International Nuclear Information System (INIS)

    Ioannidou, Theodora; Niemi, Antti J.

    2016-01-01

    The Poisson geometry of a discrete string in three dimensional Euclidean space is investigated. For this the Frenet frames are converted into a spinorial representation, the discrete spinor Frenet equation is interpreted in terms of a transfer matrix formalism, and Poisson brackets are introduced in terms of the spinor components. The construction is then generalised, in a self-similar manner, into an infinite hierarchy of Poisson algebras. As an example, the classical Virasoro (Witt) algebra that determines reparametrisation diffeomorphism along a continuous string, is identified as a particular sub-algebra, in the hierarchy of the discrete string Poisson algebra. - Highlights: • Witt (classical Virasoro) algebra is derived in the case of discrete string. • Infinite dimensional hierarchy of Poisson bracket algebras is constructed for discrete strings. • Spinor representation of discrete Frenet equations is developed.

  9. Different approaches for identifying important concepts in probabilistic biomedical text summarization.

    Science.gov (United States)

    Moradi, Milad; Ghadiri, Nasser

    2018-01-01

    Automatic text summarization tools help users in the biomedical domain to acquire their intended information from various textual resources more efficiently. Some of biomedical text summarization systems put the basis of their sentence selection approach on the frequency of concepts extracted from the input text. However, it seems that exploring other measures rather than the raw frequency for identifying valuable contents within an input document, or considering correlations existing between concepts, may be more useful for this type of summarization. In this paper, we describe a Bayesian summarization method for biomedical text documents. The Bayesian summarizer initially maps the input text to the Unified Medical Language System (UMLS) concepts; then it selects the important ones to be used as classification features. We introduce six different feature selection approaches to identify the most important concepts of the text and select the most informative contents according to the distribution of these concepts. We show that with the use of an appropriate feature selection approach, the Bayesian summarizer can improve the performance of biomedical summarization. Using the Recall-Oriented Understudy for Gisting Evaluation (ROUGE) toolkit, we perform extensive evaluations on a corpus of scientific papers in the biomedical domain. The results show that when the Bayesian summarizer utilizes the feature selection methods that do not use the raw frequency, it can outperform the biomedical summarizers that rely on the frequency of concepts, domain-independent and baseline methods. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. A probabilistic approach for automated discovery of perturbed genes using expression data from microarray or RNA-Seq.

    Science.gov (United States)

    Sundaramurthy, Gopinath; Eghbalnia, Hamid R

    2015-12-01

    In complex diseases, alterations of multiple molecular and cellular components in response to perturbations are indicative of disease physiology. While expression level of genes from high-throughput analysis can vary among patients, the common path among disease progression suggests that the underlying cellular sub-processes involving associated genes follow similar fates. Motivated by the interconnected nature of sub-processes, we have developed an automated methodology that combines ideas from biological networks, statistical models, and game theory, to probe connected cellular processes. The core concept in our approach uses probability of change (POC) to indicate the probability that a gene's expression level has changed between two conditions. POC facilitates the definition of change at the neighborhood, pathway, and network levels and enables evaluation of the influence of diseases on the expression. The 'connected' disease-related genes (DRG) identified display coherent and concomitant differential expression levels along paths. RNA-Seq and microarray breast cancer subtyping expression data sets were used to identify DRG between subtypes. A machine-learning algorithm was trained for subtype discrimination using the DRG, and the training yielded a set of biomarkers. The discriminative power of the biomarkers was tested using an unseen data set. Biomarkers identified overlaps with disease-specific identified genes, and we were able to classify disease subtypes with 100% and 80% agreement with PAM50, for microarray and RNA-Seq data set respectively. We present an automated probabilistic approach that offers unbiased and reproducible results, thus complementing existing methods in DRG and biomarker discovery for complex diseases. Copyright © 2015. Published by Elsevier Ltd.

  11. Probabilistic conditional independence structures

    CERN Document Server

    Studeny, Milan

    2005-01-01

    Probabilistic Conditional Independence Structures provides the mathematical description of probabilistic conditional independence structures; the author uses non-graphical methods of their description, and takes an algebraic approach.The monograph presents the methods of structural imsets and supermodular functions, and deals with independence implication and equivalence of structural imsets.Motivation, mathematical foundations and areas of application are included, and a rough overview of graphical methods is also given.In particular, the author has been careful to use suitable terminology, and presents the work so that it will be understood by both statisticians, and by researchers in artificial intelligence.The necessary elementary mathematical notions are recalled in an appendix.

  12. Experiments with ROPAR, an approach for probabilistic analysis of the optimal solutions' robustness

    Science.gov (United States)

    Marquez, Oscar; Solomatine, Dimitri

    2016-04-01

    Robust optimization is defined as the search for solutions and performance results which remain reasonably unchanged when exposed to uncertain conditions such as natural variability in input variables, parameter drifts during operation time, model sensitivities and others [1]. In the present study we follow the approach named ROPAR (multi-objective robust optimization allowing for explicit analysis of robustness (see online publication [2]). Its main idea is in: a) sampling the vectors of uncertain factors; b) solving MOO problem for each of them obtaining multiple Pareto sets; c) analysing the statistical properties (distributions) of the subsets of these Pareto sets corresponding to different conditions (e.g. based on constraints formulated for the objective functions values of other system variables); d) selecting the robust solutions. The paper presents the results of experiments with the two case studies: 1) a benchmark function ZDT1 (with an uncertain factor) often used in algorithms comparisons, and 2) a problem of drainage network rehabilitation that uses SWMM hydrodynamic model (the rainfall is assumed to be an uncertain factor). This study is partly supported by the FP7 European Project WeSenseIt Citizen Water Observatory (www.http://wesenseit.eu/) and the CONACYT (Mexico's National Council of Science and Technology) supporting the PhD study of the first author. References [1] H.G.Beyer and B. Sendhoff. "Robust optimization - A comprehensive survey." Comput. Methods Appl. Mech. Engrg., 2007: 3190-3218. [2] D.P. Solomatine (2012). An approach to multi-objective robust optimization allowing for explicit analysis of robustness (ROPAR). UNESCO-IHE. Online publication. Web: https://www.unesco-ihe.org/sites/default/files/solomatine-ropar.pdf

  13. A Computational Approach for Probabilistic Analysis of LS-DYNA Water Impact Simulations

    Science.gov (United States)

    Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.

    2010-01-01

    NASA s development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. Because of the computational cost, these tools are often used to evaluate specific conditions and rarely used for statistical analysis. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. For this problem, response surface models are used to predict the system time responses to a water landing as a function of capsule speed, direction, attitude, water speed, and water direction. Furthermore, these models can also be used to ascertain the adequacy of the design in terms of probability measures. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.

  14. Mixing Carrots and Sticks to Conserve Forests in the Brazilian Amazon: A Spatial Probabilistic Modeling Approach

    Science.gov (United States)

    Börner, Jan; Marinho, Eduardo; Wunder, Sven

    2015-01-01

    Annual forest loss in the Brazilian Amazon had in 2012 declined to less than 5,000 sqkm, from over 27,000 in 2004. Mounting empirical evidence suggests that changes in Brazilian law enforcement strategy and the related governance system may account for a large share of the overall success in curbing deforestation rates. At the same time, Brazil is experimenting with alternative approaches to compensate farmers for conservation actions through economic incentives, such as payments for environmental services, at various administrative levels. We develop a spatially explicit simulation model for deforestation decisions in response to policy incentives and disincentives. The model builds on elements of optimal enforcement theory and introduces the notion of imperfect payment contract enforcement in the context of avoided deforestation. We implement the simulations using official deforestation statistics and data collected from field-based forest law enforcement operations in the Amazon region. We show that a large-scale integration of payments with the existing regulatory enforcement strategy involves a tradeoff between the cost-effectiveness of forest conservation and landholder incomes. Introducing payments as a complementary policy measure increases policy implementation cost, reduces income losses for those hit hardest by law enforcement, and can provide additional income to some land users. The magnitude of the tradeoff varies in space, depending on deforestation patterns, conservation opportunity and enforcement costs. Enforcement effectiveness becomes a key determinant of efficiency in the overall policy mix. PMID:25650966

  15. Mixing carrots and sticks to conserve forests in the Brazilian Amazon: a spatial probabilistic modeling approach.

    Science.gov (United States)

    Börner, Jan; Marinho, Eduardo; Wunder, Sven

    2015-01-01

    Annual forest loss in the Brazilian Amazon had in 2012 declined to less than 5,000 sqkm, from over 27,000 in 2004. Mounting empirical evidence suggests that changes in Brazilian law enforcement strategy and the related governance system may account for a large share of the overall success in curbing deforestation rates. At the same time, Brazil is experimenting with alternative approaches to compensate farmers for conservation actions through economic incentives, such as payments for environmental services, at various administrative levels. We develop a spatially explicit simulation model for deforestation decisions in response to policy incentives and disincentives. The model builds on elements of optimal enforcement theory and introduces the notion of imperfect payment contract enforcement in the context of avoided deforestation. We implement the simulations using official deforestation statistics and data collected from field-based forest law enforcement operations in the Amazon region. We show that a large-scale integration of payments with the existing regulatory enforcement strategy involves a tradeoff between the cost-effectiveness of forest conservation and landholder incomes. Introducing payments as a complementary policy measure increases policy implementation cost, reduces income losses for those hit hardest by law enforcement, and can provide additional income to some land users. The magnitude of the tradeoff varies in space, depending on deforestation patterns, conservation opportunity and enforcement costs. Enforcement effectiveness becomes a key determinant of efficiency in the overall policy mix.

  16. Mixing carrots and sticks to conserve forests in the Brazilian Amazon: a spatial probabilistic modeling approach.

    Directory of Open Access Journals (Sweden)

    Jan Börner

    Full Text Available Annual forest loss in the Brazilian Amazon had in 2012 declined to less than 5,000 sqkm, from over 27,000 in 2004. Mounting empirical evidence suggests that changes in Brazilian law enforcement strategy and the related governance system may account for a large share of the overall success in curbing deforestation rates. At the same time, Brazil is experimenting with alternative approaches to compensate farmers for conservation actions through economic incentives, such as payments for environmental services, at various administrative levels. We develop a spatially explicit simulation model for deforestation decisions in response to policy incentives and disincentives. The model builds on elements of optimal enforcement theory and introduces the notion of imperfect payment contract enforcement in the context of avoided deforestation. We implement the simulations using official deforestation statistics and data collected from field-based forest law enforcement operations in the Amazon region. We show that a large-scale integration of payments with the existing regulatory enforcement strategy involves a tradeoff between the cost-effectiveness of forest conservation and landholder incomes. Introducing payments as a complementary policy measure increases policy implementation cost, reduces income losses for those hit hardest by law enforcement, and can provide additional income to some land users. The magnitude of the tradeoff varies in space, depending on deforestation patterns, conservation opportunity and enforcement costs. Enforcement effectiveness becomes a key determinant of efficiency in the overall policy mix.

  17. Probabilistic modeling and global sensitivity analysis for CO 2 storage in geological formations: a spectral approach

    KAUST Repository

    Saad, Bilal Mohammed

    2017-09-18

    This work focuses on the simulation of CO2 storage in deep underground formations under uncertainty and seeks to understand the impact of uncertainties in reservoir properties on CO2 leakage. To simulate the process, a non-isothermal two-phase two-component flow system with equilibrium phase exchange is used. Since model evaluations are computationally intensive, instead of traditional Monte Carlo methods, we rely on polynomial chaos (PC) expansions for representation of the stochastic model response. A non-intrusive approach is used to determine the PC coefficients. We establish the accuracy of the PC representations within a reasonable error threshold through systematic convergence studies. In addition to characterizing the distributions of model observables, we compute probabilities of excess CO2 leakage. Moreover, we consider the injection rate as a design parameter and compute an optimum injection rate that ensures that the risk of excess pressure buildup at the leaky well remains below acceptable levels. We also provide a comprehensive analysis of sensitivities of CO2 leakage, where we compute the contributions of the random parameters, and their interactions, to the variance by computing first, second, and total order Sobol’ indices.

  18. Probabilistic approach: back pressure turbine for geothermal vapor-dominated system

    Science.gov (United States)

    Alfandi Ahmad, Angga; Xaverius Guwowijoyo, Fransiscus; Pratama, Heru Berian

    2017-12-01

    Geothermal bussiness nowadays needs to be accelerated in a way that profit can be obtained as soon as reasonable possible. One of the many ways to do this is by using one of geothermal wellhead generating unit (GWGU), called backpressure turbine. Backpressure turbine can be used in producing electricity as soon as there is productive or rather small-scale productive well existed after finished drilling. In a vapor dominated system, steam fraction in the wellhead capable to produce electricity based on each well productivity immediately. The advantage for using vapor dominated system is reduce brine disposal in the wellhead so it will be a cost benefit in operation. The design and calculation for backpressure turbine will use probablistic approach with Monte Carlo simulation. The parameter that will be evaluated in sensitivity would be steam flow rate, turbine inlet pressure, and turbine exhaust pressure/atmospheric pressure. The result are probability for P10, P50, and P90 of gross power output which are 1.78 MWe, 2.22 MWe and 2.66 Mwe respectively. Whereas the P10, P50, and P90 of SSC are 4.67 kg/s/MWe, 5.19 kg/s/MWe and 5.78 kg/s/MWe respectively.

  19. A long journey from minimum inhibitory concentration testing to clinically predictive breakpoints: deterministic and probabilistic approaches in deriving breakpoints.

    Science.gov (United States)

    Dalhoff, A; Ambrose, P G; Mouton, J W

    2009-08-01

    Since the origin of an "'International Collaborative Study on Antibiotic Sensitivity Testing'" in 1971, considerable advancement has been made to standardize clinical susceptibility testing procedures of antimicrobial agents. However, a consensus on the methods to be used and interpretive criteria was not reached, so the results of susceptibility testing were discrepant. Recently, the European Committee on Antimicrobial Susceptibility Testing achieved a harmonization of existing methods for susceptibility testing and now co-ordinates the process for setting breakpoints. Previously, breakpoints were set by adjusting the mean pharmacokinetic parameters derived from healthy volunteers to the susceptibilities of a population of potential pathogens expressed as the mean minimum inhibitory concentration (MIC) or MIC90%. Breakpoints derived by the deterministic approach tend to be too high, since this procedure does not take the variabilities of drug exposure and the susceptibility patterns into account. Therefore, first-step mutants or borderline susceptible bacteria may be considered as fully susceptible. As the drug exposure of such sub-populations is inadequate, resistance development will increase and eradication rates will decrease, resulting in clinical failure. The science of pharmacokinetics/pharmacodynamics integrates all possible drug exposures for standard dosage regimens and all MIC values likely to be found for the clinical isolates into the breakpoint definitions. Ideally, the data sets used originate from patients suffering from the disease to be treated. Probability density functions for both the pharmacokinetic and microbiological variables are determined, and a large number of MIC/drug exposure scenarios are calculated. Therefore, this method is defined as the probabilistic approach. The breakpoints thus derived are lower than the ones defined deterministically, as the entire range of probable drug exposures from low to high is modeled. Therefore, the

  20. Analysis of overdispersed count data by mixtures of Poisson variables and Poisson processes.

    Science.gov (United States)

    Hougaard, P; Lee, M L; Whitmore, G A

    1997-12-01

    Count data often show overdispersion compared to the Poisson distribution. Overdispersion is typically modeled by a random effect for the mean, based on the gamma distribution, leading to the negative binomial distribution for the count. This paper considers a larger family of mixture distributions, including the inverse Gaussian mixture distribution. It is demonstrated that it gives a significantly better fit for a data set on the frequency of epileptic seizures. The same approach can be used to generate counting processes from Poisson processes, where the rate or the time is random. A random rate corresponds to variation between patients, whereas a random time corresponds to variation within patients.

  1. Probabilistic linguistics

    NARCIS (Netherlands)

    Bod, R.; Heine, B.; Narrog, H.

    2010-01-01

    Probabilistic linguistics takes all linguistic evidence as positive evidence and lets statistics decide. It allows for accurate modelling of gradient phenomena in production and perception, and suggests that rule-like behaviour is no more than a side effect of maximizing probability. This chapter

  2. Probabilistic Design

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Burcharth, H. F.

    This chapter describes how partial safety factors can be used in design of vertical wall breakwaters and an example of a code format is presented. The partial safety factors are calibrated on a probabilistic basis. The code calibration process used to calibrate some of the partial safety factors...

  3. PROBABILIST ANTIREALISM

    NARCIS (Netherlands)

    Douven, Igor; Horsten, Leon; Romeijn, Jan-Willem

    Until now, antirealists have offered sketches of a theory of truth, at best. In this paper, we present a probabilist account of antirealist truth in some formal detail, and we assess its ability to deal with the problems that are standardly taken to beset antirealism.

  4. Probabilistic approach to rock fall hazard assessment: potential of historical data analysis

    Directory of Open Access Journals (Sweden)

    C. Dussauge-Peisser

    2002-01-01

    Full Text Available We study the rock fall volume distribution for three rock fall inventories and we fit the observed data by a power-law distribution, which has recently been proposed to describe landslide and rock fall volume distributions, and is also observed for many other natural phenomena, such as volcanic eruptions or earthquakes. We use these statistical distributions of past events to estimate rock fall occurrence rates on the studied areas. It is an alternative to deterministic approaches, which have not proved successful in predicting individual rock falls. The first one concerns calcareous cliffs around Grenoble, French Alps, from 1935 to 1995. The second data set is gathered during the 1912–1992 time window in Yosemite Valley, USA, in granite cliffs. The third one covers the 1954–1976 period in the Arly gorges, French Alps, with metamorphic and sedimentary rocks. For the three data sets, we find a good agreement between the observed volume distributions and a fit by a power-law distribution for volumes larger than 50 m3 , or 20 m3 for the Arly gorges. We obtain similar values of the b exponent close to 0.45 for the 3 data sets. In agreement with previous studies, this suggests, that the b value is not dependant on the geological settings. Regarding the rate of rock fall activity, determined as the number of rock fall events with volume larger than 1 m3 per year, we find a large variability from one site to the other. The rock fall activity, as part of a local erosion rate, is thus spatially dependent. We discuss the implications of these observations for the rock fall hazard evaluation. First, assuming that the volume distributions are temporally stable, a complete rock fall inventory allows for the prediction of recurrence rates for future events of a given volume in the range of the observed historical data. Second, assuming that the observed volume distribution follows a power-law distribution without cutoff at small or large scales, we can

  5. Probabilistic approach to rock fall hazard assessment: potential of historical data analysis

    Science.gov (United States)

    Dussauge-Peisser, C.; Helmstetter, A.; Grasso, J.-R.; Hantz, D.; Desvarreux, P.; Jeannin, M.; Giraud, A.

    We study the rock fall volume distribution for three rock fall inventories and we fit the observed data by a power-law distribution, which has recently been proposed to describe landslide and rock fall volume distributions, and is also observed for many other natural phenomena, such as volcanic eruptions or earthquakes. We use these statistical distributions of past events to estimate rock fall occurrence rates on the studied areas. It is an alternative to deterministic approaches, which have not proved successful in predicting individual rock falls. The first one concerns calcareous cliffs around Grenoble, French Alps, from 1935 to 1995. The second data set is gathered during the 1912-1992 time window in Yosemite Valley, USA, in granite cliffs. The third one covers the 1954-1976 period in the Arly gorges, French Alps, with metamorphic and sedimentary rocks. For the three data sets, we find a good agreement between the observed volume distributions and a fit by a power-law distribution for volumes larger than 50 m3 , or 20 m3 for the Arly gorges. We obtain similar values of the b exponent close to 0.45 for the 3 data sets. In agreement with previous studies, this suggests, that the b value is not dependant on the geological settings. Regarding the rate of rock fall activity, determined as the number of rock fall events with volume larger than 1 m3 per year, we find a large variability from one site to the other. The rock fall activity, as part of a local erosion rate, is thus spatially dependent. We discuss the implications of these observations for the rock fall hazard evaluation. First, assuming that the volume distributions are temporally stable, a complete rock fall inventory allows for the prediction of recurrence rates for future events of a given volume in the range of the observed historical data. Second, assuming that the observed volume distribution follows a power-law distribution without cutoff at small or large scales, we can extrapolate these

  6. A fuzzy-based reliability approach to evaluate basic events of fault tree analysis for nuclear power plant probabilistic safety assessment

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry

    2014-01-01

    Highlights: • We propose a fuzzy-based reliability approach to evaluate basic event reliabilities. • It implements the concepts of failure possibilities and fuzzy sets. • Experts evaluate basic event failure possibilities using qualitative words. • Triangular fuzzy numbers mathematically represent qualitative failure possibilities. • It is a very good alternative for conventional reliability approach. - Abstract: Fault tree analysis has been widely utilized as a tool for nuclear power plant probabilistic safety assessment. This analysis can be completed only if all basic events of the system fault tree have their quantitative failure rates or failure probabilities. However, it is difficult to obtain those failure data due to insufficient data, environment changing or new components. This study proposes a fuzzy-based reliability approach to evaluate basic events of system fault trees whose failure precise probability distributions of their lifetime to failures are not available. It applies the concept of failure possibilities to qualitatively evaluate basic events and the concept of fuzzy sets to quantitatively represent the corresponding failure possibilities. To demonstrate the feasibility and the effectiveness of the proposed approach, the actual basic event failure probabilities collected from the operational experiences of the David–Besse design of the Babcock and Wilcox reactor protection system fault tree are used to benchmark the failure probabilities generated by the proposed approach. The results confirm that the proposed fuzzy-based reliability approach arises as a suitable alternative for the conventional probabilistic reliability approach when basic events do not have the corresponding quantitative historical failure data for determining their reliability characteristics. Hence, it overcomes the limitation of the conventional fault tree analysis for nuclear power plant probabilistic safety assessment

  7. Coordination of Conditional Poisson Samples

    Directory of Open Access Journals (Sweden)

    Grafström Anton

    2015-12-01

    Full Text Available Sample coordination seeks to maximize or to minimize the overlap of two or more samples. The former is known as positive coordination, and the latter as negative coordination. Positive coordination is mainly used for estimation purposes and to reduce data collection costs. Negative coordination is mainly performed to diminish the response burden of the sampled units. Poisson sampling design with permanent random numbers provides an optimum coordination degree of two or more samples. The size of a Poisson sample is, however, random. Conditional Poisson (CP sampling is a modification of the classical Poisson sampling that produces a fixed-size πps sample. We introduce two methods to coordinate Conditional Poisson samples over time or simultaneously. The first one uses permanent random numbers and the list-sequential implementation of CP sampling. The second method uses a CP sample in the first selection and provides an approximate one in the second selection because the prescribed inclusion probabilities are not respected exactly. The methods are evaluated using the size of the expected sample overlap, and are compared with their competitors using Monte Carlo simulation. The new methods provide a good coordination degree of two samples, close to the performance of Poisson sampling with permanent random numbers.

  8. Background stratified Poisson regression analysis of cohort data.

    Science.gov (United States)

    Richardson, David B; Langholz, Bryan

    2012-03-01

    Background stratified Poisson regression is an approach that has been used in the analysis of data derived from a variety of epidemiologically important studies of radiation-exposed populations, including uranium miners, nuclear industry workers, and atomic bomb survivors. We describe a novel approach to fit Poisson regression models that adjust for a set of covariates through background stratification while directly estimating the radiation-disease association of primary interest. The approach makes use of an expression for the Poisson likelihood that treats the coefficients for stratum-specific indicator variables as 'nuisance' variables and avoids the need to explicitly estimate the coefficients for these stratum-specific parameters. Log-linear models, as well as other general relative rate models, are accommodated. This approach is illustrated using data from the Life Span Study of Japanese atomic bomb survivors and data from a study of underground uranium miners. The point estimate and confidence interval obtained from this 'conditional' regression approach are identical to the values obtained using unconditional Poisson regression with model terms for each background stratum. Moreover, it is shown that the proposed approach allows estimation of background stratified Poisson regression models of non-standard form, such as models that parameterize latency effects, as well as regression models in which the number of strata is large, thereby overcoming the limitations of previously available statistical software for fitting background stratified Poisson regression models.

  9. Background stratified Poisson regression analysis of cohort data

    International Nuclear Information System (INIS)

    Richardson, David B.; Langholz, Bryan

    2012-01-01

    Background stratified Poisson regression is an approach that has been used in the analysis of data derived from a variety of epidemiologically important studies of radiation-exposed populations, including uranium miners, nuclear industry workers, and atomic bomb survivors. We describe a novel approach to fit Poisson regression models that adjust for a set of covariates through background stratification while directly estimating the radiation-disease association of primary interest. The approach makes use of an expression for the Poisson likelihood that treats the coefficients for stratum-specific indicator variables as 'nuisance' variables and avoids the need to explicitly estimate the coefficients for these stratum-specific parameters. Log-linear models, as well as other general relative rate models, are accommodated. This approach is illustrated using data from the Life Span Study of Japanese atomic bomb survivors and data from a study of underground uranium miners. The point estimate and confidence interval obtained from this 'conditional' regression approach are identical to the values obtained using unconditional Poisson regression with model terms for each background stratum. Moreover, it is shown that the proposed approach allows estimation of background stratified Poisson regression models of non-standard form, such as models that parameterize latency effects, as well as regression models in which the number of strata is large, thereby overcoming the limitations of previously available statistical software for fitting background stratified Poisson regression models. (orig.)

  10. Background stratified Poisson regression analysis of cohort data

    Energy Technology Data Exchange (ETDEWEB)

    Richardson, David B. [University of North Carolina at Chapel Hill, Department of Epidemiology, School of Public Health, Chapel Hill, NC (United States); Langholz, Bryan [Keck School of Medicine, University of Southern California, Division of Biostatistics, Department of Preventive Medicine, Los Angeles, CA (United States)

    2012-03-15

    Background stratified Poisson regression is an approach that has been used in the analysis of data derived from a variety of epidemiologically important studies of radiation-exposed populations, including uranium miners, nuclear industry workers, and atomic bomb survivors. We describe a novel approach to fit Poisson regression models that adjust for a set of covariates through background stratification while directly estimating the radiation-disease association of primary interest. The approach makes use of an expression for the Poisson likelihood that treats the coefficients for stratum-specific indicator variables as 'nuisance' variables and avoids the need to explicitly estimate the coefficients for these stratum-specific parameters. Log-linear models, as well as other general relative rate models, are accommodated. This approach is illustrated using data from the Life Span Study of Japanese atomic bomb survivors and data from a study of underground uranium miners. The point estimate and confidence interval obtained from this 'conditional' regression approach are identical to the values obtained using unconditional Poisson regression with model terms for each background stratum. Moreover, it is shown that the proposed approach allows estimation of background stratified Poisson regression models of non-standard form, such as models that parameterize latency effects, as well as regression models in which the number of strata is large, thereby overcoming the limitations of previously available statistical software for fitting background stratified Poisson regression models. (orig.)

  11. Application of probabilistic analysis/design methods in space programs - The approaches, the status, and the needs

    Science.gov (United States)

    Ryan, Robert S.; Townsend, John S.

    1993-01-01

    The prospective improvement of probabilistic methods for space program analysis/design entails the further development of theories, codes, and tools which match specific areas of application, the drawing of lessons from previous uses of probability and statistics data bases, the enlargement of data bases (especially in the field of structural failures), and the education of engineers and managers on the advantages of these methods. An evaluation is presently made of the current limitations of probabilistic engineering methods. Recommendations are made for specific applications.

  12. Hazardous waste transportation risk assessment: Benefits of a combined deterministic and probabilistic Monte Carlo approach in expressing risk uncertainty

    International Nuclear Information System (INIS)

    Policastro, A.J.; Lazaro, M.A.; Cowen, M.A.; Hartmann, H.M.; Dunn, W.E.; Brown, D.F.

    1995-01-01

    This paper presents a combined deterministic and probabilistic methodology for modeling hazardous waste transportation risk and expressing the uncertainty in that risk. Both the deterministic and probabilistic methodologies are aimed at providing tools useful in the evaluation of alternative management scenarios for US Department of Energy (DOE) hazardous waste treatment, storage, and disposal (TSD). The probabilistic methodology can be used to provide perspective on and quantify uncertainties in deterministic predictions. The methodology developed has been applied to 63 DOE shipments made in fiscal year 1992, which contained poison by inhalation chemicals that represent an inhalation risk to the public. Models have been applied to simulate shipment routes, truck accident rates, chemical spill probabilities, spill/release rates, dispersion, population exposure, and health consequences. The simulation presented in this paper is specific to trucks traveling from DOE sites to their commercial TSD facilities, but the methodology is more general. Health consequences are presented as the number of people with potentially life-threatening health effects. Probabilistic distributions were developed (based on actual item data) for accident release amounts, time of day and season of the accident, and meteorological conditions

  13. Memristive Probabilistic Computing

    KAUST Repository

    Alahmadi, Hamzah

    2017-10-01

    In the era of Internet of Things and Big Data, unconventional techniques are rising to accommodate the large size of data and the resource constraints. New computing structures are advancing based on non-volatile memory technologies and different processing paradigms. Additionally, the intrinsic resiliency of current applications leads to the development of creative techniques in computations. In those applications, approximate computing provides a perfect fit to optimize the energy efficiency while compromising on the accuracy. In this work, we build probabilistic adders based on stochastic memristor. Probabilistic adders are analyzed with respect of the stochastic behavior of the underlying memristors. Multiple adder implementations are investigated and compared. The memristive probabilistic adder provides a different approach from the typical approximate CMOS adders. Furthermore, it allows for a high area saving and design exibility between the performance and power saving. To reach a similar performance level as approximate CMOS adders, the memristive adder achieves 60% of power saving. An image-compression application is investigated using the memristive probabilistic adders with the performance and the energy trade-off.

  14. Poisson Plus Quantification for Digital PCR Systems.

    Science.gov (United States)

    Majumdar, Nivedita; Banerjee, Swapnonil; Pallas, Michael; Wessel, Thomas; Hegerich, Patricia

    2017-08-29

    Digital PCR, a state-of-the-art nucleic acid quantification technique, works by spreading the target material across a large number of partitions. The average number of molecules per partition is estimated using Poisson statistics, and then converted into concentration by dividing by partition volume. In this standard approach, identical partition sizing is assumed. Violations of this assumption result in underestimation of target quantity, when using Poisson modeling, especially at higher concentrations. The Poisson-Plus Model accommodates for this underestimation, if statistics of the volume variation are well characterized. The volume variation was measured on the chip array based QuantStudio 3D Digital PCR System using the ROX fluorescence level as a proxy for effective load volume per through-hole. Monte Carlo simulations demonstrate the efficacy of the proposed correction. Empirical measurement of model parameters characterizing the effective load volume on QuantStudio 3D Digital PCR chips is presented. The model was used to analyze digital PCR experiments and showed improved accuracy in quantification. At the higher concentrations, the modeling must take effective fill volume variation into account to produce accurate estimates. The extent of the difference from the standard to the new modeling is positively correlated to the extent of fill volume variation in the effective load of your reactions.

  15. Nuclear safety and probabilistic methods

    International Nuclear Information System (INIS)

    Tanguy, Pierre

    1976-01-01

    Having first recalled the principles of conventional methodology concerning nuclear safety, the probabilistic approach is defined, as it has been elaborated by Dr Farmer. The basic rules which determined the elaboration of the Rasmussen report as well as the main conclusions of this report are commented. Definition of the evolution prospects - possible and advisable - of the probabilistic method as concerns nuclear safety are defined [fr

  16. Graded geometry and Poisson reduction

    OpenAIRE

    Cattaneo, A S; Zambon, M

    2009-01-01

    The main result of [2] extends the Marsden-Ratiu reduction theorem [4] in Poisson geometry, and is proven by means of graded geometry. In this note we provide the background material about graded geometry necessary for the proof in [2]. Further, we provide an alternative algebraic proof for the main result. ©2009 American Institute of Physics

  17. Applying probabilistic temporal and multisite data quality control methods to a public health mortality registry in Spain: a systematic approach to quality control of repositories.

    Science.gov (United States)

    Sáez, Carlos; Zurriaga, Oscar; Pérez-Panadés, Jordi; Melchor, Inma; Robles, Montserrat; García-Gómez, Juan M

    2016-11-01

    To assess the variability in data distributions among data sources and over time through a case study of a large multisite repository as a systematic approach to data quality (DQ). Novel probabilistic DQ control methods based on information theory and geometry are applied to the Public Health Mortality Registry of the Region of Valencia, Spain, with 512 143 entries from 2000 to 2012, disaggregated into 24 health departments. The methods provide DQ metrics and exploratory visualizations for (1) assessing the variability among multiple sources and (2) monitoring and exploring changes with time. The methods are suited to big data and multitype, multivariate, and multimodal data. The repository was partitioned into 2 probabilistically separated temporal subgroups following a change in the Spanish National Death Certificate in 2009. Punctual temporal anomalies were noticed due to a punctual increment in the missing data, along with outlying and clustered health departments due to differences in populations or in practices. Changes in protocols, differences in populations, biased practices, or other systematic DQ problems affected data variability. Even if semantic and integration aspects are addressed in data sharing infrastructures, probabilistic variability may still be present. Solutions include fixing or excluding data and analyzing different sites or time periods separately. A systematic approach to assessing temporal and multisite variability is proposed. Multisite and temporal variability in data distributions affects DQ, hindering data reuse, and an assessment of such variability should be a part of systematic DQ procedures. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. A tiered approach for probabilistic ecological risk assessment of contaminated sites; Un approccio multilivello per l'analisi probabilistica di rischio ecologico di siti contaminati

    Energy Technology Data Exchange (ETDEWEB)

    Zolezzi, M. [Fisia Italimpianti SpA, Genova (Italy); Nicolella, C. [Pisa Univ., Pisa (Italy). Dipartimento di ingegneria chimica, chimica industriale e scienza dei materiali; Tarazona, J.V. [Instituto Nacional de Investigacion y Tecnologia Agraria y Alimentaria, Madrid (Spain). Departamento de Medio Ambiente, Laboratorio de toxicologia

    2005-09-15

    This paper presents a tiered methodology for probabilistic ecological risk assessment. The proposed approach starts from deterministic comparison (ratio) of single exposure concentration and threshold or safe level calculated from a dose-response relationship, goes through comparison of probabilistic distributions that describe exposure values and toxicological responses of organisms to the chemical of concern, and finally determines the so called distribution-based quotients (DBQs). In order to illustrate the proposed approach, soil concentrations of 1,2,4-trichlorobenzene (1,2,4- TCB) measured in an industrial contaminated site were used for site-specific probabilistic ecological risks assessment. By using probabilistic distributions, the risk, which exceeds a level of concern for soil organisms with the deterministic approach, is associated to the presence of hot spots reaching concentrations able to affect acutely more than 50% of the soil species, while the large majority of the area presents 1,2,4- TCB concentrations below those reported as toxic. [Italian] Scopo del presente studio e fornire una procedura per l'analisi di rischio ecologico di siti contaminati basata su livelli successivi di approfondimento. L'approccio proposto, partendo dal semplice rapporto deterministico tra un livello di esposizione ed un valore di effetto che consenta la salvaguardia del maggior numero di specie dell'ecosistema considerato, procede attraverso il confronto tra le distribuzioni statistiche dei valori di esposizione e di sensitivita delle specie, per determinare infine la distribuzione probabilistica del quoziente di rischio. Ai fini di illustrare la metodologia proposta, le concentrazioni di 1,2,4-triclorobenzene determinate nel suolo di un sito industriale contaminato sono state utilizzate per condurre l'analisi di rischio per le specie terrestri. L'utilizzo delle distribuzioni probabilistiche ha permesso di associare il rischio, inizialmente

  19. Metabolic network segmentation: A probabilistic graphical modeling approach to identify the sites and sequential order of metabolic regulation from non-targeted metabolomics data.

    Directory of Open Access Journals (Sweden)

    Andreas Kuehne

    2017-06-01

    Full Text Available In recent years, the number of large-scale metabolomics studies on various cellular processes in different organisms has increased drastically. However, it remains a major challenge to perform a systematic identification of mechanistic regulatory events that mediate the observed changes in metabolite levels, due to complex interdependencies within metabolic networks. We present the metabolic network segmentation (MNS algorithm, a probabilistic graphical modeling approach that enables genome-scale, automated prediction of regulated metabolic reactions from differential or serial metabolomics data. The algorithm sections the metabolic network into modules of metabolites with consistent changes. Metabolic reactions that connect different modules are the most likely sites of metabolic regulation. In contrast to most state-of-the-art methods, the MNS algorithm is independent of arbitrary pathway definitions, and its probabilistic nature facilitates assessments of noisy and incomplete measurements. With serial (i.e., time-resolved data, the MNS algorithm also indicates the sequential order of metabolic regulation. We demonstrated the power and flexibility of the MNS algorithm with three, realistic case studies with bacterial and human cells. Thus, this approach enables the identification of mechanistic regulatory events from large-scale metabolomics data, and contributes to the understanding of metabolic processes and their interplay with cellular signaling and regulation processes.

  20. Periodic Poisson Solver for Particle Tracking

    International Nuclear Information System (INIS)

    Dohlus, M.; Henning, C.

    2015-05-01

    A method is described to solve the Poisson problem for a three dimensional source distribution that is periodic into one direction. Perpendicular to the direction of periodicity a free space (or open) boundary is realized. In beam physics, this approach allows to calculate the space charge field of a continualized charged particle distribution with periodic pattern. The method is based on a particle mesh approach with equidistant grid and fast convolution with a Green's function. The periodic approach uses only one period of the source distribution, but a periodic extension of the Green's function. The approach is numerically efficient and allows the investigation of periodic- and pseudo-periodic structures with period lengths that are small compared to the source dimensions, for instance of laser modulated beams or of the evolution of micro bunch structures. Applications for laser modulated beams are given.

  1. A Robust Optimisation Approach using CVaR for Unit Commitment in a Market with Probabilistic Offers

    DEFF Research Database (Denmark)

    Bukhsh, W. A.; Papakonstantinou, Athanasios; Pinson, Pierre

    2016-01-01

    The large scale integration of renewable energy sources (RES) challenges power system planners and operators alike as it can potentially introduce the need for costly investments in infrastructure. Furthermore, traditional market clearing mechanisms are no longer optimal due to the stochastic...... nature of RES. This paper presents a risk-aware market clearing strategy for a network with significant shares of RES. We propose an electricity market that embeds the uncertainty brought by wind power and other stochastic renewable sources by accepting probabilistic offers and use a risk measure defined...

  2. A comparative study of the probabilistic fracture mechanics and the stochastic Markovian process approaches for structural reliability assessment

    Energy Technology Data Exchange (ETDEWEB)

    Stavrakakis, G.; Lucia, A.C.; Solomos, G. (Commission of the European Communities, Ispra (Italy). Joint Research Centre)

    1990-01-01

    The two computer codes COVASTOL and RELIEF, developed for the modeling of cumulative damage processes in the framework of probabilistic structural reliability, are compared. They are based respectively on the randomisation of a differential crack growth law and on the theory of discrete Markov processes. The codes are applied for fatigue crack growth predictions using two sets of data of crack propagation curves from specimens. The results are critically analyzed and an extensive discussion follows on the merits and limitations of each code. Their transferability for the reliability assessment of real structures is investigated. (author).

  3. Efficient maximal Poisson-disk sampling and remeshing on surfaces

    KAUST Repository

    Guo, Jianwei

    2015-02-01

    Poisson-disk sampling is one of the fundamental research problems in computer graphics that has many applications. In this paper, we study the problem of maximal Poisson-disk sampling on mesh surfaces. We present a simple approach that generalizes the 2D maximal sampling framework to surfaces. The key observation is to use a subdivided mesh as the sampling domain for conflict checking and void detection. Our approach improves the state-of-the-art approach in efficiency, quality and the memory consumption.

  4. A probabilistic modeling approach to assess human inhalation exposure risks to airborne aflatoxin B 1 (AFB 1)

    Science.gov (United States)

    Liao, Chung-Min; Chen, Szu-Chieh

    To assess how the human lung exposure to airborne aflatoxin B 1 (AFB 1) during on-farm activities including swine feeding, storage bin cleaning, corn harvest, and grain elevator loading/unloading, we present a probabilistic risk model, appraised with empirical data. The model integrates probabilistic exposure profiles from a compartmental lung model with the reconstructed dose-response relationships based on an empirical three-parameter Hill equation model, describing AFB 1 cytotoxicity for inhibition response in human bronchial epithelial cells, to quantitatively estimate the inhalation exposure risks. The risk assessment results implicate that exposure to airborne AFB 1 may pose no significance to corn harvest and grain elevator loading/unloading activities, yet a relatively high risk for swine feeding and storage bin cleaning. Applying a joint probability function method based on exceedence profiles, we estimate that a potential high risk for the bronchial region (inhibition=56.69% with 95% confidence interval (CI): 35.05-72.87%) and bronchiolar region (inhibition=44.93% with 95% CI: 21.61 - 66.78%) is alarming during swine feeding activity. We parameterized the proposed predictive model that should encourage a risk-management framework for discussion of carcinogenic risk in occupational settings where inhalation of AFB 1-contaminated dust occurs.

  5. Independent production and Poisson distribution

    International Nuclear Information System (INIS)

    Golokhvastov, A.I.

    1994-01-01

    The well-known statement of factorization of inclusive cross-sections in case of independent production of particles (or clusters, jets etc.) and the conclusion of Poisson distribution over their multiplicity arising from it do not follow from the probability theory in any way. Using accurately the theorem of the product of independent probabilities, quite different equations are obtained and no consequences relative to multiplicity distributions are obtained. 11 refs

  6. Some probabilistic properties of fractional point processes

    KAUST Repository

    Garra, Roberto

    2017-05-16

    In this article, the first hitting times of generalized Poisson processes N-f (t), related to Bernstein functions f are studied. For the spacefractional Poisson processes, N alpha (t), t > 0 ( corresponding to f = x alpha), the hitting probabilities P{T-k(alpha) < infinity} are explicitly obtained and analyzed. The processes N-f (t) are time-changed Poisson processes N( H-f (t)) with subordinators H-f (t) and here we study N(Sigma H-n(j= 1)f j (t)) and obtain probabilistic features of these extended counting processes. A section of the paper is devoted to processes of the form N( G(H,v) (t)) where G(H,v) (t) are generalized grey Brownian motions. This involves the theory of time-dependent fractional operators of the McBride form. While the time-fractional Poisson process is a renewal process, we prove that the space-time Poisson process is no longer a renewal process.

  7. Probabilistic liver atlas construction.

    Science.gov (United States)

    Dura, Esther; Domingo, Juan; Ayala, Guillermo; Marti-Bonmati, Luis; Goceri, E

    2017-01-13

    Anatomical atlases are 3D volumes or shapes representing an organ or structure of the human body. They contain either the prototypical shape of the object of interest together with other shapes representing its statistical variations (statistical atlas) or a probability map of belonging to the object (probabilistic atlas). Probabilistic atlases are mostly built with simple estimations only involving the data at each spatial location. A new method for probabilistic atlas construction that uses a generalized linear model is proposed. This method aims to improve the estimation of the probability to be covered by the liver. Furthermore, all methods to build an atlas involve previous coregistration of the sample of shapes available. The influence of the geometrical transformation adopted for registration in the quality of the final atlas has not been sufficiently investigated. The ability of an atlas to adapt to a new case is one of the most important quality criteria that should be taken into account. The presented experiments show that some methods for atlas construction are severely affected by the previous coregistration step. We show the good performance of the new approach. Furthermore, results suggest that extremely flexible registration methods are not always beneficial, since they can reduce the variability of the atlas and hence its ability to give sensible values of probability when used as an aid in segmentation of new cases.

  8. Development of system based code for integrity of FBR. Fundamental probabilistic approach, Part 1: Model calculation of creep-fatigue damage (Research report)

    International Nuclear Information System (INIS)

    Kawasaki, Nobuchika; Asayama, Tai

    2001-09-01

    Both reliability and safety have to be further improved for the successful commercialization of FBRs. At the same time, construction and operation costs need to be reduced to a same level of future LWRs. To realize compatibility among reliability, safety and, cost, the Structural Mechanics Research Group in JNC started the development of System Based Code for Integrity of FBR. This code extends the present structural design standard to include the areas of fabrication, installation, plant system design, safety design, operation and maintenance, and so on. A quantitative index is necessary to connect different partial standards in this code. Failure probability is considered as a candidate index. Therefore we decided to make a model calculation using failure probability and judge its applicability. We first investigated other probabilistic standards like ASME Code Case N-578. A probabilistic approach in the structural integrity evaluation was created based on these results, and also an evaluation flow was proposed. According to this flow, a model calculation of creep-fatigue damage was performed. This trial calculation was for a vessel in a sodium-cooled FBR. As the result of this model calculation, a crack initiation probability and a crack penetration probability were found to be effective indices. Last we discussed merits of this System Based Code, which are presented in this report. Furthermore, this report presents future development tasks. (author)

  9. The exposure of Sydney (Australia) to earthquake-generated tsunamis, storms and sea level rise: a probabilistic multi-hazard approach.

    Science.gov (United States)

    Dall'Osso, F; Dominey-Howes, D; Moore, C; Summerhayes, S; Withycombe, G

    2014-12-10

    Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney.

  10. Efficient triangulation of Poisson-disk sampled point sets

    KAUST Repository

    Guo, Jianwei

    2014-05-06

    In this paper, we present a simple yet efficient algorithm for triangulating a 2D input domain containing a Poisson-disk sampled point set. The proposed algorithm combines a regular grid and a discrete clustering approach to speedup the triangulation. Moreover, our triangulation algorithm is flexible and performs well on more general point sets such as adaptive, non-maximal Poisson-disk sets. The experimental results demonstrate that our algorithm is robust for a wide range of input domains and achieves significant performance improvement compared to the current state-of-the-art approaches. © 2014 Springer-Verlag Berlin Heidelberg.

  11. A probabilistic and individualized approach for predicting treatment gains: an extension and application to anxiety disordered youth.

    Science.gov (United States)

    Beidas, Rinad S; Lindhiem, Oliver; Brodman, Douglas M; Swan, Anna; Carper, Matthew; Cummings, Colleen; Kendall, Philip C; Albano, Anne Marie; Rynn, Moira; Piacentini, John; McCracken, James; Compton, Scott N; March, John; Walkup, John; Ginsburg, Golda; Keeton, Courtney P; Birmaher, Boris; Sakolsky, Dara; Sherrill, Joel

    2014-01-01

    The objective of this study was to extend the probability of treatment benefit method by adding treatment condition as a stratifying variable, and illustrate this extension of the methodology using the Child and Adolescent Anxiety Multimodal Study data. The probability of treatment benefit method produces a simple and practical way to predict individualized treatment benefit based on pretreatment patient characteristics. Two pretreatment patient characteristics were selected in the production of the probability of treatment benefit charts: baseline anxiety severity, measured by the Pediatric Anxiety Rating Scale, and treatment condition (cognitive-behavioral therapy, sertraline, their combination, and placebo). We produced two charts as exemplars which provide individualized and probabilistic information for treatment response and outcome to treatments for child anxiety. We discuss the implications of the use of the probability of treatment benefit method, particularly with regard to patient-centered outcomes and individualized decision-making in psychology and psychiatry. Copyright © 2013. Published by Elsevier Ltd.

  12. A High Performance Computing Approach to Tree Cover Delineation in 1-m NAIP Imagery using a Probabilistic Learning Framework

    Science.gov (United States)

    Basu, S.; Ganguly, S.; Michaelis, A.; Votava, P.; Roy, A.; Mukhopadhyay, S.; Nemani, R. R.

    2015-12-01

    Tree cover delineation is a useful instrument in deriving Above Ground Biomass (AGB) density estimates from Very High Resolution (VHR) airborne imagery data. Numerous algorithms have been designed to address this problem, but most of them do not scale to these datasets which are of the order of terabytes. In this paper, we present a semi-automated probabilistic framework for the segmentation and classification of 1-m National Agriculture Imagery Program (NAIP) for tree-cover delineation for the whole of Continental United States, using a High Performance Computing Architecture. Classification is performed using a multi-layer Feedforward Backpropagation Neural Network and segmentation is performed using a Statistical Region Merging algorithm. The results from the classification and segmentation algorithms are then consolidated into a structured prediction framework using a discriminative undirected probabilistic graphical model based on Conditional Random Field, which helps in capturing the higher order contextual dependencies between neighboring pixels. Once the final probability maps are generated, the framework is updated and re-trained by relabeling misclassified image patches. This leads to a significant improvement in the true positive rates and reduction in false positive rates. The tree cover maps were generated for the whole state of California, spanning a total of 11,095 NAIP tiles covering a total geographical area of 163,696 sq. miles. The framework produced true positive rates of around 88% for fragmented forests and 74% for urban tree cover areas, with false positive rates lower than 2% for both landscapes. Comparative studies with the National Land Cover Data (NLCD) algorithm and the LiDAR canopy height model (CHM) showed the effectiveness of our framework for generating accurate high-resolution tree-cover maps.

  13. A High Performance Computing Approach to Tree Cover Delineation in 1-m NAIP Imagery Using a Probabilistic Learning Framework

    Science.gov (United States)

    Basu, Saikat; Ganguly, Sangram; Michaelis, Andrew; Votava, Petr; Roy, Anshuman; Mukhopadhyay, Supratik; Nemani, Ramakrishna

    2015-01-01

    Tree cover delineation is a useful instrument in deriving Above Ground Biomass (AGB) density estimates from Very High Resolution (VHR) airborne imagery data. Numerous algorithms have been designed to address this problem, but most of them do not scale to these datasets, which are of the order of terabytes. In this paper, we present a semi-automated probabilistic framework for the segmentation and classification of 1-m National Agriculture Imagery Program (NAIP) for tree-cover delineation for the whole of Continental United States, using a High Performance Computing Architecture. Classification is performed using a multi-layer Feedforward Backpropagation Neural Network and segmentation is performed using a Statistical Region Merging algorithm. The results from the classification and segmentation algorithms are then consolidated into a structured prediction framework using a discriminative undirected probabilistic graphical model based on Conditional Random Field, which helps in capturing the higher order contextual dependencies between neighboring pixels. Once the final probability maps are generated, the framework is updated and re-trained by relabeling misclassified image patches. This leads to a significant improvement in the true positive rates and reduction in false positive rates. The tree cover maps were generated for the whole state of California, spanning a total of 11,095 NAIP tiles covering a total geographical area of 163,696 sq. miles. The framework produced true positive rates of around 88% for fragmented forests and 74% for urban tree cover areas, with false positive rates lower than 2% for both landscapes. Comparative studies with the National Land Cover Data (NLCD) algorithm and the LiDAR canopy height model (CHM) showed the effectiveness of our framework for generating accurate high-resolution tree-cover maps.

  14. A formal statistical approach to representing uncertainty in rainfall-runoff modelling with focus on residual analysis and probabilistic output evaluation - Distinguishing simulation and prediction

    DEFF Research Database (Denmark)

    Breinholt, Anders; Møller, Jan Kloppenborg; Madsen, Henrik

    2012-01-01

    evaluation of the modelled output, and we attach particular importance to inspecting the residuals of the model outputs and improving the model uncertainty description. We also introduce the probabilistic performance measures sharpness, reliability and interval skill score for model comparison...... and for checking the reliability of the confidence bounds. Using point rainfall and evaporation data as input and flow measurements from a sewer system for model conditioning, a state space model is formulated that accounts for three different flow contributions: wastewater from households, and fast rainfall......-runoff from paved areas and slow rainfall-dependent infiltration-inflow from unknown sources. We consider two different approaches to evaluate the model output uncertainty, the output error method that lumps all uncertainty into the observation noise term, and a method based on Stochastic Differential...

  15. Explaining differences between bioaccumulation measurements in laboratory and field data through use of a probabilistic modeling approach

    Science.gov (United States)

    Selck, Henriette; Drouillard, Ken; Eisenreich, Karen; Koelmans, Albert A.; Palmqvist, Annemette; Ruus, Anders; Salvito, Daniel; Schultz, Irv; Stewart, Robin; Weisbrod, Annie; van den Brink, Nico W.; van den Heuvel-Greve, Martine

    2012-01-01

    In the regulatory context, bioaccumulation assessment is often hampered by substantial data uncertainty as well as by the poorly understood differences often observed between results from laboratory and field bioaccumulation studies. Bioaccumulation is a complex, multifaceted process, which calls for accurate error analysis. Yet, attempts to quantify and compare propagation of error in bioaccumulation metrics across species and chemicals are rare. Here, we quantitatively assessed the combined influence of physicochemical, physiological, ecological, and environmental parameters known to affect bioaccumulation for 4 species and 2 chemicals, to assess whether uncertainty in these factors can explain the observed differences among laboratory and field studies. The organisms evaluated in simulations including mayfly larvae, deposit-feeding polychaetes, yellow perch, and little owl represented a range of ecological conditions and biotransformation capacity. The chemicals, pyrene and the polychlorinated biphenyl congener PCB-153, represented medium and highly hydrophobic chemicals with different susceptibilities to biotransformation. An existing state of the art probabilistic bioaccumulation model was improved by accounting for bioavailability and absorption efficiency limitations, due to the presence of black carbon in sediment, and was used for probabilistic modeling of variability and propagation of error. Results showed that at lower trophic levels (mayfly and polychaete), variability in bioaccumulation was mainly driven by sediment exposure, sediment composition and chemical partitioning to sediment components, which was in turn dominated by the influence of black carbon. At higher trophic levels (yellow perch and the little owl), food web structure (i.e., diet composition and abundance) and chemical concentration in the diet became more important particularly for the most persistent compound, PCB-153. These results suggest that variation in bioaccumulation

  16. Probabilistic molecular dynamics evaluation of the stress-strain behavior of polyethylene

    International Nuclear Information System (INIS)

    Stowe, J.Q.; Predecki, P.K.; Laz, P.J.; Burks, B.M.; Kumosa, M.

    2009-01-01

    The primary goal of this study was to utilize molecular dynamics to predict the mechanical behavior of polyethylene. In particular, stress-strain relationships, the Young's modulus and Poisson ratio were predicted for low-density polyethylene at several molecular weights and polymer configurations with the number of united CH 2 atoms ranging between 500 and 5000. Probabilistic Monte Carlo methods were also used to identify the extent of uncertainty in mechanical property predictions. In general, asymptotic behavior was observed for stress and the Young's modulus as the molecular weight of the models increased. At the same time, significant variability, of the order of 1000% of the mean, in the stress-strain relationships and the Young's modulus predictions was observed, especially for low molecular weight models. The variability in the Young's modulus predictions ranged from 17.9 to 3.2 GPa for the models ranging from 100 to 5000 CH 2 atom models. However, it was also found that the mean value of the Young's modulus approached a physically possible value of 194 MPa for the 5000 atom model. Poisson ratio predictions also resulted in significant variability, from 200% to 425% of the mean, and ranged from 0.75 to 1.30. The mean value of the Poisson ratios calculated in this study ranged from 0.32 to 0.44 for the 100 to 5000 atom models, respectively.

  17. Is it possible to predict the presence of colorectal cancer in a blood test? A probabilistic approach method.

    Science.gov (United States)

    Navarro Rodríguez, José Manuel; Gallego Plazas, Javier; Borrás Rocher, Fernando; Calpena Rico, Rafael; Ruiz Macia, José Antonio; Morcillo Ródenas, Miguel Ángel

    2017-10-01

    The assessment of the state of immunosurveillance (the ability of the organism to prevent the development of neoplasias) in the blood has prognostic implications of interest in colorectal cancer. We evaluated and quantified a possible predictive character of the disease in a blood test using a mathematical interaction index of several blood parameters. The predictive capacity of the index to detect colorectal cancer was also assessed. We performed a retrospective case-control study of a comparative analysis of the distribution of blood parameters in 266 patients with colorectal cancer and 266 healthy patients during the period from 2009 to 2013. Statistically significant differences (p indexes (neutrophil to lymphocyte ratio and platelet to lymphocyte ratio), hemoglobin, hematocrit and eosinophil levels. These differences allowed the design of a blood analytical profile that calculates the risk of colorectal cancer. This risk profile can be quantified via a mathematical formula with a probabilistic capacity to identify patients with the highest risk of the presence of colorectal cancer (area under the ROC curve = 0.85). We showed that a colorectal cancer predictive character exists in blood which can be quantified by an interaction index of several blood parameters. The design and development of interaction indexes of blood parameters constitutes an interesting research line for the development and improvement of programs for the screening of colorectal cancer.

  18. A probabilistic approach to combining smart meter and electric vehicle charging data to investigate distribution network impacts

    International Nuclear Information System (INIS)

    Neaimeh, Myriam; Wardle, Robin; Jenkins, Andrew M.; Yi, Jialiang; Hill, Graeme; Lyons, Padraig F.; Hübner, Yvonne; Blythe, Phil T.; Taylor, Phil C.

    2015-01-01

    Highlights: • Working with unique datasets of EV charging and smart meter load demand. • Distribution networks are not a homogenous group with more capabilities to accommodate EVs than previously suggested. • Spatial and temporal diversity of EV charging demand alleviate the impacts on networks. • An extensive recharging infrastructure could enable connection of additional EVs on constrained distribution networks. • Electric utilities could increase the network capability to accommodate EVs by investing in recharging infrastructure. - Abstract: This work uses a probabilistic method to combine two unique datasets of real world electric vehicle charging profiles and residential smart meter load demand. The data was used to study the impact of the uptake of Electric Vehicles (EVs) on electricity distribution networks. Two real networks representing an urban and rural area, and a generic network representative of a heavily loaded UK distribution network were used. The findings show that distribution networks are not a homogeneous group with a variation of capabilities to accommodate EVs and there is a greater capability than previous studies have suggested. Consideration of the spatial and temporal diversity of EV charging demand has been demonstrated to reduce the estimated impacts on the distribution networks. It is suggested that distribution network operators could collaborate with new market players, such as charging infrastructure operators, to support the roll out of an extensive charging infrastructure in a way that makes the network more robust; create more opportunities for demand side management; and reduce planning uncertainties associated with the stochastic nature of EV charging demand.

  19. Probabilistic approach for assessing infants' health risks due to ingestion of nanoscale silver released from consumer products.

    Science.gov (United States)

    Pang, Chengfang; Hristozov, Danail; Zabeo, Alex; Pizzol, Lisa; Tsang, Michael P; Sayre, Phil; Marcomini, Antonio

    2017-02-01

    Silver nanoparticles (n-Ag) are widely used in consumer products and many medical applications because of their unique antibacterial properties. Their use is raising concern about potential human exposures and health effects. Therefore, it is informative to assess the potential human health risks of n-Ag in order to ensure that nanotechnology-based consumer products are deployed in a safe and sustainable way. Even though toxicity studies clearly show the potential hazard of n-Ag, there have been few attempts to integrate hazard and exposure assessments to evaluate risks. The underlying reason for this is the difficulty in characterizing exposure and the lack of toxicity studies essential for human health risk assessment (HHRA). Such data gaps introduce significant uncertainty into the risk assessment process. This study uses probabilistic methods to assess the relative uncertainty and potential risks of n-Ag exposure to infants. In this paper, we estimate the risks for infants potentially exposed to n-Ag through drinking juice or milk from sippy cups or licking baby blankets containing n-Ag. We explicitly evaluate uncertainty and variability contained in available dose-response and exposure data in order to make the risk characterization process transparent. Our results showed that individual margin of exposures for oral exposure to sippy cups and baby blankets containing n-Ag exhibited minimal risk. Copyright © 2016. Published by Elsevier Ltd.

  20. Integration of anatomical and external response mappings explains crossing effects in tactile localization: A probabilistic modeling approach.

    Science.gov (United States)

    Badde, Stephanie; Heed, Tobias; Röder, Brigitte

    2016-04-01

    To act upon a tactile stimulus its original skin-based, anatomical spatial code has to be transformed into an external, posture-dependent reference frame, a process known as tactile remapping. When the limbs are crossed, anatomical and external location codes are in conflict, leading to a decline in tactile localization accuracy. It is unknown whether this impairment originates from the integration of the resulting external localization response with the original, anatomical one or from a failure of tactile remapping in crossed postures. We fitted probabilistic models based on these diverging accounts to the data from three tactile localization experiments. Hand crossing disturbed tactile left-right location choices in all experiments. Furthermore, the size of these crossing effects was modulated by stimulus configuration and task instructions. The best model accounted for these results by integration of the external response mapping with the original, anatomical one, while applying identical integration weights for uncrossed and crossed postures. Thus, the model explained the data without assuming failures of remapping. Moreover, performance differences across tasks were accounted for by non-individual parameter adjustments, indicating that individual participants' task adaptation results from one common functional mechanism. These results suggest that remapping is an automatic and accurate process, and that the observed localization impairments in touch result from a cognitively controlled integration process that combines anatomically and externally coded responses.

  1. Is it possible to predict the presence of colorectal cancer in a blood test?: a probabilistic approach method

    Directory of Open Access Journals (Sweden)

    José Manuel Navarro-Rodríguez

    Full Text Available Introduction: The assessment of the state of immunosurveillance (the ability of the organism to prevent the development of neoplasias in the blood has prognostic implications of interest in colorectal cancer. We evaluated and quantified a possible predictive character of the disease in a blood test using a mathematical interaction index of several blood parameters. The predictive capacity of the index to detect colorectal cancer was also assessed. Methods: We performed a retrospective case-control study of a comparative analysis of the distribution of blood parameters in 266 patients with colorectal cancer and 266 healthy patients during the period from 2009 to 2013. Results: Statistically significant differences (p < 0.05 were observed between patients with colorectal cancer and the control group in terms of platelet counts, fibrinogen, total leukocytes, neutrophils, systemic immunovigilance indexes (neutrophil to lymphocyte ratio and platelet to lymphocyte ratio, hemoglobin, hematocrit and eosinophil levels. These differences allowed the design of a blood analytical profile that calculates the risk of colorectal cancer. This risk profile can be quantified via a mathematical formula with a probabilistic capacity to identify patients with the highest risk of the presence of colorectal cancer (area under the ROC curve = 0.85. Conclusions: We showed that a colorectal cancer predictive character exists in blood which can be quantified by an interaction index of several blood parameters. The design and development of interaction indexes of blood parameters constitutes an interesting research line for the development and improvement of programs for the screening of colorectal cancer.

  2. A probabilistic approach to the assessment of some life history pattern parameters in a Middle Pleistocene human population.

    Science.gov (United States)

    Durand, A I; Ipina, S L; Bermúdez de Castro, J M

    2000-06-01

    Parameters of a Middle Pleistocene human population such as the expected length of the female reproductive period (E(Y)), the expected interbirth interval (E(X)), the survival rate (tau) for females after the expected reproductive period, the rate (phi(2)) of women who, given that they reach first birth, do not survive to the end of the expected reproductive period, and the female infant plus juvenile mortality rate (phi(1)) have been assessed from a probabilistic standpoint provided that such a population were stationary. The hominid sample studied, the Sima de los Huesos (SH) cave site, Sierra de Atapuerca (Spain), is the most exhaustive human fossil sample currently available. Results suggest that the Atapuerca (SH) sample can derive from a stationary population. Further, in the case that the expected reproductive period ends between 37 and 40 yr of age, then 24 less, similarE(Y) less, similar27 yr, E(X)=3 yr, 0.224

  3. Poisson sigma model with branes and hyperelliptic Riemann surfaces

    International Nuclear Information System (INIS)

    Ferrario, Andrea

    2008-01-01

    We derive the explicit form of the superpropagators in the presence of general boundary conditions (coisotropic branes) for the Poisson sigma model. This generalizes the results presented by Cattaneo and Felder [''A path integral approach to the Kontsevich quantization formula,'' Commun. Math. Phys. 212, 591 (2000)] and Cattaneo and Felder ['Coisotropic submanifolds in Poisson geometry and branes in the Poisson sigma model', Lett. Math. Phys. 69, 157 (2004)] for Kontsevich's angle function [Kontsevich, M., 'Deformation quantization of Poisson manifolds I', e-print arXiv:hep.th/0101170] used in the deformation quantization program of Poisson manifolds. The relevant superpropagators for n branes are defined as gauge fixed homotopy operators of a complex of differential forms on n sided polygons P n with particular ''alternating'' boundary conditions. In the presence of more than three branes we use first order Riemann theta functions with odd singular characteristics on the Jacobian variety of a hyperelliptic Riemann surface (canonical setting). In genus g the superpropagators present g zero mode contributions

  4. Probabilistic metric spaces

    CERN Document Server

    Schweizer, B

    2005-01-01

    Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

  5. Risk-based probabilistic approach to assess the impact of false mussel invasions on farmed hard clams.

    Science.gov (United States)

    Liao, Chung-Min; Ju, Yun-Ru; Chio, Chia-Pin; Chen, Wei-Yu

    2010-02-01

    The purpose of this article is to provide a risk-based predictive model to assess the impact of false mussel Mytilopsis sallei invasions on hard clam Meretrix lusoria farms in the southwestern region of Taiwan. The actual spread of invasive false mussel was predicted by using analytical models based on advection-diffusion and gravity models. The proportion of hard clam colonized and infestation by false mussel were used to characterize risk estimates. A mortality model was parameterized to assess hard clam mortality risk characterized by false mussel density and infestation intensity. The published data were reanalyzed to parameterize a predictive threshold model described by a cumulative Weibull distribution function that can be used to estimate the exceeding thresholds of proportion of hard clam colonized and infestation. Results indicated that the infestation thresholds were 2-17 ind clam(-1) for adult hard clams, whereas 4 ind clam(-1) for nursery hard clams. The average colonization thresholds were estimated to be 81-89% for cultivated and nursery hard clam farms, respectively. Our results indicated that false mussel density and infestation, which caused 50% hard clam mortality, were estimated to be 2,812 ind m(-2) and 31 ind clam(-1), respectively. This study further indicated that hard clam farms that are close to the coastal area have at least 50% probability for 43% mortality caused by infestation. This study highlighted that a probabilistic risk-based framework characterized by probability distributions and risk curves is an effective representation of scientific assessments for farmed hard clam in response to the nonnative false mussel invasion.

  6. Parasites et parasitoses des poissons

    OpenAIRE

    De Kinkelin, Pierre; Morand, Marc; Hedrick, Ronald; Michel, Christian

    2014-01-01

    Cet ouvrage, richement illustré, offre un panorama représentatif des agents parasitaires rencontrés chez les poissons. S'appuyant sur les nouvelles conceptions de la classification phylogénétique, il met l'accent sur les propriétés biologiques, l'épidémiologie et les conséquences cliniques des groupes d'organismes en cause, à la lumière des avancées cognitives permises par les nouveaux outils de la biologie. Il est destiné à un large public, allant du monde de l'aquaculture à ceux de la santé...

  7. Dualizing the Poisson summation formula.

    Science.gov (United States)

    Duffin, R J; Weinberger, H F

    1991-01-01

    If f(x) and g(x) are a Fourier cosine transform pair, then the Poisson summation formula can be written as 2sumfrominfinityn = 1g(n) + g(0) = 2sumfrominfinityn = 1f(n) + f(0). The concepts of linear transformation theory lead to the following dual of this classical relation. Let phi(x) and gamma(x) = phi(1/x)/x have absolutely convergent integrals over the positive real line. Let F(x) = sumfrominfinityn = 1phi(n/x)/x - integralinfinity0phi(t)dt and G(x) = sumfrominfinityn = 1gamma (n/x)/x - integralinfinity0 gamma(t)dt. Then F(x) and G(x) are a Fourier cosine transform pair. We term F(x) the "discrepancy" of phi because it is the error in estimating the integral phi of by its Riemann sum with the constant mesh spacing 1/x. PMID:11607208

  8. Singular reduction of Nambu-Poisson manifolds

    Science.gov (United States)

    Das, Apurba

    The version of Marsden-Ratiu Poisson reduction theorem for Nambu-Poisson manifolds by a regular foliation have been studied by Ibáñez et al. In this paper, we show that this reduction procedure can be extended to the singular case. Under a suitable notion of Hamiltonian flow on the reduced space, we show that a set of Hamiltonians on a Nambu-Poisson manifold can also be reduced.

  9. Probabilistic model for sterilization of food

    International Nuclear Information System (INIS)

    Chepurko, V.V.; Malinovskij, O.V.

    1986-01-01

    The probabilistic model for radiation sterilization is proposed based on the followng suppositions: (1) initial contamination of a volume unit of the sterilized product m is described by the distribution of the probabilities q(m), (2) inactivation of the population from m of microorganisms is approximated by Bernoulli test scheme, and (3) contamination of unit of the sterilized product is independent. The possibility of approximation q(m) by Poisson distribution is demonstrated. The diagrams are presented permitting to evaluate the dose which provides the defined reliability of sterilization of food for chicken-gnotobionts

  10. A Bayesian Approach to Integrate Real-Time Data into Probabilistic Risk Analysis of Remediation Efforts in NAPL Sites

    Science.gov (United States)

    Fernandez-Garcia, D.; Sanchez-Vila, X.; Bolster, D.; Tartakovsky, D. M.

    2010-12-01

    The release of non-aqueous phase liquids (NAPLs) such as petroleum hydrocarbons and chlorinated solvents in the subsurface is a severe source of groundwater and vapor contamination. Because these liquids are essentially immiscible due to low solubility, these contaminants get slowly dissolved in groundwater and/or volatilized in the vadoze zone threatening the environment and public health over a long period. Many remediation technologies and strategies have been developed in the last decades for restoring the water quality properties of these contaminated sites. The failure of an on-site treatment technology application is often due to the unnoticed presence of dissolved NAPL entrapped in low permeability areas (heterogeneity) and/or the remaining of substantial amounts of pure phase after remediation efforts. Full understanding of the impact of remediation efforts is complicated due to the role of many interlink physical and biochemical processes taking place through several potential pathways of exposure to multiple receptors in a highly unknown heterogeneous environment. Due to these difficulties, the design of remediation strategies and definition of remediation endpoints have been traditionally determined without quantifying the risk associated with the failure of such efforts. We conduct a probabilistic risk analysis (PRA) of the likelihood of success of an on-site NAPL treatment technology that easily integrates all aspects of the problem (causes, pathways, and receptors) without doing extensive modeling. Importantly, the method is further capable to incorporate the inherent uncertainty that often exist in the exact location where the dissolved NAPL plume leaves the source zone. This is achieved by describing the failure of the system as a function of this source zone exit location, parameterized in terms of a vector of parameters. Using a Bayesian interpretation of the system and by means of the posterior multivariate distribution, the failure of the

  11. Probabilistic Connections for Bidirectional Path Tracing

    OpenAIRE

    Popov, Stefan; Ramamoorthi, Ravi; Durand, Fredo; Drettakis, George

    2015-01-01

    International audience; Bidirectional Path Tracing Probabilistic Connections for Bidirectional Path Tracing Figure 1: Our Probabilistic Connections for Bidirectional Path Tracing approach importance samples connections to an eye sub-path, and greatly reduces variance, by considering and reusing multiple light sub-paths at once. Our approach (right) achieves much higher quality than bidirectional path-tracing on the left for the same computation time (~8.4 min).. Abstract Bidirectional path tr...

  12. A Seemingly Unrelated Poisson Regression Model

    OpenAIRE

    King, Gary

    1989-01-01

    This article introduces a new estimator for the analysis of two contemporaneously correlated endogenous event count variables. This seemingly unrelated Poisson regression model (SUPREME) estimator combines the efficiencies created by single equation Poisson regression model estimators and insights from "seemingly unrelated" linear regression models.

  13. Associative and Lie deformations of Poisson algebras

    OpenAIRE

    Remm, Elisabeth

    2011-01-01

    Considering a Poisson algebra as a non associative algebra satisfying the Markl-Remm identity, we study deformations of Poisson algebras as deformations of this non associative algebra. This gives a natural interpretation of deformations which preserves the underlying associative structure and we study deformations which preserve the underlying Lie algebra.

  14. Probabilistic logic networks a comprehensive framework for uncertain inference

    CERN Document Server

    Goertzel, Ben; Goertzel, Izabela Freire; Heljakka, Ari

    2008-01-01

    This comprehensive book describes Probabilistic Logic Networks (PLN), a novel conceptual, mathematical and computational approach to uncertain inference. A broad scope of reasoning types are considered.

  15. Probabilistic costing of transmission services

    International Nuclear Information System (INIS)

    Wijayatunga, P.D.C.

    1992-01-01

    Costing of transmission services of electrical utilities is required for transactions involving the transport of energy over a power network. The calculation of these costs based on Short Run Marginal Costing (SRMC) is preferred over other methods proposed in the literature due to its economic efficiency. In the research work discussed here, the concept of probabilistic costing of use-of-system based on SRMC which emerges as a consequence of the uncertainties in a power system is introduced using two different approaches. The first approach, based on the Monte Carlo method, generates a large number of possible system states by simulating random variables in the system using pseudo random number generators. A second approach to probabilistic use-of-system costing is proposed based on numerical convolution and multi-area representation of the transmission network. (UK)

  16. Some thoughts on the future of probabilistic structural design of nuclear components

    International Nuclear Information System (INIS)

    Stancampiano, P.A.

    1978-01-01

    This paper presents some views on the future role of probabilistic methods in the structural design of nuclear components. The existing deterministic design approach is discussed and compared to the probabilistic approach. Some of the objections to both deterministic and probabilistic design are listed. Extensive research and development activities are required to mature the probabilistic approach suficiently to make it cost-effective and competitive with current deterministic design practices. The required research activities deal with probabilistic methods development, more realistic casual failure mode models development, and statistical data models development. A quasi-probabilistic structural design approach is recommended which accounts for the random error in the design models. (Auth.)

  17. 77 FR 38856 - An Approach for Probabilistic Risk Assessment in Risk-Informed Decisions on Plant-Specific...

    Science.gov (United States)

    2012-06-29

    ... philosophy is interpreted and implemented consistently. To the extent that other regulatory guidance refers... provides guidance on an approach the NRC finds acceptable for analyzing issues associated with proposed... DG-1285, (above) which provides guidance on evaluating proposed changes to a plant's licensing basis...

  18. Probabilistic Approaches to Examining Linguistic Features of Test Items and Their Effect on the Performance of English Language Learners

    Science.gov (United States)

    Solano-Flores, Guillermo

    2014-01-01

    This article addresses validity and fairness in the testing of English language learners (ELLs)--students in the United States who are developing English as a second language. It discusses limitations of current approaches to examining the linguistic features of items and their effect on the performance of ELL students. The article submits that…

  19. Probabilistic modelling in urban drainage – two approaches that explicitly account for temporal variation of model errors

    DEFF Research Database (Denmark)

    Löwe, Roland; Del Giudice, Dario; Mikkelsen, Peter Steen

    of input uncertainties observed in the models. The explicit inclusion of such variations in the modelling process will lead to a better fulfilment of the assumptions made in formal statistical frameworks, thus reducing the need to resolve to informal methods. The two approaches presented here...

  20. Computation of a coastal protection, using classical method, the PIANC-method or a full probabilistic approach ?

    NARCIS (Netherlands)

    Verhagen, H.J.

    2003-01-01

    In a classical design approach to breakwaters a design wave height is determined, and filled in into a design formula. Some undefined safety is added. In the method using partial safety coefficients (as developed by PIANC [1992] and recently also adopted by the Coastal Engineering Manual of the US

  1. Probabilistic approach to decision making under uncertainty during volcanic crises. Retrospective analysis of the 2011 eruption of El Hierro, in the Canary Islands

    Science.gov (United States)

    Sobradelo, Rosa; Martí, Joan; Kilburn, Christopher; López, Carmen

    2014-05-01

    Understanding the potential evolution of a volcanic crisis is crucial to improving the design of effective mitigation strategies. This is especially the case for volcanoes close to densely-populated regions, where inappropriate decisions may trigger widespread loss of life, economic disruption and public distress. An outstanding goal for improving the management of volcanic crises, therefore, is to develop objective, real-time methodologies for evaluating how an emergency will develop and how scientists communicate with decision makers. Here we present a new model BADEMO (Bayesian Decision Model) that applies a general and flexible, probabilistic approach to managing volcanic crises. The model combines the hazard and risk factors that decision makers need for a holistic analysis of a volcanic crisis. These factors include eruption scenarios and their probabilities of occurrence, the vulnerability of populations and their activities, and the costs of false alarms and failed forecasts. The model can be implemented before an emergency, to identify actions for reducing the vulnerability of a district; during an emergency, to identify the optimum mitigating actions and how these may change as new information is obtained; and after an emergency, to assess the effectiveness of a mitigating response and, from the results, to improve strategies before another crisis occurs. As illustrated by a retrospective analysis of the 2011 eruption of El Hierro, in the Canary Islands, BADEMO provides the basis for quantifying the uncertainty associated with each recommended action as an emergency evolves, and serves as a mechanism for improving communications between scientists and decision makers.

  2. A Decision Support System Coupling Fuzzy Logic and Probabilistic Graphical Approaches for the Agri-Food Industry: Prediction of Grape Berry Maturity.

    Science.gov (United States)

    Perrot, Nathalie; Baudrit, Cédric; Brousset, Jean Marie; Abbal, Philippe; Guillemin, Hervé; Perret, Bruno; Goulet, Etienne; Guerin, Laurence; Barbeau, Gérard; Picque, Daniel

    2015-01-01

    Agri-food is one of the most important sectors of the industry and a major contributor to the global warming potential in Europe. Sustainability issues pose a huge challenge for this sector. In this context, a big issue is to be able to predict the multiscale dynamics of those systems using computing science. A robust predictive mathematical tool is implemented for this sector and applied to the wine industry being easily able to be generalized to other applications. Grape berry maturation relies on complex and coupled physicochemical and biochemical reactions which are climate dependent. Moreover one experiment represents one year and the climate variability could not be covered exclusively by the experiments. Consequently, harvest mostly relies on expert predictions. A big challenge for the wine industry is nevertheless to be able to anticipate the reactions for sustainability purposes. We propose to implement a decision support system so called FGRAPEDBN able to (1) capitalize the heterogeneous fragmented knowledge available including data and expertise and (2) predict the sugar (resp. the acidity) concentrations with a relevant RMSE of 7 g/l (resp. 0.44 g/l and 0.11 g/kg). FGRAPEDBN is based on a coupling between a probabilistic graphical approach and a fuzzy expert system.

  3. Adequacy of the default values for skin surface area used for risk assessment and French anthropometric data by a probabilistic approach.

    Science.gov (United States)

    Dornic, N; Ficheux, A S; Bernard, A; Roudot, A C

    2017-08-01

    The notes of guidance for the testing of cosmetic ingredients and their safety evaluation by the Scientific Committee on Consumer Safety (SCCS) is a document dedicated to ensuring the safety of European consumers. This contains useful data for risk assessment such as default values for Skin Surface Area (SSA). A more in-depth study of anthropometric data across Europe reveals considerable variations. The default SSA value was derived from a study on the Dutch population, which is known to be one of the tallest nations in the World. This value could be inadequate for shorter populations of Europe. Data were collected in a survey on cosmetic consumption in France. Probabilistic treatment of these data and analysis of the case of methylisothiazolinone, a sensitizer recently evaluated by a deterministic approach submitted to SCCS, suggest that the default value for SSA used in the quantitative risk assessment might not be relevant for a significant share of the French female population. Others female populations of Southern Europe may also be excluded. This is of importance given that some studies show an increasing risk of developping skin sensitization among women. The disparities in anthropometric data across Europe should be taken into consideration. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Validating physician-certified verbal autopsy and probabilistic modeling (InterVA approaches to verbal autopsy interpretation using hospital causes of adult deaths

    Directory of Open Access Journals (Sweden)

    Tsofa Benjamin

    2011-08-01

    Full Text Available Abstract Background The most common method for determining cause of death is certification by physicians based either on available medical records, or where such data are not available, through verbal autopsy (VA. The physician-certification approach is costly and inconvenient; however, recent work shows the potential of a computer-based probabilistic model (InterVA to interpret verbal autopsy data in a more convenient, consistent, and rapid way. In this study we validate separately both physician-certified verbal autopsy (PCVA and the InterVA probabilistic model against hospital cause of death (HCOD in adults dying in a district hospital on the coast of Kenya. Methods Between March 2007 and June 2010, VA interviews were conducted for 145 adult deaths that occurred at Kilifi District Hospital. The VA data were reviewed by a physician and the cause of death established. A range of indicators (including age, gender, physical signs and symptoms, pregnancy status, medical history, and the circumstances of death from the VA forms were included in the InterVA for interpretation. Cause-specific mortality fractions (CSMF, Cohen's kappa (κ statistic, receiver operating characteristic (ROC curves, sensitivity, specificity, and positive predictive values were applied to compare agreement between PCVA, InterVA, and HCOD. Results HCOD, InterVA, and PCVA yielded the same top five underlying causes of adult deaths. The InterVA overestimated tuberculosis as a cause of death compared to the HCOD. On the other hand, PCVA overestimated diabetes. Overall, CSMF for the five major cause groups by the InterVA, PCVA, and HCOD were 70%, 65%, and 60%, respectively. PCVA versus HCOD yielded a higher kappa value (κ = 0.52, 95% confidence interval [CI]: 0.48, 0.54 than the InterVA versus HCOD which yielded a kappa (κ value of 0.32 (95% CI: 0.30, 0.38. Overall, (κ agreement across the three methods was 0.41 (95% CI: 0.37, 0.48. The areas under the ROC curves were 0

  5. Probabilistic machine learning and artificial intelligence

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  6. Probabilistic machine learning and artificial intelligence.

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  7. Developing a Korean standard brain atlas on the basis of statistical and probabilistic approach and visualization tool for functional image analysis

    International Nuclear Information System (INIS)

    Koo, B. B.; Lee, J. M.; Kim, J. S.; Kim, I. Y.; Kim, S. I.; Lee, J. S.; Lee, D. S.; Kwon, J. S.; Kim, J. J.

    2003-01-01

    The probabilistic anatomical maps are used to localize the functional neuro-images and morphological variability. The quantitative indicator is very important to inquire the anatomical position of an activated region because functional image data has the low-resolution nature and no inherent anatomical information. Although previously developed MNI probabilistic anatomical map was enough to localize the data, it was not suitable for the Korean brains because of the morphological difference between Occidental and Oriental. In this study, we develop a probabilistic anatomical map for Korean normal brain. Normal 75 brains of T1-weighted spoiled gradient echo magnetic resonance images were acquired on a 1.5-T GESIGNA scanner. Then, a standard brain is selected in the group through a clinician searches a brain of the average property in the Talairach coordinate system. With the standard brain, an anatomist delineates 89 regions of interest (ROI) parcellating cortical and subcortical areas. The parcellated ROIs of the standard are warped and overlapped into each brain by maximizing intensity similarity. And every brain is automatically labeled with the registered ROIs. Each of the same-labeled region is linearly normalize to the standard brain, and the occurrence of each region is counted. Finally, 89 probabilistic ROI volumes are generated. This paper presents a probabilistic anatomical map for localizing the functional and structural analysis of Korean normal brain. In the future, we'll develop the group specific probabilistic anatomical maps of OCD and schizophrenia disease

  8. On covariant Poisson brackets in classical field theory

    International Nuclear Information System (INIS)

    Forger, Michael; Salles, Mário O.

    2015-01-01

    How to give a natural geometric definition of a covariant Poisson bracket in classical field theory has for a long time been an open problem—as testified by the extensive literature on “multisymplectic Poisson brackets,” together with the fact that all these proposals suffer from serious defects. On the other hand, the functional approach does provide a good candidate which has come to be known as the Peierls–De Witt bracket and whose construction in a geometrical setting is now well understood. Here, we show how the basic “multisymplectic Poisson bracket” already proposed in the 1970s can be derived from the Peierls–De Witt bracket, applied to a special class of functionals. This relation allows to trace back most (if not all) of the problems encountered in the past to ambiguities (the relation between differential forms on multiphase space and the functionals they define is not one-to-one) and also to the fact that this class of functionals does not form a Poisson subalgebra

  9. On covariant Poisson brackets in classical field theory

    Energy Technology Data Exchange (ETDEWEB)

    Forger, Michael [Instituto de Matemática e Estatística, Universidade de São Paulo, Caixa Postal 66281, BR–05315-970 São Paulo, SP (Brazil); Salles, Mário O. [Instituto de Matemática e Estatística, Universidade de São Paulo, Caixa Postal 66281, BR–05315-970 São Paulo, SP (Brazil); Centro de Ciências Exatas e da Terra, Universidade Federal do Rio Grande do Norte, Campus Universitário – Lagoa Nova, BR–59078-970 Natal, RN (Brazil)

    2015-10-15

    How to give a natural geometric definition of a covariant Poisson bracket in classical field theory has for a long time been an open problem—as testified by the extensive literature on “multisymplectic Poisson brackets,” together with the fact that all these proposals suffer from serious defects. On the other hand, the functional approach does provide a good candidate which has come to be known as the Peierls–De Witt bracket and whose construction in a geometrical setting is now well understood. Here, we show how the basic “multisymplectic Poisson bracket” already proposed in the 1970s can be derived from the Peierls–De Witt bracket, applied to a special class of functionals. This relation allows to trace back most (if not all) of the problems encountered in the past to ambiguities (the relation between differential forms on multiphase space and the functionals they define is not one-to-one) and also to the fact that this class of functionals does not form a Poisson subalgebra.

  10. Constructions and classifications of projective Poisson varieties

    Science.gov (United States)

    Pym, Brent

    2018-03-01

    This paper is intended both as an introduction to the algebraic geometry of holomorphic Poisson brackets, and as a survey of results on the classification of projective Poisson manifolds that have been obtained in the past 20 years. It is based on the lecture series delivered by the author at the Poisson 2016 Summer School in Geneva. The paper begins with a detailed treatment of Poisson surfaces, including adjunction, ruled surfaces and blowups, and leading to a statement of the full birational classification. We then describe several constructions of Poisson threefolds, outlining the classification in the regular case, and the case of rank-one Fano threefolds (such as projective space). Following a brief introduction to the notion of Poisson subspaces, we discuss Bondal's conjecture on the dimensions of degeneracy loci on Poisson Fano manifolds. We close with a discussion of log symplectic manifolds with simple normal crossings degeneracy divisor, including a new proof of the classification in the case of rank-one Fano manifolds.

  11. Probabilistic Logical Characterization

    DEFF Research Database (Denmark)

    Hermanns, Holger; Parma, Augusto; Segala, Roberto

    2011-01-01

    Probabilistic automata exhibit both probabilistic and non-deterministic choice. They are therefore a powerful semantic foundation for modeling concurrent systems with random phenomena arising in many applications ranging from artificial intelligence, security, systems biology to performance...... modeling. Several variations of bisimulation and simulation relations have proved to be useful as means to abstract and compare different automata. This paper develops a taxonomy of logical characterizations of these relations on image-finite and image-infinite probabilistic automata....

  12. Towards a High Reliable Enforcement of Safety Regulations - A Workflow Meta Data Model and Probabilistic Failure Management Approach

    Directory of Open Access Journals (Sweden)

    Heiko Henning Thimm

    2016-10-01

    Full Text Available Today’s companies are able to automate the enforcement of Environmental, Health and Safety (EH&S duties through the use of workflow management technology. This approach requires to specify activities that are combined to workflow models for EH&S enforcement duties. In order to meet given safety regulations these activities are to be completed correctly and within given deadlines. Otherwise, activity failures emerge which may lead to breaches against safety regulations. A novel domain-specific workflow meta data model is proposed. The model enables a system to detect and predict activity failures through the use of data about the company, failure statistics, and activity proxies. Since the detection and prediction methods are based on the evaluation of constraints specified on EH&S regulations, a system approach is proposed that builds on the integration of a Workflow Management System (WMS with an EH&S Compliance Information System. Main principles of the failure detection and prediction are described. For EH&S managers the system shall provide insights into the current failure situation. This can help to prevent and mitigate critical situations such as safety enforcement measures that are behind their deadlines. As a result a more reliable enforcement of safety regulations can be achieved.

  13. Encoding Probabilistic Brain Atlases Using Bayesian Inference

    OpenAIRE

    Van Leemput, Koen

    2008-01-01

    This paper addresses the problem of creating probabilistic brain atlases from manually labeled training data. Probabilistic atlases are typically constructed by counting the relative frequency of occurrence of labels in corresponding locations across the training images. However, such an “averaging” approach generalizes poorly to unseen cases when the number of training images is limited, and provides no principled way of aligning the training datasets using deformable registration. In this p...

  14. Poisson's ratio and Young's modulus of lipid bilayers in different phases

    Directory of Open Access Journals (Sweden)

    Tayebeh eJadidi

    2014-04-01

    Full Text Available A general computational method is introduced to estimate the Poisson's ratio for membranes with small thickness.In this method, the Poisson's ratio is calculated by utilizing a rescaling of inter-particle distancesin one lateral direction under periodic boundary conditions. As an example for the coarse grained lipid model introduced by Lenz and Schmid, we calculate the Poisson's ratio in the gel, fluid, and interdigitated phases. Having the Poisson's ratio, enable us to obtain the Young's modulus for the membranes in different phases. The approach may be applied to other membranes such as graphene and tethered membranes in orderto predict the temperature dependence of its Poisson's ratio and Young's modulus.

  15. Estimates of dietary exposure to bisphenol A (BPA) from light metal packaging using food consumption and packaging usage data: a refined deterministic approach and a fully probabilistic (FACET) approach.

    Science.gov (United States)

    Oldring, P K T; Castle, L; O'Mahony, C; Dixon, J

    2014-01-01

    The FACET tool is a probabilistic model to estimate exposure to chemicals in foodstuffs, originating from flavours, additives and food contact materials. This paper demonstrates the use of the FACET tool to estimate exposure to BPA (bisphenol A) from light metal packaging. For exposure to migrants from food packaging, FACET uses industry-supplied data on the occurrence of substances in the packaging, their concentrations and construction of the packaging, which were combined with data from a market research organisation and food consumption data supplied by national database managers. To illustrate the principles, UK packaging data were used together with consumption data from the UK National Diet and Nutrition Survey (NDNS) dietary survey for 19-64 year olds for a refined deterministic verification. The UK data were chosen mainly because the consumption surveys are detailed, data for UK packaging at a detailed level were available and, arguably, the UK population is composed of high consumers of packaged foodstuffs. Exposures were run for each food category that could give rise to BPA from light metal packaging. Consumer loyalty to a particular type of packaging, commonly referred to as packaging loyalty, was set. The BPA extraction levels used for the 15 types of coating chemistries that could release BPA were in the range of 0.00005-0.012 mg dm(-2). The estimates of exposure to BPA using FACET for the total diet were 0.0098 (mean) and 0.0466 (97.5th percentile) mg/person/day, corresponding to 0.00013 (mean) and 0.00059 (97.5th percentile) mg kg(-1) body weight day(-1) for consumers of foods packed in light metal packaging. This is well below the current EFSA (and other recognised bodies) TDI of 0.05 mg kg(-1) body weight day(-1). These probabilistic estimates were compared with estimates using a refined deterministic approach drawing on the same input data. The results from FACET for the mean, 95th and 97.5th percentile exposures to BPA lay between the

  16. Characterizing the performance of the Conway-Maxwell Poisson generalized linear model.

    Science.gov (United States)

    Francis, Royce A; Geedipally, Srinivas Reddy; Guikema, Seth D; Dhavala, Soma Sekhar; Lord, Dominique; LaRocca, Sarah

    2012-01-01

    Count data are pervasive in many areas of risk analysis; deaths, adverse health outcomes, infrastructure system failures, and traffic accidents are all recorded as count events, for example. Risk analysts often wish to estimate the probability distribution for the number of discrete events as part of doing a risk assessment. Traditional count data regression models of the type often used in risk assessment for this problem suffer from limitations due to the assumed variance structure. A more flexible model based on the Conway-Maxwell Poisson (COM-Poisson) distribution was recently proposed, a model that has the potential to overcome the limitations of the traditional model. However, the statistical performance of this new model has not yet been fully characterized. This article assesses the performance of a maximum likelihood estimation method for fitting the COM-Poisson generalized linear model (GLM). The objectives of this article are to (1) characterize the parameter estimation accuracy of the MLE implementation of the COM-Poisson GLM, and (2) estimate the prediction accuracy of the COM-Poisson GLM using simulated data sets. The results of the study indicate that the COM-Poisson GLM is flexible enough to model under-, equi-, and overdispersed data sets with different sample mean values. The results also show that the COM-Poisson GLM yields accurate parameter estimates. The COM-Poisson GLM provides a promising and flexible approach for performing count data regression. © 2011 Society for Risk Analysis.

  17. The Poisson equation on Klein surfaces

    Directory of Open Access Journals (Sweden)

    Monica Rosiu

    2016-04-01

    Full Text Available We obtain a formula for the solution of the Poisson equation with Dirichlet boundary condition on a region of a Klein surface. This formula reveals the symmetric character of the solution.

  18. Poisson point processes imaging, tracking, and sensing

    CERN Document Server

    Streit, Roy L

    2010-01-01

    This overview of non-homogeneous and multidimensional Poisson point processes and their applications features mathematical tools and applications from emission- and transmission-computed tomography to multiple target tracking and distributed sensor detection.

  19. A probabilistic risk assessment approach used to prioritize chemical constituents in mainstream smoke of cigarettes sold in China.

    Science.gov (United States)

    Xie, Jianping; Marano, Kristin M; Wilson, Cody L; Liu, Huimin; Gan, Huamin; Xie, Fuwei; Naufal, Ziad S

    2012-03-01

    The chemical and physical complexity of cigarette mainstream smoke (MSS) presents a challenge in the understanding of risk for smoking-related diseases. Quantitative risk assessment is a useful tool for assessing the toxicological risks that may be presented by smoking currently available commercial cigarettes. In this study, yields of a selected group of chemical constituents were quantified in machine-generated MSS from 30 brands of cigarettes sold in China. Using constituent yields, exposure estimates specific to and representative of the Chinese population, and available dose-response data, a Monte Carlo method was applied to simulate probability distributions for incremental lifetime cancer risk (ILCR), hazard quotient (HQ), and margin of exposure (MOE) values for each constituent as appropriate. Measures of central tendency were extracted from the outcome distributions and constituents were ranked according to these three risk assessment indices. The constituents for which ILCR >10(-4), HQ >1, and MOE risk contributed by each MSS constituent, this approach provides a plausible and objective framework for the prioritization of toxicants in cigarette smoke and is valuable in guiding tobacco risk management. Copyright © 2011 Elsevier Inc. All rights reserved.

  20. Early detection of production deficit hot spots in semi-arid environment using FAPAR time series and a probabilistic approach

    Science.gov (United States)

    Meroni, M.; Fasbender, D.; Kayitakire, F.; Pini, G.; Rembold, F.; Urbano, F.; Verstraete, M. M.

    2013-12-01

    Timely information on vegetation development at regional scale is needed in arid and semiarid African regions where rainfall variability leads to high inter-annual fluctuations in crop and pasture productivity, as well as to high risk of food crisis in the presence of severe drought events. The present study aims at developing and testing an automatic procedure to estimate the probability of experiencing a seasonal biomass production deficit solely on the basis of historical and near real-time remote sensing observations. The method is based on the extraction of vegetation phenology from SPOT-VEGTATION time series of the Fraction of Absorbed Photosynthetically Active Radiation (FAPAR) and the subsequent computation of seasonally cumulated FAPAR as a proxy for vegetation gross primary production. Within season forecasts of the overall seasonal performance, expressed in terms of probability of experiencing a critical deficit, are based on a statistical approach taking into account two factors: i) the similarity between the current FAPAR profile and past profiles observable in the 15 years FAPAR time series; ii) the uncertainty of past predictions of season outcome as derived using jack-knifing technique. The method is applicable at the regional to continental scale and can be updated regularly during the season (whenever a new satellite observation is made available) to provide a synoptic view of the hot spots of likely production deficit. The specific objective of the procedure described here is to deliver to the food security analyst, as early as possible within the season, only the relevant information (e.g., masking out areas without active vegetation at the time of analysis), expressed through a reliable and easily interpretable measure of impending risk. Evaluation of method performance and examples of application in the Sahel region are discussed.

  1. Solving stochastic multiobjective vehicle routing problem using probabilistic metaheuristic

    Directory of Open Access Journals (Sweden)

    Gannouni Asmae

    2017-01-01

    closed form expression. This novel approach is based on combinatorial probability and can be incorporated in a multiobjective evolutionary algorithm. (iiProvide probabilistic approaches to elitism and diversification in multiobjective evolutionary algorithms. Finally, The behavior of the resulting Probabilistic Multi-objective Evolutionary Algorithms (PrMOEAs is empirically investigated on the multi-objective stochastic VRP problem.

  2. Probabilistic composition of preferences, theory and applications

    CERN Document Server

    Parracho Sant'Anna, Annibal

    2015-01-01

    Putting forward a unified presentation of the features and possible applications of probabilistic preferences composition, and serving as a methodology for decisions employing multiple criteria, this book maximizes reader insights into the evaluation in probabilistic terms and the development of composition approaches that do not depend on assigning weights to the criteria. With key applications in important areas of management such as failure modes, effects analysis and productivity analysis – together with explanations about the application of the concepts involved –this book makes available numerical examples of probabilistic transformation development and probabilistic composition. Useful not only as a reference source for researchers, but also in teaching classes of graduate courses in Production Engineering and Management Science, the key themes of the book will be of especial interest to researchers in the field of Operational Research.

  3. The probabilistic approach to the analysis of the limiting behavior of an integro-diffebential equation depending on a small parameter, and its application to stochastic processes

    Directory of Open Access Journals (Sweden)

    O. V. Borisenko

    1994-01-01

    Full Text Available Using connection between stochastic differential equation with Poisson measure term and its Kolmogorov's equation, we investigate the limiting behavior of the Cauchy problem solution of the integro differential equation with coefficients depending on a small parameter. We also study the dependence of the limiting equation on the order of the parameter.

  4. Potential changes in the extreme climate conditions at the regional scale: from observed data to modelling approaches and towards probabilistic climate change information

    International Nuclear Information System (INIS)

    Gachon, P.; Radojevic, M.; Harding, A.; Saad, C.; Nguyen, V.T.V.

    2008-01-01

    downscaled results in this latter case, in future runs. Finally, we conclude about the need to generate ensemble runs to produce probabilistic climate information, and with the quantification of the cascade of uncertainties from the various sources of GCM and different downscaling approaches. (author)

  5. Probabilistic Design of Wind Turbines

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Toft, H.S.

    2010-01-01

    Probabilistic design of wind turbines requires definition of the structural elements to be included in the probabilistic basis: e.g., blades, tower, foundation; identification of important failure modes; careful stochastic modeling of the uncertain parameters; recommendations for target reliability....... It is described how uncertainties in wind turbine design related to computational models, statistical data from test specimens, results from a few full-scale tests and from prototype wind turbines can be accounted for using the Maximum Likelihood Method and a Bayesian approach. Assessment of the optimal...... reliability level by cost-benefit optimization is illustrated by an offshore wind turbine example. Uncertainty modeling is illustrated by an example where physical, statistical and model uncertainties are estimated....

  6. Probabilistic seismic hazard assessment of southern part of Ghana

    Science.gov (United States)

    Ahulu, Sylvanus T.; Danuor, Sylvester Kojo; Asiedu, Daniel K.

    2017-12-01

    This paper presents a seismic hazard map for the southern part of Ghana prepared using the probabilistic approach, and seismic hazard assessment results for six cities. The seismic hazard map was prepared for 10% probability of exceedance for peak ground acceleration in 50 years. The input parameters used for the computations of hazard were obtained using data from a catalogue that was compiled and homogenised to moment magnitude (Mw). The catalogue covered a period of over a century (1615-2009). The hazard assessment is based on the Poisson model for earthquake occurrence, and hence, dependent events were identified and removed from the catalogue. The following attenuation relations were adopted and used in this study—Allen (for south and eastern Australia), Silva et al. (for Central and eastern North America), Campbell and Bozorgnia (for worldwide active-shallow-crust regions) and Chiou and Youngs (for worldwide active-shallow-crust regions). Logic-tree formalism was used to account for possible uncertainties associated with the attenuation relationships. OpenQuake software package was used for the hazard calculation. The highest level of seismic hazard is found in the Accra and Tema seismic zones, with estimated peak ground acceleration close to 0.2 g. The level of the seismic hazard in the southern part of Ghana diminishes with distance away from the Accra/Tema region to a value of 0.05 g at a distance of about 140 km.

  7. Probabilistic Approaches to Energy Systems

    DEFF Research Database (Denmark)

    Iversen, Jan Emil Banning

    be useful on all levels of the energy systems, ranging from the highest level, where the transmission system operator is concerned with minimizing system failures and is aided by wind power forecasts, to the end user of energy where power price forecasts are useful for users with flexible power demand......Energy generation from wind and sun is increasing rapidly in many parts of the world. This presents new challenges on how to integrate this uncertain, intermittent and non-dispatchable energy source. This thesis deals with forecasting and decision making in energy systems with a large proportion...... of renewable energy generation. Particularly we focus on producing forecasting models that can predict renewable energy generation, single user demand, and provide advanced forecast products that are needed for an efficient integration of renewable energy into the power generation mix. Such forecasts can...

  8. Probabilistic approaches to robotic perception

    CERN Document Server

    Ferreira, João Filipe

    2014-01-01

    This book tries to address the following questions: How should the uncertainty and incompleteness inherent to sensing the environment be represented and modelled in a way that will increase the autonomy of a robot? How should a robotic system perceive, infer, decide and act efficiently? These are two of the challenging questions robotics community and robotic researchers have been facing. The development of robotic domain by the 1980s spurred the convergence of automation to autonomy, and the field of robotics has consequently converged towards the field of artificial intelligence (AI). Since the end of that decade, the general public’s imagination has been stimulated by high expectations on autonomy, where AI and robotics try to solve difficult cognitive problems through algorithms developed from either philosophical and anthropological conjectures or incomplete notions of cognitive reasoning. Many of these developments do not unveil even a few of the processes through which biological organisms solve thes...

  9. Dirichlet forms methods for Poisson point measures and Lévy processes with emphasis on the creation-annihilation techniques

    CERN Document Server

    Bouleau, Nicolas

    2015-01-01

    A simplified approach to Malliavin calculus adapted to Poisson random measures is developed and applied in this book. Called the “lent particle method” it is based on perturbation of the position of particles. Poisson random measures describe phenomena involving random jumps (for instance in mathematical finance) or the random distribution of particles (as in statistical physics). Thanks to the theory of Dirichlet forms, the authors develop a mathematical tool for a quite general class of random Poisson measures and significantly simplify computations of Malliavin matrices of Poisson functionals. The method gives rise to a new explicit calculus that they illustrate on various examples: it consists in adding a particle and then removing it after computing the gradient. Using this method, one can establish absolute continuity of Poisson functionals such as Lévy areas, solutions of SDEs driven by Poisson measure and, by iteration, obtain regularity of laws. The authors also give applications to error calcul...

  10. Balkanization and Unification of Probabilistic Inferences

    Science.gov (United States)

    Yu, Chong-Ho

    2005-01-01

    Many research-related classes in social sciences present probability as a unified approach based upon mathematical axioms, but neglect the diversity of various probability theories and their associated philosophical assumptions. Although currently the dominant statistical and probabilistic approach is the Fisherian tradition, the use of Fisherian…

  11. A Comparative Study of Probabilistic Roadmap Planners

    NARCIS (Netherlands)

    Geraerts, R.J.; Overmars, M.H.

    2004-01-01

    The probabilistic roadmap approach is one of the leading motion planning techniques. Over the past eight years the technique has been studied by many different researchers. This has led to a large number of variants of the approach, each with its own merits. It is difficult to compare the different

  12. Probabilistic tractography using Lasso bootstrap.

    Science.gov (United States)

    Ye, Chuyang; Prince, Jerry L

    2017-01-01

    Diffusion magnetic resonance imaging (dMRI) can be used for noninvasive imaging of white matter tracts. Using fiber tracking, which propagates fiber streamlines according to fiber orientations (FOs) computed from dMRI, white matter tracts can be reconstructed for investigation of brain diseases and the brain connectome. Because of image noise, probabilistic tractography has been proposed to characterize uncertainties in FO estimation. Bootstrap provides a nonparametric approach to the estimation of FO uncertainties and residual bootstrap has been used for developing probabilistic tractography. However, recently developed models have incorporated sparsity regularization to reduce the required number of gradient directions to resolve crossing FOs, and the residual bootstrap used in previous methods is not applicable to these models. In this work, we propose a probabilistic tractography algorithm named Lasso bootstrap tractography (LBT) for the models that incorporate sparsity. Using a fixed tensor basis and a sparsity assumption, diffusion signals are modeled using a Lasso formulation. With the residuals from the Lasso model, a distribution of diffusion signals is obtained according to a modified Lasso bootstrap strategy. FOs are then estimated from the synthesized diffusion signals by an algorithm that improves FO estimation by enforcing spatial consistency of FOs. Finally, streamlining fiber tracking is performed with the computed FOs. The LBT algorithm was evaluated on simulated and real dMRI data both qualitatively and quantitatively. Results demonstrate that LBT outperforms state-of-the-art algorithms. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. High order Poisson Solver for unbounded flows

    DEFF Research Database (Denmark)

    Hejlesen, Mads Mølholm; Rasmussen, Johannes Tophøj; Chatelain, Philippe

    2015-01-01

    This paper presents a high order method for solving the unbounded Poisson equation on a regular mesh using a Green’s function solution. The high order convergence was achieved by formulating mollified integration kernels, that were derived from a filter regularisation of the solution field...... the equations of fluid mechanics as an example, but can be used in many physical problems to solve the Poisson equation on a rectangular unbounded domain. For the two-dimensional case we propose an infinitely smooth test function which allows for arbitrary high order convergence. Using Gaussian smoothing....... The method was implemented on a rectangular domain using fast Fourier transforms (FFT) to increase computational efficiency. The Poisson solver was extended to directly solve the derivatives of the solution. This is achieved either by including the differential operator in the integration kernel...

  14. Poisson-Jacobi reduction of homogeneous tensors

    International Nuclear Information System (INIS)

    Grabowski, J; Iglesias, D; Marrero, J C; Padron, E; Urbanski, P

    2004-01-01

    The notion of homogeneous tensors is discussed. We show that there is a one-to-one correspondence between multivector fields on a manifold M, homogeneous with respect to a vector field Δ on M, and first-order polydifferential operators on a closed submanifold N of codimension 1 such that Δ is transversal to N. This correspondence relates the Schouten-Nijenhuis bracket of multivector fields on M to the Schouten-Jacobi bracket of first-order polydifferential operators on N and generalizes the Poissonization of Jacobi manifolds. Actually, it can be viewed as a super-Poissonization. This procedure of passing from a homogeneous multivector field to a first-order polydifferential operator can also be understood as a sort of reduction; in the standard case-a half of a Poisson reduction. A dual version of the above correspondence yields in particular the correspondence between Δ-homogeneous symplectic structures on M and contact structures on N

  15. Probabilistic risk assessment on maritime spent nuclear fuel transportation (Part II: Ship collision probability)

    International Nuclear Information System (INIS)

    Christian, Robby; Kang, Hyun Gook

    2017-01-01

    This paper proposes a methodology to assess and reduce risks of maritime spent nuclear fuel transportation with a probabilistic approach. Event trees detailing the progression of collisions leading to transport casks’ damage were constructed. Parallel and crossing collision probabilities were formulated based on the Poisson distribution. Automatic Identification System (AIS) data were processed with the Hough Transform algorithm to estimate possible intersections between the shipment route and the marine traffic. Monte Carlo simulations were done to compute collision probabilities and impact energies at each intersection. Possible safety improvement measures through a proper selection of operational transport parameters were investigated. These parameters include shipment routes, ship's cruise velocity, number of transport casks carried in a shipment, the casks’ stowage configuration and loading order on board the ship. A shipment case study is presented. Waters with high collision probabilities were identified. Effective range of cruising velocity to reduce collision risks were discovered. The number of casks in a shipment and their stowage method which gave low cask damage frequencies were obtained. The proposed methodology was successful in quantifying ship collision and cask damage frequency. It was effective in assisting decision making processes to minimize risks in maritime spent nuclear fuel transportation. - Highlights: • Proposes a probabilistic framework on the safety of spent nuclear fuel transportation by sea. • Developed a marine traffic simulation model using Generalized Hough Transform (GHT) algorithm. • A transportation case study on South Korean waters is presented. • Single-vessel risk reduction method is outlined by optimizing transport parameters.

  16. Probabilistic Tsunami Hazard in the Northeast Atlantic from Near- and Far-Field Tectonic Sources

    Science.gov (United States)

    Omira, R.; Baptista, M. A.; Matias, L.

    2015-03-01

    In this article, we present the first study on probabilistic tsunami hazard assessment for the Northeast (NE) Atlantic region related to earthquake sources. The methodology combines the probabilistic seismic hazard assessment, tsunami numerical modeling, and statistical approaches. We consider three main tsunamigenic areas, namely the Southwest Iberian Margin, the Gloria, and the Caribbean. For each tsunamigenic zone, we derive the annual recurrence rate for each magnitude range, from Mw 8.0 up to Mw 9.0, with a regular interval, using the Bayesian method, which incorporates seismic information from historical and instrumental catalogs. A numerical code, solving the shallow water equations, is employed to simulate the tsunami propagation and compute near shore wave heights. The probability of exceeding a specific tsunami hazard level during a given time period is calculated using the Poisson distribution. The results are presented in terms of the probability of exceedance of a given tsunami amplitude for 100- and 500-year return periods. The hazard level varies along the NE Atlantic coast, being maximum along the northern segment of the Morocco Atlantic coast, the southern Portuguese coast, and the Spanish coast of the Gulf of Cadiz. We find that the probability that a maximum wave height exceeds 1 m somewhere in the NE Atlantic region reaches 60 and 100 % for 100- and 500-year return periods, respectively. These probability values decrease, respectively, to about 15 and 50 % when considering the exceedance threshold of 5 m for the same return periods of 100 and 500 years.

  17. Evaluating the double Poisson generalized linear model.

    Science.gov (United States)

    Zou, Yaotian; Geedipally, Srinivas Reddy; Lord, Dominique

    2013-10-01

    The objectives of this study are to: (1) examine the applicability of the double Poisson (DP) generalized linear model (GLM) for analyzing motor vehicle crash data characterized by over- and under-dispersion and (2) compare the performance of the DP GLM with the Conway-Maxwell-Poisson (COM-Poisson) GLM in terms of goodness-of-fit and theoretical soundness. The DP distribution has seldom been investigated and applied since its first introduction two decades ago. The hurdle for applying the DP is related to its normalizing constant (or multiplicative constant) which is not available in closed form. This study proposed a new method to approximate the normalizing constant of the DP with high accuracy and reliability. The DP GLM and COM-Poisson GLM were developed using two observed over-dispersed datasets and one observed under-dispersed dataset. The modeling results indicate that the DP GLM with its normalizing constant approximated by the new method can handle crash data characterized by over- and under-dispersion. Its performance is comparable to the COM-Poisson GLM in terms of goodness-of-fit (GOF), although COM-Poisson GLM provides a slightly better fit. For the over-dispersed data, the DP GLM performs similar to the NB GLM. Considering the fact that the DP GLM can be easily estimated with inexpensive computation and that it is simpler to interpret coefficients, it offers a flexible and efficient alternative for researchers to model count data. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Learning Probabilistic Decision Graphs

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Dalgaard, Jens; Silander, Tomi

    2004-01-01

    Probabilistic decision graphs (PDGs) are a representation language for probability distributions based on binary decision diagrams. PDGs can encode (context-specific) independence relations that cannot be captured in a Bayesian network structure, and can sometimes provide computationally more...

  19. Probabilistic transmission system planning

    CERN Document Server

    Li, Wenyuan

    2011-01-01

    "The book is composed of 12 chapters and three appendices, and can be divided into four parts. The first part includes Chapters 2 to 7, which discuss the concepts, models, methods and data in probabilistic transmission planning. The second part, Chapters 8 to 11, addresses four essential issues in probabilistic transmission planning applications using actual utility systems as examples. Chapter 12, as the third part, focuses on a special issue, i.e. how to deal with uncertainty of data in probabilistic transmission planning. The fourth part consists of three appendices, which provide the basic knowledge in mathematics for probabilistic planning. Please refer to the attached table of contents which is given in a very detailed manner"--

  20. THE PANCHROMATIC HUBBLE ANDROMEDA TREASURY. IV. A PROBABILISTIC APPROACH TO INFERRING THE HIGH-MASS STELLAR INITIAL MASS FUNCTION AND OTHER POWER-LAW FUNCTIONS

    Energy Technology Data Exchange (ETDEWEB)

    Weisz, Daniel R.; Fouesneau, Morgan; Dalcanton, Julianne J.; Clifton Johnson, L.; Beerman, Lori C.; Williams, Benjamin F. [Department of Astronomy, University of Washington, Box 351580, Seattle, WA 98195 (United States); Hogg, David W.; Foreman-Mackey, Daniel T. [Center for Cosmology and Particle Physics, New York University, 4 Washington Place, New York, NY 10003 (United States); Rix, Hans-Walter; Gouliermis, Dimitrios [Max Planck Institute for Astronomy, Koenigstuhl 17, D-69117 Heidelberg (Germany); Dolphin, Andrew E. [Raytheon Company, 1151 East Hermans Road, Tucson, AZ 85756 (United States); Lang, Dustin [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States); Bell, Eric F. [Department of Astronomy, University of Michigan, 500 Church Street, Ann Arbor, MI 48109 (United States); Gordon, Karl D.; Kalirai, Jason S. [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Skillman, Evan D., E-mail: dweisz@astro.washington.edu [Minnesota Institute for Astrophysics, University of Minnesota, 116 Church Street SE, Minneapolis, MN 55455 (United States)

    2013-01-10

    We present a probabilistic approach for inferring the parameters of the present-day power-law stellar mass function (MF) of a resolved young star cluster. This technique (1) fully exploits the information content of a given data set; (2) can account for observational uncertainties in a straightforward way; (3) assigns meaningful uncertainties to the inferred parameters; (4) avoids the pitfalls associated with binning data; and (5) can be applied to virtually any resolved young cluster, laying the groundwork for a systematic study of the high-mass stellar MF (M {approx}> 1 M {sub Sun }). Using simulated clusters and Markov Chain Monte Carlo sampling of the probability distribution functions, we show that estimates of the MF slope, {alpha}, are unbiased and that the uncertainty, {Delta}{alpha}, depends primarily on the number of observed stars and on the range of stellar masses they span, assuming that the uncertainties on individual masses and the completeness are both well characterized. Using idealized mock data, we compute the theoretical precision, i.e., lower limits, on {alpha}, and provide an analytic approximation for {Delta}{alpha} as a function of the observed number of stars and mass range. Comparison with literature studies shows that {approx}3/4 of quoted uncertainties are smaller than the theoretical lower limit. By correcting these uncertainties to the theoretical lower limits, we find that the literature studies yield ({alpha}) = 2.46, with a 1{sigma} dispersion of 0.35 dex. We verify that it is impossible for a power-law MF to obtain meaningful constraints on the upper mass limit of the initial mass function, beyond the lower bound of the most massive star actually observed. We show that avoiding substantial biases in the MF slope requires (1) including the MF as a prior when deriving individual stellar mass estimates, (2) modeling the uncertainties in the individual stellar masses, and (3) fully characterizing and then explicitly modeling the

  1. THE PANCHROMATIC HUBBLE ANDROMEDA TREASURY. IV. A PROBABILISTIC APPROACH TO INFERRING THE HIGH-MASS STELLAR INITIAL MASS FUNCTION AND OTHER POWER-LAW FUNCTIONS

    International Nuclear Information System (INIS)

    Weisz, Daniel R.; Fouesneau, Morgan; Dalcanton, Julianne J.; Clifton Johnson, L.; Beerman, Lori C.; Williams, Benjamin F.; Hogg, David W.; Foreman-Mackey, Daniel T.; Rix, Hans-Walter; Gouliermis, Dimitrios; Dolphin, Andrew E.; Lang, Dustin; Bell, Eric F.; Gordon, Karl D.; Kalirai, Jason S.; Skillman, Evan D.

    2013-01-01

    We present a probabilistic approach for inferring the parameters of the present-day power-law stellar mass function (MF) of a resolved young star cluster. This technique (1) fully exploits the information content of a given data set; (2) can account for observational uncertainties in a straightforward way; (3) assigns meaningful uncertainties to the inferred parameters; (4) avoids the pitfalls associated with binning data; and (5) can be applied to virtually any resolved young cluster, laying the groundwork for a systematic study of the high-mass stellar MF (M ∼> 1 M ☉ ). Using simulated clusters and Markov Chain Monte Carlo sampling of the probability distribution functions, we show that estimates of the MF slope, α, are unbiased and that the uncertainty, Δα, depends primarily on the number of observed stars and on the range of stellar masses they span, assuming that the uncertainties on individual masses and the completeness are both well characterized. Using idealized mock data, we compute the theoretical precision, i.e., lower limits, on α, and provide an analytic approximation for Δα as a function of the observed number of stars and mass range. Comparison with literature studies shows that ∼3/4 of quoted uncertainties are smaller than the theoretical lower limit. By correcting these uncertainties to the theoretical lower limits, we find that the literature studies yield (α) = 2.46, with a 1σ dispersion of 0.35 dex. We verify that it is impossible for a power-law MF to obtain meaningful constraints on the upper mass limit of the initial mass function, beyond the lower bound of the most massive star actually observed. We show that avoiding substantial biases in the MF slope requires (1) including the MF as a prior when deriving individual stellar mass estimates, (2) modeling the uncertainties in the individual stellar masses, and (3) fully characterizing and then explicitly modeling the completeness for stars of a given mass. The precision on MF

  2. Probabilistic object and viewpoint models for active object recognition

    CSIR Research Space (South Africa)

    Govender, N

    2013-09-01

    Full Text Available across views to be integrated in a principled manner, and permitting a principled approach to data acquisition. Existing approaches however mostly rely on probabilistic models which make simplifying assumptions such as that features may be treated...

  3. Improving software size estimates by using probabilistic pairwise comparison matrices

    Science.gov (United States)

    Hihn, Jairus; Lum, Karen T.

    2004-01-01

    The Pairwise Comparison technique is a general purpose estimation approach for capturing expert judgment. This approach can be generalized to a probabilistic version using Monte Carlo methods to produce estimates of size distributions.

  4. Equilibrium stochastic dynamics of Poisson cluster ensembles

    Directory of Open Access Journals (Sweden)

    L.Bogachev

    2008-06-01

    Full Text Available The distribution μ of a Poisson cluster process in Χ=Rd (with n-point clusters is studied via the projection of an auxiliary Poisson measure in the space of configurations in Χn, with the intensity measure being the convolution of the background intensity (of cluster centres with the probability distribution of a generic cluster. We show that μ is quasi-invariant with respect to the group of compactly supported diffeomorphisms of Χ, and prove an integration by parts formula for μ. The corresponding equilibrium stochastic dynamics is then constructed using the method of Dirichlet forms.

  5. White Noise of Poisson Random Measures

    OpenAIRE

    Proske, Frank; Øksendal, Bernt

    2002-01-01

    We develop a white noise theory for Poisson random measures associated with a Lévy process. The starting point of this theory is a chaos expansion with kernels of polynomial type. We use this to construct the white noise of a Poisson random measure, which takes values in a certain distribution space. Then we show, how a Skorohod/Itô integral for point processes can be represented by a Bochner integral in terms of white noise of the random measure and a Wick product. Further, we apply these co...

  6. Bayesian regression of piecewise homogeneous Poisson processes

    Directory of Open Access Journals (Sweden)

    Diego Sevilla

    2015-12-01

    Full Text Available In this paper, a Bayesian method for piecewise regression is adapted to handle counting processes data distributed as Poisson. A numerical code in Mathematica is developed and tested analyzing simulated data. The resulting method is valuable for detecting breaking points in the count rate of time series for Poisson processes. Received: 2 November 2015, Accepted: 27 November 2015; Edited by: R. Dickman; Reviewed by: M. Hutter, Australian National University, Canberra, Australia.; DOI: http://dx.doi.org/10.4279/PIP.070018 Cite as: D J R Sevilla, Papers in Physics 7, 070018 (2015

  7. Conditional Poisson models: a flexible alternative to conditional logistic case cross-over analysis.

    Science.gov (United States)

    Armstrong, Ben G; Gasparrini, Antonio; Tobias, Aurelio

    2014-11-24

    The time stratified case cross-over approach is a popular alternative to conventional time series regression for analysing associations between time series of environmental exposures (air pollution, weather) and counts of health outcomes. These are almost always analyzed using conditional logistic regression on data expanded to case-control (case crossover) format, but this has some limitations. In particular adjusting for overdispersion and auto-correlation in the counts is not possible. It has been established that a Poisson model for counts with stratum indicators gives identical estimates to those from conditional logistic regression and does not have these limitations, but it is little used, probably because of the overheads in estimating many stratum parameters. The conditional Poisson model avoids estimating stratum parameters by conditioning on the total event count in each stratum, thus simplifying the computing and increasing the number of strata for which fitting is feasible compared with the standard unconditional Poisson model. Unlike the conditional logistic model, the conditional Poisson model does not require expanding the data, and can adjust for overdispersion and auto-correlation. It is available in Stata, R, and other packages. By applying to some real data and using simulations, we demonstrate that conditional Poisson models were simpler to code and shorter to run than are conditional logistic analyses and can be fitted to larger data sets than possible with standard Poisson models. Allowing for overdispersion or autocorrelation was possible with the conditional Poisson model but when not required this model gave identical estimates to those from conditional logistic regression. Conditional Poisson regression models provide an alternative to case crossover analysis of stratified time series data with some advantages. The conditional Poisson model can also be used in other contexts in which primary control for confounding is by fine

  8. Region-specific deterministic and probabilistic seismic hazard ...

    Indian Academy of Sciences (India)

    Region-specific deterministic and probabilistic seismic hazard analysis of Kanpur city ... A seismic hazard map of Kanpur city has been developed considering the region-specific seismotectonic parameters within a 500-km radius by deterministic and probabilistic approaches. ... King Saud University, Riyadh, Saudi Arabia.

  9. Learning probabilistic document template models via interaction

    Science.gov (United States)

    Ahmadullin, Ildus; Damera-Venkata, Niranjan

    2013-03-01

    Document aesthetics measures are key to automated document composition. Recently we presented a probabilistic document model (PDM) which is a micro-model for document aesthetics based on a probabilistic modeling of designer choice in document design. The PDM model comes with efficient layout synthesis algorithms once the aesthetic model is defined. A key element of this approach is an aesthetic prior on the parameters of a template encoding aesthetic preferences for template parameters. Parameters of the prior were required to be chosen empirically by designers. In this work we show how probabilistic template models (and hence the PDM cost function) can be learnt directly by observing a designer making design choices in composing sample documents. From such training data our learning approach can learn a quality measure that can mimic some of the design tradeoffs a designer makes in practice.

  10. Comparison of Control Approaches in Genetic Regulatory Networks by Using Stochastic Master Equation Models, Probabilistic Boolean Network Models and Differential Equation Models and Estimated Error Analyzes

    Science.gov (United States)

    Caglar, Mehmet Umut; Pal, Ranadip

    2011-03-01

    Central dogma of molecular biology states that ``information cannot be transferred back from protein to either protein or nucleic acid''. However, this assumption is not exactly correct in most of the cases. There are a lot of feedback loops and interactions between different levels of systems. These types of interactions are hard to analyze due to the lack of cell level data and probabilistic - nonlinear nature of interactions. Several models widely used to analyze and simulate these types of nonlinear interactions. Stochastic Master Equation (SME) models give probabilistic nature of the interactions in a detailed manner, with a high calculation cost. On the other hand Probabilistic Boolean Network (PBN) models give a coarse scale picture of the stochastic processes, with a less calculation cost. Differential Equation (DE) models give the time evolution of mean values of processes in a highly cost effective way. The understanding of the relations between the predictions of these models is important to understand the reliability of the simulations of genetic regulatory networks. In this work the success of the mapping between SME, PBN and DE models is analyzed and the accuracy and affectivity of the control policies generated by using PBN and DE models is compared.

  11. Spatial Nonhomogeneous Poisson Process in Corrosion Management

    NARCIS (Netherlands)

    López De La Cruz, J.; Kuniewski, S.P.; Van Noortwijk, J.M.; Guriérrez, M.A.

    2008-01-01

    A method to test the assumption of nonhomogeneous Poisson point processes is implemented to analyze corrosion pit patterns. The method is calibrated with three artificially generated patterns and manages to accurately assess whether a pattern distribution is random, regular, or clustered. The

  12. Efficient information transfer by Poisson neurons

    Czech Academy of Sciences Publication Activity Database

    Košťál, Lubomír; Shinomoto, S.

    2016-01-01

    Roč. 13, č. 3 (2016), s. 509-520 ISSN 1547-1063 R&D Projects: GA ČR(CZ) GA15-08066S Institutional support: RVO:67985823 Keywords : information capacity * Poisson neuron * metabolic cost * decoding error Subject RIV: BD - Theory of Information Impact factor: 1.035, year: 2016

  13. Natural Poisson structures of nonlinear plasma dynamics

    International Nuclear Information System (INIS)

    Kaufman, A.N.

    1982-06-01

    Hamiltonian field theories, for models of nonlinear plasma dynamics, require a Poisson bracket structure for functionals of the field variables. These are presented, applied, and derived for several sets of field variables: coherent waves, incoherent waves, particle distributions, and multifluid electrodynamics. Parametric coupling of waves and plasma yields concise expressions for ponderomotive effects (in kinetic and fluid models) and for induced scattering

  14. Natural Poisson structures of nonlinear plasma dynamics

    International Nuclear Information System (INIS)

    Kaufman, A.N.

    1982-01-01

    Hamiltonian field theories, for models of nonlinear plasma dynamics, require a Poisson bracket structure for functionals of the field variables. These are presented, applied, and derived for several sets of field variables: coherent waves, incoherent waves, particle distributions, and multifluid electrodynamics. Parametric coupling of waves and plasma yields concise expressions for ponderomotive effects (in kinetic and fluid models) and for induced scattering. (Auth.)

  15. Poisson brackets for fluids and plasmas

    International Nuclear Information System (INIS)

    Morrison, P.J.

    1982-01-01

    Noncanonical yet Hamiltonian descriptions are presented of many of the non-dissipative field equations that govern fluids and plasmas. The dynamical variables are the usually encountered physical variables. These descriptions have the advantage that gauge conditions are absent, but at the expense of introducing peculiar Poisson brackets. Clebsch-like potential descriptions that reverse this situations are also introduced

  16. Almost Poisson integration of rigid body systems

    International Nuclear Information System (INIS)

    Austin, M.A.; Krishnaprasad, P.S.; Li-Sheng Wang

    1993-01-01

    In this paper we discuss the numerical integration of Lie-Poisson systems using the mid-point rule. Since such systems result from the reduction of hamiltonian systems with symmetry by lie group actions, we also present examples of reconstruction rules for the full dynamics. A primary motivation is to preserve in the integration process, various conserved quantities of the original dynamics. A main result of this paper is an O(h 3 ) error estimate for the Lie-Poisson structure, where h is the integration step-size. We note that Lie-Poisson systems appear naturally in many areas of physical science and engineering, including theoretical mechanics of fluids and plasmas, satellite dynamics, and polarization dynamics. In the present paper we consider a series of progressively complicated examples related to rigid body systems. We also consider a dissipative example associated to a Lie-Poisson system. The behavior of the mid-point rule and an associated reconstruction rule is numerically explored. 24 refs., 9 figs

  17. Dimensional reduction for generalized Poisson brackets

    Science.gov (United States)

    Acatrinei, Ciprian Sorin

    2008-02-01

    We discuss dimensional reduction for Hamiltonian systems which possess nonconstant Poisson brackets between pairs of coordinates and between pairs of momenta. The associated Jacobi identities imply that the dimensionally reduced brackets are always constant. Some examples are given alongside the general theory.

  18. Affine Poisson Groups and WZW Model

    Directory of Open Access Journals (Sweden)

    Ctirad Klimcík

    2008-01-01

    Full Text Available We give a detailed description of a dynamical system which enjoys a Poisson-Lie symmetry with two non-isomorphic dual groups. The system is obtained by taking the q → ∞ limit of the q-deformed WZW model and the understanding of its symmetry structure results in uncovering an interesting duality of its exchange relations.

  19. Collision prediction models using multivariate Poisson-lognormal regression.

    Science.gov (United States)

    El-Basyouny, Karim; Sayed, Tarek

    2009-07-01

    This paper advocates the use of multivariate Poisson-lognormal (MVPLN) regression to develop models for collision count data. The MVPLN approach presents an opportunity to incorporate the correlations across collision severity levels and their influence on safety analyses. The paper introduces a new multivariate hazardous location identification technique, which generalizes the univariate posterior probability of excess that has been commonly proposed and applied in the literature. In addition, the paper presents an alternative approach for quantifying the effect of the multivariate structure on the precision of expected collision frequency. The MVPLN approach is compared with the independent (separate) univariate Poisson-lognormal (PLN) models with respect to model inference, goodness-of-fit, identification of hot spots and precision of expected collision frequency. The MVPLN is modeled using the WinBUGS platform which facilitates computation of posterior distributions as well as providing a goodness-of-fit measure for model comparisons. The results indicate that the estimates of the extra Poisson variation parameters were considerably smaller under MVPLN leading to higher precision. The improvement in precision is due mainly to the fact that MVPLN accounts for the correlation between the latent variables representing property damage only (PDO) and injuries plus fatalities (I+F). This correlation was estimated at 0.758, which is highly significant, suggesting that higher PDO rates are associated with higher I+F rates, as the collision likelihood for both types is likely to rise due to similar deficiencies in roadway design and/or other unobserved factors. In terms of goodness-of-fit, the MVPLN model provided a superior fit than the independent univariate models. The multivariate hazardous location identification results demonstrated that some hazardous locations could be overlooked if the analysis was restricted to the univariate models.

  20. Ruin probabilities for a regenerative Poisson gap generated risk process

    DEFF Research Database (Denmark)

    Asmussen, Søren; Biard, Romain

    . Asymptotic expressions for the infinite horizon ruin probabilities are given both for the light- and the heavy-tailed case. A basic observation is that the process regenerates at each G-claim. Also an approach via Markov additive processes is outlined, and heuristics are given for the distribution of the time......A risk process with constant premium rate c and Poisson arrivals of claims is considered. A threshold r is defined for claim interarrival times, such that if k consecutive interarrival times are larger than r, then the next claim has distribution G. Otherwise, the claim size distribution is F...

  1. Improving EWMA Plans for Detecting Unusual Increases in Poisson Counts

    Directory of Open Access Journals (Sweden)

    R. S. Sparks

    2009-01-01

    adaptive exponentially weighted moving average (EWMA plan is developed for signalling unusually high incidence when monitoring a time series of nonhomogeneous daily disease counts. A Poisson transitional regression model is used to fit background/expected trend in counts and provides “one-day-ahead” forecasts of the next day's count. Departures of counts from their forecasts are monitored. The paper outlines an approach for improving early outbreak data signals by dynamically adjusting the exponential weights to be efficient at signalling local persistent high side changes. We emphasise outbreak signals in steady-state situations; that is, changes that occur after the EWMA statistic had run through several in-control counts.

  2. Probabilistic record linkage.

    Science.gov (United States)

    Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona

    2016-06-01

    Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a 'black box' research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. © The Author 2015; Published by Oxford University Press on behalf of the International Epidemiological Association.

  3. Tetrahedral meshing via maximal Poisson-disk sampling

    KAUST Repository

    Guo, Jianwei

    2016-02-15

    In this paper, we propose a simple yet effective method to generate 3D-conforming tetrahedral meshes from closed 2-manifold surfaces. Our approach is inspired by recent work on maximal Poisson-disk sampling (MPS), which can generate well-distributed point sets in arbitrary domains. We first perform MPS on the boundary of the input domain, we then sample the interior of the domain, and we finally extract the tetrahedral mesh from the samples by using 3D Delaunay or regular triangulation for uniform or adaptive sampling, respectively. We also propose an efficient optimization strategy to protect the domain boundaries and to remove slivers to improve the meshing quality. We present various experimental results to illustrate the efficiency and the robustness of our proposed approach. We demonstrate that the performance and quality (e.g., minimal dihedral angle) of our approach are superior to current state-of-the-art optimization-based approaches.

  4. A generalized Poisson and Poisson-Boltzmann solver for electrostatic environments

    Energy Technology Data Exchange (ETDEWEB)

    Fisicaro, G., E-mail: giuseppe.fisicaro@unibas.ch; Goedecker, S. [Department of Physics, University of Basel, Klingelbergstrasse 82, 4056 Basel (Switzerland); Genovese, L. [University of Grenoble Alpes, CEA, INAC-SP2M, L-Sim, F-38000 Grenoble (France); Andreussi, O. [Institute of Computational Science, Università della Svizzera Italiana, Via Giuseppe Buffi 13, CH-6904 Lugano (Switzerland); Theory and Simulations of Materials (THEOS) and National Centre for Computational Design and Discovery of Novel Materials (MARVEL), École Polytechnique Fédérale de Lausanne, Station 12, CH-1015 Lausanne (Switzerland); Marzari, N. [Theory and Simulations of Materials (THEOS) and National Centre for Computational Design and Discovery of Novel Materials (MARVEL), École Polytechnique Fédérale de Lausanne, Station 12, CH-1015 Lausanne (Switzerland)

    2016-01-07

    The computational study of chemical reactions in complex, wet environments is critical for applications in many fields. It is often essential to study chemical reactions in the presence of applied electrochemical potentials, taking into account the non-trivial electrostatic screening coming from the solvent and the electrolytes. As a consequence, the electrostatic potential has to be found by solving the generalized Poisson and the Poisson-Boltzmann equations for neutral and ionic solutions, respectively. In the present work, solvers for both problems have been developed. A preconditioned conjugate gradient method has been implemented for the solution of the generalized Poisson equation and the linear regime of the Poisson-Boltzmann, allowing to solve iteratively the minimization problem with some ten iterations of the ordinary Poisson equation solver. In addition, a self-consistent procedure enables us to solve the non-linear Poisson-Boltzmann problem. Both solvers exhibit very high accuracy and parallel efficiency and allow for the treatment of periodic, free, and slab boundary conditions. The solver has been integrated into the BigDFT and Quantum-ESPRESSO electronic-structure packages and will be released as an independent program, suitable for integration in other codes.

  5. A generalized Poisson and Poisson-Boltzmann solver for electrostatic environments

    International Nuclear Information System (INIS)

    Fisicaro, G.; Goedecker, S.; Genovese, L.; Andreussi, O.; Marzari, N.

    2016-01-01

    The computational study of chemical reactions in complex, wet environments is critical for applications in many fields. It is often essential to study chemical reactions in the presence of applied electrochemical potentials, taking into account the non-trivial electrostatic screening coming from the solvent and the electrolytes. As a consequence, the electrostatic potential has to be found by solving the generalized Poisson and the Poisson-Boltzmann equations for neutral and ionic solutions, respectively. In the present work, solvers for both problems have been developed. A preconditioned conjugate gradient method has been implemented for the solution of the generalized Poisson equation and the linear regime of the Poisson-Boltzmann, allowing to solve iteratively the minimization problem with some ten iterations of the ordinary Poisson equation solver. In addition, a self-consistent procedure enables us to solve the non-linear Poisson-Boltzmann problem. Both solvers exhibit very high accuracy and parallel efficiency and allow for the treatment of periodic, free, and slab boundary conditions. The solver has been integrated into the BigDFT and Quantum-ESPRESSO electronic-structure packages and will be released as an independent program, suitable for integration in other codes

  6. Formalizing Probabilistic Safety Claims

    Science.gov (United States)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  7. Probabilistic Transcriptome Assembly and Variant Graph Genotyping

    DEFF Research Database (Denmark)

    Sibbesen, Jonas Andreas

    that this approach outperforms existing state-of-the-art methods measured using sensitivity and precision on both simulated and real data. The second is a novel probabilistic method that uses exact alignment of k-mers to a set of variants graphs to provide unbiased estimates of genotypes in a population...

  8. A Probabilistic Framework for Curve Evolution

    DEFF Research Database (Denmark)

    Dahl, Vedrana Andersen

    2017-01-01

    approach include ability to handle textured images, simple generalization to multiple regions, and efficiency in computation. We test our probabilistic framework in combination with parametric (snakes) and geometric (level-sets) curves. The experimental results on composed and natural images demonstrate...

  9. Estimating software development project size, using probabilistic ...

    African Journals Online (AJOL)

    Probabilistic approach was used to estimate the software project size, using the data collected when we developed a tool to estimate the time to complete a new software project, and from The University of Calabar Computer Centre. The Expected size of the Tool was estimated to be 1.463 KSLOC, but when the tool was ...

  10. Further Note on the Probabilistic Constraint Handling

    NARCIS (Netherlands)

    Ciftcioglu, O.; Bittermann, M.S.; Datta, R

    2016-01-01

    A robust probabilistic constraint handling approach in the framework of joint evolutionary-classical optimization has been presented earlier. In this work, the
    theoretical foundations of the method are presented in detail. The method is known as bi-objective method, where the conventional

  11. Probabilistic System Summaries for Behavior Architecting

    NARCIS (Netherlands)

    Borth, M.

    2015-01-01

    Smart system of systems adapt to their context, current situation, and configuration. To engineer such systems’ behavior, we need to design and eval-uate system-level control strategies and the intelligent management of key scenarios. We propose a model-based approach called probabilistic system

  12. Application of probabilistic precipitation forecasts from a ...

    African Journals Online (AJOL)

    Application of probabilistic precipitation forecasts from a deterministic model towards increasing the lead-time of flash flood forecasts in South Africa. ... The procedure is applied to a real flash flood event and the ensemble-based rainfall forecasts are verified against rainfall estimated by the SAFFG system. The approach ...

  13. Probabilistic Durability Analysis in Advanced Engineering Design

    Directory of Open Access Journals (Sweden)

    A. Kudzys

    2000-01-01

    Full Text Available Expedience of probabilistic durability concepts and approaches in advanced engineering design of building materials, structural members and systems is considered. Target margin values of structural safety and serviceability indices are analyzed and their draft values are presented. Analytical methods of the cumulative coefficient of correlation and the limit transient action effect for calculation of reliability indices are given. Analysis can be used for probabilistic durability assessment of carrying and enclosure metal, reinforced concrete, wood, plastic, masonry both homogeneous and sandwich or composite structures and some kinds of equipments. Analysis models can be applied in other engineering fields.

  14. Scalable group level probabilistic sparse factor analysis

    DEFF Research Database (Denmark)

    Hinrich, Jesper Løve; Nielsen, Søren Føns Vind; Riis, Nicolai Andre Brogaard

    2017-01-01

    Many data-driven approaches exist to extract neural representations of functional magnetic resonance imaging (fMRI) data, but most of them lack a proper probabilistic formulation. We propose a scalable group level probabilistic sparse factor analysis (psFA) allowing spatially sparse maps, component...... pruning using automatic relevance determination (ARD) and subject specific heteroscedastic spatial noise modeling. For task-based and resting state fMRI, we show that the sparsity constraint gives rise to components similar to those obtained by group independent component analysis. The noise modeling...

  15. Quantitative interpretation of myocardial Tl-201 single-photon emission computerized tomograms: A probabilistic approach to the assessment of coronary artery disease

    International Nuclear Information System (INIS)

    Maddahi, J.; Prigent, F.; Staniloff, H.; Garcia, E.; Becerra, A.; Van Train, K.; Swan, H.J.C.; Waxman, A.; Berman, D.

    1985-01-01

    Probabilistic criteria for abnormality would enhance application of stress-redistribution Tl-201 rotational tomography (tomo) for evaluation of coronary artery disease (CAD). Thus, 91 pts were studied, of whom 45 had angiographic CAD (≥ 50% coronary narrowing) and 46 were normal (nl). The validity of this model was prospectively tested in the remaining 51 pts (26 nls and 25 with CAD) by comparing the predicted and observed likelihood of CAD in four subgroups (I-IV). In this paper a logistic model is developed and validated that assigns a CAD likelihood to the quantified size of tomograhic myocardial perfusion defects

  16. Probabilistic Solar Forecasting Using Quantile Regression Models

    Directory of Open Access Journals (Sweden)

    Philippe Lauret

    2017-10-01

    Full Text Available In this work, we assess the performance of three probabilistic models for intra-day solar forecasting. More precisely, a linear quantile regression method is used to build three models for generating 1 h–6 h-ahead probabilistic forecasts. Our approach is applied to forecasting solar irradiance at a site experiencing highly variable sky conditions using the historical ground observations of solar irradiance as endogenous inputs and day-ahead forecasts as exogenous inputs. Day-ahead irradiance forecasts are obtained from the Integrated Forecast System (IFS, a Numerical Weather Prediction (NWP model maintained by the European Center for Medium-Range Weather Forecast (ECMWF. Several metrics, mainly originated from the weather forecasting community, are used to evaluate the performance of the probabilistic forecasts. The results demonstrated that the NWP exogenous inputs improve the quality of the intra-day probabilistic forecasts. The analysis considered two locations with very dissimilar solar variability. Comparison between the two locations highlighted that the statistical performance of the probabilistic models depends on the local sky conditions.

  17. Probabilistic Mu-Calculus

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Mardare, Radu Iulian; Xue, Bingtian

    2016-01-01

    We introduce a version of the probabilistic µ-calculus (PMC) built on top of a probabilistic modal logic that allows encoding n-ary inequational conditions on transition probabilities. PMC extends previously studied calculi and we prove that, despite its expressiveness, it enjoys a series of good...... metaproperties. Firstly, we prove the decidability of satisfiability checking by establishing the small model property. An algorithm for deciding the satisfiability problem is developed. As a second major result, we provide a complete axiomatization for the alternation-free fragment of PMC. The completeness proof...

  18. Probabilistic Design of Wave Energy Devices

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Kofoed, Jens Peter; Ferreira, C.B.

    2011-01-01

    and advocate for a probabilistic design approach, as it is assumed (in other areas this has been demonstrated) that this leads to more economical designs compared to designs based on deterministic methods. In the present paper a general framework for probabilistic design and reliability analysis of wave energy......Wave energy has a large potential for contributing significantly to production of renewable energy. However, the wave energy sector is still not able to deliver cost competitive and reliable solutions. But the sector has already demonstrated several proofs of concepts. The design of wave energy...... devices is a new and expanding technical area where there is no tradition for probabilistic design—in fact very little full scale devices has been build to date, so it can be said that no design tradition really exists in this area. For this reason it is considered to be of great importance to develop...

  19. Linear odd Poisson bracket on Grassmann variables

    International Nuclear Information System (INIS)

    Soroka, V.A.

    1999-01-01

    A linear odd Poisson bracket (antibracket) realized solely in terms of Grassmann variables is suggested. It is revealed that the bracket, which corresponds to a semi-simple Lie group, has at once three Grassmann-odd nilpotent Δ-like differential operators of the first, the second and the third orders with respect to Grassmann derivatives, in contrast with the canonical odd Poisson bracket having the only Grassmann-odd nilpotent differential Δ-operator of the second order. It is shown that these Δ-like operators together with a Grassmann-odd nilpotent Casimir function of this bracket form a finite-dimensional Lie superalgebra. (Copyright (c) 1999 Elsevier Science B.V., Amsterdam. All rights reserved.)

  20. Degenerate odd Poisson bracket on Grassmann variables

    International Nuclear Information System (INIS)

    Soroka, V.A.

    2000-01-01

    A linear degenerate odd Poisson bracket (antibracket) realized solely on Grassmann variables is proposed. It is revealed that this bracket has at once three Grassmann-odd nilpotent Δ-like differential operators of the first, second and third orders with respect to the Grassmann derivatives. It is shown that these Δ-like operators, together with the Grassmann-odd nilpotent Casimir function of this bracket, form a finite-dimensional Lie superalgebra

  1. Poisson/Superfish codes for personal computers

    International Nuclear Information System (INIS)

    Humphries, S.

    1992-01-01

    The Poisson/Superfish codes calculate static E or B fields in two-dimensions and electromagnetic fields in resonant structures. New versions for 386/486 PCs and Macintosh computers have capabilities that exceed the mainframe versions. Notable improvements are interactive graphical post-processors, improved field calculation routines, and a new program for charged particle orbit tracking. (author). 4 refs., 1 tab., figs

  2. Probabilistic Flood Defence Assessment Tools

    Directory of Open Access Journals (Sweden)

    Slomp Robert

    2016-01-01

    institutions managing flood the defences, and not by just a small number of experts in probabilistic assessment. Therefore, data management and use of software are main issues that have been covered in courses and training in 2016 and 2017. All in all, this is the largest change in the assessment of Dutch flood defences since 1996. In 1996 probabilistic techniques were first introduced to determine hydraulic boundary conditions (water levels and waves (wave height, wave period and direction for different return periods. To simplify the process, the assessment continues to consist of a three-step approach, moving from simple decision rules, to the methods for semi-probabilistic assessment, and finally to a fully probabilistic analysis to compare the strength of flood defences with the hydraulic loads. The formal assessment results are thus mainly based on the fully probabilistic analysis and the ultimate limit state of the strength of a flood defence. For complex flood defences, additional models and software were developed. The current Hydra software suite (for policy analysis, formal flood defence assessment and design will be replaced by the model Ringtoets. New stand-alone software has been developed for revetments, geotechnical analysis and slope stability of the foreshore. Design software and policy analysis software, including the Delta model, will be updated in 2018. A fully probabilistic method results in more precise assessments and more transparency in the process of assessment and reconstruction of flood defences. This is of increasing importance, as large-scale infrastructural projects in a highly urbanized environment are increasingly subject to political and societal pressure to add additional features. For this reason, it is of increasing importance to be able to determine which new feature really adds to flood protection, to quantify how much its adds to the level of flood protection and to evaluate if it is really worthwhile. Please note: The Netherlands

  3. Probabilistic systems coalgebraically: A survey

    Science.gov (United States)

    Sokolova, Ana

    2011-01-01

    We survey the work on both discrete and continuous-space probabilistic systems as coalgebras, starting with how probabilistic systems are modeled as coalgebras and followed by a discussion of their bisimilarity and behavioral equivalence, mentioning results that follow from the coalgebraic treatment of probabilistic systems. It is interesting to note that, for different reasons, for both discrete and continuous probabilistic systems it may be more convenient to work with behavioral equivalence than with bisimilarity. PMID:21998490

  4. Elementary derivation of Poisson structures for fluid dynamics and electrodynamics

    International Nuclear Information System (INIS)

    Kaufman, A.N.

    1982-01-01

    The canonical Poisson structure of the microscopic Lagrangian is used to deduce the noncanonical Poisson structure for the macroscopic Hamiltonian dynamics of a compressible neutral fluid and of fluid electrodynamics

  5. Probabilistic modeling of timber structures

    DEFF Research Database (Denmark)

    Köhler, Jochen; Sørensen, John Dalsgaard; Faber, Michael Havbro

    2007-01-01

    The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) [Joint Committee of Structural Safety. Probabilistic Model Code, Internet Publ...

  6. Confluence reduction for probabilistic systems

    NARCIS (Netherlands)

    Timmer, Mark; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette

    In this presentation we introduce a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We proved that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To

  7. Reduction of Nambu-Poisson Manifolds by Regular Distributions

    Science.gov (United States)

    Das, Apurba

    2018-03-01

    The version of Marsden-Ratiu reduction theorem for Nambu-Poisson manifolds by a regular distribution has been studied by Ibáñez et al. In this paper we show that the reduction is always ensured unless the distribution is zero. Next we extend the more general Falceto-Zambon Poisson reduction theorem for Nambu-Poisson manifolds. Finally, we define gauge transformations of Nambu-Poisson structures and show that these transformations commute with the reduction procedure.

  8. Probabilistic simple sticker systems

    Science.gov (United States)

    Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod

    2017-04-01

    A model for DNA computing using the recombination behavior of DNA molecules, known as a sticker system, was introduced by by L. Kari, G. Paun, G. Rozenberg, A. Salomaa, and S. Yu in the paper entitled DNA computing, sticker systems and universality from the journal of Acta Informatica vol. 35, pp. 401-420 in the year 1998. A sticker system uses the Watson-Crick complementary feature of DNA molecules: starting from the incomplete double stranded sequences, and iteratively using sticking operations until a complete double stranded sequence is obtained. It is known that sticker systems with finite sets of axioms and sticker rules generate only regular languages. Hence, different types of restrictions have been considered to increase the computational power of sticker systems. Recently, a variant of restricted sticker systems, called probabilistic sticker systems, has been introduced [4]. In this variant, the probabilities are initially associated with the axioms, and the probability of a generated string is computed by multiplying the probabilities of all occurrences of the initial strings in the computation of the string. Strings for the language are selected according to some probabilistic requirements. In this paper, we study fundamental properties of probabilistic simple sticker systems. We prove that the probabilistic enhancement increases the computational power of simple sticker systems.

  9. Probabilistic safety analyses (PSA)

    International Nuclear Information System (INIS)

    1997-01-01

    The guide shows how the probabilistic safety analyses (PSA) are used in the design, construction and operation of light water reactor plants in order for their part to ensure that the safety of the plant is good enough in all plant operational states

  10. Metrics for Probabilistic Geometries

    DEFF Research Database (Denmark)

    Tosi, Alessandra; Hauberg, Søren; Vellido, Alfredo

    2014-01-01

    We investigate the geometrical structure of probabilistic generative dimensionality reduction models using the tools of Riemannian geometry. We explicitly define a distribution over the natural metric given by the models. We provide the necessary algorithms to compute expected metric tensors where...

  11. Probabilistic thread algebra

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    2015-01-01

    We add probabilistic features to basic thread algebra and its extensions with thread-service interaction and strategic interleaving. Here, threads represent the behaviours produced by instruction sequences under execution and services represent the behaviours exhibited by the components of execution

  12. Transitive probabilistic CLIR models.

    NARCIS (Netherlands)

    Kraaij, W.; de Jong, Franciska M.G.

    2004-01-01

    Transitive translation could be a useful technique to enlarge the number of supported language pairs for a cross-language information retrieval (CLIR) system in a cost-effective manner. The paper describes several setups for transitive translation based on probabilistic translation models. The

  13. Probabilistic Load Flow

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Chen, Zhe; Bak-Jensen, Birgitte

    2008-01-01

    This paper reviews the development of the probabilistic load flow (PLF) techniques. Applications of the PLF techniques in different areas of power system steady-state analysis are also discussed. The purpose of the review is to identify different available PLF techniques and their corresponding...

  14. Probabilistic dynamic belief revision

    NARCIS (Netherlands)

    Baltag, A.; Smets, S.

    2008-01-01

    We investigate the discrete (finite) case of the Popper-Renyi theory of conditional probability, introducing discrete conditional probabilistic models for knowledge and conditional belief, and comparing them with the more standard plausibility models. We also consider a related notion, that of safe

  15. Probabilistic Model-Based Diagnosis for Electrical Power Systems

    Data.gov (United States)

    National Aeronautics and Space Administration — We present in this article a case study of the probabilistic approach to model-based diagnosis. Here, the diagnosed system is a real-world electrical power system,...

  16. Limited probabilistic risk assessment applications in plant backfitting

    International Nuclear Information System (INIS)

    Desaedeleer, G.

    1987-01-01

    Plant backfitting programs are defined on the basis of deterministic (e.g. Systematic Evaluation Program) or probabilistic (e.g. Probabilistic Risk Assessment) approaches. Each approach provides valuable assets in defining the program and has its own advantages and disadvantages. Ideally one should combine the strong points of each approach. This chapter summarizes actual experience gained from combinations of deterministic and probabilistic approaches to define and implement PWR backfitting programs. Such combinations relate to limited applications of probabilistic techniques and are illustrated for upgrading fluid systems. These evaluations allow sound and rational optimization systems upgrade. However, the boundaries of the reliability analysis need to be clearly defined and system reliability may have to go beyond classical boundaries (e.g. identification of weak links in support systems). Also the implementation of upgrade on a system per system basis is not necessarily cost-effective. (author)

  17. Probabilistic risk assessment for six vapour intrusion algorithms

    NARCIS (Netherlands)

    Provoost, J.; Reijnders, L.; Bronders, J.; Van Keer, I.; Govaerts, S.

    2014-01-01

    A probabilistic assessment with sensitivity analysis using Monte Carlo simulation for six vapour intrusion algorithms, used in various regulatory frameworks for contaminated land management, is presented here. In addition a deterministic approach with default parameter sets is evaluated against

  18. Probabilistic data modelling with adaptive TAP mean-field theory

    DEFF Research Database (Denmark)

    Opper, M.; Winther, Ole

    2001-01-01

    We demonstrate for the case of single-layer neural networks how an extension of the TAP mean-field approach of disorder physics can be applied to the computation of approximate averages in probabilistic models for real data.......We demonstrate for the case of single-layer neural networks how an extension of the TAP mean-field approach of disorder physics can be applied to the computation of approximate averages in probabilistic models for real data....

  19. Approche probabiliste des milieux poreux hétérogènes ou fracturés en relation avec les écoulements diphasiques Probabilistic Approach to Heterogeneous Or Fractured Porous Media in Relation to Two-Phase Flows

    Directory of Open Access Journals (Sweden)

    Jacquin C.

    2006-11-01

    Full Text Available La prise en compte des particularités structurales des gisements pétroliers fracturés ou hétérogènes est nécessaire à l'amélioration des prévisions de production. La description de ce type de gisements relève d'une approche probabiliste, qui conduit à une estimation des caractéristiques de la roche réservoir : distribution des dimensions des blocs d'un réservoir fissuré, échelles d'hétérogénéité. Ces caractéristiques sont introduites dans les modèles déterministes qui décrivent l'écoulement des fluides. On présente en particulier les problèmes que pose la transposition au gisement des résultats obtenus au laboratoire sur petits échantillons : changement d'échelle géométrique, estimation de la récupération finale et de l'évolution de la production en fonction du temps. The structural features of fractured or heterogenous oil fields must be taken into consideration to improve production forecasting. The description of such fields is based on a probabilistic approach leading to an estimate of the characteristics of the reservoir rock, i. e. distribution of the block sizes of a fissured reservoir, scales of heterogeneity. These characteristics are fed into deterministic models that describe fluid flows. Special attention is paid to problems raised by the transposition of laboratory results obtained on small samples to a field. Such problems include the change in geometric scale, the estimating of ultimate recovery and how production will evolve in time.

  20. Stationary response of multi-degree-of-freedom vibro-impact systems to Poisson white noises

    International Nuclear Information System (INIS)

    Wu, Y.; Zhu, W.Q.

    2008-01-01

    The stationary response of multi-degree-of-freedom (MDOF) vibro-impact (VI) systems to random pulse trains is studied. The system is formulated as a stochastically excited and dissipated Hamiltonian system. The constraints are modeled as non-linear springs according to the Hertz contact law. The random pulse trains are modeled as Poisson white noises. The approximate stationary probability density function (PDF) for the response of MDOF dissipated Hamiltonian systems to Poisson white noises is obtained by solving the fourth-order generalized Fokker-Planck-Kolmogorov (FPK) equation using perturbation approach. As examples, two-degree-of-freedom (2DOF) VI systems under external and parametric Poisson white noise excitations, respectively, are investigated. The validity of the proposed approach is confirmed by using the results obtained from Monte Carlo simulation. It is shown that the non-Gaussian behaviour depends on the product of the mean arrival rate of the impulses and the relaxation time of the oscillator

  1. On the FACR( l) algorithm for the discrete Poisson equation

    Science.gov (United States)

    Temperton, Clive

    1980-03-01

    Direct methods for the solution of the discrete Poisson equation over a rectangle are commonly based either on Fourier transforms or on block-cyclic reduction. The relationship between these two approaches is demonstrated explicitly, and used to derive the FACR( l) algorithm in which the Fourier transform approach is combined with l preliminary steps of cyclic reduction. It is shown that the optimum choice of l leads to an algorithm for which the operation count per mesh point is almost independent of the mesh size. Numerical results concerning timing and round-off error are presented for the N × N Dirichlet problem for various values of N and l. Extensions to more general problems, and to implementation on parallel or vector computers are briefly discussed.

  2. Algebraic properties of compatible Poisson brackets

    Science.gov (United States)

    Zhang, Pumei

    2014-05-01

    We discuss algebraic properties of a pencil generated by two compatible Poisson tensors A( x) and B( x). From the algebraic viewpoint this amounts to studying the properties of a pair of skew-symmetric bilinear forms A and B defined on a finite-dimensional vector space. We describe the Lie group G P of linear automorphisms of the pencil P = { A + λB}. In particular, we obtain an explicit formula for the dimension of G P and discuss some other algebraic properties such as solvability and Levi-Malcev decomposition.

  3. Probabilistic reasoning for assembly-based 3D modeling

    KAUST Repository

    Chaudhuri, Siddhartha

    2011-01-01

    Assembly-based modeling is a promising approach to broadening the accessibility of 3D modeling. In assembly-based modeling, new models are assembled from shape components extracted from a database. A key challenge in assembly-based modeling is the identification of relevant components to be presented to the user. In this paper, we introduce a probabilistic reasoning approach to this problem. Given a repository of shapes, our approach learns a probabilistic graphical model that encodes semantic and geometric relationships among shape components. The probabilistic model is used to present components that are semantically and stylistically compatible with the 3D model that is being assembled. Our experiments indicate that the probabilistic model increases the relevance of presented components. © 2011 ACM.

  4. A modified Poisson-Boltzmann equation applied to protein adsorption.

    Science.gov (United States)

    Gama, Marlon de Souza; Santos, Mirella Simões; Lima, Eduardo Rocha de Almeida; Tavares, Frederico Wanderley; Barreto, Amaro Gomes Barreto

    2018-01-05

    Ion-exchange chromatography has been widely used as a standard process in purification and analysis of protein, based on the electrostatic interaction between the protein and the stationary phase. Through the years, several approaches are used to improve the thermodynamic description of colloidal particle-surface interaction systems, however there are still a lot of gaps specifically when describing the behavior of protein adsorption. Here, we present an improved methodology for predicting the adsorption equilibrium constant by solving the modified Poisson-Boltzmann (PB) equation in bispherical coordinates. By including dispersion interactions between ions and protein, and between ions and surface, the modified PB equation used can describe the Hofmeister effects. We solve the modified Poisson-Boltzmann equation to calculate the protein-surface potential of mean force, treated as spherical colloid-plate system, as a function of process variables. From the potential of mean force, the Henry constants of adsorption, for different proteins and surfaces, are calculated as a function of pH, salt concentration, salt type, and temperature. The obtained Henry constants are compared with experimental data for several isotherms showing excellent agreement. We have also performed a sensitivity analysis to verify the behavior of different kind of salts and the Hofmeister effects. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Probabilistic Model Development

    Science.gov (United States)

    Adam, James H., Jr.

    2010-01-01

    Objective: Develop a Probabilistic Model for the Solar Energetic Particle Environment. Develop a tool to provide a reference solar particle radiation environment that: 1) Will not be exceeded at a user-specified confidence level; 2) Will provide reference environments for: a) Peak flux; b) Event-integrated fluence; and c) Mission-integrated fluence. The reference environments will consist of: a) Elemental energy spectra; b) For protons, helium and heavier ions.

  6. Geothermal probabilistic cost study

    Energy Technology Data Exchange (ETDEWEB)

    Orren, L.H.; Ziman, G.M.; Jones, S.C.; Lee, T.K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-08-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model is used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents are analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance are examined. (MHR)

  7. On probabilistic Mandelbrot maps

    Energy Technology Data Exchange (ETDEWEB)

    Andreadis, Ioannis [International School of The Hague, Wijndaelerduin 1, 2554 BX The Hague (Netherlands)], E-mail: i.andreadis@ish-rijnlandslyceum.nl; Karakasidis, Theodoros E. [Department of Civil Engineering, University of Thessaly, GR-38334 Volos (Greece)], E-mail: thkarak@uth.gr

    2009-11-15

    In this work, we propose a definition for a probabilistic Mandelbrot map in order to extend and support the study initiated by Argyris et al. [Argyris J, Andreadis I, Karakasidis Th. On perturbations of the Mandelbrot map. Chaos, Solitons and Fractals 2000;11:1131-1136.] with regard to the numerical stability of the Mandelbrot and Julia set of the Mandelbrot map when subjected to noise.

  8. Invariants in probabilistic reasoning.

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2018-02-01

    Recent research has identified three invariants or identities that appear to hold in people's probabilistic reasoning: the QQ identity, the addition law identity, and the Bayes rule identity (Costello and Watts, 2014, 2016a, Fisher and Wolfe, 2014, Wang and Busemeyer, 2013, Wang et al., 2014). Each of these identities represent specific agreement with the requirements of normative probability theory; strikingly, these identities seem to hold in people's judgements despite the presence of strong and systematic biases against the requirements of normative probability theory in those very same judgements. These results suggest that the systematic biases seen in people's probabilistic reasoning follow mathematical rules: for these particular identities, these rules cause an overall cancellation of biases and so produce agreement with normative requirements. We assess two competing mathematical models of probabilistic reasoning (the 'probability theory plus noise' model and the 'quantum probability' model) in terms of their ability to account for this pattern of systematic biases and invariant identities. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Probabilistic, meso-scale flood loss modelling

    Science.gov (United States)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  10. Binomial vs poisson statistics in radiation studies

    International Nuclear Information System (INIS)

    Foster, J.; Kouris, K.; Spyrou, N.M.; Matthews, I.P.; Welsh National School of Medicine, Cardiff

    1983-01-01

    The processes of radioactive decay, decay and growth of radioactive species in a radioactive chain, prompt emission(s) from nuclear reactions, conventional activation and cyclic activation are discussed with respect to their underlying statistical density function. By considering the transformation(s) that each nucleus may undergo it is shown that all these processes are fundamentally binomial. Formally, when the number of experiments N is large and the probability of success p is close to zero, the binomial is closely approximated by the Poisson density function. In radiation and nuclear physics, N is always large: each experiment can be conceived of as the observation of the fate of each of the N nuclei initially present. Whether p, the probability that a given nucleus undergoes a prescribed transformation, is close to zero depends on the process and nuclide(s) concerned. Hence, although a binomial description is always valid, the Poisson approximation is not always adequate. Therefore further clarification is provided as to when the binomial distribution must be used in the statistical treatment of detected events. (orig.)

  11. Probabilistic analysis of modernization options

    International Nuclear Information System (INIS)

    Wunderlich, W.O.; Giles, J.E.

    1991-01-01

    This paper reports on benefit-cost analysis for hydropower operations, a standard procedure for reaching planning decisions. Cost overruns and benefit shortfalls are also common occurrences. One reason for the difficulty of predicting future benefits and costs is that they usually cannot be represented with sufficient reliability by accurate values, because of the many uncertainties that enter the analysis through assumptions on inputs and system parameters. Therefore, ranges of variables need to be analyzed instead of single values. As a consequence, the decision criteria, such as net benefit and benefit-cost ratio, also vary over some range. A probabilistic approach will be demonstrated as a tool for assessing the reliability of the results

  12. Probabilistic Fatigue Damage Program (FATIG)

    Science.gov (United States)

    Michalopoulos, Constantine

    2012-01-01

    FATIG computes fatigue damage/fatigue life using the stress rms (root mean square) value, the total number of cycles, and S-N curve parameters. The damage is computed by the following methods: (a) traditional method using Miner s rule with stress cycles determined from a Rayleigh distribution up to 3*sigma; and (b) classical fatigue damage formula involving the Gamma function, which is derived from the integral version of Miner's rule. The integration is carried out over all stress amplitudes. This software solves the problem of probabilistic fatigue damage using the integral form of the Palmgren-Miner rule. The software computes fatigue life using an approach involving all stress amplitudes, up to N*sigma, as specified by the user. It can be used in the design of structural components subjected to random dynamic loading, or by any stress analyst with minimal training for fatigue life estimates of structural components.

  13. Machine learning a probabilistic perspective

    CERN Document Server

    Murphy, Kevin P

    2012-01-01

    Today's Web-enabled deluge of electronic data calls for automated methods of data analysis. Machine learning provides these, developing methods that can automatically detect patterns in data and then use the uncovered patterns to predict future data. This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach. The coverage combines breadth and depth, offering necessary background material on such topics as probability, optimization, and linear algebra as well as discussion of recent developments in the field, including conditional random fields, L1 regularization, and deep learning. The book is written in an informal, accessible style, complete with pseudo-code for the most important algorithms. All topics are copiously illustrated with color images and worked examples drawn from such application domains as biology, text processing, computer vision, and robotics. Rather than providing a cookbook of different heuristic method...

  14. Fast immersed interface Poisson solver for 3D unbounded problems around arbitrary geometries

    Science.gov (United States)

    Gillis, T.; Winckelmans, G.; Chatelain, P.

    2018-02-01

    We present a fast and efficient Fourier-based solver for the Poisson problem around an arbitrary geometry in an unbounded 3D domain. This solver merges two rewarding approaches, the lattice Green's function method and the immersed interface method, using the Sherman-Morrison-Woodbury decomposition formula. The method is intended to be second order up to the boundary. This is verified on two potential flow benchmarks. We also further analyse the iterative process and the convergence behavior of the proposed algorithm. The method is applicable to a wide range of problems involving a Poisson equation around inner bodies, which goes well beyond the present validation on potential flows.

  15. Minimum Hellinger distance estimation for k-component poisson mixture with random effects.

    Science.gov (United States)

    Xiang, Liming; Yau, Kelvin K W; Van Hui, Yer; Lee, Andy H

    2008-06-01

    The k-component Poisson regression mixture with random effects is an effective model in describing the heterogeneity for clustered count data arising from several latent subpopulations. However, the residual maximum likelihood estimation (REML) of regression coefficients and variance component parameters tend to be unstable and may result in misleading inferences in the presence of outliers or extreme contamination. In the literature, the minimum Hellinger distance (MHD) estimation has been investigated to obtain robust estimation for finite Poisson mixtures. This article aims to develop a robust MHD estimation approach for k-component Poisson mixtures with normally distributed random effects. By applying the Gaussian quadrature technique to approximate the integrals involved in the marginal distribution, the marginal probability function of the k-component Poisson mixture with random effects can be approximated by the summation of a set of finite Poisson mixtures. Simulation study shows that the MHD estimates perform satisfactorily for data without outlying observation(s), and outperform the REML estimates when data are contaminated. Application to a data set of recurrent urinary tract infections (UTI) with random institution effects demonstrates the practical use of the robust MHD estimation method.

  16. Probabilistic Modeling of Timber Structures

    DEFF Research Database (Denmark)

    Köhler, J.D.; Sørensen, John Dalsgaard; Faber, Michael Havbro

    2005-01-01

    The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) and of the COST action E24 'Reliability of Timber Structures'. The present pro...... probabilistic model for these basic properties is presented and possible refinements are given related to updating of the probabilistic model given new information, modeling of the spatial variation of strength properties and the duration of load effects.......The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) and of the COST action E24 'Reliability of Timber Structures'. The present...

  17. A LATENT CLASS POISSON REGRESSION-MODEL FOR HETEROGENEOUS COUNT DATA

    NARCIS (Netherlands)

    WEDEL, M; DESARBO, WS; BULT, [No Value; RAMASWAMY, [No Value

    1993-01-01

    In this paper an approach is developed that accommodates heterogeneity in Poisson regression models for count data. The model developed assumes that heterogeneity arises from a distribution of both the intercept and the coefficients of the explanatory variables. We assume that the mixing

  18. On two-echelon inventory systems with Poisson demand and lost sales

    NARCIS (Netherlands)

    Alvarez, Elisa; van der Heijden, Matthijs C.

    2014-01-01

    We consider a two-echelon, continuous review inventory system under Poisson demand and a one-for-one replenishment policy. Demand is lost if no items are available at the local warehouse, the central depot, or in the pipeline in between. We give a simple, fast and accurate approach to approximate

  19. Identification of water quality management policy of watershed system with multiple uncertain interactions using a multi-level-factorial risk-inference-based possibilistic-probabilistic programming approach.

    Science.gov (United States)

    Liu, Jing; Li, Yongping; Huang, Guohe; Fu, Haiyan; Zhang, Junlong; Cheng, Guanhui

    2017-06-01

    In this study, a multi-level-factorial risk-inference-based possibilistic-probabilistic programming (MRPP) method is proposed for supporting water quality management under multiple uncertainties. The MRPP method can handle uncertainties expressed as fuzzy-random-boundary intervals, probability distributions, and interval numbers, and analyze the effects of uncertainties as well as their interactions on modeling outputs. It is applied to plan water quality management in the Xiangxihe watershed. Results reveal that a lower probability of satisfying the objective function (θ) as well as a higher probability of violating environmental constraints (q i ) would correspond to a higher system benefit with an increased risk of violating system feasibility. Chemical plants are the major contributors to biological oxygen demand (BOD) and total phosphorus (TP) discharges; total nitrogen (TN) would be mainly discharged by crop farming. It is also discovered that optimistic decision makers should pay more attention to the interactions between chemical plant and water supply, while decision makers who possess a risk-averse attitude would focus on the interactive effect of q i and benefit of water supply. The findings can help enhance the model's applicability and identify a suitable water quality management policy for environmental sustainability according to the practical situations.

  20. Poisson regression for modeling count and frequency outcomes in trauma research.

    Science.gov (United States)

    Gagnon, David R; Doron-LaMarca, Susan; Bell, Margret; O'Farrell, Timothy J; Taft, Casey T

    2008-10-01

    The authors describe how the Poisson regression method for analyzing count or frequency outcome variables can be applied in trauma studies. The outcome of interest in trauma research may represent a count of the number of incidents of behavior occurring in a given time interval, such as acts of physical aggression or substance abuse. Traditional linear regression approaches assume a normally distributed outcome variable with equal variances over the range of predictor variables, and may not be optimal for modeling count outcomes. An application of Poisson regression is presented using data from a study of intimate partner aggression among male patients in an alcohol treatment program and their female partners. Results of Poisson regression and linear regression models are compared.

  1. Infinitesimal deformations of Poisson bi-vectors using the Kontsevich graph calculus

    Science.gov (United States)

    Buring, Ricardo; Kiselev, Arthemy V.; Rutten, Nina

    2018-02-01

    Let \\mathscr{P} be a Poisson structure on a finite-dimensional affine real manifold. Can \\mathscr{P} be deformed in such a way that it stays Poisson? The language of Kontsevich graphs provides a universal approach - with respect to all affine Poisson manifolds - to finding a class of solutions to this deformation problem. For that reasoning, several types of graphs are needed. In this paper we outline the algorithms to generate those graphs. The graphs that encode deformations are classified by the number of internal vertices k; for k ≤ 4 we present all solutions of the deformation problem. For k ≥ 5, first reproducing the pentagon-wheel picture suggested at k = 6 by Kontsevich and Willwacher, we construct the heptagon-wheel cocycle that yields a new unique solution without 2-loops and tadpoles at k = 8.

  2. Comparative study of probabilistic methodologies for small signal stability assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rueda, J.L.; Colome, D.G. [Universidad Nacional de San Juan (IEE-UNSJ), San Juan (Argentina). Inst. de Energia Electrica], Emails: joseluisrt@iee.unsj.edu.ar, colome@iee.unsj.edu.ar

    2009-07-01

    Traditional deterministic approaches for small signal stability assessment (SSSA) are unable to properly reflect the existing uncertainties in real power systems. Hence, the probabilistic analysis of small signal stability (SSS) is attracting more attention by power system engineers. This paper discusses and compares two probabilistic methodologies for SSSA, which are based on the two point estimation method and the so-called Monte Carlo method, respectively. The comparisons are based on the results obtained for several power systems of different sizes and with different SSS performance. It is demonstrated that although with an analytical approach the amount of computation of probabilistic SSSA can be reduced, the different degrees of approximations that are adopted, lead to deceptive results. Conversely, Monte Carlo based probabilistic SSSA can be carried out with reasonable computational effort while holding satisfactory estimation precision. (author)

  3. Inference of Dim Gamma-Ray Point Sources Using Probabilistic Catalogues

    Science.gov (United States)

    Daylan, Tansu; Portillo, Stephen K. N.; Finkbeiner, Douglas P.

    2016-07-01

    Poisson regression of the Fermi-LAT data in the inner Milky Way reveals an extended gamma-ray excess. The anomalous emission falls steeply away from the galactic center and has an energy spectrum that peaks at 1-2 GeV. An important question is whether the signal is coming from a collection of unresolved point sources, possibly recycled pulsars, or constitutes a truly diffuse emission component. Previous analyses have relied on non-Poissonian template fits or wavelet decomposition of the Fermi-LAT data, which find evidence for a population of dim point sources just below the 3FGL flux limit. In order to draw conclusions about a potentially dim population, we propose to sample from the catalog space of point sources, where the model dimensionality, i.e., the number of sources, is unknown. Although being a computationally expensive sampling problem, this approach allows us to infer the number, flux and radial distribution of the point sources consistent with the observed count data. Probabilistic cataloging is specifically useful in the crowded field limit, such as in the galactic disk, where the typical separation between point sources is comparable to the PSF. Using this approach, we recover the results of the deterministic Fermi-LAT 3FGL catalog, as well as sub-detection threshold information and fold the point source parameter degeneracies into the model-choice problem of whether an emission is coming from unresolved MSPs or dark matter annihilation.

  4. An alternating minimization method for blind deconvolution from Poisson data

    International Nuclear Information System (INIS)

    Prato, Marco; La Camera, Andrea; Bonettini, Silvia

    2014-01-01

    Blind deconvolution is a particularly challenging inverse problem since information on both the desired target and the acquisition system have to be inferred from the measured data. When the collected data are affected by Poisson noise, this problem is typically addressed by the minimization of the Kullback-Leibler divergence, in which the unknowns are sought in particular feasible sets depending on the a priori information provided by the specific application. If these sets are separated, then the resulting constrained minimization problem can be addressed with an inexact alternating strategy. In this paper we apply this optimization tool to the problem of reconstructing astronomical images from adaptive optics systems, and we show that the proposed approach succeeds in providing very good results in the blind deconvolution of nondense stellar clusters

  5. On a Poisson homogeneous space of bilinear forms with a Poisson-Lie action

    Science.gov (United States)

    Chekhov, L. O.; Mazzocco, M.

    2017-12-01

    Let \\mathscr A be the space of bilinear forms on C^N with defining matrices A endowed with a quadratic Poisson structure of reflection equation type. The paper begins with a short description of previous studies of the structure, and then this structure is extended to systems of bilinear forms whose dynamics is governed by the natural action A\\mapsto B ABT} of the {GL}_N Poisson-Lie group on \\mathscr A. A classification is given of all possible quadratic brackets on (B, A)\\in {GL}_N× \\mathscr A preserving the Poisson property of the action, thus endowing \\mathscr A with the structure of a Poisson homogeneous space. Besides the product Poisson structure on {GL}_N× \\mathscr A, there are two other (mutually dual) structures, which (unlike the product Poisson structure) admit reductions by the Dirac procedure to a space of bilinear forms with block upper triangular defining matrices. Further generalisations of this construction are considered, to triples (B,C, A)\\in {GL}_N× {GL}_N× \\mathscr A with the Poisson action A\\mapsto B ACT}, and it is shown that \\mathscr A then acquires the structure of a Poisson symmetric space. Generalisations to chains of transformations and to the quantum and quantum affine algebras are investigated, as well as the relations between constructions of Poisson symmetric spaces and the Poisson groupoid. Bibliography: 30 titles.

  6. Probabilistic forecasting and Bayesian data assimilation

    CERN Document Server

    Reich, Sebastian

    2015-01-01

    In this book the authors describe the principles and methods behind probabilistic forecasting and Bayesian data assimilation. Instead of focusing on particular application areas, the authors adopt a general dynamical systems approach, with a profusion of low-dimensional, discrete-time numerical examples designed to build intuition about the subject. Part I explains the mathematical framework of ensemble-based probabilistic forecasting and uncertainty quantification. Part II is devoted to Bayesian filtering algorithms, from classical data assimilation algorithms such as the Kalman filter, variational techniques, and sequential Monte Carlo methods, through to more recent developments such as the ensemble Kalman filter and ensemble transform filters. The McKean approach to sequential filtering in combination with coupling of measures serves as a unifying mathematical framework throughout Part II. Assuming only some basic familiarity with probability, this book is an ideal introduction for graduate students in ap...

  7. PENERAPAN REGRESI BINOMIAL NEGATIF UNTUK MENGATASI OVERDISPERSI PADA REGRESI POISSON

    Directory of Open Access Journals (Sweden)

    PUTU SUSAN PRADAWATI

    2013-09-01

    Full Text Available Poisson regression was used to analyze the count data which Poisson distributed. Poisson regression analysis requires state equidispersion, in which the mean value of the response variable is equal to the value of the variance. However, there are deviations in which the value of the response variable variance is greater than the mean. This is called overdispersion. If overdispersion happens and Poisson Regression analysis is being used, then underestimated standard errors will be obtained. Negative Binomial Regression can handle overdispersion because it contains a dispersion parameter. From the simulation data which experienced overdispersion in the Poisson Regression model it was found that the Negative Binomial Regression was better than the Poisson Regression model.

  8. Probabilistic assessment of seismic risk in urban areas

    OpenAIRE

    Aguilar, Armando; Pujades Beneit, Lluís; Barbat Barbat, Horia Alejandro; Ordaz Schroder, Mario Gustavo

    2008-01-01

    A probabilistic approach to estimate the expected seismic physical damage of existing buildings in urban areas is presented. The main steps of the procedure are seismic hazard, vulnerability and structural response. These three steps are undertaken with a fully probabilistic point of view. Seismic hazard is assessed by means of the annual rate of exceedance of a parameter quantifying the expected seismic action. Buildings may be characterized by a class in a building typology matrix and/or by...

  9. Probabilistic statistical modeling of air pollution from vehicles

    Science.gov (United States)

    Adikanova, Saltanat; Malgazhdarov, Yerzhan A.; Madiyarov, Muratkan N.; Temirbekov, Nurlan M.

    2017-09-01

    The aim of the work is to create a probabilistic-statistical mathematical model for the distribution of emissions from vehicles. In this article, it is proposed to use the probabilistic and statistical approach for modeling the distribution of harmful impurities in the atmosphere from vehicles using the example of the Ust-Kamenogorsk city. Using a simplified methodology of stochastic modeling, it is possible to construct effective numerical computational algorithms that significantly reduce the amount of computation without losing their accuracy.

  10. Approximate Bayesian Image Interpretation using Generative Probabilistic Graphics Programs

    OpenAIRE

    Mansinghka, Vikash K.; Kulkarni, Tejas D.; Perov, Yura N.; Tenenbaum, Joshua B.

    2013-01-01

    The idea of computer vision as the Bayesian inverse problem to computer graphics has a long history and an appealing elegance, but it has proved difficult to directly implement. Instead, most vision tasks are approached via complex bottom-up processing pipelines. Here we show that it is possible to write short, simple probabilistic graphics programs that define flexible generative models and to automatically invert them to interpret real-world images. Generative probabilistic graphics program...

  11. Thinning spatial point processes into Poisson processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Schoenberg, Frederic Paik

    This paper describes methods for randomly thinning certain classes of spatial point processes. In the case of a Markov point process, the proposed method involves a dependent thinning of a spatial birth-and-death process, where clans of ancestors associated with the original points are identified......, and where one simulates backwards and forwards in order to obtain the thinned process. In the case of a Cox process, a simple independent thinning technique is proposed. In both cases, the thinning results in a Poisson process if and only if the true Papangelou conditional intensity is used, and thus can...... be used as a diagnostic for assessing the goodness-of-fit of a spatial point process model. Several examples, including clustered and inhibitive point processes, are considered....

  12. Thinning spatial point processes into Poisson processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Schoenberg, Frederic Paik

    2010-01-01

    In this paper we describe methods for randomly thinning certain classes of spatial point processes. In the case of a Markov point process, the proposed method involves a dependent thinning of a spatial birth-and-death process, where clans of ancestors associated with the original points...... are identified, and where we simulate backwards and forwards in order to obtain the thinned process. In the case of a Cox process, a simple independent thinning technique is proposed. In both cases, the thinning results in a Poisson process if and only if the true Papangelou conditional intensity is used, and......, thus, can be used as a graphical exploratory tool for inspecting the goodness-of-fit of a spatial point process model. Several examples, including clustered and inhibitive point processes, are considered....

  13. Compound Poisson Approximations for Sums of Random Variables

    OpenAIRE

    Serfozo, Richard F.

    1986-01-01

    We show that a sum of dependent random variables is approximately compound Poisson when the variables are rarely nonzero and, given they are nonzero, their conditional distributions are nearly identical. We give several upper bounds on the total-variation distance between the distribution of such a sum and a compound Poisson distribution. Included is an example for Markovian occurrences of a rare event. Our bounds are consistent with those that are known for Poisson approximations for sums of...

  14. Probabilistic studies for safety at optimum cost

    International Nuclear Information System (INIS)

    Pitner, P.

    1999-01-01

    By definition, the risk of failure of very reliable components is difficult to evaluate. How can the best strategies for in service inspection and maintenance be defined to limit this risk to an acceptable level at optimum cost? It is not sufficient to design structures with margins, it is also essential to understand how they age. The probabilistic approach has made it possible to develop well proven concepts. (author)

  15. Integration of Probabilistic Exposure Assessment and Probabilistic Hazard Characterization

    NARCIS (Netherlands)

    Voet, van der H.; Slob, W.

    2007-01-01

    A method is proposed for integrated probabilistic risk assessment where exposure assessment and hazard characterization are both included in a probabilistic way. The aim is to specify the probability that a random individual from a defined (sub)population will have an exposure high enough to cause a

  16. Benthic invertebrate exposure and chronic toxicity risk analysis for cyclic volatile methylsiloxanes: Comparison of hazard quotient and probabilistic risk assessment approaches.

    Science.gov (United States)

    Woodburn, Kent B; Seston, Rita M; Kim, Jaeshin; Powell, David E

    2018-02-01

    This study utilized probabilistic risk assessment techniques to compare field sediment concentrations of the cyclic volatile methylsiloxane (cVMS) materials octamethylcyclotetrasiloxane (D4, CAS # 556-67-2), decamethylcyclopentasiloxane (D5, CAS # 541-02-6), and dodecamethylcyclohexasiloxane (D6, CAS # 540-97-6) to effect levels for these compounds determined in laboratory chronic toxicity tests with benthic organisms. The concentration data for D4/D5/D6 in sediment were individually sorted and the 95th centile concentrations determined in sediment on an organic carbon (OC) fugacity basis. These concentrations were then compared to interpolated 5th centile benthic sediment no-observed effect concentration (NOEC) fugacity levels, calculated from a distribution of chronic D4/D5/D6 toxicologic assays per OECD guidelines using a variety of standard benthic species. The benthic invertebrate fugacity biota NOEC values were then compared to field-measured invertebrate biota fugacity levels to see if risk assessment evaluations were similar on a field sediment and field biota basis. No overlap was noted for D4 and D5 95th centile sediment and biota fugacity levels and their respective 5th centile benthic organism NOEC values. For D6, there was a small level of overlap at the exposure 95th centile sediment fugacity and the 5th centile benthic organism NOEC fugacity value; the sediment fugacities indicate that a negligible risk (1%) exists for benthic species exposed to D6. In contrast, there was no indication of risk when the field invertebrate exposure 95th centile biota fugacity and the 5th centile benthic organism NOEC fugacity values were compared. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. PCAT: Probabilistic Cataloger

    Science.gov (United States)

    Daylan, Tansu; Portillo, K. N. Stephen; Finkbeiner, Douglas P.

    2017-05-01

    PCAT (Probabilistic Cataloger) samples from the posterior distribution of a metamodel, i.e., union of models with different dimensionality, to compare the models. This is achieved via transdimensional proposals such as births, deaths, splits and merges in addition to the within-model proposals. This method avoids noisy estimates of the Bayesian evidence that may not reliably distinguish models when sampling from the posterior probability distribution of each model. The code has been applied in two different subfields of astronomy: high energy photometry, where transdimensional elements are gamma-ray point sources; and strong lensing, where light-deflecting dark matter subhalos take the role of transdimensional elements.

  18. Probabilistic safety assessment

    International Nuclear Information System (INIS)

    Hoertner, H.; Schuetz, B.

    1982-09-01

    For the purpose of assessing applicability and informativeness on risk-analysis methods in licencing procedures under atomic law, the choice of instruments for probabilistic analysis, the problems in and experience gained in their application, and the discussion of safety goals with respect to such instruments are of paramount significance. Naturally, such a complex field can only be dealt with step by step, making contribution relative to specific problems. The report on hand shows the essentials of a 'stocktaking' of systems relability studies in the licencing procedure under atomic law and of an American report (NUREG-0739) on 'Quantitative Safety Goals'. (orig.) [de

  19. Probabilistic role models and the guarded fragment

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...

  20. Probabilistic Role Models and the Guarded Fragment

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...

  1. Poisson-Boltzmann versus Size-Modified Poisson-Boltzmann Electrostatics Applied to Lipid Bilayers.

    Science.gov (United States)

    Wang, Nuo; Zhou, Shenggao; Kekenes-Huskey, Peter M; Li, Bo; McCammon, J Andrew

    2014-12-26

    Mean-field methods, such as the Poisson-Boltzmann equation (PBE), are often used to calculate the electrostatic properties of molecular systems. In the past two decades, an enhancement of the PBE, the size-modified Poisson-Boltzmann equation (SMPBE), has been reported. Here, the PBE and the SMPBE are reevaluated for realistic molecular systems, namely, lipid bilayers, under eight different sets of input parameters. The SMPBE appears to reproduce the molecular dynamics simulation results better than the PBE only under specific parameter sets, but in general, it performs no better than the Stern layer correction of the PBE. These results emphasize the need for careful discussions of the accuracy of mean-field calculations on realistic systems with respect to the choice of parameters and call for reconsideration of the cost-efficiency and the significance of the current SMPBE formulation.

  2. Quantum probabilistic logic programming

    Science.gov (United States)

    Balu, Radhakrishnan

    2015-05-01

    We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.

  3. Probabilistic population aging.

    Science.gov (United States)

    Sanderson, Warren C; Scherbov, Sergei; Gerland, Patrick

    2017-01-01

    We merge two methodologies, prospective measures of population aging and probabilistic population forecasts. We compare the speed of change and variability in forecasts of the old age dependency ratio and the prospective old age dependency ratio as well as the same comparison for the median age and the prospective median age. While conventional measures of population aging are computed on the basis of the number of years people have already lived, prospective measures are computed also taking account of the expected number of years they have left to live. Those remaining life expectancies change over time and differ from place to place. We compare the probabilistic distributions of the conventional and prospective measures using examples from China, Germany, Iran, and the United States. The changes over time and the variability of the prospective indicators are smaller than those that are observed in the conventional ones. A wide variety of new results emerge from the combination of methodologies. For example, for Germany, Iran, and the United States the likelihood that the prospective median age of the population in 2098 will be lower than it is today is close to 100 percent.

  4. Compression of Probabilistic XML documents

    NARCIS (Netherlands)

    Veldman, Irma

    2009-01-01

    Probabilistic XML (PXML) files resulting from data integration can become extremely large, which is undesired. For XML there are several techniques available to compress the document and since probabilistic XML is in fact (a special form of) XML, it might benefit from these methods even more. In

  5. Probabilistic studies of accident sequences

    International Nuclear Information System (INIS)

    Villemeur, A.; Berger, J.P.

    1986-01-01

    For several years, Electricite de France has carried out probabilistic assessment of accident sequences for nuclear power plants. In the framework of this program many methods were developed. As the interest in these studies was increasing and as adapted methods were developed, Electricite de France has undertaken a probabilistic safety assessment of a nuclear power plant [fr

  6. Topics in Probabilistic Judgment Aggregation

    Science.gov (United States)

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  7. Perturbation-induced emergence of Poisson-like behavior in non-Poisson systems

    International Nuclear Information System (INIS)

    Akin, Osman C; Grigolini, Paolo; Paradisi, Paolo

    2009-01-01

    The response of a system with ON–OFF intermittency to an external harmonic perturbation is discussed. ON–OFF intermittency is described by means of a sequence of random events, i.e., the transitions from the ON to the OFF state and vice versa. The unperturbed waiting times (WTs) between two events are assumed to satisfy a renewal condition, i.e., the WTs are statistically independent random variables. The response of a renewal model with non-Poisson ON–OFF intermittency, associated with non-exponential WT distribution, is analyzed by looking at the changes induced in the WT statistical distribution by the harmonic perturbation. The scaling properties are also studied by means of diffusion entropy analysis. It is found that, in the range of fast and relatively strong perturbation, the non-Poisson system displays a Poisson-like behavior in both WT distribution and scaling. In particular, the histogram of perturbed WTs becomes a sequence of equally spaced peaks, with intensity decaying exponentially in time. Further, the diffusion entropy detects an ordinary scaling (related to normal diffusion) instead of the expected unperturbed anomalous scaling related to the inverse power-law decay. Thus, an analysis based on the WT histogram and/or on scaling methods has to be considered with some care when dealing with perturbed intermittent systems

  8. Symbolic Computing in Probabilistic and Stochastic Analysis

    Directory of Open Access Journals (Sweden)

    Kamiński Marcin

    2015-12-01

    Full Text Available The main aim is to present recent developments in applications of symbolic computing in probabilistic and stochastic analysis, and this is done using the example of the well-known MAPLE system. The key theoretical methods discussed are (i analytical derivations, (ii the classical Monte-Carlo simulation approach, (iii the stochastic perturbation technique, as well as (iv some semi-analytical approaches. It is demonstrated in particular how to engage the basic symbolic tools implemented in any system to derive the basic equations for the stochastic perturbation technique and how to make an efficient implementation of the semi-analytical methods using an automatic differentiation and integration provided by the computer algebra program itself. The second important illustration is probabilistic extension of the finite element and finite difference methods coded in MAPLE, showing how to solve boundary value problems with random parameters in the environment of symbolic computing. The response function method belongs to the third group, where interference of classical deterministic software with the non-linear fitting numerical techniques available in various symbolic environments is displayed. We recover in this context the probabilistic structural response in engineering systems and show how to solve partial differential equations including Gaussian randomness in their coefficients.

  9. Probabilistic Graph Layout for Uncertain Network Visualization.

    Science.gov (United States)

    Schulz, Christoph; Nocaj, Arlind; Goertler, Jochen; Deussen, Oliver; Brandes, Ulrik; Weiskopf, Daniel

    2017-01-01

    We present a novel uncertain network visualization technique based on node-link diagrams. Nodes expand spatially in our probabilistic graph layout, depending on the underlying probability distributions of edges. The visualization is created by computing a two-dimensional graph embedding that combines samples from the probabilistic graph. A Monte Carlo process is used to decompose a probabilistic graph into its possible instances and to continue with our graph layout technique. Splatting and edge bundling are used to visualize point clouds and network topology. The results provide insights into probability distributions for the entire network-not only for individual nodes and edges. We validate our approach using three data sets that represent a wide range of network types: synthetic data, protein-protein interactions from the STRING database, and travel times extracted from Google Maps. Our approach reveals general limitations of the force-directed layout and allows the user to recognize that some nodes of the graph are at a specific position just by chance.

  10. Probabilistic safety assessment for research reactors

    International Nuclear Information System (INIS)

    1986-12-01

    Increasing interest in using Probabilistic Safety Assessment (PSA) methods for research reactor safety is being observed in many countries throughout the world. This is mainly because of the great ability of this approach in achieving safe and reliable operation of research reactors. There is also a need to assist developing countries to apply Probabilistic Safety Assessment to existing nuclear facilities which are simpler and therefore less complicated to analyse than a large Nuclear Power Plant. It may be important, therefore, to develop PSA for research reactors. This might also help to better understand the safety characteristics of the reactor and to base any backfitting on a cost-benefit analysis which would ensure that only necessary changes are made. This document touches on all the key aspects of PSA but placed greater emphasis on so-called systems analysis aspects rather than the in-plant or ex-plant consequences

  11. Probabilistic precursor analysis - an application of PSA

    International Nuclear Information System (INIS)

    Hari Prasad, M.; Gopika, V.; Sanyasi Rao, V.V.S.; Vaze, K.K.

    2011-01-01

    Incidents are inevitably part of the operational life of any complex industrial facility, and it is hard to predict how various contributing factors combine to cause the outcome. However, it should be possible to detect the existence of latent conditions that, together with the triggering failure(s), result in abnormal events. These incidents are called precursors. Precursor study, by definition, focuses on how a particular event might have adversely developed. This paper focuses on the events which can be analyzed to assess their potential to develop into core damage situation and looks into extending Probabilistic Safety Assessment techniques to precursor studies and explains the benefits through a typical case study. A preliminary probabilistic precursor analysis has been carried out for a typical NPP. The major advantages of this approach are the strong potential for augmenting event analysis which is currently carried out purely on deterministic basis. (author)

  12. Exact and Approximate Probabilistic Symbolic Execution

    Science.gov (United States)

    Luckow, Kasper; Pasareanu, Corina S.; Dwyer, Matthew B.; Filieri, Antonio; Visser, Willem

    2014-01-01

    Probabilistic software analysis seeks to quantify the likelihood of reaching a target event under uncertain environments. Recent approaches compute probabilities of execution paths using symbolic execution, but do not support nondeterminism. Nondeterminism arises naturally when no suitable probabilistic model can capture a program behavior, e.g., for multithreading or distributed systems. In this work, we propose a technique, based on symbolic execution, to synthesize schedulers that resolve nondeterminism to maximize the probability of reaching a target event. To scale to large systems, we also introduce approximate algorithms to search for good schedulers, speeding up established random sampling and reinforcement learning results through the quantification of path probabilities based on symbolic execution. We implemented the techniques in Symbolic PathFinder and evaluated them on nondeterministic Java programs. We show that our algorithms significantly improve upon a state-of- the-art statistical model checking algorithm, originally developed for Markov Decision Processes.

  13. On the Coherence of Probabilistic Relational Formalisms

    Directory of Open Access Journals (Sweden)

    Glauber De Bona

    2018-03-01

    Full Text Available There are several formalisms that enhance Bayesian networks by including relations amongst individuals as modeling primitives. For instance, Probabilistic Relational Models (PRMs use diagrams and relational databases to represent repetitive Bayesian networks, while Relational Bayesian Networks (RBNs employ first-order probability formulas with the same purpose. We examine the coherence checking problem for those formalisms; that is, the problem of guaranteeing that any grounding of a well-formed set of sentences does produce a valid Bayesian network. This is a novel version of de Finetti’s problem of coherence checking for probabilistic assessments. We show how to reduce the coherence checking problem in relational Bayesian networks to a validity problem in first-order logic augmented with a transitive closure operator and how to combine this logic-based approach with faster, but incomplete algorithms.

  14. Probabilistic graphlet transfer for photo cropping.

    Science.gov (United States)

    Zhang, Luming; Song, Mingli; Zhao, Qi; Liu, Xiao; Bu, Jiajun; Chen, Chun

    2013-02-01

    As one of the most basic photo manipulation processes, photo cropping is widely used in the printing, graphic design, and photography industries. In this paper, we introduce graphlets (i.e., small connected subgraphs) to represent a photo's aesthetic features, and propose a probabilistic model to transfer aesthetic features from the training photo onto the cropped photo. In particular, by segmenting each photo into a set of regions, we construct a region adjacency graph (RAG) to represent the global aesthetic feature of each photo. Graphlets are then extracted from the RAGs, and these graphlets capture the local aesthetic features of the photos. Finally, we cast photo cropping as a candidate-searching procedure on the basis of a probabilistic model, and infer the parameters of the cropped photos using Gibbs sampling. The proposed method is fully automatic. Subjective evaluations have shown that it is preferred over a number of existing approaches.

  15. Poisson cohomology of scalar multidimensional Dubrovin-Novikov brackets

    Science.gov (United States)

    Carlet, Guido; Casati, Matteo; Shadrin, Sergey

    2017-04-01

    We compute the Poisson cohomology of a scalar Poisson bracket of Dubrovin-Novikov type with D independent variables. We find that the second and third cohomology groups are generically non-vanishing in D > 1. Hence, in contrast with the D = 1 case, the deformation theory in the multivariable case is non-trivial.

  16. Avoiding negative populations in explicit Poisson tau-leaping.

    Science.gov (United States)

    Cao, Yang; Gillespie, Daniel T; Petzold, Linda R

    2005-08-01

    The explicit tau-leaping procedure attempts to speed up the stochastic simulation of a chemically reacting system by approximating the number of firings of each reaction channel during a chosen time increment tau as a Poisson random variable. Since the Poisson random variable can have arbitrarily large sample values, there is always the possibility that this procedure will cause one or more reaction channels to fire so many times during tau that the population of some reactant species will be driven negative. Two recent papers have shown how that unacceptable occurrence can be avoided by replacing the Poisson random variables with binomial random variables, whose values are naturally bounded. This paper describes a modified Poisson tau-leaping procedure that also avoids negative populations, but is easier to implement than the binomial procedure. The new Poisson procedure also introduces a second control parameter, whose value essentially dials the procedure from the original Poisson tau-leaping at one extreme to the exact stochastic simulation algorithm at the other; therefore, the modified Poisson procedure will generally be more accurate than the original Poisson procedure.

  17. Estimation of a Non-homogeneous Poisson Model: An Empirical ...

    African Journals Online (AJOL)

    This article aims at applying the Nonhomogeneous Poisson process to trends of economic development. For this purpose, a modified Nonhomogeneous Poisson process is derived when the intensity rate is considered as a solution of stochastic differential equation which satisfies the geometric Brownian motion. The mean ...

  18. Formulation of Hamiltonian mechanics with even and odd Poisson brackets

    International Nuclear Information System (INIS)

    Khudaverdyan, O.M.; Nersesyan, A.P.

    1987-01-01

    A possibility is studied as to constrict the odd Poisson bracket and odd Hamiltonian by the given dynamics in phase superspace - the even Poisson bracket and even Hamiltonian so the transition to the new structure does not change the equations of motion. 9 refs

  19. Cluster X-varieties, amalgamation, and Poisson-Lie groups

    DEFF Research Database (Denmark)

    Fock, V. V.; Goncharov, A. B.

    2006-01-01

    In this paper, starting from a split semisimple real Lie group G with trivial center, we define a family of varieties with additional structures. We describe them as cluster χ-varieties, as defined in [FG2]. In particular they are Poisson varieties. We define canonical Poisson maps of these varie...

  20. Derivation of relativistic wave equation from the Poisson process

    Indian Academy of Sciences (India)

    Abstract. A Poisson process is one of the fundamental descriptions for relativistic particles: both fermions and bosons. A generalized linear photon wave equation in dispersive and homogeneous medium with dissipation is derived using the formulation of the Poisson process. This formulation provides a possible ...

  1. Probabilistic description of traffic flow

    International Nuclear Information System (INIS)

    Mahnke, R.; Kaupuzs, J.; Lubashevsky, I.

    2005-01-01

    A stochastic description of traffic flow, called probabilistic traffic flow theory, is developed. The general master equation is applied to relatively simple models to describe the formation and dissolution of traffic congestions. Our approach is mainly based on spatially homogeneous systems like periodically closed circular rings without on- and off-ramps. We consider a stochastic one-step process of growth or shrinkage of a car cluster (jam). As generalization we discuss the coexistence of several car clusters of different sizes. The basic problem is to find a physically motivated ansatz for the transition rates of the attachment and detachment of individual cars to a car cluster consistent with the empirical observations in real traffic. The emphasis is put on the analogy with first-order phase transitions and nucleation phenomena in physical systems like supersaturated vapour. The results are summarized in the flux-density relation, the so-called fundamental diagram of traffic flow, and compared with empirical data. Different regimes of traffic flow are discussed: free flow, congested mode as stop-and-go regime, and heavy viscous traffic. The traffic breakdown is studied based on the master equation as well as the Fokker-Planck approximation to calculate mean first passage times or escape rates. Generalizations are developed to allow for on-ramp effects. The calculated flux-density relation and characteristic breakdown times coincide with empirical data measured on highways. Finally, a brief summary of the stochastic cellular automata approach is given

  2. Unimodularity criteria for Poisson structures on foliated manifolds

    Science.gov (United States)

    Pedroza, Andrés; Velasco-Barreras, Eduardo; Vorobiev, Yury

    2018-03-01

    We study the behavior of the modular class of an orientable Poisson manifold and formulate some unimodularity criteria in the semilocal context, around a (singular) symplectic leaf. Our results generalize some known unimodularity criteria for regular Poisson manifolds related to the notion of the Reeb class. In particular, we show that the unimodularity of the transverse Poisson structure of the leaf is a necessary condition for the semilocal unimodular property. Our main tool is an explicit formula for a bigraded decomposition of modular vector fields of a coupling Poisson structure on a foliated manifold. Moreover, we also exploit the notion of the modular class of a Poisson foliation and its relationship with the Reeb class.

  3. Probabilistic safety assessment improves surveillance requirements in technical specifications

    International Nuclear Information System (INIS)

    Cepin, M.; Mavko, B.

    1997-01-01

    Probabilistic Safety Assessment is widely becoming the standard method for assessing, maintaining, assuring and improving the nuclear power plant safety. To achieve one of its many potential benefits, the optimization approach of surveillance requirements in technical specifications was developed. Surveillance requirements in technical specifications define the surveillance test intervals for the equipment to be tested and the testing strategy. This optimization approach based mainly on probabilistic safety assessment results consists of three levels: component level, system level and plant level. The application of this optimization approach on system level has shown that the risk based surveillance requirements differ from existing ones in technical specifications

  4. Probabilistic sensory recoding.

    Science.gov (United States)

    Jazayeri, Mehrdad

    2008-08-01

    A hallmark of higher brain functions is the ability to contemplate the world rather than to respond reflexively to it. To do so, the nervous system makes use of a modular architecture in which sensory representations are dissociated from areas that control actions. This flexibility however necessitates a recoding scheme that would put sensory information to use in the control of behavior. Sensory recoding faces two important challenges. First, recoding must take into account the inherent variability of sensory responses. Second, it must be flexible enough to satisfy the requirements of different perceptual goals. Recent progress in theory, psychophysics, and neurophysiology indicate that cortical circuitry might meet these challenges by evaluating sensory signals probabilistically.

  5. Probabilistic quantum multimeters

    Science.gov (United States)

    Fiurášek, Jaromír; Dušek, Miloslav

    2004-03-01

    We propose quantum devices that can realize probabilistically different projective measurements on a qubit. The desired measurement basis is selected by the quantum state of a program register. First we analyze the phase-covariant multimeters for a large class of program states and then the universal multimeters for a special choice of program. In both cases we start with deterministic but error-prone devices and then proceed to devices that never make a mistake but from time to time give an inconclusive result. These multimeters are optimized (for a given type of program) with respect to the minimum probability of an inconclusive result. This concept is further generalized to multimeters that minimize the error rate for a given probability of an inconclusive result (or vice versa). Finally, we propose a generalization for qudits.

  6. A probabilistic approach to assess antibiotic resistance development risks in environmental compartments and its application to an intensive aquaculture production scenario

    NARCIS (Netherlands)

    Rico, Andreu; Jacobs, Rianne; Brink, Van den Paul J.; Tello, Alfredo

    2017-01-01

    Estimating antibiotic pollution and antibiotic resistance development risks in environmental compartments is important to design management strategies that advance our stewardship of antibiotics. In this study we propose a modelling approach to estimate the risk of antibiotic resistance development

  7. A probabilistic approach to assess antibiotic resistance development risks in environmental compartments and its application to an intensive aquaculture production scenario.

    NARCIS (Netherlands)

    Rico, Andreu; Jacobs, Rianne; Van den Brink, Paul J; Tello, Alfredo

    2017-01-01

    Estimating antibiotic pollution and antibiotic resistance development risks in environmental compartments is important to design management strategies that advance our stewardship of antibiotics. In this study we propose a modelling approach to estimate the risk of antibiotic resistance development

  8. Non-isothermal Smoluchowski-Poisson equation as a singular limit of the Navier-Stokes-Fourier-Poisson system

    Czech Academy of Sciences Publication Activity Database

    Feireisl, Eduard; Laurençot, P.

    2007-01-01

    Roč. 88, - (2007), s. 325-349 ISSN 0021-7824 R&D Projects: GA ČR GA201/05/0164 Institutional research plan: CEZ:AV0Z10190503 Keywords : Navier-Stokes-Fourier- Poisson system * Smoluchowski- Poisson system * singular limit Subject RIV: BA - General Mathematics Impact factor: 1.118, year: 2007

  9. Probabilistic brains: knowns and unknowns

    Science.gov (United States)

    Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E

    2015-01-01

    There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561

  10. Canonical form of Nambu–Poisson bracket: A pedestrian approach

    Indian Academy of Sciences (India)

    Therefore, Hamilton's equations for the coordinates, and by the derivation property for a general dynamical variable F, are equivalent to the. Nambu form. We have also the liberty of going back to the original canonical coordinates. {x} because the transformation {x}↔{x } has unit Jacobian and we see that the Hamilton's.

  11. Probabilistic Decision Graphs - Combining Verification and AI Techniques for Probabilistic Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We adopt probabilistic decision graphs developed in the field of automated verification as a tool for probabilistic model representation and inference. We show that probabilistic inference has linear time complexity in the size of the probabilistic decision graph, that the smallest probabilistic ...

  12. A framework for the probabilistic analysis of meteotsunamis

    Science.gov (United States)

    Geist, Eric L.; ten Brink, Uri S.; Gove, Matthew D.

    2014-01-01

    A probabilistic technique is developed to assess the hazard from meteotsunamis. Meteotsunamis are unusual sea-level events, generated when the speed of an atmospheric pressure or wind disturbance is comparable to the phase speed of long waves in the ocean. A general aggregation equation is proposed for the probabilistic analysis, based on previous frameworks established for both tsunamis and storm surges, incorporating different sources and source parameters of meteotsunamis. Parameterization of atmospheric disturbances and numerical modeling is performed for the computation of maximum meteotsunami wave amplitudes near the coast. A historical record of pressure disturbances is used to establish a continuous analytic distribution of each parameter as well as the overall Poisson rate of occurrence. A demonstration study is presented for the northeast U.S. in which only isolated atmospheric pressure disturbances from squall lines and derechos are considered. For this study, Automated Surface Observing System stations are used to determine the historical parameters of squall lines from 2000 to 2013. The probabilistic equations are implemented using a Monte Carlo scheme, where a synthetic catalog of squall lines is compiled by sampling the parameter distributions. For each entry in the catalog, ocean wave amplitudes are computed using a numerical hydrodynamic model. Aggregation of the results from the Monte Carlo scheme results in a meteotsunami hazard curve that plots the annualized rate of exceedance with respect to maximum event amplitude for a particular location along the coast. Results from using multiple synthetic catalogs, resampled from the parent parameter distributions, yield mean and quantile hazard curves. Further refinements and improvements for probabilistic analysis of meteotsunamis are discussed.

  13. Poisson Mixture Regression Models for Heart Disease Prediction

    Science.gov (United States)

    Erol, Hamza

    2016-01-01

    Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model. PMID:27999611

  14. Incorporating psychological influences in probabilistic cost analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kujawski, Edouard; Alvaro, Mariana; Edwards, William

    2004-01-08

    Today's typical probabilistic cost analysis assumes an ''ideal'' project that is devoid of the human and organizational considerations that heavily influence the success and cost of real-world projects. In the real world ''Money Allocated Is Money Spent'' (MAIMS principle); cost underruns are rarely available to protect against cost overruns while task overruns are passed on to the total project cost. Realistic cost estimates therefore require a modified probabilistic cost analysis that simultaneously models the cost management strategy including budget allocation. Psychological influences such as overconfidence in assessing uncertainties and dependencies among cost elements and risks are other important considerations that are generally not addressed. It should then be no surprise that actual project costs often exceed the initial estimates and are delivered late and/or with a reduced scope. This paper presents a practical probabilistic cost analysis model that incorporates recent findings in human behavior and judgment under uncertainty, dependencies among cost elements, the MAIMS principle, and project management practices. Uncertain cost elements are elicited from experts using the direct fractile assessment method and fitted with three-parameter Weibull distributions. The full correlation matrix is specified in terms of two parameters that characterize correlations among cost elements in the same and in different subsystems. The analysis is readily implemented using standard Monte Carlo simulation tools such as {at}Risk and Crystal Ball{reg_sign}. The analysis of a representative design and engineering project substantiates that today's typical probabilistic cost analysis is likely to severely underestimate project cost for probability of success values of importance to contractors and procuring activities. The proposed approach provides a framework for developing a viable cost management strategy for

  15. 14th International Probabilistic Workshop

    CERN Document Server

    Taerwe, Luc; Proske, Dirk

    2017-01-01

    This book presents the proceedings of the 14th International Probabilistic Workshop that was held in Ghent, Belgium in December 2016. Probabilistic methods are currently of crucial importance for research and developments in the field of engineering, which face challenges presented by new materials and technologies and rapidly changing societal needs and values. Contemporary needs related to, for example, performance-based design, service-life design, life-cycle analysis, product optimization, assessment of existing structures and structural robustness give rise to new developments as well as accurate and practically applicable probabilistic and statistical engineering methods to support these developments. These proceedings are a valuable resource for anyone interested in contemporary developments in the field of probabilistic engineering applications.

  16. Evaluation of Probabilistic Disease Forecasts.

    Science.gov (United States)

    Hughes, Gareth; Burnett, Fiona J

    2017-10-01

    The statistical evaluation of probabilistic disease forecasts often involves calculation of metrics defined conditionally on disease status, such as sensitivity and specificity. However, for the purpose of disease management decision making, metrics defined conditionally on the result of the forecast-predictive values-are also important, although less frequently reported. In this context, the application of scoring rules in the evaluation of probabilistic disease forecasts is discussed. An index of separation with application in the evaluation of probabilistic disease forecasts, described in the clinical literature, is also considered and its relation to scoring rules illustrated. Scoring rules provide a principled basis for the evaluation of probabilistic forecasts used in plant disease management. In particular, the decomposition of scoring rules into interpretable components is an advantageous feature of their application in the evaluation of disease forecasts.

  17. Computing Distances between Probabilistic Automata

    Directory of Open Access Journals (Sweden)

    Mathieu Tracol

    2011-07-01

    Full Text Available We present relaxed notions of simulation and bisimulation on Probabilistic Automata (PA, that allow some error epsilon. When epsilon is zero we retrieve the usual notions of bisimulation and simulation on PAs. We give logical characterisations of these notions by choosing suitable logics which differ from the elementary ones, L with negation and L without negation, by the modal operator. Using flow networks, we show how to compute the relations in PTIME. This allows the definition of an efficiently computable non-discounted distance between the states of a PA. A natural modification of this distance is introduced, to obtain a discounted distance, which weakens the influence of long term transitions. We compare our notions of distance to others previously defined and illustrate our approach on various examples. We also show that our distance is not expansive with respect to process algebra operators. Although L without negation is a suitable logic to characterise epsilon-(bisimulation on deterministic PAs, it is not for general PAs; interestingly, we prove that it does characterise weaker notions, called a priori epsilon-(bisimulation, which we prove to be NP-difficult to decide.

  18. Boundary Lax pairs from non-ultra-local Poisson algebras

    International Nuclear Information System (INIS)

    Avan, Jean; Doikou, Anastasia

    2009-01-01

    We consider non-ultra-local linear Poisson algebras on a continuous line. Suitable combinations of representations of these algebras yield representations of novel generalized linear Poisson algebras or 'boundary' extensions. They are parametrized by a boundary scalar matrix and depend, in addition, on the choice of an antiautomorphism. The new algebras are the classical-linear counterparts of the known quadratic quantum boundary algebras. For any choice of parameters, the non-ultra-local contribution of the original Poisson algebra disappears. We also systematically construct the associated classical Lax pair. The classical boundary principal chiral model is examined as a physical example.

  19. Multiple-Strain Approach and Probabilistic Modeling of Consumer Habits in Quantitative Microbial Risk Assessment: A Quantitative Assessment of Exposure to Staphylococcal Enterotoxin A in Raw Milk.

    Science.gov (United States)

    Crotta, Matteo; Rizzi, Rita; Varisco, Giorgio; Daminelli, Paolo; Cunico, Elena Cosciani; Luini, Mario; Graber, Hans Ulrich; Paterlini, Franco; Guitian, Javier

    2016-03-01

    Quantitative microbial risk assessment (QMRA) models are extensively applied to inform management of a broad range of food safety risks. Inevitably, QMRA modeling involves an element of simplification of the biological process of interest. Two features that are frequently simplified or disregarded are the pathogenicity of multiple strains of a single pathogen and consumer behavior at the household level. In this study, we developed a QMRA model with a multiple-strain approach and a consumer phase module (CPM) based on uncertainty distributions fitted from field data. We modeled exposure to staphylococcal enterotoxin A in raw milk in Lombardy; a specific enterotoxin production module was thus included. The model is adaptable and could be used to assess the risk related to other pathogens in raw milk as well as other staphylococcal enterotoxins. The multiplestrain approach, implemented as a multinomial process, allowed the inclusion of variability and uncertainty with regard to pathogenicity at the bacterial level. Data from 301 questionnaires submitted to raw milk consumers were used to obtain uncertainty distributions for the CPM. The distributions were modeled to be easily updatable with further data or evidence. The sources of uncertainty due to the multiple-strain approach and the CPM were identified, and their impact on the output was assessed by comparing specific scenarios to the baseline. When the distributions reflecting the uncertainty in consumer behavior were fixed to the 95th percentile, the risk of exposure increased up to 160 times. This reflects the importance of taking into consideration the diversity of consumers' habits at the household level and the impact that the lack of knowledge about variables in the CPM can have on the final QMRA estimates. The multiple-strain approach lends itself to use in other food matrices besides raw milk and allows the model to better capture the complexity of the real world and to be capable of geographical

  20. Modeling and control of an unstable system using probabilistic fuzzy inference system

    Directory of Open Access Journals (Sweden)

    Sozhamadevi N.

    2015-09-01

    Full Text Available A new type Fuzzy Inference System is proposed, a Probabilistic Fuzzy Inference system which model and minimizes the effects of statistical uncertainties. The blend of two different concepts, degree of truth and probability of truth in a unique framework leads to this new concept. This combination is carried out both in Fuzzy sets and Fuzzy rules, which gives rise to Probabilistic Fuzzy Sets and Probabilistic Fuzzy Rules. Introducing these probabilistic elements, a distinctive probabilistic fuzzy inference system is developed and this involves fuzzification, inference and output processing. This integrated approach accounts for all of the uncertainty like rule uncertainties and measurement uncertainties present in the systems and has led to the design which performs optimally after training. In this paper a Probabilistic Fuzzy Inference System is applied for modeling and control of a highly nonlinear, unstable system and also proved its effectiveness.

  1. A Probabilistic Analysis Framework for Malicious Insider Threats

    DEFF Research Database (Denmark)

    Chen, Taolue; Kammuller, Florian; Nemli, Ibrahim

    2015-01-01

    Malicious insider threats are difficult to detect and to mitigate. Many approaches for explaining behaviour exist, but there is little work to relate them to formal approaches to insider threat detection. In this work we present a general formal framework to perform analysis for malicious insider...... threats, based on probabilistic modelling, verification, and synthesis techniques. The framework first identifies insiders’ intention to perform an inside attack, using Bayesian networks, and in a second phase computes the probability of success for an inside attack by this actor, using probabilistic...

  2. On Probabilistic Alpha-Fuzzy Fixed Points and Related Convergence Results in Probabilistic Metric and Menger Spaces under Some Pompeiu-Hausdorff-Like Probabilistic Contractive Conditions

    OpenAIRE

    De la Sen, M.

    2015-01-01

    In the framework of complete probabilistic metric spaces and, in particular, in probabilistic Menger spaces, this paper investigates some relevant properties of convergence of sequences to probabilistic α-fuzzy fixed points under some types of probabilistic contractive conditions.

  3. An integrated approach coupling physically based models and probabilistic method to assess quantitatively landslide susceptibility at different scale: application to different geomorphological environments

    Science.gov (United States)

    Vandromme, Rosalie; Thiéry, Yannick; Sedan, Olivier; Bernardie, Séverine

    2016-04-01

    Landslide hazard assessment is the estimation of a target area where landslides of a particular type, volume, runout and intensity may occur within a given period. The first step to analyze landslide hazard consists in assessing the spatial and temporal failure probability (when the information is available, i.e. susceptibility assessment). Two types of approach are generally recommended to achieve this goal: (i) qualitative approach (i.e. inventory based methods and knowledge data driven methods) and (ii) quantitative approach (i.e. data-driven methods or deterministic physically based methods). Among quantitative approaches, deterministic physically based methods (PBM) are generally used at local and/or site-specific scales (1:5,000-1:25,000 and >1:5,000, respectively). The main advantage of these methods is the calculation of probability of failure (safety factor) following some specific environmental conditions. For some models it is possible to integrate the land-uses and climatic change. At the opposite, major drawbacks are the large amounts of reliable and detailed data (especially materials type, their thickness and the geotechnical parameters heterogeneity over a large area) and the fact that only shallow landslides are taking into account. This is why they are often used at site-specific scales (> 1:5,000). Thus, to take into account (i) materials' heterogeneity , (ii) spatial variation of physical parameters, (iii) different landslide types, the French Geological Survey (i.e. BRGM) has developed a physically based model (PBM) implemented in a GIS environment. This PBM couples a global hydrological model (GARDENIA®) including a transient unsaturated/saturated hydrological component with a physically based model computing the stability of slopes (ALICE®, Assessment of Landslides Induced by Climatic Events) based on the Morgenstern-Price method for any slip surface. The variability of mechanical parameters is handled by Monte Carlo approach. The

  4. Studies on a Double Poisson-Geometric Insurance Risk Model with Interference

    Directory of Open Access Journals (Sweden)

    Yujuan Huang

    2013-01-01

    Full Text Available This paper mainly studies a generalized double Poisson-Geometric insurance risk model. By martingale and stopping time approach, we obtain adjustment coefficient equation, the Lundberg inequality, and the formula for the ruin probability. Also the Laplace transformation of the time when the surplus reaches a given level for the first time is discussed, and the expectation and its variance are obtained. Finally, we give the numerical examples.

  5. Modified Regression Correlation Coefficient for Poisson Regression Model

    Science.gov (United States)

    Kaengthong, Nattacha; Domthong, Uthumporn

    2017-09-01

    This study gives attention to indicators in predictive power of the Generalized Linear Model (GLM) which are widely used; however, often having some restrictions. We are interested in regression correlation coefficient for a Poisson regression model. This is a measure of predictive power, and defined by the relationship between the dependent variable (Y) and the expected value of the dependent variable given the independent variables [E(Y|X)] for the Poisson regression model. The dependent variable is distributed as Poisson. The purpose of this research was modifying regression correlation coefficient for Poisson regression model. We also compare the proposed modified regression correlation coefficient with the traditional regression correlation coefficient in the case of two or more independent variables, and having multicollinearity in independent variables. The result shows that the proposed regression correlation coefficient is better than the traditional regression correlation coefficient based on Bias and the Root Mean Square Error (RMSE).

  6. A high order solver for the unbounded Poisson equation

    DEFF Research Database (Denmark)

    Hejlesen, Mads Mølholm; Rasmussen, Johannes Tophøj; Chatelain, Philippe

    2013-01-01

    A high order converging Poisson solver is presented, based on the Greenʼs function solution to Poissonʼs equation subject to free-space boundary conditions. The high order convergence is achieved by formulating regularised integration kernels, analogous to a smoothing of the solution field....... The method is extended to directly solve the derivatives of the solution to Poissonʼs equation. In this way differential operators such as the divergence or curl of the solution field can be solved to the same high order convergence without additional computational effort. The method, is applied...... and validated, however not restricted, to the equations of fluid mechanics, and can be used in many applications to solve Poissonʼs equation on a rectangular unbounded domain....

  7. On the poisson's ratio of the nucleus pulposus.

    Science.gov (United States)

    Farrell, M D; Riches, P E

    2013-10-01

    Existing experimental data on the Poisson's ratio of nucleus pulposus (NP) tissue is limited. This study aims to determine whether the Poisson's ratio of NP tissue is strain-dependent, strain-rate-dependent, or varies with axial location in the disk. Thirty-two cylindrical plugs of bovine tail NP tissue were subjected to ramp-hold unconfined compression to 20% axial strain in 5% increments, at either 30 μm/s or 0.3 μm/s ramp speeds and the radial displacement determined using biaxial video extensometry. Following radial recoil, the true Poisson's ratio of the solid phase of NP tissue increased linearly with increasing strain and demonstrated strain-rate dependency. The latter finding suggests that the solid matrix undergoes stress relaxation during the test. For small strains, we suggest a Poisson's ratio of 0.125 to be used in biphasic models of the intervertebral disk.

  8. A framework to assess the impacts of Climate Change for different hazards at local and regional scale through probabilistic multi-model approaches

    Science.gov (United States)

    Mercogliano, P.; Reder, A.; Rianna, G.

    2017-12-01

    Extreme weather events (EWEs) are projected to be more frequent and severe across the globe because of global warming. This poses challenging problems for critical infrastructures (CIs) which can be dramatically affected by EWEs needing adaptation countermeasures againts changing climate conditions. In this work, we present the main results achieved in the framework of the FP7-European project INTACT aimed at analyzing the resilience of CIs against shocks and stresses due to the climate changes. To identify variations in the hazard induced by climate change, appropriate Extreme Weather Indicators (EWIs) are defined for several case studies and different approaches are analyzed to obtain local climate projections. The different approaches, with increasing refinement depending on local information available and methodologies selected, are investigated considering raw versus bias corrected data and weighted or equiprobable ensemble mean projections given by the regional climate models within the Euro-CORDEX program. Specifically, this work focuses on two case studies selected from the five proposed within the INTACT project and for which local station data are available: • rainfall-induced landslide affecting Campania region (Southern Italy) with a special view on the Nocera municipality; • storms and heavy rainfall/winds in port of Rotterdam (Netherlands). In general, our results show a small sensitivity to the weighting approach and a large sensitivity to bias-correction in the future projections. For landslides in Campania region, the Euro-CORDEX simulations projected a generalized worsening of the safety conditions depending on the scenario (RCP4.5/8.5) and period (2011-2040/2041-2070/2071-2100) considered. For the port of Rotterdam, the Euro-CORDEX simulations projected an increment in the intense events of daily and weekly precipitation, also in this case depending on the scenario and period considered. Considering framework, methodologies and results, the

  9. Organisation spatiale du peuplement de poissons dans le Bandama ...

    African Journals Online (AJOL)

    L'évolution des peuplements de poissons sur le Bandama a été étudiée en considérant quatre zones d'échantillonnage : en amont du lac de Kossou, dans les lacs de Kossou et de Taabo, entre les lacs de Kossou et de Taabo, et en aval du lac de Taabo. Au total, 74 espèces de poisson réparties en 49 genres, 28 familles ...

  10. Formality theory from Poisson structures to deformation quantization

    CERN Document Server

    Esposito, Chiara

    2015-01-01

    This book is a survey of the theory of formal deformation quantization of Poisson manifolds, in the formalism developed by Kontsevich. It is intended as an educational introduction for mathematical physicists who are dealing with the subject for the first time. The main topics covered are the theory of Poisson manifolds, star products and their classification, deformations of associative algebras and the formality theorem. Readers will also be familiarized with the relevant physical motivations underlying the purely mathematical construction.

  11. Poisson structure of the equations of ideal multispecies fluid electrodynamics

    International Nuclear Information System (INIS)

    Spencer, R.G.

    1984-01-01

    The equations of the two- (or multi-) fluid model of plasma physics are recast in Hamiltonian form, following general methods of symplectic geometry. The dynamical variables are the fields of physical interest, but are noncanonical, so that the Poisson bracket in the theory is not the standard one. However, it is a skew-symmetric bilinear form which, from the method of derivation, automatically satisfies the Jacobi identity; therefore, this noncanonical structure has all the essential properties of a canonical Poisson bracket

  12. On the Fedosov deformation quantization beyond the regular Poisson manifolds

    International Nuclear Information System (INIS)

    Dolgushev, V.A.; Isaev, A.P.; Lyakhovich, S.L.; Sharapov, A.A.

    2002-01-01

    A simple iterative procedure is suggested for the deformation quantization of (irregular) Poisson brackets associated to the classical Yang-Baxter equation. The construction is shown to admit a pure algebraic reformulation giving the Universal Deformation Formula (UDF) for any triangular Lie bialgebra. A simple proof of classification theorem for inequivalent UDF's is given. As an example the explicit quantization formula is presented for the quasi-homogeneous Poisson brackets on two-plane

  13. A Note On the Estimation of the Poisson Parameter

    Directory of Open Access Journals (Sweden)

    S. S. Chitgopekar

    1985-01-01

    distribution when there are errors in observing the zeros and ones and obtains both the maximum likelihood and moments estimates of the Poisson mean and the error probabilities. It is interesting to note that either method fails to give unique estimates of these parameters unless the error probabilities are functionally related. However, it is equally interesting to observe that the estimate of the Poisson mean does not depend on the functional relationship between the error probabilities.

  14. Probabilistic analysis of fires in nuclear plants

    International Nuclear Information System (INIS)

    Unione, A.; Teichmann, T.

    1985-01-01

    The aim of this paper is to describe a multilevel (i.e., staged) probabilistic analysis of fire risks in nuclear plants (as part of a general PRA) which maximizes the benefits of the FRA (fire risk assessment) in a cost effective way. The approach uses several stages of screening, physical modeling of clearly dominant risk contributors, searches for direct (e.g., equipment dependences) and secondary (e.g., fire induced internal flooding) interactions, and relies on lessons learned and available data from and surrogate FRAs. The general methodology is outlined. 6 figs., 10 tabs

  15. A trading-space-for-time approach to probabilistic continuous streamflow predictions in a changing climate – accounting for changing watershed behavior

    Directory of Open Access Journals (Sweden)

    R. Singh

    2011-11-01

    Full Text Available Projecting how future climatic change might impact streamflow is an important challenge for hydrologic science. The common approach to solve this problem is by forcing a hydrologic model, calibrated on historical data or using a priori parameter estimates, with future scenarios of precipitation and temperature. However, several recent studies suggest that the climatic regime of the calibration period is reflected in the resulting parameter estimates and model performance can be negatively impacted if the climate for which projections are made is significantly different from that during calibration. So how can we calibrate a hydrologic model for historically unobserved climatic conditions? To address this issue, we propose a new trading-space-for-time framework that utilizes the similarity between the predictions under change (PUC and predictions in ungauged basins (PUB problems. In this new framework we first regionalize climate dependent streamflow characteristics using 394 US watersheds. We then assume that this spatial relationship between climate and streamflow characteristics is similar to the one we would observe between climate and streamflow over long time periods at a single location. This assumption is what we refer to as trading-space-for-time. Therefore, we change the limits for extrapolation to future climatic situations from the restricted locally observed historical variability to the variability observed across all watersheds used to derive the regression relationships. A typical watershed model is subsequently calibrated (conditioned on the predicted signatures for any future climate scenario to account for the impact of climate on model parameters within a Bayesian framework. As a result, we can obtain ensemble predictions of continuous streamflow at both gauged and ungauged locations. The new method is tested in five US watersheds located in historically different climates using synthetic climate scenarios generated by

  16. A probabilistic approach to assess antibiotic resistance development risks in environmental compartments and its application to an intensive aquaculture production scenario.

    Science.gov (United States)

    Rico, Andreu; Jacobs, Rianne; Van den Brink, Paul J; Tello, Alfredo

    2017-12-01

    Estimating antibiotic pollution and antibiotic resistance development risks in environmental compartments is important to design management strategies that advance our stewardship of antibiotics. In this study we propose a modelling approach to estimate the risk of antibiotic resistance development in environmental compartments and demonstrate its application in aquaculture production systems. We modelled exposure concentrations for 12 antibiotics used in Vietnamese Pangasius catfish production using the ERA-AQUA model. Minimum selective concentration (MSC) distributions that characterize the selective pressure of antibiotics on bacterial communities were derived from the European Committee on Antimicrobial Susceptibility Testing (EUCAST) Minimum Inhibitory Concentration dataset. The antibiotic resistance development risk (RDR) for each antibiotic was calculated as the probability that the antibiotic exposure distribution exceeds the MSC distribution representing the bacterial community. RDRs in pond sediments were nearly 100% for all antibiotics. Median RDR values in pond water were high for the majority of the antibiotics, with rifampicin, levofloxacin and ampicillin having highest values. In the effluent mixing area, RDRs were low for most antibiotics, with the exception of amoxicillin, ampicillin and trimethoprim, which presented moderate risks, and rifampicin and levofloxacin, which presented high risks. The RDR provides an efficient means to benchmark multiple antibiotics and treatment regimes in the initial phase of a risk assessment with regards to their potential to develop resistance in different environmental compartments, and can be used to derive resistance threshold concentrations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Comparison of probabilistic and deterministic fiber tracking of cranial nerves.

    Science.gov (United States)

    Zolal, Amir; Sobottka, Stephan B; Podlesek, Dino; Linn, Jennifer; Rieger, Bernhard; Juratli, Tareq A; Schackert, Gabriele; Kitzler, Hagen H

    2017-09-01

    OBJECTIVE The depiction of cranial nerves (CNs) using diffusion tensor imaging (DTI) is of great interest in skull base tumor surgery and DTI used with deterministic tracking methods has been reported previously. However, there are still no good methods usable for the elimination of noise from the resulting depictions. The authors have hypothesized that probabilistic tracking could lead to more accurate results, because it more efficiently extracts information from the underlying data. Moreover, the authors have adapted a previously described technique for noise elimination using gradual threshold increases to probabilistic tracking. To evaluate the utility of this new approach, a comparison is provided with this work between the gradual threshold increase method in probabilistic and deterministic tracking of CNs. METHODS Both tracking methods were used to depict CNs II, III, V, and the VII+VIII bundle. Depiction of 240 CNs was attempted with each of the above methods in 30 healthy subjects, which were obtained from 2 public databases: the Kirby repository (KR) and Human Connectome Project (HCP). Elimination of erroneous fibers was attempted by gradually increasing the respective thresholds (fractional anisotropy [FA] and probabilistic index of connectivity [PICo]). The results were compared with predefined ground truth images based on corresponding anatomical scans. Two label overlap measures (false-positive error and Dice similarity coefficient) were used to evaluate the success of both methods in depicting the CN. Moreover, the differences between these parameters obtained from the KR and HCP (with higher angular resolution) databases were evaluated. Additionally, visualization of 10 CNs in 5 clinical cases was attempted with both methods and evaluated by comparing the depictions with intraoperative findings. RESULTS Maximum Dice similarity coefficients were significantly higher with probabilistic tracking (p probabilistic than in deterministic tracking (p

  18. A Probabilistic Approach for Analysis of Modeling Uncertainties in Quantification of Trading Ratios in Nonpoint to Point Source Nutrient Trading Programs

    Science.gov (United States)

    Tasdighi, A.; Arabi, M.

    2015-12-01

    Quantifying the nonpoint source pollutant loads and assessing the water quality benefits of conservation practices (BMPs) are prone to different types of uncertainties which have to be taken into account when developing nutrient trading programs. Although various types of modeling uncertainties (parameter, input and structure) have been examined in the literature more or less, the impact of modeling uncertainties on evaluation of BMPs has not been addressed sufficiently. Currently, "trading ratios" are used within nutrient trading programs to account for variability of nonpoint source loads. However, we were not able to find any case of some rigorous scientific approach to account for any type of uncertainties in trading ratios. In this study, Bayesian inferences were applied to incorporate input, parameter and structural uncertainties using a statistically valid likelihood function. IPEAT (Integrated Parameter Estimation and Uncertainty Analysis Tool), a framework developed for simultaneous evaluation of parameterization, input data, model structure, and observation data uncertainty and their contribution to predictive uncertainty was used to quantify the uncertainties in effectiveness of agricultural BMPs while propagating different sources of uncertainty. SWAT was used as the simulation model. SWAT parameterization was done for three different model structures (SCS CN I, SCS CN II and G&A methods) using a Bayesian based Markov Chain Monte Carlo (MCMC) method named Differential Evolution Adaptive Metropolis (DREAM). For each model structure, the Integrated Bayesian Uncertainty Estimator (IBUNE) was employed to generate latent variables from input data. Bayesian Model Averaging (BMA) was then used to combine the models and Expectation-Maximization (EM) optimization technique was used to estimate the BMA weights. Using this framework, the impact of different sources of uncertainty on nutrient loads from nonpoint sources and subsequently effectiveness of BMPs in

  19. How does Poisson kriging compare to the popular BYM model for mapping disease risks?

    Directory of Open Access Journals (Sweden)

    Gebreab Samson

    2008-02-01

    Full Text Available Abstract Background Geostatistical techniques are now available to account for spatially varying population sizes and spatial patterns in the mapping of disease rates. At first glance, Poisson kriging represents an attractive alternative to increasingly popular Bayesian spatial models in that: 1 it is easier to implement and less CPU intensive, and 2 it accounts for the size and shape of geographical units, avoiding the limitations of conditional auto-regressive (CAR models commonly used in Bayesian algorithms while allowing for the creation of isopleth risk maps. Both approaches, however, have never been compared in simulation studies, and there is a need to better understand their merits in terms of accuracy and precision of disease risk estimates. Results Besag, York and Mollie's (BYM model and Poisson kriging (point and area-to-area implementations were applied to age-adjusted lung and cervix cancer mortality rates recorded for white females in two contrasted county geographies: 1 state of Indiana that consists of 92 counties of fairly similar size and shape, and 2 four states in the Western US (Arizona, California, Nevada and Utah forming a set of 118 counties that are vastly different geographical units. The spatial support (i.e. point versus area has a much smaller impact on the results than the statistical methodology (i.e. geostatistical versus Bayesian models. Differences between methods are particularly pronounced in the Western US dataset: BYM model yields smoother risk surface and prediction variance that changes mainly as a function of the predicted risk, while the Poisson kriging variance increases in large sparsely populated counties. Simulation studies showed that the geostatistical approach yields smaller prediction errors, more precise and accurate probability intervals, and allows a better discrimination between counties with high and low mortality risks. The benefit of area-to-area Poisson kriging increases as the county

  20. Probabilistic Design and Analysis Framework

    Science.gov (United States)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.