WorldWideScience

Sample records for poisson probabilistic approaches

  1. Poisson-Like Spiking in Circuits with Probabilistic Synapses

    Science.gov (United States)

    Moreno-Bote, Rubén

    2014-01-01

    Neuronal activity in cortex is variable both spontaneously and during stimulation, and it has the remarkable property that it is Poisson-like over broad ranges of firing rates covering from virtually zero to hundreds of spikes per second. The mechanisms underlying cortical-like spiking variability over such a broad continuum of rates are currently unknown. We show that neuronal networks endowed with probabilistic synaptic transmission, a well-documented source of variability in cortex, robustly generate Poisson-like variability over several orders of magnitude in their firing rate without fine-tuning of the network parameters. Other sources of variability, such as random synaptic delays or spike generation jittering, do not lead to Poisson-like variability at high rates because they cannot be sufficiently amplified by recurrent neuronal networks. We also show that probabilistic synapses predict Fano factor constancy of synaptic conductances. Our results suggest that synaptic noise is a robust and sufficient mechanism for the type of variability found in cortex. PMID:25032705

  2. POISSON, Analysis Solution of Poisson Problems in Probabilistic Risk Assessment

    International Nuclear Information System (INIS)

    Froehner, F.H.

    1986-01-01

    1 - Description of program or function: Purpose of program: Analytic treatment of two-stage Poisson problem in Probabilistic Risk Assessment. Input: estimated a-priori mean failure rate and error factor of system considered (for calculation of stage-1 prior), number of failures and operating times for similar systems (for calculation of stage-2 prior). Output: a-posteriori probability distributions on linear and logarithmic time scale (on specified time grid) and expectation values of failure rate and error factors are calculated for: - stage-1 a-priori distribution, - stage-1 a-posteriori distribution, - stage-2 a-priori distribution, - stage-2 a-posteriori distribution. 2 - Method of solution: Bayesian approach with conjugate stage-1 prior, improved with experience from similar systems to yield stage-2 prior, and likelihood function from experience with system under study (documentation see below under 10.). 3 - Restrictions on the complexity of the problem: Up to 100 similar systems (including the system considered), arbitrary number of problems (failure types) with same grid

  3. Dependent Neyman type A processes based on common shock Poisson approach

    Science.gov (United States)

    Kadilar, Gamze Özel; Kadilar, Cem

    2016-04-01

    The Neyman type A process is used for describing clustered data since the Poisson process is insufficient for clustering of events. In a multivariate setting, there may be dependencies between multivarite Neyman type A processes. In this study, dependent form of the Neyman type A process is considered under common shock approach. Then, the joint probability function are derived for the dependent Neyman type A Poisson processes. Then, an application based on forest fires in Turkey are given. The results show that the joint probability function of the dependent Neyman type A processes, which is obtained in this study, can be a good tool for the probabilistic fitness for the total number of burned trees in Turkey.

  4. Probabilistic reasoning in data analysis.

    Science.gov (United States)

    Sirovich, Lawrence

    2011-09-20

    This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.

  5. Analytic Bayesian solution of the two-stage poisson-type problem in probabilistic risk analysis

    International Nuclear Information System (INIS)

    Frohner, F.H.

    1985-01-01

    The basic purpose of probabilistic risk analysis is to make inferences about the probabilities of various postulated events, with an account of all relevant information such as prior knowledge and operating experience with the specific system under study, as well as experience with other similar systems. Estimation of the failure rate of a Poisson-type system leads to an especially simple Bayesian solution in closed form if the prior probabilty implied by the invariance properties of the problem is properly taken into account. This basic simplicity persists if a more realistic prior, representing order of magnitude knowledge of the rate parameter, is employed instead. Moreover, the more realistic prior allows direct incorporation of experience gained from other similar systems, without need to postulate a statistical model for an underlying ensemble. The analytic formalism is applied to actual nuclear reactor data

  6. A Markov Chain Approach to Probabilistic Swarm Guidance

    Science.gov (United States)

    Acikmese, Behcet; Bayard, David S.

    2012-01-01

    This paper introduces a probabilistic guidance approach for the coordination of swarms of autonomous agents. The main idea is to drive the swarm to a prescribed density distribution in a prescribed region of the configuration space. In its simplest form, the probabilistic approach is completely decentralized and does not require communication or collabo- ration between agents. Agents make statistically independent probabilistic decisions based solely on their own state, that ultimately guides the swarm to the desired density distribution in the configuration space. In addition to being completely decentralized, the probabilistic guidance approach has a novel autonomous self-repair property: Once the desired swarm density distribution is attained, the agents automatically repair any damage to the distribution without collaborating and without any knowledge about the damage.

  7. Probabilistic approaches for geotechnical site characterization and slope stability analysis

    CERN Document Server

    Cao, Zijun; Li, Dianqing

    2017-01-01

    This is the first book to revisit geotechnical site characterization from a probabilistic point of view and provide rational tools to probabilistically characterize geotechnical properties and underground stratigraphy using limited information obtained from a specific site. This book not only provides new probabilistic approaches for geotechnical site characterization and slope stability analysis, but also tackles the difficulties in practical implementation of these approaches. In addition, this book also develops efficient Monte Carlo simulation approaches for slope stability analysis and implements these approaches in a commonly available spreadsheet environment. These approaches and the software package are readily available to geotechnical practitioners and alleviate them from reliability computational algorithms. The readers will find useful information for a non-specialist to determine project-specific statistics of geotechnical properties and to perform probabilistic analysis of slope stability.

  8. The probabilistic approach and the deterministic licensing procedure

    International Nuclear Information System (INIS)

    Fabian, H.; Feigel, A.; Gremm, O.

    1984-01-01

    If safety goals are given, the creativity of the engineers is necessary to transform the goals into actual safety measures. That is, safety goals are not sufficient for the derivation of a safety concept; the licensing process asks ''What does a safe plant look like.'' The answer connot be given by a probabilistic procedure, but need definite deterministic statements; the conclusion is, that the licensing process needs a deterministic approach. The probabilistic approach should be used in a complementary role in cases where deterministic criteria are not complete, not detailed enough or not consistent and additional arguments for decision making in connection with the adequacy of a specific measure are necessary. But also in these cases the probabilistic answer has to be transformed into a clear deterministic statement. (orig.)

  9. Probabilistic methods in exotic option pricing

    NARCIS (Netherlands)

    Anderluh, J.H.M.

    2007-01-01

    The thesis presents three ways of calculating the Parisian option price as an illustration of probabilistic methods in exotic option pricing. Moreover options on commidities are considered and double-sided barrier options in a compound Poisson framework.

  10. Standardized approach for developing probabilistic exposure factor distributions

    Energy Technology Data Exchange (ETDEWEB)

    Maddalena, Randy L.; McKone, Thomas E.; Sohn, Michael D.

    2003-03-01

    The effectiveness of a probabilistic risk assessment (PRA) depends critically on the quality of input information that is available to the risk assessor and specifically on the probabilistic exposure factor distributions that are developed and used in the exposure and risk models. Deriving probabilistic distributions for model inputs can be time consuming and subjective. The absence of a standard approach for developing these distributions can result in PRAs that are inconsistent and difficult to review by regulatory agencies. We present an approach that reduces subjectivity in the distribution development process without limiting the flexibility needed to prepare relevant PRAs. The approach requires two steps. First, we analyze data pooled at a population scale to (1) identify the most robust demographic variables within the population for a given exposure factor, (2) partition the population data into subsets based on these variables, and (3) construct archetypal distributions for each subpopulation. Second, we sample from these archetypal distributions according to site- or scenario-specific conditions to simulate exposure factor values and use these values to construct the scenario-specific input distribution. It is envisaged that the archetypal distributions from step 1 will be generally applicable so risk assessors will not have to repeatedly collect and analyze raw data for each new assessment. We demonstrate the approach for two commonly used exposure factors--body weight (BW) and exposure duration (ED)--using data for the U.S. population. For these factors we provide a first set of subpopulation based archetypal distributions along with methodology for using these distributions to construct relevant scenario-specific probabilistic exposure factor distributions.

  11. Nambu–Poisson gauge theory

    Energy Technology Data Exchange (ETDEWEB)

    Jurčo, Branislav, E-mail: jurco@karlin.mff.cuni.cz [Charles University in Prague, Faculty of Mathematics and Physics, Mathematical Institute, Prague 186 75 (Czech Republic); Schupp, Peter, E-mail: p.schupp@jacobs-university.de [Jacobs University Bremen, 28759 Bremen (Germany); Vysoký, Jan, E-mail: vysokjan@fjfi.cvut.cz [Jacobs University Bremen, 28759 Bremen (Germany); Czech Technical University in Prague, Faculty of Nuclear Sciences and Physical Engineering, Prague 115 19 (Czech Republic)

    2014-06-02

    We generalize noncommutative gauge theory using Nambu–Poisson structures to obtain a new type of gauge theory with higher brackets and gauge fields. The approach is based on covariant coordinates and higher versions of the Seiberg–Witten map. We construct a covariant Nambu–Poisson gauge theory action, give its first order expansion in the Nambu–Poisson tensor and relate it to a Nambu–Poisson matrix model.

  12. Nambu–Poisson gauge theory

    International Nuclear Information System (INIS)

    Jurčo, Branislav; Schupp, Peter; Vysoký, Jan

    2014-01-01

    We generalize noncommutative gauge theory using Nambu–Poisson structures to obtain a new type of gauge theory with higher brackets and gauge fields. The approach is based on covariant coordinates and higher versions of the Seiberg–Witten map. We construct a covariant Nambu–Poisson gauge theory action, give its first order expansion in the Nambu–Poisson tensor and relate it to a Nambu–Poisson matrix model.

  13. Probabilistic logics and probabilistic networks

    CERN Document Server

    Haenni, Rolf; Wheeler, Gregory; Williamson, Jon; Andrews, Jill

    2014-01-01

    Probabilistic Logic and Probabilistic Networks presents a groundbreaking framework within which various approaches to probabilistic logic naturally fit. Additionally, the text shows how to develop computationally feasible methods to mesh with this framework.

  14. Test of Poisson Process for Earthquakes in and around Korea

    International Nuclear Information System (INIS)

    Noh, Myunghyun; Choi, Hoseon

    2015-01-01

    Since Cornell's work on the probabilistic seismic hazard analysis (hereafter, PSHA), majority of PSHA computer codes are assuming that the earthquake occurrence is Poissonian. To the author's knowledge, it is uncertain who first opened the issue of the Poisson process for the earthquake occurrence. The systematic PSHA in Korea, led by the nuclear industry, were carried out for more than 25 year with the assumption of the Poisson process. However, the assumption of the Poisson process has never been tested. Therefore, the test is of significance. We tested whether the Korean earthquakes follow the Poisson process or not. The Chi-square test with the significance level of 5% was applied. The test turned out that the Poisson process could not be rejected for the earthquakes of magnitude 2.9 or larger. However, it was still observed in the graphical comparison that some portion of the observed distribution significantly deviated from the Poisson distribution. We think this is due to the small earthquake data. The earthquakes of magnitude 2.9 or larger occurred only 376 times during 34 years. Therefore, the judgment on the Poisson process derived in the present study is not conclusive

  15. Poisson Coordinates.

    Science.gov (United States)

    Li, Xian-Ying; Hu, Shi-Min

    2013-02-01

    Harmonic functions are the critical points of a Dirichlet energy functional, the linear projections of conformal maps. They play an important role in computer graphics, particularly for gradient-domain image processing and shape-preserving geometric computation. We propose Poisson coordinates, a novel transfinite interpolation scheme based on the Poisson integral formula, as a rapid way to estimate a harmonic function on a certain domain with desired boundary values. Poisson coordinates are an extension of the Mean Value coordinates (MVCs) which inherit their linear precision, smoothness, and kernel positivity. We give explicit formulas for Poisson coordinates in both continuous and 2D discrete forms. Superior to MVCs, Poisson coordinates are proved to be pseudoharmonic (i.e., they reproduce harmonic functions on n-dimensional balls). Our experimental results show that Poisson coordinates have lower Dirichlet energies than MVCs on a number of typical 2D domains (particularly convex domains). As well as presenting a formula, our approach provides useful insights for further studies on coordinates-based interpolation and fast estimation of harmonic functions.

  16. Probabilistic Forecasting of Photovoltaic Generation: An Efficient Statistical Approach

    DEFF Research Database (Denmark)

    Wan, Can; Lin, Jin; Song, Yonghua

    2017-01-01

    This letter proposes a novel efficient probabilistic forecasting approach to accurately quantify the variability and uncertainty of the power production from photovoltaic (PV) systems. Distinguished from most existing models, a linear programming based prediction interval construction model for P...... power generation is proposed based on extreme learning machine and quantile regression, featuring high reliability and computational efficiency. The proposed approach is validated through the numerical studies on PV data from Denmark.......This letter proposes a novel efficient probabilistic forecasting approach to accurately quantify the variability and uncertainty of the power production from photovoltaic (PV) systems. Distinguished from most existing models, a linear programming based prediction interval construction model for PV...

  17. Understanding poisson regression.

    Science.gov (United States)

    Hayat, Matthew J; Higgins, Melinda

    2014-04-01

    Nurse investigators often collect study data in the form of counts. Traditional methods of data analysis have historically approached analysis of count data either as if the count data were continuous and normally distributed or with dichotomization of the counts into the categories of occurred or did not occur. These outdated methods for analyzing count data have been replaced with more appropriate statistical methods that make use of the Poisson probability distribution, which is useful for analyzing count data. The purpose of this article is to provide an overview of the Poisson distribution and its use in Poisson regression. Assumption violations for the standard Poisson regression model are addressed with alternative approaches, including addition of an overdispersion parameter or negative binomial regression. An illustrative example is presented with an application from the ENSPIRE study, and regression modeling of comorbidity data is included for illustrative purposes. Copyright 2014, SLACK Incorporated.

  18. Poisson regression approach for modeling fatal injury rates amongst Malaysian workers

    International Nuclear Information System (INIS)

    Kamarulzaman Ibrahim; Heng Khai Theng

    2005-01-01

    Many safety studies are based on the analysis carried out on injury surveillance data. The injury surveillance data gathered for the analysis include information on number of employees at risk of injury in each of several strata where the strata are defined in terms of a series of important predictor variables. Further insight into the relationship between fatal injury rates and predictor variables may be obtained by the poisson regression approach. Poisson regression is widely used in analyzing count data. In this study, poisson regression is used to model the relationship between fatal injury rates and predictor variables which are year (1995-2002), gender, recording system and industry type. Data for the analysis were obtained from PERKESO and Jabatan Perangkaan Malaysia. It is found that the assumption that the data follow poisson distribution has been violated. After correction for the problem of over dispersion, the predictor variables that are found to be significant in the model are gender, system of recording, industry type, two interaction effects (interaction between recording system and industry type and between year and industry type). Introduction Regression analysis is one of the most popular

  19. A tiered approach for probabilistic ecological risk assessment of contaminated sites

    International Nuclear Information System (INIS)

    Zolezzi, M.; Nicolella, C.; Tarazona, J.V.

    2005-01-01

    This paper presents a tiered methodology for probabilistic ecological risk assessment. The proposed approach starts from deterministic comparison (ratio) of single exposure concentration and threshold or safe level calculated from a dose-response relationship, goes through comparison of probabilistic distributions that describe exposure values and toxicological responses of organisms to the chemical of concern, and finally determines the so called distribution-based quotients (DBQs). In order to illustrate the proposed approach, soil concentrations of 1,2,4-trichlorobenzene (1,2,4- TCB) measured in an industrial contaminated site were used for site-specific probabilistic ecological risks assessment. By using probabilistic distributions, the risk, which exceeds a level of concern for soil organisms with the deterministic approach, is associated to the presence of hot spots reaching concentrations able to affect acutely more than 50% of the soil species, while the large majority of the area presents 1,2,4- TCB concentrations below those reported as toxic [it

  20. Poisson Autoregression

    DEFF Research Database (Denmark)

    Fokianos, Konstantinos; Rahbek, Anders Christian; Tjøstheim, Dag

    2009-01-01

    In this article we consider geometric ergodicity and likelihood-based inference for linear and nonlinear Poisson autoregression. In the linear case, the conditional mean is linked linearly to its past values, as well as to the observed values of the Poisson process. This also applies...... to the conditional variance, making possible interpretation as an integer-valued generalized autoregressive conditional heteroscedasticity process. In a nonlinear conditional Poisson model, the conditional mean is a nonlinear function of its past values and past observations. As a particular example, we consider...... an exponential autoregressive Poisson model for time series. Under geometric ergodicity, the maximum likelihood estimators are shown to be asymptotically Gaussian in the linear model. In addition, we provide a consistent estimator of their asymptotic covariance matrix. Our approach to verifying geometric...

  1. Probabilistic approach to EMP assessment

    International Nuclear Information System (INIS)

    Bevensee, R.M.; Cabayan, H.S.; Deadrick, F.J.; Martin, L.C.; Mensing, R.W.

    1980-09-01

    The development of nuclear EMP hardness requirements must account for uncertainties in the environment, in interaction and coupling, and in the susceptibility of subsystems and components. Typical uncertainties of the last two kinds are briefly summarized, and an assessment methodology is outlined, based on a probabilistic approach that encompasses the basic concepts of reliability. It is suggested that statements of survivability be made compatible with system reliability. Validation of the approach taken for simple antenna/circuit systems is performed with experiments and calculations that involve a Transient Electromagnetic Range, numerical antenna modeling, separate device failure data, and a failure analysis computer program

  2. A probabilistic approach to delineating functional brain regions

    DEFF Research Database (Denmark)

    Kalbitzer, Jan; Svarer, Claus; Frokjaer, Vibe G

    2009-01-01

    The purpose of this study was to develop a reliable observer-independent approach to delineating volumes of interest (VOIs) for functional brain regions that are not identifiable on structural MR images. The case is made for the raphe nuclei, a collection of nuclei situated in the brain stem known...... to be densely packed with serotonin transporters (5-hydroxytryptaminic [5-HTT] system). METHODS: A template set for the raphe nuclei, based on their high content of 5-HTT as visualized in parametric (11)C-labeled 3-amino-4-(2-dimethylaminomethyl-phenylsulfanyl)-benzonitrile PET images, was created for 10...... healthy subjects. The templates were subsequently included in the region sets used in a previously published automatic MRI-based approach to create an observer- and activity-independent probabilistic VOI map. The probabilistic map approach was tested in a different group of 10 subjects and compared...

  3. Some probabilistic properties of fractional point processes

    KAUST Repository

    Garra, Roberto

    2017-05-16

    In this article, the first hitting times of generalized Poisson processes N-f (t), related to Bernstein functions f are studied. For the spacefractional Poisson processes, N alpha (t), t > 0 ( corresponding to f = x alpha), the hitting probabilities P{T-k(alpha) < infinity} are explicitly obtained and analyzed. The processes N-f (t) are time-changed Poisson processes N( H-f (t)) with subordinators H-f (t) and here we study N(Sigma H-n(j= 1)f j (t)) and obtain probabilistic features of these extended counting processes. A section of the paper is devoted to processes of the form N( G(H,v) (t)) where G(H,v) (t) are generalized grey Brownian motions. This involves the theory of time-dependent fractional operators of the McBride form. While the time-fractional Poisson process is a renewal process, we prove that the space-time Poisson process is no longer a renewal process.

  4. Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events

    Science.gov (United States)

    DeChant, C. M.; Moradkhani, H.

    2014-12-01

    Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.

  5. Performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data.

    Science.gov (United States)

    Yelland, Lisa N; Salter, Amy B; Ryan, Philip

    2011-10-15

    Modified Poisson regression, which combines a log Poisson regression model with robust variance estimation, is a useful alternative to log binomial regression for estimating relative risks. Previous studies have shown both analytically and by simulation that modified Poisson regression is appropriate for independent prospective data. This method is often applied to clustered prospective data, despite a lack of evidence to support its use in this setting. The purpose of this article is to evaluate the performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data, by using generalized estimating equations to account for clustering. A simulation study is conducted to compare log binomial regression and modified Poisson regression for analyzing clustered data from intervention and observational studies. Both methods generally perform well in terms of bias, type I error, and coverage. Unlike log binomial regression, modified Poisson regression is not prone to convergence problems. The methods are contrasted by using example data sets from 2 large studies. The results presented in this article support the use of modified Poisson regression as an alternative to log binomial regression for analyzing clustered prospective data when clustering is taken into account by using generalized estimating equations.

  6. Upper limit for Poisson variable incorporating systematic uncertainties by Bayesian approach

    International Nuclear Information System (INIS)

    Zhu, Yongsheng

    2007-01-01

    To calculate the upper limit for the Poisson observable at given confidence level with inclusion of systematic uncertainties in background expectation and signal efficiency, formulations have been established along the line of Bayesian approach. A FORTRAN program, BPULE, has been developed to implement the upper limit calculation

  7. Learning Probabilistic Logic Models from Probabilistic Examples.

    Science.gov (United States)

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2008-10-01

    We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.

  8. Some probabilistic properties of fractional point processes

    KAUST Repository

    Garra, Roberto; Orsingher, Enzo; Scavino, Marco

    2017-01-01

    P{T-k(alpha) < infinity} are explicitly obtained and analyzed. The processes N-f (t) are time-changed Poisson processes N( H-f (t)) with subordinators H-f (t) and here we study N(Sigma H-n(j= 1)f j (t)) and obtain probabilistic features

  9. The probabilistic approach in the licensing process and the development of probabilistic risk assessment methodology in Japan

    International Nuclear Information System (INIS)

    Togo, Y.; Sato, K.

    1981-01-01

    The probabilistic approach has long seemed to be one of the most comprehensive methods for evaluating the safety of nuclear plants. So far, most of the guidelines and criteria for licensing are based on the deterministic concept. However, there have been a few examples to which the probabilistic approach was directly applied, such as the evaluation of aircraft crashes and turbine missiles. One may find other examples of such applications. However, a much more important role is now to be played by this concept, in implementing the 52 recommendations from the lessons learned from the TMI accident. To develop the probabilistic risk assessment methodology most relevant to Japanese situations, a five-year programme plan has been adopted and is to be conducted by the Japan Atomic Research Institute from fiscal 1980. Various problems have been identified and are to be solved through this programme plan. The current status of developments is described together with activities outside the government programme. (author)

  10. A sampling-based approach to probabilistic pursuit evasion

    KAUST Repository

    Mahadevan, Aditya; Amato, Nancy M.

    2012-01-01

    Probabilistic roadmaps (PRMs) are a sampling-based approach to motion-planning that encodes feasible paths through the environment using a graph created from a subset of valid positions. Prior research has shown that PRMs can be augmented

  11. A Poisson hierarchical modelling approach to detecting copy number variation in sequence coverage data

    KAUST Repository

    Sepúlveda, Nuno

    2013-02-26

    Background: The advent of next generation sequencing technology has accelerated efforts to map and catalogue copy number variation (CNV) in genomes of important micro-organisms for public health. A typical analysis of the sequence data involves mapping reads onto a reference genome, calculating the respective coverage, and detecting regions with too-low or too-high coverage (deletions and amplifications, respectively). Current CNV detection methods rely on statistical assumptions (e.g., a Poisson model) that may not hold in general, or require fine-tuning the underlying algorithms to detect known hits. We propose a new CNV detection methodology based on two Poisson hierarchical models, the Poisson-Gamma and Poisson-Lognormal, with the advantage of being sufficiently flexible to describe different data patterns, whilst robust against deviations from the often assumed Poisson model.Results: Using sequence coverage data of 7 Plasmodium falciparum malaria genomes (3D7 reference strain, HB3, DD2, 7G8, GB4, OX005, and OX006), we showed that empirical coverage distributions are intrinsically asymmetric and overdispersed in relation to the Poisson model. We also demonstrated a low baseline false positive rate for the proposed methodology using 3D7 resequencing data and simulation. When applied to the non-reference isolate data, our approach detected known CNV hits, including an amplification of the PfMDR1 locus in DD2 and a large deletion in the CLAG3.2 gene in GB4, and putative novel CNV regions. When compared to the recently available FREEC and cn.MOPS approaches, our findings were more concordant with putative hits from the highest quality array data for the 7G8 and GB4 isolates.Conclusions: In summary, the proposed methodology brings an increase in flexibility, robustness, accuracy and statistical rigour to CNV detection using sequence coverage data. 2013 Seplveda et al.; licensee BioMed Central Ltd.

  12. A Poisson hierarchical modelling approach to detecting copy number variation in sequence coverage data.

    Science.gov (United States)

    Sepúlveda, Nuno; Campino, Susana G; Assefa, Samuel A; Sutherland, Colin J; Pain, Arnab; Clark, Taane G

    2013-02-26

    The advent of next generation sequencing technology has accelerated efforts to map and catalogue copy number variation (CNV) in genomes of important micro-organisms for public health. A typical analysis of the sequence data involves mapping reads onto a reference genome, calculating the respective coverage, and detecting regions with too-low or too-high coverage (deletions and amplifications, respectively). Current CNV detection methods rely on statistical assumptions (e.g., a Poisson model) that may not hold in general, or require fine-tuning the underlying algorithms to detect known hits. We propose a new CNV detection methodology based on two Poisson hierarchical models, the Poisson-Gamma and Poisson-Lognormal, with the advantage of being sufficiently flexible to describe different data patterns, whilst robust against deviations from the often assumed Poisson model. Using sequence coverage data of 7 Plasmodium falciparum malaria genomes (3D7 reference strain, HB3, DD2, 7G8, GB4, OX005, and OX006), we showed that empirical coverage distributions are intrinsically asymmetric and overdispersed in relation to the Poisson model. We also demonstrated a low baseline false positive rate for the proposed methodology using 3D7 resequencing data and simulation. When applied to the non-reference isolate data, our approach detected known CNV hits, including an amplification of the PfMDR1 locus in DD2 and a large deletion in the CLAG3.2 gene in GB4, and putative novel CNV regions. When compared to the recently available FREEC and cn.MOPS approaches, our findings were more concordant with putative hits from the highest quality array data for the 7G8 and GB4 isolates. In summary, the proposed methodology brings an increase in flexibility, robustness, accuracy and statistical rigour to CNV detection using sequence coverage data.

  13. A Poisson hierarchical modelling approach to detecting copy number variation in sequence coverage data

    KAUST Repository

    Sepú lveda, Nuno; Campino, Susana G; Assefa, Samuel A; Sutherland, Colin J; Pain, Arnab; Clark, Taane G

    2013-01-01

    Background: The advent of next generation sequencing technology has accelerated efforts to map and catalogue copy number variation (CNV) in genomes of important micro-organisms for public health. A typical analysis of the sequence data involves mapping reads onto a reference genome, calculating the respective coverage, and detecting regions with too-low or too-high coverage (deletions and amplifications, respectively). Current CNV detection methods rely on statistical assumptions (e.g., a Poisson model) that may not hold in general, or require fine-tuning the underlying algorithms to detect known hits. We propose a new CNV detection methodology based on two Poisson hierarchical models, the Poisson-Gamma and Poisson-Lognormal, with the advantage of being sufficiently flexible to describe different data patterns, whilst robust against deviations from the often assumed Poisson model.Results: Using sequence coverage data of 7 Plasmodium falciparum malaria genomes (3D7 reference strain, HB3, DD2, 7G8, GB4, OX005, and OX006), we showed that empirical coverage distributions are intrinsically asymmetric and overdispersed in relation to the Poisson model. We also demonstrated a low baseline false positive rate for the proposed methodology using 3D7 resequencing data and simulation. When applied to the non-reference isolate data, our approach detected known CNV hits, including an amplification of the PfMDR1 locus in DD2 and a large deletion in the CLAG3.2 gene in GB4, and putative novel CNV regions. When compared to the recently available FREEC and cn.MOPS approaches, our findings were more concordant with putative hits from the highest quality array data for the 7G8 and GB4 isolates.Conclusions: In summary, the proposed methodology brings an increase in flexibility, robustness, accuracy and statistical rigour to CNV detection using sequence coverage data. 2013 Seplveda et al.; licensee BioMed Central Ltd.

  14. A random probabilistic approach to seismic nuclear power plant analysis

    International Nuclear Information System (INIS)

    Romo, M.P.

    1985-01-01

    A probabilistic method for the seismic analysis of structures which takes into account the random nature of earthquakes and of the soil parameter uncertainties is presented in this paper. The method was developed combining elements of the theory of perturbations, the Random vibration theory and the complex response method. The probabilistic method is evaluated by comparing the responses of a single degree of freedom system computed with this approach and the Monte Carlo method. (orig.)

  15. Application of probabilistic risk based optimization approaches in environmental restoration

    International Nuclear Information System (INIS)

    Goldammer, W.

    1995-01-01

    The paper presents a general approach to site-specific risk assessments and optimization procedures. In order to account for uncertainties in the assessment of the current situation and future developments, optimization parameters are treated as probabilistic distributions. The assessments are performed within the framework of a cost-benefit analysis. Radiation hazards and conventional risks are treated within an integrated approach. Special consideration is given to consequences of low probability events such as, earthquakes or major floods. Risks and financial costs are combined to an overall figure of detriment allowing one to distinguish between benefits of available reclamation options. The probabilistic analysis uses a Monte Carlo simulation technique. The paper demonstrates the applicability of this approach in aiding the reclamation planning using an example from the German reclamation program for uranium mining and milling sites

  16. Analysis of Blood Transfusion Data Using Bivariate Zero-Inflated Poisson Model: A Bayesian Approach.

    Science.gov (United States)

    Mohammadi, Tayeb; Kheiri, Soleiman; Sedehi, Morteza

    2016-01-01

    Recognizing the factors affecting the number of blood donation and blood deferral has a major impact on blood transfusion. There is a positive correlation between the variables "number of blood donation" and "number of blood deferral": as the number of return for donation increases, so does the number of blood deferral. On the other hand, due to the fact that many donors never return to donate, there is an extra zero frequency for both of the above-mentioned variables. In this study, in order to apply the correlation and to explain the frequency of the excessive zero, the bivariate zero-inflated Poisson regression model was used for joint modeling of the number of blood donation and number of blood deferral. The data was analyzed using the Bayesian approach applying noninformative priors at the presence and absence of covariates. Estimating the parameters of the model, that is, correlation, zero-inflation parameter, and regression coefficients, was done through MCMC simulation. Eventually double-Poisson model, bivariate Poisson model, and bivariate zero-inflated Poisson model were fitted on the data and were compared using the deviance information criteria (DIC). The results showed that the bivariate zero-inflated Poisson regression model fitted the data better than the other models.

  17. PROBABILISTIC APPROACH TO OBJECT DETECTION AND RECOGNITION FOR VIDEOSTREAM PROCESSING

    Directory of Open Access Journals (Sweden)

    Volodymyr Kharchenko

    2017-07-01

    Full Text Available Purpose: The represented research results are aimed to improve theoretical basics of computer vision and artificial intelligence of dynamical system. Proposed approach of object detection and recognition is based on probabilistic fundamentals to ensure the required level of correct object recognition. Methods: Presented approach is grounded at probabilistic methods, statistical methods of probability density estimation and computer-based simulation at verification stage of development. Results: Proposed approach for object detection and recognition for video stream data processing has shown several advantages in comparison with existing methods due to its simple realization and small time of data processing. Presented results of experimental verification look plausible for object detection and recognition in video stream. Discussion: The approach can be implemented in dynamical system within changeable environment such as remotely piloted aircraft systems and can be a part of artificial intelligence in navigation and control systems.

  18. On Partial Defaults in Portfolio Credit Risk : A Poisson Mixture Model Approach

    OpenAIRE

    Weißbach, Rafael; von Lieres und Wilkau, Carsten

    2005-01-01

    Most credit portfolio models exclusively calculate the loss distribution for a portfolio of performing counterparts. Conservative default definitions cause considerable insecurity about the loss for a long time after the default. We present three approaches to account for defaulted counterparts in the calculation of the economic capital. Two of the approaches are based on the Poisson mixture model CreditRisk+ and derive a loss distribution for an integrated portfolio. The third method treats ...

  19. Operational intervention levels in a nuclear emergency, general concepts and a probabilistic approach

    International Nuclear Information System (INIS)

    Lauritzen, B.; Baeverstam, U.; Naadland Holo, E.; Sinkko, K.

    1997-12-01

    This report deals with Operational Intervention Levels (OILs) in a nuclear or radiation emergency. OILs are defined as the values of environmental measurements, in particular dose rate measurements, above which specific protective actions should be carried out in emergency exposure situations. The derivation and the application of OILs are discussed, and an overview of the presently adopted values is provided, with emphasis on the situation in the Nordic countries. A new, probabilistic approach to derive OILs is presented and the method is illustrated by calculating dose rate OILs in a simplified setting. Contrary to the standard approach, the probabilistic approach allows for optimization of OILs. It is argued, that optimized OILs may be much larger than the presently adopted or suggested values. It is recommended, that the probabilistic approach is further developed and employed in determining site specific OILs and in optimizing environmental measuring strategies. (au)

  20. Probabilistic approach to mechanisms

    CERN Document Server

    Sandler, BZ

    1984-01-01

    This book discusses the application of probabilistics to the investigation of mechanical systems. The book shows, for example, how random function theory can be applied directly to the investigation of random processes in the deflection of cam profiles, pitch or gear teeth, pressure in pipes, etc. The author also deals with some other technical applications of probabilistic theory, including, amongst others, those relating to pneumatic and hydraulic mechanisms and roller bearings. Many of the aspects are illustrated by examples of applications of the techniques under discussion.

  1. Background stratified Poisson regression analysis of cohort data.

    Science.gov (United States)

    Richardson, David B; Langholz, Bryan

    2012-03-01

    Background stratified Poisson regression is an approach that has been used in the analysis of data derived from a variety of epidemiologically important studies of radiation-exposed populations, including uranium miners, nuclear industry workers, and atomic bomb survivors. We describe a novel approach to fit Poisson regression models that adjust for a set of covariates through background stratification while directly estimating the radiation-disease association of primary interest. The approach makes use of an expression for the Poisson likelihood that treats the coefficients for stratum-specific indicator variables as 'nuisance' variables and avoids the need to explicitly estimate the coefficients for these stratum-specific parameters. Log-linear models, as well as other general relative rate models, are accommodated. This approach is illustrated using data from the Life Span Study of Japanese atomic bomb survivors and data from a study of underground uranium miners. The point estimate and confidence interval obtained from this 'conditional' regression approach are identical to the values obtained using unconditional Poisson regression with model terms for each background stratum. Moreover, it is shown that the proposed approach allows estimation of background stratified Poisson regression models of non-standard form, such as models that parameterize latency effects, as well as regression models in which the number of strata is large, thereby overcoming the limitations of previously available statistical software for fitting background stratified Poisson regression models.

  2. Probabilistic approach to manipulator kinematics and dynamics

    International Nuclear Information System (INIS)

    Rao, S.S.; Bhatti, P.K.

    2001-01-01

    A high performance, high speed robotic arm must be able to manipulate objects with a high degree of accuracy and repeatability. As with any other physical system, there are a number of factors causing uncertainties in the behavior of a robotic manipulator. These factors include manufacturing and assembling tolerances, and errors in the joint actuators and controllers. In order to study the effect of these uncertainties on the robotic end-effector and to obtain a better insight into the manipulator behavior, the manipulator kinematics and dynamics are modeled using a probabilistic approach. Based on the probabilistic model, kinematic and dynamic performance criteria are defined to provide measures of the behavior of the robotic end-effector. Techniques are presented to compute the kinematic and dynamic reliabilities of the manipulator. The effects of tolerances associated with the various manipulator parameters on the reliabilities are studied. Numerical examples are presented to illustrate the procedures

  3. A probabilistic approach for representation of interval uncertainty

    International Nuclear Information System (INIS)

    Zaman, Kais; Rangavajhala, Sirisha; McDonald, Mark P.; Mahadevan, Sankaran

    2011-01-01

    In this paper, we propose a probabilistic approach to represent interval data for input variables in reliability and uncertainty analysis problems, using flexible families of continuous Johnson distributions. Such a probabilistic representation of interval data facilitates a unified framework for handling aleatory and epistemic uncertainty. For fitting probability distributions, methods such as moment matching are commonly used in the literature. However, unlike point data where single estimates for the moments of data can be calculated, moments of interval data can only be computed in terms of upper and lower bounds. Finding bounds on the moments of interval data has been generally considered an NP-hard problem because it includes a search among the combinations of multiple values of the variables, including interval endpoints. In this paper, we present efficient algorithms based on continuous optimization to find the bounds on second and higher moments of interval data. With numerical examples, we show that the proposed bounding algorithms are scalable in polynomial time with respect to increasing number of intervals. Using the bounds on moments computed using the proposed approach, we fit a family of Johnson distributions to interval data. Furthermore, using an optimization approach based on percentiles, we find the bounding envelopes of the family of distributions, termed as a Johnson p-box. The idea of bounding envelopes for the family of Johnson distributions is analogous to the notion of empirical p-box in the literature. Several sets of interval data with different numbers of intervals and type of overlap are presented to demonstrate the proposed methods. As against the computationally expensive nested analysis that is typically required in the presence of interval variables, the proposed probabilistic representation enables inexpensive optimization-based strategies to estimate bounds on an output quantity of interest.

  4. A sampling-based approach to probabilistic pursuit evasion

    KAUST Repository

    Mahadevan, Aditya

    2012-05-01

    Probabilistic roadmaps (PRMs) are a sampling-based approach to motion-planning that encodes feasible paths through the environment using a graph created from a subset of valid positions. Prior research has shown that PRMs can be augmented with useful information to model interesting scenarios related to multi-agent interaction and coordination. © 2012 IEEE.

  5. Development of a Quantitative Framework for Regulatory Risk Assessments: Probabilistic Approaches

    International Nuclear Information System (INIS)

    Wilmot, R.D.

    2003-11-01

    The Swedish regulators have been active in the field of performance assessment for many years and have developed sophisticated approaches to the development of scenarios and other aspects of assessments. These assessments have generally used dose as the assessment end-point and have been based on deterministic calculations. Recently introduced Swedish regulations have introduced a risk criterion for radioactive waste disposal: the annual risk of harmful effects after closure of a disposal facility should not exceed 10 -6 for a representative individual in the group exposed to the greatest risk. A recent review of the overall structure of risk assessments in safety cases concluded that there are a number of decisions and assumptions in the development of a risk assessment methodology that could potentially affect the calculated results. Regulatory understanding of these issues, potentially supported by independent calculations, is important in preparing for review of a proponent's risk assessment. One approach to evaluating risk in performance assessments is to use the concept of probability to express uncertainties, and to propagate these probabilities through the analysis. This report describes the various approaches available for undertaking such probabilistic analyses, both as a means of accounting for uncertainty in the determination of risk and more generally as a means of sensitivity and uncertainty analysis. The report discusses the overall nature of probabilistic analyses and how they are applied to both the calculation of risk and sensitivity analyses. Several approaches are available, including differential analysis, response surface methods and simulation. Simulation is the approach most commonly used, both in assessments for radioactive waste disposal and in other subject areas, and the report describes the key stages of this approach in detail. Decisions relating to the development of input PDFs, sampling methods (including approaches to the treatment

  6. On poisson-stopped-sums that are mixed poisson

    OpenAIRE

    Valero Baya, Jordi; Pérez Casany, Marta; Ginebra Molins, Josep

    2013-01-01

    Maceda (1948) characterized the mixed Poisson distributions that are Poisson-stopped-sum distributions based on the mixing distribution. In an alternative characterization of the same set of distributions here the Poisson-stopped-sum distributions that are mixed Poisson distributions is proved to be the set of Poisson-stopped-sums of either a mixture of zero-truncated Poisson distributions or a zero-modification of it. Peer Reviewed

  7. Background stratified Poisson regression analysis of cohort data

    International Nuclear Information System (INIS)

    Richardson, David B.; Langholz, Bryan

    2012-01-01

    Background stratified Poisson regression is an approach that has been used in the analysis of data derived from a variety of epidemiologically important studies of radiation-exposed populations, including uranium miners, nuclear industry workers, and atomic bomb survivors. We describe a novel approach to fit Poisson regression models that adjust for a set of covariates through background stratification while directly estimating the radiation-disease association of primary interest. The approach makes use of an expression for the Poisson likelihood that treats the coefficients for stratum-specific indicator variables as 'nuisance' variables and avoids the need to explicitly estimate the coefficients for these stratum-specific parameters. Log-linear models, as well as other general relative rate models, are accommodated. This approach is illustrated using data from the Life Span Study of Japanese atomic bomb survivors and data from a study of underground uranium miners. The point estimate and confidence interval obtained from this 'conditional' regression approach are identical to the values obtained using unconditional Poisson regression with model terms for each background stratum. Moreover, it is shown that the proposed approach allows estimation of background stratified Poisson regression models of non-standard form, such as models that parameterize latency effects, as well as regression models in which the number of strata is large, thereby overcoming the limitations of previously available statistical software for fitting background stratified Poisson regression models. (orig.)

  8. A probabilistic approach for RIA fuel failure criteria

    International Nuclear Information System (INIS)

    Carlo Vitanza, Dr.

    2008-01-01

    Substantial experimental data have been produced in support of the definition of the RIA safety limits for water reactor fuels at high burn up. Based on these data, fuel failure enthalpy limits can be derived based on methods having a varying degree of complexity. However, regardless of sophistication, it is unlikely that any deterministic approach would result in perfect predictions of all failure and non failure data obtained in RIA tests. Accordingly, a probabilistic approach is proposed in this paper, where in addition to a best estimate evaluation of the failure enthalpy, a RIA fuel failure probability distribution is defined within an enthalpy band surrounding the best estimate failure enthalpy. The band width and the failure probability distribution within this band are determined on the basis of the whole data set, including failure and non failure data and accounting for the actual scatter of the database. The present probabilistic approach can be used in conjunction with any deterministic model or correlation. For deterministic models or correlations having good prediction capability, the probability distribution will be sharply increasing within a narrow band around the best estimate value. For deterministic predictions of lower quality, instead, the resulting probability distribution will be broad and coarser

  9. Probabilistic approaches to life prediction of nuclear plant structural components

    International Nuclear Information System (INIS)

    Villain, B.; Pitner, P.; Procaccia, H.

    1996-01-01

    In the last decade there has been an increasing interest at EDF in developing and applying probabilistic methods for a variety of purposes. In the field of structural integrity and reliability they are used to evaluate the effect of deterioration due to aging mechanisms, mainly on major passive structural components such as steam generators, pressure vessels and piping in nuclear plants. Because there can be numerous uncertainties involved in a assessment of the performance of these structural components, probabilistic methods. The benefits of a probabilistic approach are the clear treatment of uncertainly and the possibility to perform sensitivity studies from which it is possible to identify and quantify the effect of key factors and mitigative actions. They thus provide information to support effective decisions to optimize In-Service Inspection planning and maintenance strategies and for realistic lifetime prediction or reassessment. The purpose of the paper is to discuss and illustrate the methods available at EDF for probabilistic component life prediction. This includes a presentation of software tools in classical, Bayesian and structural reliability, and an application on two case studies (steam generator tube bundle, reactor pressure vessel). (authors)

  10. Probabilistic approaches to life prediction of nuclear plant structural components

    International Nuclear Information System (INIS)

    Villain, B.; Pitner, P.; Procaccia, H.

    1996-01-01

    In the last decade there has been an increasing interest at EDF in developing and applying probabilistic methods for a variety of purposes. In the field of structural integrity and reliability they are used to evaluate the effect of deterioration due to aging mechanisms, mainly on major passive structural components such as steam generators, pressure vessels and piping in nuclear plants. Because there can be numerous uncertainties involved in an assessment of the performance of these structural components, probabilistic methods provide an attractive alternative or supplement to more conventional deterministic methods. The benefits of a probabilistic approach are the clear treatment of uncertainty and the possibility to perform sensitivity studies from which it is possible to identify and quantify the effect of key factors and mitigative actions. They thus provide information to support effective decisions to optimize In-Service Inspection planning and maintenance strategies and for realistic lifetime prediction or reassessment. The purpose of the paper is to discuss and illustrate the methods available at EDF for probabilistic component life prediction. This includes a presentation of software tools in classical, Bayesian and structural reliability, and an application on two case studies (steam generator tube bundle, reactor pressure vessel)

  11. Zeroth Poisson Homology, Foliated Cohomology and Perfect Poisson Manifolds

    Science.gov (United States)

    Martínez-Torres, David; Miranda, Eva

    2018-01-01

    We prove that, for compact regular Poisson manifolds, the zeroth homology group is isomorphic to the top foliated cohomology group, and we give some applications. In particular, we show that, for regular unimodular Poisson manifolds, top Poisson and foliated cohomology groups are isomorphic. Inspired by the symplectic setting, we define what a perfect Poisson manifold is. We use these Poisson homology computations to provide families of perfect Poisson manifolds.

  12. A Probabilistic Model for Diagnosing Misconceptions by a Pattern Classification Approach.

    Science.gov (United States)

    Tatsuoka, Kikumi K.

    A probabilistic approach is introduced to classify and diagnose erroneous rules of operation resulting from a variety of misconceptions ("bugs") in a procedural domain of arithmetic. The model is contrasted with the deterministic approach which has commonly been used in the field of artificial intelligence, and the advantage of treating the…

  13. Polynomial Poisson algebras: Gel'fand-Kirillov problem and Poisson spectra

    OpenAIRE

    Lecoutre, César

    2014-01-01

    We study the fields of fractions and the Poisson spectra of polynomial Poisson algebras.\\ud \\ud First we investigate a Poisson birational equivalence problem for polynomial Poisson algebras over a field of arbitrary characteristic. Namely, the quadratic Poisson Gel'fand-Kirillov problem asks whether the field of fractions of a Poisson algebra is isomorphic to the field of fractions of a Poisson affine space, i.e. a polynomial algebra such that the Poisson bracket of two generators is equal to...

  14. PROBABILISTIC APPROACH OF STABILIZED ELECTROMAGNETIC FIELD EFFECTS

    Directory of Open Access Journals (Sweden)

    FELEA. I.

    2017-09-01

    Full Text Available The effects of the omnipresence of the electromagnetic field are certain and recognized. Assessing as accurately as possible these effects, which characterize random phenomena require the use of statistical-probabilistic calculation. This paper aims at assessing the probability of exceeding the admissible values of the characteristic sizes of the electromagnetic field - magnetic induction and electric field strength. The first part justifies the need for concern and specifies how to approach it. The mathematical model of approach and treatment is presented in the second part of the paper and the results obtained with reference to 14 power stations are synthesized in the third part. In the last part, are formulated the conclusions of the evaluations.

  15. Information fusion in signal and image processing major probabilistic and non-probabilistic numerical approaches

    CERN Document Server

    Bloch, Isabelle

    2010-01-01

    The area of information fusion has grown considerably during the last few years, leading to a rapid and impressive evolution. In such fast-moving times, it is important to take stock of the changes that have occurred. As such, this books offers an overview of the general principles and specificities of information fusion in signal and image processing, as well as covering the main numerical methods (probabilistic approaches, fuzzy sets and possibility theory and belief functions).

  16. A test of inflated zeros for Poisson regression models.

    Science.gov (United States)

    He, Hua; Zhang, Hui; Ye, Peng; Tang, Wan

    2017-01-01

    Excessive zeros are common in practice and may cause overdispersion and invalidate inference when fitting Poisson regression models. There is a large body of literature on zero-inflated Poisson models. However, methods for testing whether there are excessive zeros are less well developed. The Vuong test comparing a Poisson and a zero-inflated Poisson model is commonly applied in practice. However, the type I error of the test often deviates seriously from the nominal level, rendering serious doubts on the validity of the test in such applications. In this paper, we develop a new approach for testing inflated zeros under the Poisson model. Unlike the Vuong test for inflated zeros, our method does not require a zero-inflated Poisson model to perform the test. Simulation studies show that when compared with the Vuong test our approach not only better at controlling type I error rate, but also yield more power.

  17. Network Traffic Monitoring Using Poisson Dynamic Linear Models

    Energy Technology Data Exchange (ETDEWEB)

    Merl, D. M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2011-05-09

    In this article, we discuss an approach for network forensics using a class of nonstationary Poisson processes with embedded dynamic linear models. As a modeling strategy, the Poisson DLM (PoDLM) provides a very flexible framework for specifying structured effects that may influence the evolution of the underlying Poisson rate parameter, including diurnal and weekly usage patterns. We develop a novel particle learning algorithm for online smoothing and prediction for the PoDLM, and demonstrate the suitability of the approach to real-time deployment settings via a new application to computer network traffic monitoring.

  18. A probabilistic approach to Radiological Environmental Impact Assessment

    International Nuclear Information System (INIS)

    Avila, Rodolfo; Larsson, Carl-Magnus

    2001-01-01

    Since a radiological environmental impact assessment typically relies on limited data and poorly based extrapolation methods, point estimations, as implied by a deterministic approach, do not suffice. To be of practical use for risk management, it is necessary to quantify the uncertainty margins of the estimates as well. In this paper we discuss how to work out a probabilistic approach for dealing with uncertainties in assessments of the radiological risks to non-human biota of a radioactive contamination. Possible strategies for deriving the relevant probability distribution functions from available empirical data and theoretical knowledge are outlined

  19. Poisson-Hopf limit of quantum algebras

    International Nuclear Information System (INIS)

    Ballesteros, A; Celeghini, E; Olmo, M A del

    2009-01-01

    The Poisson-Hopf analogue of an arbitrary quantum algebra U z (g) is constructed by introducing a one-parameter family of quantizations U z,ℎ (g) depending explicitly on ℎ and by taking the appropriate ℎ → 0 limit. The q-Poisson analogues of the su(2) algebra are discussed and the novel su q P (3) case is introduced. The q-Serre relations are also extended to the Poisson limit. This approach opens the perspective for possible applications of higher rank q-deformed Hopf algebras in semiclassical contexts

  20. Probabilistic Approach to Conditional Probability of Release of Hazardous Materials from Railroad Tank Cars during Accidents

    Science.gov (United States)

    2009-10-13

    This paper describes a probabilistic approach to estimate the conditional probability of release of hazardous materials from railroad tank cars during train accidents. Monte Carlo methods are used in developing a probabilistic model to simulate head ...

  1. Comparison of Two Probabilistic Fatigue Damage Assessment Approaches Using Prognostic Performance Metrics

    Data.gov (United States)

    National Aeronautics and Space Administration — A general framework for probabilistic prognosis using maximum entropy approach, MRE, is proposed in this paper to include all available information and uncertainties...

  2. Analyzing hospitalization data: potential limitations of Poisson regression.

    Science.gov (United States)

    Weaver, Colin G; Ravani, Pietro; Oliver, Matthew J; Austin, Peter C; Quinn, Robert R

    2015-08-01

    Poisson regression is commonly used to analyze hospitalization data when outcomes are expressed as counts (e.g. number of days in hospital). However, data often violate the assumptions on which Poisson regression is based. More appropriate extensions of this model, while available, are rarely used. We compared hospitalization data between 206 patients treated with hemodialysis (HD) and 107 treated with peritoneal dialysis (PD) using Poisson regression and compared results from standard Poisson regression with those obtained using three other approaches for modeling count data: negative binomial (NB) regression, zero-inflated Poisson (ZIP) regression and zero-inflated negative binomial (ZINB) regression. We examined the appropriateness of each model and compared the results obtained with each approach. During a mean 1.9 years of follow-up, 183 of 313 patients (58%) were never hospitalized (indicating an excess of 'zeros'). The data also displayed overdispersion (variance greater than mean), violating another assumption of the Poisson model. Using four criteria, we determined that the NB and ZINB models performed best. According to these two models, patients treated with HD experienced similar hospitalization rates as those receiving PD {NB rate ratio (RR): 1.04 [bootstrapped 95% confidence interval (CI): 0.49-2.20]; ZINB summary RR: 1.21 (bootstrapped 95% CI 0.60-2.46)}. Poisson and ZIP models fit the data poorly and had much larger point estimates than the NB and ZINB models [Poisson RR: 1.93 (bootstrapped 95% CI 0.88-4.23); ZIP summary RR: 1.84 (bootstrapped 95% CI 0.88-3.84)]. We found substantially different results when modeling hospitalization data, depending on the approach used. Our results argue strongly for a sound model selection process and improved reporting around statistical methods used for modeling count data. © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  3. Normal forms for Poisson maps and symplectic groupoids around Poisson transversals.

    Science.gov (United States)

    Frejlich, Pedro; Mărcuț, Ioan

    2018-01-01

    Poisson transversals are submanifolds in a Poisson manifold which intersect all symplectic leaves transversally and symplectically. In this communication, we prove a normal form theorem for Poisson maps around Poisson transversals. A Poisson map pulls a Poisson transversal back to a Poisson transversal, and our first main result states that simultaneous normal forms exist around such transversals, for which the Poisson map becomes transversally linear, and intertwines the normal form data of the transversals. Our second result concerns symplectic integrations. We prove that a neighborhood of a Poisson transversal is integrable exactly when the Poisson transversal itself is integrable, and in that case we prove a normal form theorem for the symplectic groupoid around its restriction to the Poisson transversal, which puts all structure maps in normal form. We conclude by illustrating our results with examples arising from Lie algebras.

  4. Comparison of Poisson structures and Poisson-Lie dynamical r-matrices

    OpenAIRE

    Enriquez, B.; Etingof, P.; Marshall, I.

    2004-01-01

    We construct a Poisson isomorphism between the formal Poisson manifolds g^* and G^*, where g is a finite dimensional quasitriangular Lie bialgebra. Here g^* is equipped with its Lie-Poisson (or Kostant-Kirillov-Souriau) structure, and G^* with its Poisson-Lie structure. We also quantize Poisson-Lie dynamical r-matrices of Balog-Feher-Palla.

  5. Probabilistic molecular dynamics evaluation of the stress-strain behavior of polyethylene

    International Nuclear Information System (INIS)

    Stowe, J.Q.; Predecki, P.K.; Laz, P.J.; Burks, B.M.; Kumosa, M.

    2009-01-01

    The primary goal of this study was to utilize molecular dynamics to predict the mechanical behavior of polyethylene. In particular, stress-strain relationships, the Young's modulus and Poisson ratio were predicted for low-density polyethylene at several molecular weights and polymer configurations with the number of united CH 2 atoms ranging between 500 and 5000. Probabilistic Monte Carlo methods were also used to identify the extent of uncertainty in mechanical property predictions. In general, asymptotic behavior was observed for stress and the Young's modulus as the molecular weight of the models increased. At the same time, significant variability, of the order of 1000% of the mean, in the stress-strain relationships and the Young's modulus predictions was observed, especially for low molecular weight models. The variability in the Young's modulus predictions ranged from 17.9 to 3.2 GPa for the models ranging from 100 to 5000 CH 2 atom models. However, it was also found that the mean value of the Young's modulus approached a physically possible value of 194 MPa for the 5000 atom model. Poisson ratio predictions also resulted in significant variability, from 200% to 425% of the mean, and ranged from 0.75 to 1.30. The mean value of the Poisson ratios calculated in this study ranged from 0.32 to 0.44 for the 100 to 5000 atom models, respectively.

  6. On Poisson functions

    OpenAIRE

    Terashima, Yuji

    2008-01-01

    In this paper, defining Poisson functions on super manifolds, we show that the graphs of Poisson functions are Dirac structures, and find Poisson functions which include as special cases both quasi-Poisson structures and twisted Poisson structures.

  7. An approach for obtaining integrable Hamiltonians from Poisson-commuting polynomial families

    Science.gov (United States)

    Leyvraz, F.

    2017-07-01

    We discuss a general approach permitting the identification of a broad class of sets of Poisson-commuting Hamiltonians, which are integrable in the sense of Liouville. It is shown that all such Hamiltonians can be solved explicitly by a separation of variables ansatz. The method leads in particular to a proof that the so-called "goldfish" Hamiltonian is maximally superintegrable and leads to an elementary identification of a full set of integrals of motion. The Hamiltonians in involution with the "goldfish" Hamiltonian are also explicitly integrated. New integrable Hamiltonians are identified, among which some have the property of being isochronous, that is, all their orbits have the same period. Finally, a peculiar structure is identified in the Poisson brackets between the elementary symmetric functions and the set of Hamiltonians commuting with the "goldfish" Hamiltonian: these can be expressed as products between elementary symmetric functions and Hamiltonians. The structure displays an invariance property with respect to one element and has both a symmetry and a closure property. The meaning of this structure is not altogether clear to the author, but it turns out to be a powerful tool.

  8. The Fractional Poisson Process and the Inverse Stable Subordinator

    OpenAIRE

    Meerschaert, Mark; Nane, Erkan; Vellaisamy, P.

    2011-01-01

    The fractional Poisson process is a renewal process with Mittag-Leffler waiting times. Its distributions solve a time-fractional analogue of the Kolmogorov forward equation for a Poisson process. This paper shows that a traditional Poisson process, with the time variable replaced by an independent inverse stable subordinator, is also a fractional Poisson process. This result unifies the two main approaches in the stochastic theory of time-fractional diffusion equations. The equivalence extend...

  9. Universal Generating Function Based Probabilistic Production Simulation Approach Considering Wind Speed Correlation

    Directory of Open Access Journals (Sweden)

    Yan Li

    2017-11-01

    Full Text Available Due to the volatile and correlated nature of wind speed, a high share of wind power penetration poses challenges to power system production simulation. Existing power system probabilistic production simulation approaches are in short of considering the time-varying characteristics of wind power and load, as well as the correlation between wind speeds at the same time, which brings about some problems in planning and analysis for the power system with high wind power penetration. Based on universal generating function (UGF, this paper proposes a novel probabilistic production simulation approach considering wind speed correlation. UGF is utilized to develop the chronological models of wind power that characterizes wind speed correlation simultaneously, as well as the chronological models of conventional generation sources and load. The supply and demand are matched chronologically to not only obtain generation schedules, but also reliability indices both at each simulation interval and the whole period. The proposed approach has been tested on the improved IEEE-RTS 79 test system and is compared with the Monte Carlo approach and the sequence operation theory approach. The results verified the proposed approach with the merits of computation simplicity and accuracy.

  10. Probabilistic and machine learning-based retrieval approaches for biomedical dataset retrieval

    Science.gov (United States)

    Karisani, Payam; Qin, Zhaohui S; Agichtein, Eugene

    2018-01-01

    Abstract The bioCADDIE dataset retrieval challenge brought together different approaches to retrieval of biomedical datasets relevant to a user’s query, expressed as a text description of a needed dataset. We describe experiments in applying a data-driven, machine learning-based approach to biomedical dataset retrieval as part of this challenge. We report on a series of experiments carried out to evaluate the performance of both probabilistic and machine learning-driven techniques from information retrieval, as applied to this challenge. Our experiments with probabilistic information retrieval methods, such as query term weight optimization, automatic query expansion and simulated user relevance feedback, demonstrate that automatically boosting the weights of important keywords in a verbose query is more effective than other methods. We also show that although there is a rich space of potential representations and features available in this domain, machine learning-based re-ranking models are not able to improve on probabilistic information retrieval techniques with the currently available training data. The models and algorithms presented in this paper can serve as a viable implementation of a search engine to provide access to biomedical datasets. The retrieval performance is expected to be further improved by using additional training data that is created by expert annotation, or gathered through usage logs, clicks and other processes during natural operation of the system. Database URL: https://github.com/emory-irlab/biocaddie

  11. Formal equivalence of Poisson structures around Poisson submanifolds

    NARCIS (Netherlands)

    Marcut, I.T.

    2012-01-01

    Let (M,π) be a Poisson manifold. A Poisson submanifold P ⊂ M gives rise to a Lie algebroid AP → P. Formal deformations of π around P are controlled by certain cohomology groups associated to AP. Assuming that these groups vanish, we prove that π is formally rigid around P; that is, any other Poisson

  12. Probabilistic model for sterilization of food

    International Nuclear Information System (INIS)

    Chepurko, V.V.; Malinovskij, O.V.

    1986-01-01

    The probabilistic model for radiation sterilization is proposed based on the followng suppositions: (1) initial contamination of a volume unit of the sterilized product m is described by the distribution of the probabilities q(m), (2) inactivation of the population from m of microorganisms is approximated by Bernoulli test scheme, and (3) contamination of unit of the sterilized product is independent. The possibility of approximation q(m) by Poisson distribution is demonstrated. The diagrams are presented permitting to evaluate the dose which provides the defined reliability of sterilization of food for chicken-gnotobionts

  13. Précis of bayesian rationality: The probabilistic approach to human reasoning.

    Science.gov (United States)

    Oaksford, Mike; Chater, Nick

    2009-02-01

    According to Aristotle, humans are the rational animal. The borderline between rationality and irrationality is fundamental to many aspects of human life including the law, mental health, and language interpretation. But what is it to be rational? One answer, deeply embedded in the Western intellectual tradition since ancient Greece, is that rationality concerns reasoning according to the rules of logic--the formal theory that specifies the inferential connections that hold with certainty between propositions. Piaget viewed logical reasoning as defining the end-point of cognitive development; and contemporary psychology of reasoning has focussed on comparing human reasoning against logical standards. Bayesian Rationality argues that rationality is defined instead by the ability to reason about uncertainty. Although people are typically poor at numerical reasoning about probability, human thought is sensitive to subtle patterns of qualitative Bayesian, probabilistic reasoning. In Chapters 1-4 of Bayesian Rationality (Oaksford & Chater 2007), the case is made that cognition in general, and human everyday reasoning in particular, is best viewed as solving probabilistic, rather than logical, inference problems. In Chapters 5-7 the psychology of "deductive" reasoning is tackled head-on: It is argued that purportedly "logical" reasoning problems, revealing apparently irrational behaviour, are better understood from a probabilistic point of view. Data from conditional reasoning, Wason's selection task, and syllogistic inference are captured by recasting these problems probabilistically. The probabilistic approach makes a variety of novel predictions which have been experimentally confirmed. The book considers the implications of this work, and the wider "probabilistic turn" in cognitive science and artificial intelligence, for understanding human rationality.

  14. (Quasi-)Poisson enveloping algebras

    OpenAIRE

    Yang, Yan-Hong; Yao, Yuan; Ye, Yu

    2010-01-01

    We introduce the quasi-Poisson enveloping algebra and Poisson enveloping algebra for a non-commutative Poisson algebra. We prove that for a non-commutative Poisson algebra, the category of quasi-Poisson modules is equivalent to the category of left modules over its quasi-Poisson enveloping algebra, and the category of Poisson modules is equivalent to the category of left modules over its Poisson enveloping algebra.

  15. Dietary Exposure Assessment of Danish Consumers to Dithiocarbamate Residues in Food: a Comparison of the Deterministic and Probabilistic Approach

    DEFF Research Database (Denmark)

    Jensen, Bodil Hamborg; Andersen, Jens Hinge; Petersen, Annette

    2008-01-01

    Probabilistic and deterministic estimates of the acute and chronic exposure of the Danish populations to dithiocarbamate residues were performed. The Monte Carlo Risk Assessment programme (MCRA 4.0) was used for the probabilistic risk assessment. Food consumption data were obtained from...... the nationwide dietary survey conducted in 2000-02. Residue data for 5721 samples from the monitoring programme conducted in the period 1998-2003 were used for dithiocarbamates, which had been determined as carbon disulphide. Contributions from 26 commodities were included in the calculations. Using...... the probabilistic approach, the daily acute intakes at the 99.9% percentile for adults and children were 11.2 and 28.2 mu g kg(-1) body weight day(-1), representing 5.6% and 14.1% of the ARfD for maneb, respectively. When comparing the point estimate approach with the probabilistic approach, the outcome...

  16. Estimation of Poisson noise in spatial domain

    Science.gov (United States)

    Švihlík, Jan; Fliegel, Karel; Vítek, Stanislav; Kukal, Jaromír.; Krbcová, Zuzana

    2017-09-01

    This paper deals with modeling of astronomical images in the spatial domain. We consider astronomical light images contaminated by the dark current which is modeled by Poisson random process. Dark frame image maps the thermally generated charge of the CCD sensor. In this paper, we solve the problem of an addition of two Poisson random variables. At first, the noise analysis of images obtained from the astronomical camera is performed. It allows estimating parameters of the Poisson probability mass functions in every pixel of the acquired dark frame. Then the resulting distributions of the light image can be found. If the distributions of the light image pixels are identified, then the denoising algorithm can be applied. The performance of the Bayesian approach in the spatial domain is compared with the direct approach based on the method of moments and the dark frame subtraction.

  17. Poisson Autoregression

    DEFF Research Database (Denmark)

    Fokianos, Konstantinos; Rahbek, Anders Christian; Tjøstheim, Dag

    This paper considers geometric ergodicity and likelihood based inference for linear and nonlinear Poisson autoregressions. In the linear case the conditional mean is linked linearly to its past values as well as the observed values of the Poisson process. This also applies to the conditional...... variance, implying an interpretation as an integer valued GARCH process. In a nonlinear conditional Poisson model, the conditional mean is a nonlinear function of its past values and a nonlinear function of past observations. As a particular example an exponential autoregressive Poisson model for time...

  18. Poisson Autoregression

    DEFF Research Database (Denmark)

    Fokianos, Konstantinos; Rahbæk, Anders; Tjøstheim, Dag

    This paper considers geometric ergodicity and likelihood based inference for linear and nonlinear Poisson autoregressions. In the linear case the conditional mean is linked linearly to its past values as well as the observed values of the Poisson process. This also applies to the conditional...... variance, making an interpretation as an integer valued GARCH process possible. In a nonlinear conditional Poisson model, the conditional mean is a nonlinear function of its past values and a nonlinear function of past observations. As a particular example an exponential autoregressive Poisson model...

  19. Duplicate Detection in Probabilistic Data

    NARCIS (Netherlands)

    Panse, Fabian; van Keulen, Maurice; de Keijzer, Ander; Ritter, Norbert

    2009-01-01

    Collected data often contains uncertainties. Probabilistic databases have been proposed to manage uncertain data. To combine data from multiple autonomous probabilistic databases, an integration of probabilistic data has to be performed. Until now, however, data integration approaches have focused

  20. Analysis of overdispersed count data by mixtures of Poisson variables and Poisson processes.

    Science.gov (United States)

    Hougaard, P; Lee, M L; Whitmore, G A

    1997-12-01

    Count data often show overdispersion compared to the Poisson distribution. Overdispersion is typically modeled by a random effect for the mean, based on the gamma distribution, leading to the negative binomial distribution for the count. This paper considers a larger family of mixture distributions, including the inverse Gaussian mixture distribution. It is demonstrated that it gives a significantly better fit for a data set on the frequency of epileptic seizures. The same approach can be used to generate counting processes from Poisson processes, where the rate or the time is random. A random rate corresponds to variation between patients, whereas a random time corresponds to variation within patients.

  1. Use of adjoint methods in the probabilistic finite element approach to fracture mechanics

    Science.gov (United States)

    Liu, Wing Kam; Besterfield, Glen; Lawrence, Mark; Belytschko, Ted

    1988-01-01

    The adjoint method approach to probabilistic finite element methods (PFEM) is presented. When the number of objective functions is small compared to the number of random variables, the adjoint method is far superior to the direct method in evaluating the objective function derivatives with respect to the random variables. The PFEM is extended to probabilistic fracture mechanics (PFM) using an element which has the near crack-tip singular strain field embedded. Since only two objective functions (i.e., mode I and II stress intensity factors) are needed for PFM, the adjoint method is well suited.

  2. Multilevel probabilistic approach to evaluate manufacturing defect in composite aircraft structures

    International Nuclear Information System (INIS)

    Caracciolo, Paola

    2014-01-01

    In this work it is developed a reliable approach and its feasibility to the design and analysis of a composite structures. The metric is compared the robustness and reliability designs versus the traditional design, to demonstrate the gain that can be achieved with a probabilistic approach. The use of the stochastic approach of the uncertain parameteters in combination with the multi-scale levels analysis is the main objective of this paper. The work is dedicated to analyze the uncertainties in the design, tests, manufacturing process, and key gates such as materials characteristic

  3. Multilevel probabilistic approach to evaluate manufacturing defect in composite aircraft structures

    Energy Technology Data Exchange (ETDEWEB)

    Caracciolo, Paola, E-mail: paola.caracciolo@airbus.com [AIRBUS INDUSTRIES Germany, Department of Airframe Architecture and Integration-Research and Technology-Kreetslag, 10, D-21129, Hamburg (Germany)

    2014-05-15

    In this work it is developed a reliable approach and its feasibility to the design and analysis of a composite structures. The metric is compared the robustness and reliability designs versus the traditional design, to demonstrate the gain that can be achieved with a probabilistic approach. The use of the stochastic approach of the uncertain parameteters in combination with the multi-scale levels analysis is the main objective of this paper. The work is dedicated to analyze the uncertainties in the design, tests, manufacturing process, and key gates such as materials characteristic.

  4. Efficient maximal Poisson-disk sampling and remeshing on surfaces

    KAUST Repository

    Guo, Jianwei; Yan, Dongming; Jia, Xiaohong; Zhang, Xiaopeng

    2015-01-01

    Poisson-disk sampling is one of the fundamental research problems in computer graphics that has many applications. In this paper, we study the problem of maximal Poisson-disk sampling on mesh surfaces. We present a simple approach that generalizes the 2D maximal sampling framework to surfaces. The key observation is to use a subdivided mesh as the sampling domain for conflict checking and void detection. Our approach improves the state-of-the-art approach in efficiency, quality and the memory consumption.

  5. Efficient maximal Poisson-disk sampling and remeshing on surfaces

    KAUST Repository

    Guo, Jianwei

    2015-02-01

    Poisson-disk sampling is one of the fundamental research problems in computer graphics that has many applications. In this paper, we study the problem of maximal Poisson-disk sampling on mesh surfaces. We present a simple approach that generalizes the 2D maximal sampling framework to surfaces. The key observation is to use a subdivided mesh as the sampling domain for conflict checking and void detection. Our approach improves the state-of-the-art approach in efficiency, quality and the memory consumption.

  6. Selective Contrast Adjustment by Poisson Equation

    Directory of Open Access Journals (Sweden)

    Ana-Belen Petro

    2013-09-01

    Full Text Available Poisson Image Editing is a new technique permitting to modify the gradient vector field of an image, and then to recover an image with a gradient approaching this modified gradient field. This amounts to solve a Poisson equation, an operation which can be efficiently performed by Fast Fourier Transform (FFT. This paper describes an algorithm applying this technique, with two different variants. The first variant enhances the contrast by increasing the gradient in the dark regions of the image. This method is well adapted to images with back light or strong shadows, and reveals details in the shadows. The second variant of the same Poisson technique enhances all small gradients in the image, thus also sometimes revealing details and texture.

  7. Topological Poisson Sigma models on Poisson-Lie groups

    International Nuclear Information System (INIS)

    Calvo, Ivan; Falceto, Fernando; Garcia-Alvarez, David

    2003-01-01

    We solve the topological Poisson Sigma model for a Poisson-Lie group G and its dual G*. We show that the gauge symmetry for each model is given by its dual group that acts by dressing transformations on the target. The resolution of both models in the open geometry reveals that there exists a map from the reduced phase of each model (P and P*) to the main symplectic leaf of the Heisenberg double (D 0 ) such that the symplectic forms on P, P* are obtained as the pull-back by those maps of the symplectic structure on D 0 . This uncovers a duality between P and P* under the exchange of bulk degrees of freedom of one model with boundary degrees of freedom of the other one. We finally solve the Poisson Sigma model for the Poisson structure on G given by a pair of r-matrices that generalizes the Poisson-Lie case. The Hamiltonian analysis of the theory requires the introduction of a deformation of the Heisenberg double. (author)

  8. Probabilistic interpretation of data a physicist's approach

    CERN Document Server

    Miller, Guthrie

    2013-01-01

    This book is a physicists approach to interpretation of data using Markov Chain Monte Carlo (MCMC). The concepts are derived from first principles using a style of mathematics that quickly elucidates the basic ideas, sometimes with the aid of examples. Probabilistic data interpretation is a straightforward problem involving conditional probability. A prior probability distribution is essential, and examples are given. In this small book (200 pages) the reader is led from the most basic concepts of mathematical probability all the way to parallel processing algorithms for Markov Chain Monte Carlo. Fortran source code (for eigenvalue analysis of finite discrete Markov Chains, for MCMC, and for nonlinear least squares) is included with the supplementary material for this book (available online).

  9. Stormwater Tank Performance: Design and Management Criteria for Capture Tanks Using a Continuous Simulation and a Semi-Probabilistic Analytical Approach

    Directory of Open Access Journals (Sweden)

    Flavio De Martino

    2013-10-01

    Full Text Available Stormwater tank performance significantly depends on management practices. This paper proposes a procedure to assess tank efficiency in terms of volume and pollutant concentration using four different capture tank management protocols. The comparison of the efficiency results reveals that, as expected, a combined bypass—stormwater tank system achieves better results than a tank alone. The management practices tested for the tank-only systems provide notably different efficiency results. The practice of immediately emptying after the end of the event exhibits significant levels of efficiency and operational advantages. All other configurations exhibit either significant operational problems or very low performances. The continuous simulation and semi-probabilistic approach for the best tank management practice are compared. The semi-probabilistic approach is based on a Weibull probabilistic model of the main characteristics of the rainfall process. Following this approach, efficiency indexes were established. The comparison with continuous simulations shows the reliability of the probabilistic approach even if this last is certainly very site sensitive.

  10. Probabilistic Approach to Enable Extreme-Scale Simulations under Uncertainty and System Faults. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Knio, Omar [Duke Univ., Durham, NC (United States). Dept. of Mechanical Engineering and Materials Science

    2017-05-05

    The current project develops a novel approach that uses a probabilistic description to capture the current state of knowledge about the computational solution. To effectively spread the computational effort over multiple nodes, the global computational domain is split into many subdomains. Computational uncertainty in the solution translates into uncertain boundary conditions for the equation system to be solved on those subdomains, and many independent, concurrent subdomain simulations are used to account for this bound- ary condition uncertainty. By relying on the fact that solutions on neighboring subdomains must agree with each other, a more accurate estimate for the global solution can be achieved. Statistical approaches in this update process make it possible to account for the effect of system faults in the probabilistic description of the computational solution, and the associated uncertainty is reduced through successive iterations. By combining all of these elements, the probabilistic reformulation allows splitting the computational work over very many independent tasks for good scalability, while being robust to system faults.

  11. Poisson distribution

    NARCIS (Netherlands)

    Hallin, M.; Piegorsch, W.; El Shaarawi, A.

    2012-01-01

    The random variable X taking values 0,1,2,…,x,… with probabilities pλ(x) = e−λλx/x!, where λ∈R0+ is called a Poisson variable, and its distribution a Poisson distribution, with parameter λ. The Poisson distribution with parameter λ can be obtained as the limit, as n → ∞ and p → 0 in such a way that

  12. A probabilistic approach to quantifying spatial patterns of flow regimes and network-scale connectivity

    Science.gov (United States)

    Garbin, Silvia; Alessi Celegon, Elisa; Fanton, Pietro; Botter, Gianluca

    2017-04-01

    The temporal variability of river flow regime is a key feature structuring and controlling fluvial ecological communities and ecosystem processes. In particular, streamflow variability induced by climate/landscape heterogeneities or other anthropogenic factors significantly affects the connectivity between streams with notable implication for river fragmentation. Hydrologic connectivity is a fundamental property that guarantees species persistence and ecosystem integrity in riverine systems. In riverine landscapes, most ecological transitions are flow-dependent and the structure of flow regimes may affect ecological functions of endemic biota (i.e., fish spawning or grazing of invertebrate species). Therefore, minimum flow thresholds must be guaranteed to support specific ecosystem services, like fish migration, aquatic biodiversity and habitat suitability. In this contribution, we present a probabilistic approach aiming at a spatially-explicit, quantitative assessment of hydrologic connectivity at the network-scale as derived from river flow variability. Dynamics of daily streamflows are estimated based on catchment-scale climatic and morphological features, integrating a stochastic, physically based approach that accounts for the stochasticity of rainfall with a water balance model and a geomorphic recession flow model. The non-exceedance probability of ecologically meaningful flow thresholds is used to evaluate the fragmentation of individual stream reaches, and the ensuing network-scale connectivity metrics. A multi-dimensional Poisson Process for the stochastic generation of rainfall is used to evaluate the impact of climate signature on reach-scale and catchment-scale connectivity. The analysis shows that streamflow patterns and network-scale connectivity are influenced by the topology of the river network and the spatial variability of climatic properties (rainfall, evapotranspiration). The framework offers a robust basis for the prediction of the impact of

  13. POISSON SUPERFISH, Poisson Equation Solver for Radio Frequency Cavity

    International Nuclear Information System (INIS)

    Colman, J.

    2001-01-01

    1 - Description of program or function: POISSON, SUPERFISH is a group of (1) codes that solve Poisson's equation and are used to compute field quality for both magnets and fixed electric potentials and (2) RF cavity codes that calculate resonant frequencies and field distributions of the fundamental and higher modes. The group includes: POISSON, PANDIRA, SUPERFISH, AUTOMESH, LATTICE, FORCE, MIRT, PAN-T, TEKPLOT, SF01, and SHY. POISSON solves Poisson's (or Laplace's) equation for the vector (scalar) potential with nonlinear isotropic iron (dielectric) and electric current (charge) distributions for two-dimensional Cartesian or three-dimensional cylindrical symmetry. It calculates the derivatives of the potential, the stored energy, and performs harmonic (multipole) analysis of the potential. PANDIRA is similar to POISSON except it allows anisotropic and permanent magnet materials and uses a different numerical method to obtain the potential. SUPERFISH solves for the accelerating (TM) and deflecting (TE) resonant frequencies and field distributions in an RF cavity with two-dimensional Cartesian or three-dimensional cylindrical symmetry. Only the azimuthally symmetric modes are found for cylindrically symmetric cavities. AUTOMESH prepares input for LATTICE from geometrical data describing the problem, (i.e., it constructs the 'logical' mesh and generates (x,y) coordinate data for straight lines, arcs of circles, and segments of hyperbolas). LATTICE generates an irregular triangular (physical) mesh from the input data, calculates the 'point current' terms at each mesh point in regions with distributed current density, and sets up the mesh point relaxation order needed to write the binary problem file for the equation-solving POISSON, PANDIRA, or SUPERFISH. FORCE calculates forces and torques on coils and iron regions from POISSON or PANDIRA solutions for the potential. MIRT optimizes magnet profiles, coil shapes, and current densities from POISSON output based on a

  14. Probabilistic Logic and Probabilistic Networks

    NARCIS (Netherlands)

    Haenni, R.; Romeijn, J.-W.; Wheeler, G.; Williamson, J.

    2009-01-01

    While in principle probabilistic logics might be applied to solve a range of problems, in practice they are rarely applied at present. This is perhaps because they seem disparate, complicated, and computationally intractable. However, we shall argue in this programmatic paper that several approaches

  15. Semantics of probabilistic processes an operational approach

    CERN Document Server

    Deng, Yuxin

    2015-01-01

    This book discusses the semantic foundations of concurrent systems with nondeterministic and probabilistic behaviour. Particular attention is given to clarifying the relationship between testing and simulation semantics and characterising bisimulations from metric, logical, and algorithmic perspectives. Besides presenting recent research outcomes in probabilistic concurrency theory, the book exemplifies the use of many mathematical techniques to solve problems in computer science, which is intended to be accessible to postgraduate students in Computer Science and Mathematics. It can also be us

  16. Probabilistic Approaches to Video Retrieval

    NARCIS (Netherlands)

    Ianeva, Tzvetanka; Boldareva, L.; Westerveld, T.H.W.; Cornacchia, Roberto; Hiemstra, Djoerd; de Vries, A.P.

    Our experiments for TRECVID 2004 further investigate the applicability of the so-called “Generative Probabilistic Models to video retrieval��?. TRECVID 2003 results demonstrated that mixture models computed from video shot sequences improve the precision of “query by examples��? results when

  17. Probabilistic Structural Analysis of SSME Turbopump Blades: Probabilistic Geometry Effects

    Science.gov (United States)

    Nagpal, V. K.

    1985-01-01

    A probabilistic study was initiated to evaluate the precisions of the geometric and material properties tolerances on the structural response of turbopump blades. To complete this study, a number of important probabilistic variables were identified which are conceived to affect the structural response of the blade. In addition, a methodology was developed to statistically quantify the influence of these probabilistic variables in an optimized way. The identified variables include random geometric and material properties perturbations, different loadings and a probabilistic combination of these loadings. Influences of these probabilistic variables are planned to be quantified by evaluating the blade structural response. Studies of the geometric perturbations were conducted for a flat plate geometry as well as for a space shuttle main engine blade geometry using a special purpose code which uses the finite element approach. Analyses indicate that the variances of the perturbations about given mean values have significant influence on the response.

  18. Convex models and probabilistic approach of nonlinear fatigue failure

    International Nuclear Information System (INIS)

    Qiu Zhiping; Lin Qiang; Wang Xiaojun

    2008-01-01

    This paper is concerned with the nonlinear fatigue failure problem with uncertainties in the structural systems. In the present study, in order to solve the nonlinear problem by convex models, the theory of ellipsoidal algebra with the help of the thought of interval analysis is applied. In terms of the inclusion monotonic property of ellipsoidal functions, the nonlinear fatigue failure problem with uncertainties can be solved. A numerical example of 25-bar truss structures is given to illustrate the efficiency of the presented method in comparison with the probabilistic approach

  19. Multiple sequential failure model: A probabilistic approach to quantifying human error dependency

    International Nuclear Information System (INIS)

    Samanta

    1985-01-01

    This paper rpesents a probabilistic approach to quantifying human error dependency when multiple tasks are performed. Dependent human failures are dominant contributors to risks from nuclear power plants. An overview of the Multiple Sequential Failure (MSF) model developed and its use in probabilistic risk assessments (PRAs) depending on the available data are discussed. A small-scale psychological experiment was conducted on the nature of human dependency and the interpretation of the experimental data by the MSF model show remarkable accommodation of the dependent failure data. The model, which provides an unique method for quantification of dependent failures in human reliability analysis, can be used in conjunction with any of the general methods currently used for performing the human reliability aspect in PRAs

  20. A generalized Poisson and Poisson-Boltzmann solver for electrostatic environments

    International Nuclear Information System (INIS)

    Fisicaro, G.; Goedecker, S.; Genovese, L.; Andreussi, O.; Marzari, N.

    2016-01-01

    The computational study of chemical reactions in complex, wet environments is critical for applications in many fields. It is often essential to study chemical reactions in the presence of applied electrochemical potentials, taking into account the non-trivial electrostatic screening coming from the solvent and the electrolytes. As a consequence, the electrostatic potential has to be found by solving the generalized Poisson and the Poisson-Boltzmann equations for neutral and ionic solutions, respectively. In the present work, solvers for both problems have been developed. A preconditioned conjugate gradient method has been implemented for the solution of the generalized Poisson equation and the linear regime of the Poisson-Boltzmann, allowing to solve iteratively the minimization problem with some ten iterations of the ordinary Poisson equation solver. In addition, a self-consistent procedure enables us to solve the non-linear Poisson-Boltzmann problem. Both solvers exhibit very high accuracy and parallel efficiency and allow for the treatment of periodic, free, and slab boundary conditions. The solver has been integrated into the BigDFT and Quantum-ESPRESSO electronic-structure packages and will be released as an independent program, suitable for integration in other codes

  1. A generalized Poisson and Poisson-Boltzmann solver for electrostatic environments.

    Science.gov (United States)

    Fisicaro, G; Genovese, L; Andreussi, O; Marzari, N; Goedecker, S

    2016-01-07

    The computational study of chemical reactions in complex, wet environments is critical for applications in many fields. It is often essential to study chemical reactions in the presence of applied electrochemical potentials, taking into account the non-trivial electrostatic screening coming from the solvent and the electrolytes. As a consequence, the electrostatic potential has to be found by solving the generalized Poisson and the Poisson-Boltzmann equations for neutral and ionic solutions, respectively. In the present work, solvers for both problems have been developed. A preconditioned conjugate gradient method has been implemented for the solution of the generalized Poisson equation and the linear regime of the Poisson-Boltzmann, allowing to solve iteratively the minimization problem with some ten iterations of the ordinary Poisson equation solver. In addition, a self-consistent procedure enables us to solve the non-linear Poisson-Boltzmann problem. Both solvers exhibit very high accuracy and parallel efficiency and allow for the treatment of periodic, free, and slab boundary conditions. The solver has been integrated into the BigDFT and Quantum-ESPRESSO electronic-structure packages and will be released as an independent program, suitable for integration in other codes.

  2. A generalized Poisson and Poisson-Boltzmann solver for electrostatic environments

    Energy Technology Data Exchange (ETDEWEB)

    Fisicaro, G., E-mail: giuseppe.fisicaro@unibas.ch; Goedecker, S. [Department of Physics, University of Basel, Klingelbergstrasse 82, 4056 Basel (Switzerland); Genovese, L. [University of Grenoble Alpes, CEA, INAC-SP2M, L-Sim, F-38000 Grenoble (France); Andreussi, O. [Institute of Computational Science, Università della Svizzera Italiana, Via Giuseppe Buffi 13, CH-6904 Lugano (Switzerland); Theory and Simulations of Materials (THEOS) and National Centre for Computational Design and Discovery of Novel Materials (MARVEL), École Polytechnique Fédérale de Lausanne, Station 12, CH-1015 Lausanne (Switzerland); Marzari, N. [Theory and Simulations of Materials (THEOS) and National Centre for Computational Design and Discovery of Novel Materials (MARVEL), École Polytechnique Fédérale de Lausanne, Station 12, CH-1015 Lausanne (Switzerland)

    2016-01-07

    The computational study of chemical reactions in complex, wet environments is critical for applications in many fields. It is often essential to study chemical reactions in the presence of applied electrochemical potentials, taking into account the non-trivial electrostatic screening coming from the solvent and the electrolytes. As a consequence, the electrostatic potential has to be found by solving the generalized Poisson and the Poisson-Boltzmann equations for neutral and ionic solutions, respectively. In the present work, solvers for both problems have been developed. A preconditioned conjugate gradient method has been implemented for the solution of the generalized Poisson equation and the linear regime of the Poisson-Boltzmann, allowing to solve iteratively the minimization problem with some ten iterations of the ordinary Poisson equation solver. In addition, a self-consistent procedure enables us to solve the non-linear Poisson-Boltzmann problem. Both solvers exhibit very high accuracy and parallel efficiency and allow for the treatment of periodic, free, and slab boundary conditions. The solver has been integrated into the BigDFT and Quantum-ESPRESSO electronic-structure packages and will be released as an independent program, suitable for integration in other codes.

  3. Fast and Accurate Poisson Denoising With Trainable Nonlinear Diffusion.

    Science.gov (United States)

    Feng, Wensen; Qiao, Peng; Chen, Yunjin; Wensen Feng; Peng Qiao; Yunjin Chen; Feng, Wensen; Chen, Yunjin; Qiao, Peng

    2018-06-01

    The degradation of the acquired signal by Poisson noise is a common problem for various imaging applications, such as medical imaging, night vision, and microscopy. Up to now, many state-of-the-art Poisson denoising techniques mainly concentrate on achieving utmost performance, with little consideration for the computation efficiency. Therefore, in this paper we aim to propose an efficient Poisson denoising model with both high computational efficiency and recovery quality. To this end, we exploit the newly developed trainable nonlinear reaction diffusion (TNRD) model which has proven an extremely fast image restoration approach with performance surpassing recent state-of-the-arts. However, the straightforward direct gradient descent employed in the original TNRD-based denoising task is not applicable in this paper. To solve this problem, we resort to the proximal gradient descent method. We retrain the model parameters, including the linear filters and influence functions by taking into account the Poisson noise statistics, and end up with a well-trained nonlinear diffusion model specialized for Poisson denoising. The trained model provides strongly competitive results against state-of-the-art approaches, meanwhile bearing the properties of simple structure and high efficiency. Furthermore, our proposed model comes along with an additional advantage, that the diffusion process is well-suited for parallel computation on graphics processing units (GPUs). For images of size , our GPU implementation takes less than 0.1 s to produce state-of-the-art Poisson denoising performance.

  4. Efficient triangulation of Poisson-disk sampled point sets

    KAUST Repository

    Guo, Jianwei

    2014-05-06

    In this paper, we present a simple yet efficient algorithm for triangulating a 2D input domain containing a Poisson-disk sampled point set. The proposed algorithm combines a regular grid and a discrete clustering approach to speedup the triangulation. Moreover, our triangulation algorithm is flexible and performs well on more general point sets such as adaptive, non-maximal Poisson-disk sets. The experimental results demonstrate that our algorithm is robust for a wide range of input domains and achieves significant performance improvement compared to the current state-of-the-art approaches. © 2014 Springer-Verlag Berlin Heidelberg.

  5. An approach to handle Real Time and Probabilistic behaviors in e-commerce

    DEFF Research Database (Denmark)

    Diaz, G.; Larsen, Kim Guldstrand; Pardo, J.

    2005-01-01

    In this work we describe an approach to deal with systems having at the same time probabilistic and real-time behav- iors. The main goal in the paper is to show the automatic translation from a real time model based on UPPAAL tool, which makes automatic verification of Real Time Systems, to the R...

  6. Probabilistic approaches to recommendations

    CERN Document Server

    Barbieri, Nicola; Ritacco, Ettore

    2014-01-01

    The importance of accurate recommender systems has been widely recognized by academia and industry, and recommendation is rapidly becoming one of the most successful applications of data mining and machine learning. Understanding and predicting the choices and preferences of users is a challenging task: real-world scenarios involve users behaving in complex situations, where prior beliefs, specific tendencies, and reciprocal influences jointly contribute to determining the preferences of users toward huge amounts of information, services, and products. Probabilistic modeling represents a robus

  7. A Probabilistic Approach to Predict Thermal Fatigue Life for Ball Grid Array Solder Joints

    Science.gov (United States)

    Wei, Helin; Wang, Kuisheng

    2011-11-01

    Numerous studies of the reliability of solder joints have been performed. Most life prediction models are limited to a deterministic approach. However, manufacturing induces uncertainty in the geometry parameters of solder joints, and the environmental temperature varies widely due to end-user diversity, creating uncertainties in the reliability of solder joints. In this study, a methodology for accounting for variation in the lifetime prediction for lead-free solder joints of ball grid array packages (PBGA) is demonstrated. The key aspects of the solder joint parameters and the cyclic temperature range related to reliability are involved. Probabilistic solutions of the inelastic strain range and thermal fatigue life based on the Engelmaier model are developed to determine the probability of solder joint failure. The results indicate that the standard deviation increases significantly when more random variations are involved. Using the probabilistic method, the influence of each variable on the thermal fatigue life is quantified. This information can be used to optimize product design and process validation acceptance criteria. The probabilistic approach creates the opportunity to identify the root causes of failed samples from product fatigue tests and field returns. The method can be applied to better understand how variation affects parameters of interest in an electronic package design with area array interconnections.

  8. Poisson's ratio and Young's modulus of lipid bilayers in different phases

    Directory of Open Access Journals (Sweden)

    Tayebeh eJadidi

    2014-04-01

    Full Text Available A general computational method is introduced to estimate the Poisson's ratio for membranes with small thickness.In this method, the Poisson's ratio is calculated by utilizing a rescaling of inter-particle distancesin one lateral direction under periodic boundary conditions. As an example for the coarse grained lipid model introduced by Lenz and Schmid, we calculate the Poisson's ratio in the gel, fluid, and interdigitated phases. Having the Poisson's ratio, enable us to obtain the Young's modulus for the membranes in different phases. The approach may be applied to other membranes such as graphene and tethered membranes in orderto predict the temperature dependence of its Poisson's ratio and Young's modulus.

  9. Poisson processes

    NARCIS (Netherlands)

    Boxma, O.J.; Yechiali, U.; Ruggeri, F.; Kenett, R.S.; Faltin, F.W.

    2007-01-01

    The Poisson process is a stochastic counting process that arises naturally in a large variety of daily life situations. We present a few definitions of the Poisson process and discuss several properties as well as relations to some well-known probability distributions. We further briefly discuss the

  10. A probabilistic approach to quantify the uncertainties in internal dose assessment using response surface and neural network

    International Nuclear Information System (INIS)

    Baek, M.; Lee, S.K.; Lee, U.C.; Kang, C.S.

    1996-01-01

    A probabilistic approach is formulated to assess the internal radiation exposure following the intake of radioisotopes. This probabilistic approach consists of 4 steps as follows: (1) screening, (2) quantification of uncertainties, (3) propagation of uncertainties, and (4) analysis of output. The approach has been applied for Pu-induced internal dose assessment and a multi-compartment dosimetric model is used for internal transport. In this approach, surrogate models of original system are constructed using response and neural network. And the results of these surrogate models are compared with those of original model. Each surrogate model well approximates the original model. The uncertainty and sensitivity analysis of the model parameters are evaluated in this process. Dominant contributors to each organ are identified and the results show that this approach could serve a good tool of assessing the internal radiation exposure

  11. Singular Poisson tensors

    International Nuclear Information System (INIS)

    Littlejohn, R.G.

    1982-01-01

    The Hamiltonian structures discovered by Morrison and Greene for various fluid equations were obtained by guessing a Hamiltonian and a suitable Poisson bracket formula, expressed in terms of noncanonical (but physical) coordinates. In general, such a procedure for obtaining a Hamiltonian system does not produce a Hamiltonian phase space in the usual sense (a symplectic manifold), but rather a family of symplectic manifolds. To state the matter in terms of a system with a finite number of degrees of freedom, the family of symplectic manifolds is parametrized by a set of Casimir functions, which are characterized by having vanishing Poisson brackets with all other functions. The number of independent Casimir functions is the corank of the Poisson tensor J/sup ij/, the components of which are the Poisson brackets of the coordinates among themselves. Thus, these Casimir functions exist only when the Poisson tensor is singular

  12. Poisson sigma model with branes and hyperelliptic Riemann surfaces

    International Nuclear Information System (INIS)

    Ferrario, Andrea

    2008-01-01

    We derive the explicit form of the superpropagators in the presence of general boundary conditions (coisotropic branes) for the Poisson sigma model. This generalizes the results presented by Cattaneo and Felder [''A path integral approach to the Kontsevich quantization formula,'' Commun. Math. Phys. 212, 591 (2000)] and Cattaneo and Felder ['Coisotropic submanifolds in Poisson geometry and branes in the Poisson sigma model', Lett. Math. Phys. 69, 157 (2004)] for Kontsevich's angle function [Kontsevich, M., 'Deformation quantization of Poisson manifolds I', e-print arXiv:hep.th/0101170] used in the deformation quantization program of Poisson manifolds. The relevant superpropagators for n branes are defined as gauge fixed homotopy operators of a complex of differential forms on n sided polygons P n with particular ''alternating'' boundary conditions. In the presence of more than three branes we use first order Riemann theta functions with odd singular characteristics on the Jacobian variety of a hyperelliptic Riemann surface (canonical setting). In genus g the superpropagators present g zero mode contributions

  13. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  14. Identification of failure type in corroded pipelines: a bayesian probabilistic approach.

    Science.gov (United States)

    Breton, T; Sanchez-Gheno, J C; Alamilla, J L; Alvarez-Ramirez, J

    2010-07-15

    Spillover of hazardous materials from transport pipelines can lead to catastrophic events with serious and dangerous environmental impact, potential fire events and human fatalities. The problem is more serious for large pipelines when the construction material is under environmental corrosion conditions, as in the petroleum and gas industries. In this way, predictive models can provide a suitable framework for risk evaluation, maintenance policies and substitution procedure design that should be oriented to reduce increased hazards. This work proposes a bayesian probabilistic approach to identify and predict the type of failure (leakage or rupture) for steel pipelines under realistic corroding conditions. In the first step of the modeling process, the mechanical performance of the pipe is considered for establishing conditions under which either leakage or rupture failure can occur. In the second step, experimental burst tests are used to introduce a mean probabilistic boundary defining a region where the type of failure is uncertain. In the boundary vicinity, the failure discrimination is carried out with a probabilistic model where the events are considered as random variables. In turn, the model parameters are estimated with available experimental data and contrasted with a real catastrophic event, showing good discrimination capacity. The results are discussed in terms of policies oriented to inspection and maintenance of large-size pipelines in the oil and gas industry. 2010 Elsevier B.V. All rights reserved.

  15. Exploring the uncertainties in cancer risk assessment using the integrated probabilistic risk assessment (IPRA) approach.

    Science.gov (United States)

    Slob, Wout; Bakker, Martine I; Biesebeek, Jan Dirk Te; Bokkers, Bas G H

    2014-08-01

    Current methods for cancer risk assessment result in single values, without any quantitative information on the uncertainties in these values. Therefore, single risk values could easily be overinterpreted. In this study, we discuss a full probabilistic cancer risk assessment approach in which all the generally recognized uncertainties in both exposure and hazard assessment are quantitatively characterized and probabilistically evaluated, resulting in a confidence interval for the final risk estimate. The methodology is applied to three example chemicals (aflatoxin, N-nitrosodimethylamine, and methyleugenol). These examples illustrate that the uncertainty in a cancer risk estimate may be huge, making single value estimates of cancer risk meaningless. Further, a risk based on linear extrapolation tends to be lower than the upper 95% confidence limit of a probabilistic risk estimate, and in that sense it is not conservative. Our conceptual analysis showed that there are two possible basic approaches for cancer risk assessment, depending on the interpretation of the dose-incidence data measured in animals. However, it remains unclear which of the two interpretations is the more adequate one, adding an additional uncertainty to the already huge confidence intervals for cancer risk estimates. © 2014 Society for Risk Analysis.

  16. NHPoisson: An R Package for Fitting and Validating Nonhomogeneous Poisson Processes

    Directory of Open Access Journals (Sweden)

    Ana C. Cebrián

    2015-03-01

    Full Text Available NHPoisson is an R package for the modeling of nonhomogeneous Poisson processes in one dimension. It includes functions for data preparation, maximum likelihood estimation, covariate selection and inference based on asymptotic distributions and simulation methods. It also provides specific methods for the estimation of Poisson processes resulting from a peak over threshold approach. In addition, the package supports a wide range of model validation tools and functions for generating nonhomogenous Poisson process trajectories. This paper is a description of the package and aims to help those interested in modeling data using nonhomogeneous Poisson processes.

  17. Perbandingan Regresi Binomial Negatif dan Regresi Conway-Maxwell-Poisson dalam Mengatasi Overdispersi pada Regresi Poisson

    Directory of Open Access Journals (Sweden)

    Lusi Eka Afri

    2017-03-01

    Full Text Available Regresi Binomial Negatif dan regresi Conway-Maxwell-Poisson merupakan solusi untuk mengatasi overdispersi pada regresi Poisson. Kedua model tersebut merupakan perluasan dari model regresi Poisson. Menurut Hinde dan Demetrio (2007, terdapat beberapa kemungkinan terjadi overdispersi pada regresi Poisson yaitu keragaman hasil pengamatan keragaman individu sebagai komponen yang tidak dijelaskan oleh model, korelasi antar respon individu, terjadinya pengelompokan dalam populasi dan peubah teramati yang dihilangkan. Akibatnya dapat menyebabkan pendugaan galat baku yang terlalu rendah dan akan menghasilkan pendugaan parameter yang bias ke bawah (underestimate. Penelitian ini bertujuan untuk membandingan model Regresi Binomial Negatif dan model regresi Conway-Maxwell-Poisson (COM-Poisson dalam mengatasi overdispersi pada data distribusi Poisson berdasarkan statistik uji devians. Data yang digunakan dalam penelitian ini terdiri dari dua sumber data yaitu data simulasi dan data kasus terapan. Data simulasi yang digunakan diperoleh dengan membangkitkan data berdistribusi Poisson yang mengandung overdispersi dengan menggunakan bahasa pemrograman R berdasarkan karakteristik data berupa , peluang munculnya nilai nol (p serta ukuran sampel (n. Data dibangkitkan berguna untuk mendapatkan estimasi koefisien parameter pada regresi binomial negatif dan COM-Poisson.   Kata Kunci: overdispersi, regresi binomial negatif, regresi Conway-Maxwell-Poisson Negative binomial regression and Conway-Maxwell-Poisson regression could be used to overcome over dispersion on Poisson regression. Both models are the extension of Poisson regression model. According to Hinde and Demetrio (2007, there will be some over dispersion on Poisson regression: observed variance in individual variance cannot be described by a model, correlation among individual response, and the population group and the observed variables are eliminated. Consequently, this can lead to low standard error

  18. A probabilistic approach to controlling crevice chemistry

    International Nuclear Information System (INIS)

    Millett, P.J.; Brobst, G.E.; Riddle, J.

    1995-01-01

    It has been generally accepted that the corrosion of steam generator tubing could be reduced if the local pH in regions where impurities concentrate could be controlled. The practice of molar ratio control is based on this assumption. Unfortunately, due to the complexity of the crevice concentration process, efforts to model the crevice chemistry based on bulk water conditions are quite uncertain. In-situ monitoring of the crevice chemistry is desirable, but may not be achievable in the near future. The current methodology for assessing the crevice chemistry is to monitor the hideout return chemistry when the plant shuts down. This approach also has its shortcomings, but may provide sufficient data to evaluate whether the crevice pH is in a desirable range. In this paper, an approach to controlling the crevice chemistry based on a target molar ratio indicator is introduced. The molar ratio indicator is based on what is believed to be the most reliable hideout return data. Probabilistic arguments are then used to show that the crevice pH will most likely be in a desirable range when the target molar ratio is achieved

  19. A probabilistic approach for debris impact risk with numerical simulations of debris behaviors

    International Nuclear Information System (INIS)

    Kihara, Naoto; Matsuyama, Masafumi; Fujii, Naoki

    2013-01-01

    We propose a probabilistic approach for evaluating the impact risk of tsunami debris through Monte Carlo simulations with a combined system comprising a depth-averaged two-dimensional shallow water model and a discrete element model customized to simulate the motions of floating objects such as vessels. In the proposed method, first, probabilistic tsunami hazard analysis is carried out, and the exceedance probability of tsunami height and numerous tsunami time series for various hazard levels on the offshore side of a target site are estimated. Second, a characteristic tsunami time series for each hazard level is created by cluster analysis. Third, using the Monte Carlo simulation model the debris impact probability with the buildings of interest and the exceedance probability of debris impact speed are evaluated. (author)

  20. Comparison of Two Probabilistic Fatigue Damage Assessment Approaches Using Prognostic Performance Metrics

    Directory of Open Access Journals (Sweden)

    Xuefei Guan

    2011-01-01

    Full Text Available In this paper, two probabilistic prognosis updating schemes are compared. One is based on the classical Bayesian approach and the other is based on newly developed maximum relative entropy (MRE approach. The algorithm performance of the two models is evaluated using a set of recently developed prognostics-based metrics. Various uncertainties from measurements, modeling, and parameter estimations are integrated into the prognosis framework as random input variables for fatigue damage of materials. Measures of response variables are then used to update the statistical distributions of random variables and the prognosis results are updated using posterior distributions. Markov Chain Monte Carlo (MCMC technique is employed to provide the posterior samples for model updating in the framework. Experimental data are used to demonstrate the operation of the proposed probabilistic prognosis methodology. A set of prognostics-based metrics are employed to quantitatively evaluate the prognosis performance and compare the proposed entropy method with the classical Bayesian updating algorithm. In particular, model accuracy, precision, robustness and convergence are rigorously evaluated in addition to the qualitative visual comparison. Following this, potential development and improvement for the prognostics-based metrics are discussed in detail.

  1. An Intrinsic Algorithm for Parallel Poisson Disk Sampling on Arbitrary Surfaces.

    Science.gov (United States)

    Ying, Xiang; Xin, Shi-Qing; Sun, Qian; He, Ying

    2013-03-08

    Poisson disk sampling plays an important role in a variety of visual computing, due to its useful statistical property in distribution and the absence of aliasing artifacts. While many effective techniques have been proposed to generate Poisson disk distribution in Euclidean space, relatively few work has been reported to the surface counterpart. This paper presents an intrinsic algorithm for parallel Poisson disk sampling on arbitrary surfaces. We propose a new technique for parallelizing the dart throwing. Rather than the conventional approaches that explicitly partition the spatial domain to generate the samples in parallel, our approach assigns each sample candidate a random and unique priority that is unbiased with regard to the distribution. Hence, multiple threads can process the candidates simultaneously and resolve conflicts by checking the given priority values. It is worth noting that our algorithm is accurate as the generated Poisson disks are uniformly and randomly distributed without bias. Our method is intrinsic in that all the computations are based on the intrinsic metric and are independent of the embedding space. This intrinsic feature allows us to generate Poisson disk distributions on arbitrary surfaces. Furthermore, by manipulating the spatially varying density function, we can obtain adaptive sampling easily.

  2. A Probabilistic Approach to Fitting Period–luminosity Relations and Validating Gaia Parallaxes

    Energy Technology Data Exchange (ETDEWEB)

    Sesar, Branimir; Fouesneau, Morgan; Bailer-Jones, Coryn A. L.; Gould, Andy; Rix, Hans-Walter [Max Planck Institute for Astronomy, Königstuhl 17, D-69117 Heidelberg (Germany); Price-Whelan, Adrian M., E-mail: bsesar@mpia.de [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States)

    2017-04-01

    Pulsating stars, such as Cepheids, Miras, and RR Lyrae stars, are important distance indicators and calibrators of the “cosmic distance ladder,” and yet their period–luminosity–metallicity (PLZ) relations are still constrained using simple statistical methods that cannot take full advantage of available data. To enable optimal usage of data provided by the Gaia mission, we present a probabilistic approach that simultaneously constrains parameters of PLZ relations and uncertainties in Gaia parallax measurements. We demonstrate this approach by constraining PLZ relations of type ab RR Lyrae stars in near-infrared W 1 and W 2 bands, using Tycho- Gaia Astrometric Solution (TGAS) parallax measurements for a sample of ≈100 type ab RR Lyrae stars located within 2.5 kpc of the Sun. The fitted PLZ relations are consistent with previous studies, and in combination with other data, deliver distances precise to 6% (once various sources of uncertainty are taken into account). To a precision of 0.05 mas (1 σ ), we do not find a statistically significant offset in TGAS parallaxes for this sample of distant RR Lyrae stars (median parallax of 0.8 mas and distance of 1.4 kpc). With only minor modifications, our probabilistic approach can be used to constrain PLZ relations of other pulsating stars, and we intend to apply it to Cepheid and Mira stars in the near future.

  3. DL_MG: A Parallel Multigrid Poisson and Poisson-Boltzmann Solver for Electronic Structure Calculations in Vacuum and Solution.

    Science.gov (United States)

    Womack, James C; Anton, Lucian; Dziedzic, Jacek; Hasnip, Phil J; Probert, Matt I J; Skylaris, Chris-Kriton

    2018-03-13

    The solution of the Poisson equation is a crucial step in electronic structure calculations, yielding the electrostatic potential-a key component of the quantum mechanical Hamiltonian. In recent decades, theoretical advances and increases in computer performance have made it possible to simulate the electronic structure of extended systems in complex environments. This requires the solution of more complicated variants of the Poisson equation, featuring nonhomogeneous dielectric permittivities, ionic concentrations with nonlinear dependencies, and diverse boundary conditions. The analytic solutions generally used to solve the Poisson equation in vacuum (or with homogeneous permittivity) are not applicable in these circumstances, and numerical methods must be used. In this work, we present DL_MG, a flexible, scalable, and accurate solver library, developed specifically to tackle the challenges of solving the Poisson equation in modern large-scale electronic structure calculations on parallel computers. Our solver is based on the multigrid approach and uses an iterative high-order defect correction method to improve the accuracy of solutions. Using two chemically relevant model systems, we tested the accuracy and computational performance of DL_MG when solving the generalized Poisson and Poisson-Boltzmann equations, demonstrating excellent agreement with analytic solutions and efficient scaling to ∼10 9 unknowns and 100s of CPU cores. We also applied DL_MG in actual large-scale electronic structure calculations, using the ONETEP linear-scaling electronic structure package to study a 2615 atom protein-ligand complex with routinely available computational resources. In these calculations, the overall execution time with DL_MG was not significantly greater than the time required for calculations using a conventional FFT-based solver.

  4. An intrinsic algorithm for parallel Poisson disk sampling on arbitrary surfaces.

    Science.gov (United States)

    Ying, Xiang; Xin, Shi-Qing; Sun, Qian; He, Ying

    2013-09-01

    Poisson disk sampling has excellent spatial and spectral properties, and plays an important role in a variety of visual computing. Although many promising algorithms have been proposed for multidimensional sampling in euclidean space, very few studies have been reported with regard to the problem of generating Poisson disks on surfaces due to the complicated nature of the surface. This paper presents an intrinsic algorithm for parallel Poisson disk sampling on arbitrary surfaces. In sharp contrast to the conventional parallel approaches, our method neither partitions the given surface into small patches nor uses any spatial data structure to maintain the voids in the sampling domain. Instead, our approach assigns each sample candidate a random and unique priority that is unbiased with regard to the distribution. Hence, multiple threads can process the candidates simultaneously and resolve conflicts by checking the given priority values. Our algorithm guarantees that the generated Poisson disks are uniformly and randomly distributed without bias. It is worth noting that our method is intrinsic and independent of the embedding space. This intrinsic feature allows us to generate Poisson disk patterns on arbitrary surfaces in IR(n). To our knowledge, this is the first intrinsic, parallel, and accurate algorithm for surface Poisson disk sampling. Furthermore, by manipulating the spatially varying density function, we can obtain adaptive sampling easily.

  5. Some thoughts on the future of probabilistic structural design of nuclear components

    International Nuclear Information System (INIS)

    Stancampiano, P.A.

    1978-01-01

    This paper presents some views on the future role of probabilistic methods in the structural design of nuclear components. The existing deterministic design approach is discussed and compared to the probabilistic approach. Some of the objections to both deterministic and probabilistic design are listed. Extensive research and development activities are required to mature the probabilistic approach suficiently to make it cost-effective and competitive with current deterministic design practices. The required research activities deal with probabilistic methods development, more realistic casual failure mode models development, and statistical data models development. A quasi-probabilistic structural design approach is recommended which accounts for the random error in the design models. (Auth.)

  6. Combination of Evidence with Different Weighting Factors: A Novel Probabilistic-Based Dissimilarity Measure Approach

    Directory of Open Access Journals (Sweden)

    Mengmeng Ma

    2015-01-01

    Full Text Available To solve the invalidation problem of Dempster-Shafer theory of evidence (DS with high conflict in multisensor data fusion, this paper presents a novel combination approach of conflict evidence with different weighting factors using a new probabilistic dissimilarity measure. Firstly, an improved probabilistic transformation function is proposed to map basic belief assignments (BBAs to probabilities. Then, a new dissimilarity measure integrating fuzzy nearness and introduced correlation coefficient is proposed to characterize not only the difference between basic belief functions (BBAs but also the divergence degree of the hypothesis that two BBAs support. Finally, the weighting factors used to reassign conflicts on BBAs are developed and Dempster’s rule is chosen to combine the discounted sources. Simple numerical examples are employed to demonstrate the merit of the proposed method. Through analysis and comparison of the results, the new combination approach can effectively solve the problem of conflict management with better convergence performance and robustness.

  7. Limited probabilistic risk assessment applications in plant backfitting

    International Nuclear Information System (INIS)

    Desaedeleer, G.

    1987-01-01

    Plant backfitting programs are defined on the basis of deterministic (e.g. Systematic Evaluation Program) or probabilistic (e.g. Probabilistic Risk Assessment) approaches. Each approach provides valuable assets in defining the program and has its own advantages and disadvantages. Ideally one should combine the strong points of each approach. This chapter summarizes actual experience gained from combinations of deterministic and probabilistic approaches to define and implement PWR backfitting programs. Such combinations relate to limited applications of probabilistic techniques and are illustrated for upgrading fluid systems. These evaluations allow sound and rational optimization systems upgrade. However, the boundaries of the reliability analysis need to be clearly defined and system reliability may have to go beyond classical boundaries (e.g. identification of weak links in support systems). Also the implementation of upgrade on a system per system basis is not necessarily cost-effective. (author)

  8. Probabilistic reasoning for assembly-based 3D modeling

    KAUST Repository

    Chaudhuri, Siddhartha

    2011-01-01

    Assembly-based modeling is a promising approach to broadening the accessibility of 3D modeling. In assembly-based modeling, new models are assembled from shape components extracted from a database. A key challenge in assembly-based modeling is the identification of relevant components to be presented to the user. In this paper, we introduce a probabilistic reasoning approach to this problem. Given a repository of shapes, our approach learns a probabilistic graphical model that encodes semantic and geometric relationships among shape components. The probabilistic model is used to present components that are semantically and stylistically compatible with the 3D model that is being assembled. Our experiments indicate that the probabilistic model increases the relevance of presented components. © 2011 ACM.

  9. Transmission capacity assessment by probabilistic planning. An approach

    International Nuclear Information System (INIS)

    Lammintausta, M.

    2002-01-01

    The Finnish electricity markets participate in the Scandinavian markets, Nord-Pool. The Finnish market is free for marketers, producers and consumers. All these participants can be seen as customers of the transmission network, which in turn can be considered to be a market place in which electricity can be sold and bought. The Finnish transmission network is owned and operated by an independent company, Fingrid that has the full responsibility of the Finnish transmission system. The available transfer capacity of a transmission route is traditionally limited by deterministic security constraints. More efficient and flexible network utilisation could be achieved with probabilistic planning methods. This report introduces a simple and practical probabilistic approach for transfer limit and risk assessment. The method is based on the economical benefit and risk predictions. It uses also the existing results of deterministic data and it could be used side by side with the deterministic method. The basic concept and necessary equations for expected risks of various market players have been derived for further developments. The outage costs and thereby the risks of the market participants depend on how the system operator reacts to the faults. In the Finnish power system consumers will usually experience no costs due to the faults because of meshed network and counter trade method preferred by the system operator. The costs to the producers and dealers are also low because of the counter trade method. The network company will lose the cost of reparation, additional losses and cost of regulation power because of counter trades. In case power flows will be rearranged drastically because of aggressive strategies used in the electricity markets, the only way to fulfil the needs of free markets is that the network operator buys regulation power for short-term problems and reinforces the network in the long-term situations. The reinforcement is done if the network can not be

  10. Elastic properties of a material composed of alternating layers of negative and positive Poisson's ratio

    International Nuclear Information System (INIS)

    Kocer, C.; McKenzie, D.R.; Bilek, M.M.

    2009-01-01

    The theory of elasticity predicts a variety of phenomena associated with solids that possess a negative Poisson's ratio. The fabrication of metamaterials with a 'designed' microstructure that exhibit a Poisson's ratio approaching the thermodynamic limits of 1/2 and -1 increases the likelihood of realising these phenomena for applications. In this work, we investigate the properties of a layered composite, with alternating layers of materials with negative and positive Poisson's ratio approaching the thermodynamic limits. Using the finite element method to simulate uniaxial loading and indentation of a free standing composite, we observed an increase in the resistance to mechanical deformation above the average value of the two materials. Even though the greatest increase in stiffness is gained as the thermodynamic limits are approached, a significant amount of added stiffness can be attained, provided that the Young's modulus of the negative Poisson's ratio material is not less than that of the positive Poisson's ratio material

  11. Estimated Quality of Multistage Process on the Basis of Probabilistic Approach with Continuous Response Functions

    Directory of Open Access Journals (Sweden)

    Yuri B. Tebekin

    2011-11-01

    Full Text Available The article is devoted to the problem of the quality management for multiphase processes on the basis of the probabilistic approach. Method with continuous response functions is offered from the application of the method of Lagrange multipliers.

  12. Poisson image reconstruction with Hessian Schatten-norm regularization.

    Science.gov (United States)

    Lefkimmiatis, Stamatios; Unser, Michael

    2013-11-01

    Poisson inverse problems arise in many modern imaging applications, including biomedical and astronomical ones. The main challenge is to obtain an estimate of the underlying image from a set of measurements degraded by a linear operator and further corrupted by Poisson noise. In this paper, we propose an efficient framework for Poisson image reconstruction, under a regularization approach, which depends on matrix-valued regularization operators. In particular, the employed regularizers involve the Hessian as the regularization operator and Schatten matrix norms as the potential functions. For the solution of the problem, we propose two optimization algorithms that are specifically tailored to the Poisson nature of the noise. These algorithms are based on an augmented-Lagrangian formulation of the problem and correspond to two variants of the alternating direction method of multipliers. Further, we derive a link that relates the proximal map of an l(p) norm with the proximal map of a Schatten matrix norm of order p. This link plays a key role in the development of one of the proposed algorithms. Finally, we provide experimental results on natural and biological images for the task of Poisson image deblurring and demonstrate the practical relevance and effectiveness of the proposed framework.

  13. On a Poisson homogeneous space of bilinear forms with a Poisson-Lie action

    Science.gov (United States)

    Chekhov, L. O.; Mazzocco, M.

    2017-12-01

    Let \\mathscr A be the space of bilinear forms on C^N with defining matrices A endowed with a quadratic Poisson structure of reflection equation type. The paper begins with a short description of previous studies of the structure, and then this structure is extended to systems of bilinear forms whose dynamics is governed by the natural action A\\mapsto B ABT} of the {GL}_N Poisson-Lie group on \\mathscr A. A classification is given of all possible quadratic brackets on (B, A)\\in {GL}_N× \\mathscr A preserving the Poisson property of the action, thus endowing \\mathscr A with the structure of a Poisson homogeneous space. Besides the product Poisson structure on {GL}_N× \\mathscr A, there are two other (mutually dual) structures, which (unlike the product Poisson structure) admit reductions by the Dirac procedure to a space of bilinear forms with block upper triangular defining matrices. Further generalisations of this construction are considered, to triples (B,C, A)\\in {GL}_N× {GL}_N× \\mathscr A with the Poisson action A\\mapsto B ACT}, and it is shown that \\mathscr A then acquires the structure of a Poisson symmetric space. Generalisations to chains of transformations and to the quantum and quantum affine algebras are investigated, as well as the relations between constructions of Poisson symmetric spaces and the Poisson groupoid. Bibliography: 30 titles.

  14. Illustration of probabilistic approach in consequence assessment of accidental radioactive releases

    International Nuclear Information System (INIS)

    Pecha, P.; Hofman, R.; Kuca, P.

    2009-01-01

    We are describing a certain application of uncertainty analysis of environmental model HARP applied on atmospheric and deposition sub-model. Simulation of uncertainties propagation through the model is basic inevitable task bringing data for advanced techniques of probabilistic consequence assessment and further improvement of reliability of model predictions based on statistical procedures of assimilation with measured data. The activities are investigated in the institute IITA AV CR within the grant project supported by GACR (2007-2009). The problem is solved in close cooperation with section of information systems in institute NRPI. The subject of investigation concerns evaluation of consequences of radioactivity propagation after an accidental radioactivity release from nuclear facility.Transport of activity is studied from initial atmospheric propagation, deposition of radionuclides on terrain and spreading through food chains towards human body .Subsequent deposition processes of admixtures and food chain activity transport are modeled. In the final step a hazard estimation based on doses on population is integrated into the software system HARP. Extension to probabilistic approach has increased the complexity substantially, but offers much more informative background for modem methods of estimation accounting for inherent stochastic nature of the problem. Example of probabilistic assessment illustrated here is based on uncertainty analysis of input parameters of SGPM model. Predicted background field of Cs-137 deposition are labelled with index p. as P X SGPM . Final goal is estimation of a certain unknown true background vector χ true , which accounts also for deficiencies of the SGPM formulation in itself insisting in insufficient description of reality. We must have on mind, that even if we know true values of all input parameters θ m true (m= 1 ,..., M) of SGPM model, the χ true still remain uncertain. One possibility how to approach reality insists

  15. Illustration of probabilistic approach in consequence assessment of accidental radioactive releases

    International Nuclear Information System (INIS)

    Pecha, P.; Hofman, R.; Kuca, P.

    2008-01-01

    We are describing a certain application of uncertainty analysis of environmental model HARP applied on atmospheric and deposition sub-model. Simulation of uncertainties propagation through the model is basic inevitable task bringing data for advanced techniques of probabilistic consequence assessment and further improvement of reliability of model predictions based on statistical procedures of assimilation with measured data. The activities are investigated in the institute IITA AV CR within the grant project supported by GACR (2007-2009). The problem is solved in close cooperation with section of information systems in institute NRPI. The subject of investigation concerns evaluation of consequences of radioactivity propagation after an accidental radioactivity release from nuclear facility.Transport of activity is studied from initial atmospheric propagation, deposition of radionuclides on terrain and spreading through food chains towards human body .Subsequent deposition processes of admixtures and food chain activity transport are modeled. In the final step a hazard estimation based on doses on population is integrated into the software system HARP. Extension to probabilistic approach has increased the complexity substantially, but offers much more informative background for modem methods of estimation accounting for inherent stochastic nature of the problem. Example of probabilistic assessment illustrated here is based on uncertainty analysis of input parameters of SGPM model. Predicted background field of Cs-137 deposition are labelled with index p. as P X SGPM . Final goal is estimation of a certain unknown true background vector χ true , which accounts also for deficiencies of the SGPM formulation in itself insisting in insufficient description of reality. We must have on mind, that even if we know true values of all input parameters θ m true (m= 1 ,..., M) of SGPM model, the χ true still remain uncertain. One possibility how to approach reality insists

  16. On covariant Poisson brackets in classical field theory

    International Nuclear Information System (INIS)

    Forger, Michael; Salles, Mário O.

    2015-01-01

    How to give a natural geometric definition of a covariant Poisson bracket in classical field theory has for a long time been an open problem—as testified by the extensive literature on “multisymplectic Poisson brackets,” together with the fact that all these proposals suffer from serious defects. On the other hand, the functional approach does provide a good candidate which has come to be known as the Peierls–De Witt bracket and whose construction in a geometrical setting is now well understood. Here, we show how the basic “multisymplectic Poisson bracket” already proposed in the 1970s can be derived from the Peierls–De Witt bracket, applied to a special class of functionals. This relation allows to trace back most (if not all) of the problems encountered in the past to ambiguities (the relation between differential forms on multiphase space and the functionals they define is not one-to-one) and also to the fact that this class of functionals does not form a Poisson subalgebra

  17. On covariant Poisson brackets in classical field theory

    Energy Technology Data Exchange (ETDEWEB)

    Forger, Michael [Instituto de Matemática e Estatística, Universidade de São Paulo, Caixa Postal 66281, BR–05315-970 São Paulo, SP (Brazil); Salles, Mário O. [Instituto de Matemática e Estatística, Universidade de São Paulo, Caixa Postal 66281, BR–05315-970 São Paulo, SP (Brazil); Centro de Ciências Exatas e da Terra, Universidade Federal do Rio Grande do Norte, Campus Universitário – Lagoa Nova, BR–59078-970 Natal, RN (Brazil)

    2015-10-15

    How to give a natural geometric definition of a covariant Poisson bracket in classical field theory has for a long time been an open problem—as testified by the extensive literature on “multisymplectic Poisson brackets,” together with the fact that all these proposals suffer from serious defects. On the other hand, the functional approach does provide a good candidate which has come to be known as the Peierls–De Witt bracket and whose construction in a geometrical setting is now well understood. Here, we show how the basic “multisymplectic Poisson bracket” already proposed in the 1970s can be derived from the Peierls–De Witt bracket, applied to a special class of functionals. This relation allows to trace back most (if not all) of the problems encountered in the past to ambiguities (the relation between differential forms on multiphase space and the functionals they define is not one-to-one) and also to the fact that this class of functionals does not form a Poisson subalgebra.

  18. Homogeneous Poisson structures

    International Nuclear Information System (INIS)

    Shafei Deh Abad, A.; Malek, F.

    1993-09-01

    We provide an algebraic definition for Schouten product and give a decomposition for any homogenenous Poisson structure in any n-dimensional vector space. A large class of n-homogeneous Poisson structures in R k is also characterized. (author). 4 refs

  19. Simplified probabilistic approach to determine safety factors in deterministic flaw acceptance criteria

    International Nuclear Information System (INIS)

    Barthelet, B.; Ardillon, E.

    1997-01-01

    The flaw acceptance rules in nuclear components rely on deterministic criteria supposed to ensure the safe operating of plants. The interest of having a reliable method of evaluating the safety margins and the integrity of components led Electricite de France to launch a study to link safety factors with requested reliability. A simplified analytical probabilistic approach is developed to analyse the failure risk in Fracture Mechanics. Assuming lognormal distributions of the main random variables, it is possible considering a simple Linear Elastic Fracture Mechanics model, to determine the failure probability as a function of mean values and logarithmic standard deviations. The 'design' failure point can be analytically calculated. Partial safety factors on the main variables (stress, crack size, material toughness) are obtained in relation with reliability target values. The approach is generalized to elastic plastic Fracture Mechanics (piping) by fitting J as a power law function of stress, crack size and yield strength. The simplified approach is validated by detailed probabilistic computations with PROBAN computer program. Assuming reasonable coefficients of variations (logarithmic standard deviations), the method helps to calibrate safety factors for different components taking into account reliability target values in normal, emergency and faulted conditions. Statistical data for the mechanical properties of the main basic materials complement the study. The work involves laboratory results and manufacture data. The results of this study are discussed within a working group of the French in service inspection code RSE-M. (authors)

  20. Approach to modeling of human performance for purposes of probabilistic risk assessment

    International Nuclear Information System (INIS)

    Swain, A.D.

    1983-01-01

    This paper describes the general approach taken in NUREG/CR-1278 to model human performance in sufficienct detail to permit probabilistic risk assessments of nuclear power plant operations. To show the basis for the more specific models in the above NUREG, a simplified model of the human component in man-machine systems is presented, the role of performance shaping factors is discussed, and special problems in modeling the cognitive aspect of behavior are described

  1. An approximate methods approach to probabilistic structural analysis

    Science.gov (United States)

    Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.

    1989-01-01

    A probabilistic structural analysis method (PSAM) is described which makes an approximate calculation of the structural response of a system, including the associated probabilistic distributions, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The method employs the fast probability integration (FPI) algorithm of Wu and Wirsching. Typical solution strategies are illustrated by formulations for a representative critical component chosen from the Space Shuttle Main Engine (SSME) as part of a major NASA-sponsored program on PSAM. Typical results are presented to demonstrate the role of the methodology in engineering design and analysis.

  2. Modifications to POISSON

    International Nuclear Information System (INIS)

    Harwood, L.H.

    1981-01-01

    At MSU we have used the POISSON family of programs extensively for magnetic field calculations. In the presently super-saturated computer situation, reducing the run time for the program is imperative. Thus, a series of modifications have been made to POISSON to speed up convergence. Two of the modifications aim at having the first guess solution as close as possible to the final solution. The other two aim at increasing the convergence rate. In this discussion, a working knowledge of POISSON is assumed. The amount of new code and expected time saving for each modification is discussed

  3. Probabilistic representation of fermionic lattice systems

    International Nuclear Information System (INIS)

    Beccaria, Matteo; Presilla, Carlo; De Angelis, Gian Fabrizio; Jona-Lasinio, Giovanni

    2000-01-01

    We describe an exact Feynman-Kac type formula to represent the dynamics of fermionic lattice systems. In this approach the real time or Euclidean time dynamics is expressed in terms of the stochastic evolution of a collection of Poisson processes. From this formula we derive a family of algorithms for Monte Carlo simulations, parametrized by the jump rates of the Poisson processes

  4. A dynamic probabilistic safety margin characterization approach in support of Integrated Deterministic and Probabilistic Safety Analysis

    International Nuclear Information System (INIS)

    Di Maio, Francesco; Rai, Ajit; Zio, Enrico

    2016-01-01

    The challenge of Risk-Informed Safety Margin Characterization (RISMC) is to develop a methodology for estimating system safety margins in the presence of stochastic and epistemic uncertainties affecting the system dynamic behavior. This is useful to support decision-making for licensing purposes. In the present work, safety margin uncertainties are handled by Order Statistics (OS) (with both Bracketing and Coverage approaches) to jointly estimate percentiles of the distributions of the safety parameter and of the time required for it to reach these percentiles values during its dynamic evolution. The novelty of the proposed approach consists in the integration of dynamic aspects (i.e., timing of events) into the definition of a dynamic safety margin for a probabilistic Quantification of Margin and Uncertainties (QMU). The system here considered for demonstration purposes is the Lead–Bismuth Eutectic- eXperimental Accelerator Driven System (LBE-XADS). - Highlights: • We integrate dynamic aspects into the definition of a safety margins. • We consider stochastic and epistemic uncertainties affecting the system dynamics. • Uncertainties are handled by Order Statistics (OS). • We estimate the system grace time during accidental scenarios. • We apply the approach to an LBE-XADS accidental scenario.

  5. Variational approach to probabilistic finite elements

    Science.gov (United States)

    Belytschko, T.; Liu, W. K.; Mani, A.; Besterfield, G.

    1991-08-01

    Probabilistic finite element methods (PFEM), synthesizing the power of finite element methods with second-moment techniques, are formulated for various classes of problems in structural and solid mechanics. Time-invariant random materials, geometric properties and loads are incorporated in terms of their fundamental statistics viz. second-moments. Analogous to the discretization of the displacement field in finite element methods, the random fields are also discretized. Preserving the conceptual simplicity, the response moments are calculated with minimal computations. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. By construction, these methods are applicable when the scale of randomness is not very large and when the probabilistic density functions have decaying tails. The accuracy and efficiency of these methods, along with their limitations, are demonstrated by various applications. Results obtained are compared with those of Monte Carlo simulation and it is shown that good accuracy can be obtained for both linear and nonlinear problems. The methods are amenable to implementation in deterministic FEM based computer codes.

  6. Tractable approximations for probabilistic models: The adaptive Thouless-Anderson-Palmer mean field approach

    DEFF Research Database (Denmark)

    Opper, Manfred; Winther, Ole

    2001-01-01

    We develop an advanced mean held method for approximating averages in probabilistic data models that is based on the Thouless-Anderson-Palmer (TAP) approach of disorder physics. In contrast to conventional TAP. where the knowledge of the distribution of couplings between the random variables...... is required. our method adapts to the concrete couplings. We demonstrate the validity of our approach, which is so far restricted to models with nonglassy behavior? by replica calculations for a wide class of models as well as by simulations for a real data set....

  7. Non-equal-time Poisson brackets

    OpenAIRE

    Nikolic, H.

    1998-01-01

    The standard definition of the Poisson brackets is generalized to the non-equal-time Poisson brackets. Their relationship to the equal-time Poisson brackets, as well as to the equal- and non-equal-time commutators, is discussed.

  8. Memristive Probabilistic Computing

    KAUST Repository

    Alahmadi, Hamzah

    2017-10-01

    In the era of Internet of Things and Big Data, unconventional techniques are rising to accommodate the large size of data and the resource constraints. New computing structures are advancing based on non-volatile memory technologies and different processing paradigms. Additionally, the intrinsic resiliency of current applications leads to the development of creative techniques in computations. In those applications, approximate computing provides a perfect fit to optimize the energy efficiency while compromising on the accuracy. In this work, we build probabilistic adders based on stochastic memristor. Probabilistic adders are analyzed with respect of the stochastic behavior of the underlying memristors. Multiple adder implementations are investigated and compared. The memristive probabilistic adder provides a different approach from the typical approximate CMOS adders. Furthermore, it allows for a high area saving and design exibility between the performance and power saving. To reach a similar performance level as approximate CMOS adders, the memristive adder achieves 60% of power saving. An image-compression application is investigated using the memristive probabilistic adders with the performance and the energy trade-off.

  9. Branes in Poisson sigma models

    International Nuclear Information System (INIS)

    Falceto, Fernando

    2010-01-01

    In this review we discuss possible boundary conditions (branes) for the Poisson sigma model. We show how to carry out the perturbative quantization in the presence of a general pre-Poisson brane and how this is related to the deformation quantization of Poisson structures. We conclude with an open problem: the perturbative quantization of the system when the boundary has several connected components and we use a different pre-Poisson brane in every component.

  10. Extended Poisson Exponential Distribution

    Directory of Open Access Journals (Sweden)

    Anum Fatima

    2015-09-01

    Full Text Available A new mixture of Modified Exponential (ME and Poisson distribution has been introduced in this paper. Taking the Maximum of Modified Exponential random variable when the sample size follows a zero truncated Poisson distribution we have derived the new distribution, named as Extended Poisson Exponential distribution. This distribution possesses increasing and decreasing failure rates. The Poisson-Exponential, Modified Exponential and Exponential distributions are special cases of this distribution. We have also investigated some mathematical properties of the distribution along with Information entropies and Order statistics of the distribution. The estimation of parameters has been obtained using the Maximum Likelihood Estimation procedure. Finally we have illustrated a real data application of our distribution.

  11. A probabilistic approach to the evaluation of the PTS issue

    International Nuclear Information System (INIS)

    Cheverton, R.D.; Selby, D.L.

    1991-01-01

    An integrated probabilistic approach for the evaluation of the pressurized-thermal-shock (PTS) issue was developed at the Oak Ridge National Laboratory (ORNL) at the request of the Nuclear Regulatory Commission (NRC). The purpose was to provide a method for identifying dominant plant design and operating features, evaluating possible remedial measures and the validity of the NRC PTS screening criteria, and to provide an additional tool for estimating vessel life expectancy. The approach was to be integrated in the sense that it would include the postulation of transients; estimates of their frequencies of occurrence; systems analyses to obtain the corresponding primary-system pressure, down-comer coolant temperature, and fluid-film heat-transfer coefficient adjacent to the vessel wall; and a probabilistic fracture-mechanics analysis, using the latter data as input. A summation of the products of frequency of transient and conditional probability of failure for all postulated transients provides an estimate of frequency of vessel failure. In the process of developing the integrated pressurized-thermal-shock (IPTS) methodology, three specific plant analyses were conducted. The results indicate that the NRC screening criteria may not be appropriate for all US pressurized water reactor (PWR) plants; that is, for some PWRs, the calculated mean frequency of vessel failure corresponding to the screening criteria may be greater than the maximum permissible value in Regulatory Guide 1.154. A recent view of the ORNL IPTS study, which was completed in 1985, indicates that there are a number of areas in which the methodology can and should be updated, but it is not clear whether the update will increase or decrease the calculated probabilities. 31 refs., 2 tabs

  12. Poisson branching point processes

    International Nuclear Information System (INIS)

    Matsuo, K.; Teich, M.C.; Saleh, B.E.A.

    1984-01-01

    We investigate the statistical properties of a special branching point process. The initial process is assumed to be a homogeneous Poisson point process (HPP). The initiating events at each branching stage are carried forward to the following stage. In addition, each initiating event independently contributes a nonstationary Poisson point process (whose rate is a specified function) located at that point. The additional contributions from all points of a given stage constitute a doubly stochastic Poisson point process (DSPP) whose rate is a filtered version of the initiating point process at that stage. The process studied is a generalization of a Poisson branching process in which random time delays are permitted in the generation of events. Particular attention is given to the limit in which the number of branching stages is infinite while the average number of added events per event of the previous stage is infinitesimal. In the special case when the branching is instantaneous this limit of continuous branching corresponds to the well-known Yule--Furry process with an initial Poisson population. The Poisson branching point process provides a useful description for many problems in various scientific disciplines, such as the behavior of electron multipliers, neutron chain reactions, and cosmic ray showers

  13. A Probabilistic Approach for Robustness Evaluation of Timber Structures

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Sørensen, John Dalsgaard

    of Structures and a probabilistic modelling of the timber material proposed in the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS). Due to the framework in the Danish Code the timber structure has to be evaluated with respect to the following criteria where at least one shall...... to criteria a) and b) the timber frame structure has one column with a reliability index a bit lower than an assumed target level. By removal three columns one by one no significant extensive failure of the entire structure or significant parts of it are obatined. Therefore the structure can be considered......A probabilistic based robustness analysis has been performed for a glulam frame structure supporting the roof over the main court in a Norwegian sports centre. The robustness analysis is based on the framework for robustness analysis introduced in the Danish Code of Practice for the Safety...

  14. Determining reserve requirements in DK1 area of Nord Pool using a probabilistic approach

    DEFF Research Database (Denmark)

    Saez Gallego, Javier; Morales González, Juan Miguel; Madsen, Henrik

    2014-01-01

    a probabilistic framework where the reserve requirements are computed based on scenarios of wind power forecast error, load forecast errors and power plant outages. Our approach is first motivated by the increasing wind power penetration in power systems worldwide as well as the current market design of the DK1...... System Operator). © 2014 Elsevier Ltd. All rights reserved....

  15. A framework for the probabilistic analysis of meteotsunamis

    Science.gov (United States)

    Geist, Eric L.; ten Brink, Uri S.; Gove, Matthew D.

    2014-01-01

    A probabilistic technique is developed to assess the hazard from meteotsunamis. Meteotsunamis are unusual sea-level events, generated when the speed of an atmospheric pressure or wind disturbance is comparable to the phase speed of long waves in the ocean. A general aggregation equation is proposed for the probabilistic analysis, based on previous frameworks established for both tsunamis and storm surges, incorporating different sources and source parameters of meteotsunamis. Parameterization of atmospheric disturbances and numerical modeling is performed for the computation of maximum meteotsunami wave amplitudes near the coast. A historical record of pressure disturbances is used to establish a continuous analytic distribution of each parameter as well as the overall Poisson rate of occurrence. A demonstration study is presented for the northeast U.S. in which only isolated atmospheric pressure disturbances from squall lines and derechos are considered. For this study, Automated Surface Observing System stations are used to determine the historical parameters of squall lines from 2000 to 2013. The probabilistic equations are implemented using a Monte Carlo scheme, where a synthetic catalog of squall lines is compiled by sampling the parameter distributions. For each entry in the catalog, ocean wave amplitudes are computed using a numerical hydrodynamic model. Aggregation of the results from the Monte Carlo scheme results in a meteotsunami hazard curve that plots the annualized rate of exceedance with respect to maximum event amplitude for a particular location along the coast. Results from using multiple synthetic catalogs, resampled from the parent parameter distributions, yield mean and quantile hazard curves. Further refinements and improvements for probabilistic analysis of meteotsunamis are discussed.

  16. 77 FR 29391 - An Approach for Probabilistic Risk Assessment in Risk-Informed Decisions on Plant-Specific...

    Science.gov (United States)

    2012-05-17

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0110] An Approach for Probabilistic Risk Assessment in Risk-Informed Decisions on Plant-Specific Changes to the Licensing Basis AGENCY: Nuclear Regulatory Commission. ACTION: Draft regulatory guide; request for comment. SUMMARY: The U.S. Nuclear Regulatory...

  17. Arbitrage and Hedging in a non probabilistic framework

    OpenAIRE

    Alvarez, Alexander; Ferrando, Sebastian; Olivares, Pablo

    2011-01-01

    The paper studies the concepts of hedging and arbitrage in a non probabilistic framework. It provides conditions for non probabilistic arbitrage based on the topological structure of the trajectory space and makes connections with the usual notion of arbitrage. Several examples illustrate the non probabilistic arbitrage as well perfect replication of options under continuous and discontinuous trajectories, the results can then be applied in probabilistic models path by path. The approach is r...

  18. On (co)homology of Frobenius Poisson algebras

    OpenAIRE

    Zhu, Can; Van Oystaeyen, Fred; ZHANG, Yinhuo

    2014-01-01

    In this paper, we study Poisson (co)homology of a Frobenius Poisson algebra. More precisely, we show that there exists a duality between Poisson homology and Poisson cohomology of Frobenius Poisson algebras, similar to that between Hochschild homology and Hochschild cohomology of Frobenius algebras. Then we use the non-degenerate bilinear form on a unimodular Frobenius Poisson algebra to construct a Batalin-Vilkovisky structure on the Poisson cohomology ring making it into a Batalin-Vilkovisk...

  19. Experiencing a probabilistic approach to clarify and disclose uncertainties when setting occupational exposure limits.

    Science.gov (United States)

    Vernez, David; Fraize-Frontier, Sandrine; Vincent, Raymond; Binet, Stéphane; Rousselle, Christophe

    2018-03-15

    Assessment factors (AFs) are commonly used for deriving reference concentrations for chemicals. These factors take into account variabilities as well as uncertainties in the dataset, such as inter-species and intra-species variabilities or exposure duration extrapolation or extrapolation from the lowest-observed-adverse-effect level (LOAEL) to the noobserved- adverse-effect level (NOAEL). In a deterministic approach, the value of an AF is the result of a debate among experts and, often a conservative value is used as a default choice. A probabilistic framework to better take into account uncertainties and/or variability when setting occupational exposure limits (OELs) is presented and discussed in this paper. Each AF is considered as a random variable with a probabilistic distribution. A short literature was conducted before setting default distributions ranges and shapes for each AF commonly used. A random sampling, using Monte Carlo techniques, is then used for propagating the identified uncertainties and computing the final OEL distribution. Starting from the broad default distributions obtained, experts narrow it to its most likely range, according to the scientific knowledge available for a specific chemical. Introducing distribution rather than single deterministic values allows disclosing and clarifying variability and/or uncertainties inherent to the OEL construction process. This probabilistic approach yields quantitative insight into both the possible range and the relative likelihood of values for model outputs. It thereby provides a better support in decision-making and improves transparency. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.

  20. Probabilistic Structural Analysis Theory Development

    Science.gov (United States)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  1. Normal forms in Poisson geometry

    NARCIS (Netherlands)

    Marcut, I.T.

    2013-01-01

    The structure of Poisson manifolds is highly nontrivial even locally. The first important result in this direction is Conn's linearization theorem around fixed points. One of the main results of this thesis (Theorem 2) is a normal form theorem in Poisson geometry, which is the Poisson-geometric

  2. Survey on application of probabilistic fracture mechanics approach to nuclear piping

    International Nuclear Information System (INIS)

    Kashima, Koichi

    1987-01-01

    Probabilistic fracture mechanics (PFM) approach is newly developed as one of the tools to evaluate the structural integrity of nuclear components. This report describes the current status of PFM studies for pressure vessel and piping system in light water reactors and focuses on the investigations of the piping failure probability which have been undertaken by USNRC. USNRC reevaluates the double-ended guillotine break (DEGB) of rector coolant piping as a design basis event for nuclear power plant by using the PFM approach. For PWR piping systems designed by Westinghouse, two causes of pipe break are considered: pipe failure due to the crack growth and pipe failure indirectly caused by failure of component supports due to an earthquake. PFM approach shows that the probability of DEGB from either cause is very low and that the effect of earthquake on pipe failure can be neglected. (author)

  3. Characterizing the performance of the Conway-Maxwell Poisson generalized linear model.

    Science.gov (United States)

    Francis, Royce A; Geedipally, Srinivas Reddy; Guikema, Seth D; Dhavala, Soma Sekhar; Lord, Dominique; LaRocca, Sarah

    2012-01-01

    Count data are pervasive in many areas of risk analysis; deaths, adverse health outcomes, infrastructure system failures, and traffic accidents are all recorded as count events, for example. Risk analysts often wish to estimate the probability distribution for the number of discrete events as part of doing a risk assessment. Traditional count data regression models of the type often used in risk assessment for this problem suffer from limitations due to the assumed variance structure. A more flexible model based on the Conway-Maxwell Poisson (COM-Poisson) distribution was recently proposed, a model that has the potential to overcome the limitations of the traditional model. However, the statistical performance of this new model has not yet been fully characterized. This article assesses the performance of a maximum likelihood estimation method for fitting the COM-Poisson generalized linear model (GLM). The objectives of this article are to (1) characterize the parameter estimation accuracy of the MLE implementation of the COM-Poisson GLM, and (2) estimate the prediction accuracy of the COM-Poisson GLM using simulated data sets. The results of the study indicate that the COM-Poisson GLM is flexible enough to model under-, equi-, and overdispersed data sets with different sample mean values. The results also show that the COM-Poisson GLM yields accurate parameter estimates. The COM-Poisson GLM provides a promising and flexible approach for performing count data regression. © 2011 Society for Risk Analysis.

  4. Compositions, Random Sums and Continued Random Fractions of Poisson and Fractional Poisson Processes

    Science.gov (United States)

    Orsingher, Enzo; Polito, Federico

    2012-08-01

    In this paper we consider the relation between random sums and compositions of different processes. In particular, for independent Poisson processes N α ( t), N β ( t), t>0, we have that N_{α}(N_{β}(t)) stackrel{d}{=} sum_{j=1}^{N_{β}(t)} Xj, where the X j s are Poisson random variables. We present a series of similar cases, where the outer process is Poisson with different inner processes. We highlight generalisations of these results where the external process is infinitely divisible. A section of the paper concerns compositions of the form N_{α}(tauk^{ν}), ν∈(0,1], where tauk^{ν} is the inverse of the fractional Poisson process, and we show how these compositions can be represented as random sums. Furthermore we study compositions of the form Θ( N( t)), t>0, which can be represented as random products. The last section is devoted to studying continued fractions of Cauchy random variables with a Poisson number of levels. We evaluate the exact distribution and derive the scale parameter in terms of ratios of Fibonacci numbers.

  5. Modeling animal-vehicle collisions using diagonal inflated bivariate Poisson regression.

    Science.gov (United States)

    Lao, Yunteng; Wu, Yao-Jan; Corey, Jonathan; Wang, Yinhai

    2011-01-01

    Two types of animal-vehicle collision (AVC) data are commonly adopted for AVC-related risk analysis research: reported AVC data and carcass removal data. One issue with these two data sets is that they were found to have significant discrepancies by previous studies. In order to model these two types of data together and provide a better understanding of highway AVCs, this study adopts a diagonal inflated bivariate Poisson regression method, an inflated version of bivariate Poisson regression model, to fit the reported AVC and carcass removal data sets collected in Washington State during 2002-2006. The diagonal inflated bivariate Poisson model not only can model paired data with correlation, but also handle under- or over-dispersed data sets as well. Compared with three other types of models, double Poisson, bivariate Poisson, and zero-inflated double Poisson, the diagonal inflated bivariate Poisson model demonstrates its capability of fitting two data sets with remarkable overlapping portions resulting from the same stochastic process. Therefore, the diagonal inflated bivariate Poisson model provides researchers a new approach to investigating AVCs from a different perspective involving the three distribution parameters (λ(1), λ(2) and λ(3)). The modeling results show the impacts of traffic elements, geometric design and geographic characteristics on the occurrences of both reported AVC and carcass removal data. It is found that the increase of some associated factors, such as speed limit, annual average daily traffic, and shoulder width, will increase the numbers of reported AVCs and carcass removals. Conversely, the presence of some geometric factors, such as rolling and mountainous terrain, will decrease the number of reported AVCs. Published by Elsevier Ltd.

  6. Transforming spatial point processes into Poisson processes using random superposition

    DEFF Research Database (Denmark)

    Møller, Jesper; Berthelsen, Kasper Klitgaaard

    with a complementary spatial point process Y  to obtain a Poisson process X∪Y  with intensity function β. Underlying this is a bivariate spatial birth-death process (Xt,Yt) which converges towards the distribution of (X,Y). We study the joint distribution of X and Y, and their marginal and conditional distributions....... In particular, we introduce a fast and easy simulation procedure for Y conditional on X. This may be used for model checking: given a model for the Papangelou intensity of the original spatial point process, this model is used to generate the complementary process, and the resulting superposition is a Poisson...... process with intensity function β if and only if the true Papangelou intensity is used. Whether the superposition is actually such a Poisson process can easily be examined using well known results and fast simulation procedures for Poisson processes. We illustrate this approach to model checking...

  7. Probabilistic approach in treatment of deterministic analyses results of severe accidents

    International Nuclear Information System (INIS)

    Krajnc, B.; Mavko, B.

    1996-01-01

    Severe accidents sequences resulting in loss of the core geometric integrity have been found to have small probability of the occurrence. Because of their potential consequences to public health and safety, an evaluation of the core degradation progression and the resulting effects on the containment is necessary to determine the probability of a significant release of radioactive materials. This requires assessment of many interrelated phenomena including: steel and zircaloy oxidation, steam spikes, in-vessel debris cooling, potential vessel failure mechanisms, release of core material to the containment, containment pressurization from steam generation, or generation of non-condensable gases or hydrogen burn, and ultimately coolability of degraded core material. To asses the answer from the containment event trees in the sense of weather certain phenomenological event would happen or not the plant specific deterministic analyses should be performed. Due to the fact that there is a large uncertainty in the prediction of severe accidents phenomena in Level 2 analyses (containment event trees) the combination of probabilistic and deterministic approach should be used. In fact the result of the deterministic analyses of severe accidents are treated in probabilistic manner due to large uncertainty of results as a consequence of a lack of detailed knowledge. This paper discusses approach used in many IPEs, and which assures that the assigned probability for certain question in the event tree represent the probability that the event will or will not happen and that this probability also includes its uncertainty, which is mainly result of lack of knowledge. (author)

  8. Cellular solutions for the Poisson equation in extended systems

    International Nuclear Information System (INIS)

    Zhang, X.; Butler, W.H.; MacLaren, J.M.; van Ek, J.

    1994-01-01

    The Poisson equation for the electrostatic potential in a solid is solved using three different cellular techniques. The relative merits of these different approaches are discussed for two test charge densities for which an analytic solution to the Poisson equation is known. The first approach uses full-cell multiple-scattering theory and results in the famililar structure constant and multipole moment expansion. This solution is shown to be valid everywhere inside the cell, although for points outside the muffin-tin sphere but inside the cell the sums must be performed in the correct order to yield meaningful results. A modification of the multiple-scattering-theory approach yields a second method, a Green-function cellular method, which only requires the solution of a nearest-neighbor linear system of equations. A third approach, a related variational cellular method, is also derived. The variational cellular approach is shown to be the most accurate and reliable, and to have the best convergence in angular momentum of the three methods. Coulomb energies accurate to within 10 -6 hartree are easily achieved with the variational cellular approach, demonstrating the practicality of the approach in electronic structure calculations

  9. A twisted generalization of Novikov-Poisson algebras

    OpenAIRE

    Yau, Donald

    2010-01-01

    Hom-Novikov-Poisson algebras, which are twisted generalizations of Novikov-Poisson algebras, are studied. Hom-Novikov-Poisson algebras are shown to be closed under tensor products and several kinds of twistings. Necessary and sufficient conditions are given under which Hom-Novikov-Poisson algebras give rise to Hom-Poisson algebras.

  10. Poisson hierarchy of discrete strings

    International Nuclear Information System (INIS)

    Ioannidou, Theodora; Niemi, Antti J.

    2016-01-01

    The Poisson geometry of a discrete string in three dimensional Euclidean space is investigated. For this the Frenet frames are converted into a spinorial representation, the discrete spinor Frenet equation is interpreted in terms of a transfer matrix formalism, and Poisson brackets are introduced in terms of the spinor components. The construction is then generalised, in a self-similar manner, into an infinite hierarchy of Poisson algebras. As an example, the classical Virasoro (Witt) algebra that determines reparametrisation diffeomorphism along a continuous string, is identified as a particular sub-algebra, in the hierarchy of the discrete string Poisson algebra. - Highlights: • Witt (classical Virasoro) algebra is derived in the case of discrete string. • Infinite dimensional hierarchy of Poisson bracket algebras is constructed for discrete strings. • Spinor representation of discrete Frenet equations is developed.

  11. Poisson hierarchy of discrete strings

    Energy Technology Data Exchange (ETDEWEB)

    Ioannidou, Theodora, E-mail: ti3@auth.gr [Faculty of Civil Engineering, School of Engineering, Aristotle University of Thessaloniki, 54249, Thessaloniki (Greece); Niemi, Antti J., E-mail: Antti.Niemi@physics.uu.se [Department of Physics and Astronomy, Uppsala University, P.O. Box 803, S-75108, Uppsala (Sweden); Laboratoire de Mathematiques et Physique Theorique CNRS UMR 6083, Fédération Denis Poisson, Université de Tours, Parc de Grandmont, F37200, Tours (France); Department of Physics, Beijing Institute of Technology, Haidian District, Beijing 100081 (China)

    2016-01-28

    The Poisson geometry of a discrete string in three dimensional Euclidean space is investigated. For this the Frenet frames are converted into a spinorial representation, the discrete spinor Frenet equation is interpreted in terms of a transfer matrix formalism, and Poisson brackets are introduced in terms of the spinor components. The construction is then generalised, in a self-similar manner, into an infinite hierarchy of Poisson algebras. As an example, the classical Virasoro (Witt) algebra that determines reparametrisation diffeomorphism along a continuous string, is identified as a particular sub-algebra, in the hierarchy of the discrete string Poisson algebra. - Highlights: • Witt (classical Virasoro) algebra is derived in the case of discrete string. • Infinite dimensional hierarchy of Poisson bracket algebras is constructed for discrete strings. • Spinor representation of discrete Frenet equations is developed.

  12. Four-dimensional gravity as an almost-Poisson system

    Science.gov (United States)

    Ita, Eyo Eyo

    2015-04-01

    In this paper, we examine the phase space structure of a noncanonical formulation of four-dimensional gravity referred to as the Instanton representation of Plebanski gravity (IRPG). The typical Hamiltonian (symplectic) approach leads to an obstruction to the definition of a symplectic structure on the full phase space of the IRPG. We circumvent this obstruction, using the Lagrange equations of motion, to find the appropriate generalization of the Poisson bracket. It is shown that the IRPG does not support a Poisson bracket except on the vector constraint surface. Yet there exists a fundamental bilinear operation on its phase space which produces the correct equations of motion and induces the correct transformation properties of the basic fields. This bilinear operation is known as the almost-Poisson bracket, which fails to satisfy the Jacobi identity and in this case also the condition of antisymmetry. We place these results into the overall context of nonsymplectic systems.

  13. Probabilistic approaches to accounting for data variability in the practical application of bioavailability in predicting aquatic risks from metals.

    Science.gov (United States)

    Ciffroy, Philippe; Charlatchka, Rayna; Ferreira, Daniel; Marang, Laura

    2013-07-01

    The biotic ligand model (BLM) theoretically enables the derivation of environmental quality standards that are based on true bioavailable fractions of metals. Several physicochemical variables (especially pH, major cations, dissolved organic carbon, and dissolved metal concentrations) must, however, be assigned to run the BLM, but they are highly variable in time and space in natural systems. This article describes probabilistic approaches for integrating such variability during the derivation of risk indexes. To describe each variable using a probability density function (PDF), several methods were combined to 1) treat censored data (i.e., data below the limit of detection), 2) incorporate the uncertainty of the solid-to-liquid partitioning of metals, and 3) detect outliers. From a probabilistic perspective, 2 alternative approaches that are based on log-normal and Γ distributions were tested to estimate the probability of the predicted environmental concentration (PEC) exceeding the predicted non-effect concentration (PNEC), i.e., p(PEC/PNEC>1). The probabilistic approach was tested on 4 real-case studies based on Cu-related data collected from stations on the Loire and Moselle rivers. The approach described in this article is based on BLM tools that are freely available for end-users (i.e., the Bio-Met software) and on accessible statistical data treatments. This approach could be used by stakeholders who are involved in risk assessments of metals for improving site-specific studies. Copyright © 2013 SETAC.

  14. A probabilistic approach to crack instability

    Science.gov (United States)

    Chudnovsky, A.; Kunin, B.

    1989-01-01

    A probabilistic model of brittle fracture is examined with reference to two-dimensional problems. The model is illustrated by using experimental data obtained for 25 macroscopically identical specimens made of short-fiber-reinforced composites. It is shown that the model proposed here provides a predictive formalism for the probability distributions of critical crack depth, critical loads, and crack arrest depths. It also provides similarity criteria for small-scale testing.

  15. A fourth order PDE based fuzzy c- means approach for segmentation of microscopic biopsy images in presence of Poisson noise for cancer detection.

    Science.gov (United States)

    Kumar, Rajesh; Srivastava, Subodh; Srivastava, Rajeev

    2017-07-01

    For cancer detection from microscopic biopsy images, image segmentation step used for segmentation of cells and nuclei play an important role. Accuracy of segmentation approach dominate the final results. Also the microscopic biopsy images have intrinsic Poisson noise and if it is present in the image the segmentation results may not be accurate. The objective is to propose an efficient fuzzy c-means based segmentation approach which can also handle the noise present in the image during the segmentation process itself i.e. noise removal and segmentation is combined in one step. To address the above issues, in this paper a fourth order partial differential equation (FPDE) based nonlinear filter adapted to Poisson noise with fuzzy c-means segmentation method is proposed. This approach is capable of effectively handling the segmentation problem of blocky artifacts while achieving good tradeoff between Poisson noise removals and edge preservation of the microscopic biopsy images during segmentation process for cancer detection from cells. The proposed approach is tested on breast cancer microscopic biopsy data set with region of interest (ROI) segmented ground truth images. The microscopic biopsy data set contains 31 benign and 27 malignant images of size 896 × 768. The region of interest selected ground truth of all 58 images are also available for this data set. Finally, the result obtained from proposed approach is compared with the results of popular segmentation algorithms; fuzzy c-means, color k-means, texture based segmentation, and total variation fuzzy c-means approaches. The experimental results shows that proposed approach is providing better results in terms of various performance measures such as Jaccard coefficient, dice index, Tanimoto coefficient, area under curve, accuracy, true positive rate, true negative rate, false positive rate, false negative rate, random index, global consistency error, and variance of information as compared to other

  16. Quantization of the Poisson SU(2) and its Poisson homogeneous space - the 2-sphere

    International Nuclear Information System (INIS)

    Sheu, A.J.L.

    1991-01-01

    We show that deformation quantizations of the Poisson structures on the Poisson Lie group SU(2) and its homogeneous space, the 2-sphere, are compatible with Woronowicz's deformation quantization of SU(2)'s group structure and Podles' deformation quantization of 2-sphere's homogeneous structure, respectively. So in a certain sense the multiplicativity of the Lie Poisson structure on SU(2) at the classical level is preserved under quantization. (orig.)

  17. Cumulative Poisson Distribution Program

    Science.gov (United States)

    Bowerman, Paul N.; Scheuer, Ernest M.; Nolty, Robert

    1990-01-01

    Overflow and underflow in sums prevented. Cumulative Poisson Distribution Program, CUMPOIS, one of two computer programs that make calculations involving cumulative Poisson distributions. Both programs, CUMPOIS (NPO-17714) and NEWTPOIS (NPO-17715), used independently of one another. CUMPOIS determines cumulative Poisson distribution, used to evaluate cumulative distribution function (cdf) for gamma distributions with integer shape parameters and cdf for X (sup2) distributions with even degrees of freedom. Used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. Written in C.

  18. Nonlinear Poisson equation for heterogeneous media.

    Science.gov (United States)

    Hu, Langhua; Wei, Guo-Wei

    2012-08-22

    The Poisson equation is a widely accepted model for electrostatic analysis. However, the Poisson equation is derived based on electric polarizations in a linear, isotropic, and homogeneous dielectric medium. This article introduces a nonlinear Poisson equation to take into consideration of hyperpolarization effects due to intensive charges and possible nonlinear, anisotropic, and heterogeneous media. Variational principle is utilized to derive the nonlinear Poisson model from an electrostatic energy functional. To apply the proposed nonlinear Poisson equation for the solvation analysis, we also construct a nonpolar solvation energy functional based on the nonlinear Poisson equation by using the geometric measure theory. At a fixed temperature, the proposed nonlinear Poisson theory is extensively validated by the electrostatic analysis of the Kirkwood model and a set of 20 proteins, and the solvation analysis of a set of 17 small molecules whose experimental measurements are also available for a comparison. Moreover, the nonlinear Poisson equation is further applied to the solvation analysis of 21 compounds at different temperatures. Numerical results are compared to theoretical prediction, experimental measurements, and those obtained from other theoretical methods in the literature. A good agreement between our results and experimental data as well as theoretical results suggests that the proposed nonlinear Poisson model is a potentially useful model for electrostatic analysis involving hyperpolarization effects. Copyright © 2012 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  19. Probabilistic Design Analysis (PDA) Approach to Determine the Probability of Cross-System Failures for a Space Launch Vehicle

    Science.gov (United States)

    Shih, Ann T.; Lo, Yunnhon; Ward, Natalie C.

    2010-01-01

    Quantifying the probability of significant launch vehicle failure scenarios for a given design, while still in the design process, is critical to mission success and to the safety of the astronauts. Probabilistic risk assessment (PRA) is chosen from many system safety and reliability tools to verify the loss of mission (LOM) and loss of crew (LOC) requirements set by the NASA Program Office. To support the integrated vehicle PRA, probabilistic design analysis (PDA) models are developed by using vehicle design and operation data to better quantify failure probabilities and to better understand the characteristics of a failure and its outcome. This PDA approach uses a physics-based model to describe the system behavior and response for a given failure scenario. Each driving parameter in the model is treated as a random variable with a distribution function. Monte Carlo simulation is used to perform probabilistic calculations to statistically obtain the failure probability. Sensitivity analyses are performed to show how input parameters affect the predicted failure probability, providing insight for potential design improvements to mitigate the risk. The paper discusses the application of the PDA approach in determining the probability of failure for two scenarios from the NASA Ares I project

  20. A probabilistic Poisson-based model accounts for an extensive set of absolute auditory threshold measurements.

    Science.gov (United States)

    Heil, Peter; Matysiak, Artur; Neubauer, Heinrich

    2017-09-01

    Thresholds for detecting sounds in quiet decrease with increasing sound duration in every species studied. The neural mechanisms underlying this trade-off, often referred to as temporal integration, are not fully understood. Here, we probe the human auditory system with a large set of tone stimuli differing in duration, shape of the temporal amplitude envelope, duration of silent gaps between bursts, and frequency. Duration was varied by varying the plateau duration of plateau-burst (PB) stimuli, the duration of the onsets and offsets of onset-offset (OO) stimuli, and the number of identical bursts of multiple-burst (MB) stimuli. Absolute thresholds for a large number of ears (>230) were measured using a 3-interval-3-alternative forced choice (3I-3AFC) procedure. Thresholds decreased with increasing sound duration in a manner that depended on the temporal envelope. Most commonly, thresholds for MB stimuli were highest followed by thresholds for OO and PB stimuli of corresponding durations. Differences in the thresholds for MB and OO stimuli and in the thresholds for MB and PB stimuli, however, varied widely across ears, were negative in some ears, and were tightly correlated. We show that the variation and correlation of MB-OO and MB-PB threshold differences are linked to threshold microstructure, which affects the relative detectability of the sidebands of the MB stimuli and affects estimates of the bandwidth of auditory filters. We also found that thresholds for MB stimuli increased with increasing duration of the silent gaps between bursts. We propose a new model and show that it accurately accounts for our results and does so considerably better than a leaky-integrator-of-intensity model and a probabilistic model proposed by others. Our model is based on the assumption that sensory events are generated by a Poisson point process with a low rate in the absence of stimulation and higher, time-varying rates in the presence of stimulation. A subject in a 3I-3AFC

  1. Probabilistic methods used in NUSS

    International Nuclear Information System (INIS)

    Fischer, J.; Giuliani, P.

    1985-01-01

    Probabilistic considerations are used implicitly or explicitly in all technical areas. In the NUSS codes and guides the two areas of design and siting are those where more use is made of these concepts. A brief review of the relevant documents in these two areas is made in this paper. It covers the documents where either probabilistic considerations are implied or where probabilistic approaches are recommended in the evaluation of situations and of events. In the siting guides the review mainly covers the area of seismic hydrological and external man-made events analysis, as well as some aspects of meteorological extreme events analysis. Probabilistic methods are recommended in the design guides but they are not made a requirement. There are several reasons for this, mainly lack of reliable data and the absence of quantitative safety limits or goals against which to judge the design analysis. As far as practical, engineering judgement should be backed up by quantitative probabilistic analysis. Examples are given and the concept of design basis as used in NUSS design guides is explained. (author)

  2. Poisson's ratio of fiber-reinforced composites

    Science.gov (United States)

    Christiansson, Henrik; Helsing, Johan

    1996-05-01

    Poisson's ratio flow diagrams, that is, the Poisson's ratio versus the fiber fraction, are obtained numerically for hexagonal arrays of elastic circular fibers in an elastic matrix. High numerical accuracy is achieved through the use of an interface integral equation method. Questions concerning fixed point theorems and the validity of existing asymptotic relations are investigated and partially resolved. Our findings for the transverse effective Poisson's ratio, together with earlier results for random systems by other authors, make it possible to formulate a general statement for Poisson's ratio flow diagrams: For composites with circular fibers and where the phase Poisson's ratios are equal to 1/3, the system with the lowest stiffness ratio has the highest Poisson's ratio. For other choices of the elastic moduli for the phases, no simple statement can be made.

  3. Comparison of a Traditional Probabilistic Risk Assessment Approach with Advanced Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L; Mandelli, Diego; Zhegang Ma

    2014-11-01

    As part of the Light Water Sustainability Program (LWRS) [1], the purpose of the Risk Informed Safety Margin Characterization (RISMC) [2] Pathway research and development (R&D) is to support plant decisions for risk-informed margin management with the aim to improve economics, reliability, and sustain safety of current NPPs. In this paper, we describe the RISMC analysis process illustrating how mechanistic and probabilistic approaches are combined in order to estimate a safety margin. We use the scenario of a “station blackout” (SBO) wherein offsite power and onsite power is lost, thereby causing a challenge to plant safety systems. We describe the RISMC approach, illustrate the station blackout modeling, and contrast this with traditional risk analysis modeling for this type of accident scenario. We also describe our approach we are using to represent advanced flooding analysis.

  4. Deterministic and probabilistic approach to determine seismic risk of nuclear power plants; a practical example

    International Nuclear Information System (INIS)

    Soriano Pena, A.; Lopez Arroyo, A.; Roesset, J.M.

    1976-01-01

    The probabilistic and deterministic approaches for calculating the seismic risk of nuclear power plants are both applied to a particular case in Southern Spain. The results obtained by both methods, when varying the input data, are presented and some conclusions drawn in relation to the applicability of the methods, their reliability and their sensitivity to change

  5. Solving stochastic multiobjective vehicle routing problem using probabilistic metaheuristic

    Directory of Open Access Journals (Sweden)

    Gannouni Asmae

    2017-01-01

    closed form expression. This novel approach is based on combinatorial probability and can be incorporated in a multiobjective evolutionary algorithm. (iiProvide probabilistic approaches to elitism and diversification in multiobjective evolutionary algorithms. Finally, The behavior of the resulting Probabilistic Multi-objective Evolutionary Algorithms (PrMOEAs is empirically investigated on the multi-objective stochastic VRP problem.

  6. Probabilistic machine learning and artificial intelligence.

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  7. Probabilistic machine learning and artificial intelligence

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  8. Statistical shape analysis using 3D Poisson equation--A quantitatively validated approach.

    Science.gov (United States)

    Gao, Yi; Bouix, Sylvain

    2016-05-01

    Statistical shape analysis has been an important area of research with applications in biology, anatomy, neuroscience, agriculture, paleontology, etc. Unfortunately, the proposed methods are rarely quantitatively evaluated, and as shown in recent studies, when they are evaluated, significant discrepancies exist in their outputs. In this work, we concentrate on the problem of finding the consistent location of deformation between two population of shapes. We propose a new shape analysis algorithm along with a framework to perform a quantitative evaluation of its performance. Specifically, the algorithm constructs a Signed Poisson Map (SPoM) by solving two Poisson equations on the volumetric shapes of arbitrary topology, and statistical analysis is then carried out on the SPoMs. The method is quantitatively evaluated on synthetic shapes and applied on real shape data sets in brain structures. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Nonparametric Bayesian inference for multidimensional compound Poisson processes

    NARCIS (Netherlands)

    Gugushvili, S.; van der Meulen, F.; Spreij, P.

    2015-01-01

    Given a sample from a discretely observed multidimensional compound Poisson process, we study the problem of nonparametric estimation of its jump size density r0 and intensity λ0. We take a nonparametric Bayesian approach to the problem and determine posterior contraction rates in this context,

  10. Probabilistic costing of transmission services

    International Nuclear Information System (INIS)

    Wijayatunga, P.D.C.

    1992-01-01

    Costing of transmission services of electrical utilities is required for transactions involving the transport of energy over a power network. The calculation of these costs based on Short Run Marginal Costing (SRMC) is preferred over other methods proposed in the literature due to its economic efficiency. In the research work discussed here, the concept of probabilistic costing of use-of-system based on SRMC which emerges as a consequence of the uncertainties in a power system is introduced using two different approaches. The first approach, based on the Monte Carlo method, generates a large number of possible system states by simulating random variables in the system using pseudo random number generators. A second approach to probabilistic use-of-system costing is proposed based on numerical convolution and multi-area representation of the transmission network. (UK)

  11. A generalized right truncated bivariate Poisson regression model with applications to health data.

    Science.gov (United States)

    Islam, M Ataharul; Chowdhury, Rafiqul I

    2017-01-01

    A generalized right truncated bivariate Poisson regression model is proposed in this paper. Estimation and tests for goodness of fit and over or under dispersion are illustrated for both untruncated and right truncated bivariate Poisson regression models using marginal-conditional approach. Estimation and test procedures are illustrated for bivariate Poisson regression models with applications to Health and Retirement Study data on number of health conditions and the number of health care services utilized. The proposed test statistics are easy to compute and it is evident from the results that the models fit the data very well. A comparison between the right truncated and untruncated bivariate Poisson regression models using the test for nonnested models clearly shows that the truncated model performs significantly better than the untruncated model.

  12. Probabilistic, meso-scale flood loss modelling

    Science.gov (United States)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  13. Assessing dynamic postural control during exergaming in older adults: A probabilistic approach.

    Science.gov (United States)

    Soancatl Aguilar, V; Lamoth, C J C; Maurits, N M; Roerdink, J B T M

    2018-02-01

    Digital games controlled by body movements (exergames) have been proposed as a way to improve postural control among older adults. Exergames are meant to be played at home in an unsupervised way. However, only few studies have investigated the effect of unsupervised home-exergaming on postural control. Moreover, suitable methods to dynamically assess postural control during exergaming are still scarce. Dynamic postural control (DPC) assessment could be used to provide both meaningful feedback and automatic adjustment of exergame difficulty. These features could potentially foster unsupervised exergaming at home and improve the effectiveness of exergames as tools to improve balance control. The main aim of this study is to investigate the effect of six weeks of unsupervised home-exergaming on DPC as assessed by a recently developed probabilistic model. High probability values suggest 'deteriorated' postural control, whereas low probability values suggest 'good' postural control. In a pilot study, ten healthy older adults (average 77.9, SD 7.2 years) played an ice-skating exergame at home half an hour per day, three times a week during six weeks. The intervention effect on DPC was assessed using exergaming trials recorded by Kinect at baseline and every other week. Visualization of the results suggests that the probabilistic model is suitable for real-time DPC assessment. Moreover, linear mixed model analysis and parametric bootstrapping suggest a significant intervention effect on DPC. In conclusion, these results suggest that unsupervised exergaming for improving DPC among older adults is indeed feasible and that probabilistic models could be a new approach to assess DPC. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Wavelets, ridgelets, and curvelets for Poisson noise removal.

    Science.gov (United States)

    Zhang, Bo; Fadili, Jalal M; Starck, Jean-Luc

    2008-07-01

    In order to denoise Poisson count data, we introduce a variance stabilizing transform (VST) applied on a filtered discrete Poisson process, yielding a near Gaussian process with asymptotic constant variance. This new transform, which can be deemed as an extension of the Anscombe transform to filtered data, is simple, fast, and efficient in (very) low-count situations. We combine this VST with the filter banks of wavelets, ridgelets and curvelets, leading to multiscale VSTs (MS-VSTs) and nonlinear decomposition schemes. By doing so, the noise-contaminated coefficients of these MS-VST-modified transforms are asymptotically normally distributed with known variances. A classical hypothesis-testing framework is adopted to detect the significant coefficients, and a sparsity-driven iterative scheme reconstructs properly the final estimate. A range of examples show the power of this MS-VST approach for recovering important structures of various morphologies in (very) low-count images. These results also demonstrate that the MS-VST approach is competitive relative to many existing denoising methods.

  15. Coordination of Conditional Poisson Samples

    Directory of Open Access Journals (Sweden)

    Grafström Anton

    2015-12-01

    Full Text Available Sample coordination seeks to maximize or to minimize the overlap of two or more samples. The former is known as positive coordination, and the latter as negative coordination. Positive coordination is mainly used for estimation purposes and to reduce data collection costs. Negative coordination is mainly performed to diminish the response burden of the sampled units. Poisson sampling design with permanent random numbers provides an optimum coordination degree of two or more samples. The size of a Poisson sample is, however, random. Conditional Poisson (CP sampling is a modification of the classical Poisson sampling that produces a fixed-size πps sample. We introduce two methods to coordinate Conditional Poisson samples over time or simultaneously. The first one uses permanent random numbers and the list-sequential implementation of CP sampling. The second method uses a CP sample in the first selection and provides an approximate one in the second selection because the prescribed inclusion probabilities are not respected exactly. The methods are evaluated using the size of the expected sample overlap, and are compared with their competitors using Monte Carlo simulation. The new methods provide a good coordination degree of two samples, close to the performance of Poisson sampling with permanent random numbers.

  16. A shortened version of the THERP/Handbook approach to human reliability analysis for probabilistic risk assessment

    International Nuclear Information System (INIS)

    Swain, A.D.

    1986-01-01

    The approach to human reliability analysis (HRA) known as THERP/Handbook has been applied to several probabilistic risk assessments (PRAs) of nuclear power plants (NPPs) and other complex systems. The approach is based on a thorough task analysis of the man-machine interfaces, including the interactions among the people, involved in the operations being assessed. The idea is to assess fully the underlying performance shaping factors (PSFs) and dependence effects which result either in reliable or unreliable human performance

  17. Identification of probabilistic approaches and map-based navigation ...

    Indian Academy of Sciences (India)

    B Madhevan

    2018-02-07

    Feb 7, 2018 ... consists of three processes: map learning (ML), localization and PP [73–76]. (i) ML ...... [83] Thrun S 2001 A probabilistic online mapping algorithm for teams of .... for target tracking using fuzzy logic controller in game theoretic ...

  18. Poisson regression for modeling count and frequency outcomes in trauma research.

    Science.gov (United States)

    Gagnon, David R; Doron-LaMarca, Susan; Bell, Margret; O'Farrell, Timothy J; Taft, Casey T

    2008-10-01

    The authors describe how the Poisson regression method for analyzing count or frequency outcome variables can be applied in trauma studies. The outcome of interest in trauma research may represent a count of the number of incidents of behavior occurring in a given time interval, such as acts of physical aggression or substance abuse. Traditional linear regression approaches assume a normally distributed outcome variable with equal variances over the range of predictor variables, and may not be optimal for modeling count outcomes. An application of Poisson regression is presented using data from a study of intimate partner aggression among male patients in an alcohol treatment program and their female partners. Results of Poisson regression and linear regression models are compared.

  19. Probabilistic Modeling and Visualization for Bankruptcy Prediction

    DEFF Research Database (Denmark)

    Antunes, Francisco; Ribeiro, Bernardete; Pereira, Francisco Camara

    2017-01-01

    In accounting and finance domains, bankruptcy prediction is of great utility for all of the economic stakeholders. The challenge of accurate assessment of business failure prediction, specially under scenarios of financial crisis, is known to be complicated. Although there have been many successful...... studies on bankruptcy detection, seldom probabilistic approaches were carried out. In this paper we assume a probabilistic point-of-view by applying Gaussian Processes (GP) in the context of bankruptcy prediction, comparing it against the Support Vector Machines (SVM) and the Logistic Regression (LR......). Using real-world bankruptcy data, an in-depth analysis is conducted showing that, in addition to a probabilistic interpretation, the GP can effectively improve the bankruptcy prediction performance with high accuracy when compared to the other approaches. We additionally generate a complete graphical...

  20. A fuzzy-based reliability approach to evaluate basic events of fault tree analysis for nuclear power plant probabilistic safety assessment

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry

    2014-01-01

    Highlights: • We propose a fuzzy-based reliability approach to evaluate basic event reliabilities. • It implements the concepts of failure possibilities and fuzzy sets. • Experts evaluate basic event failure possibilities using qualitative words. • Triangular fuzzy numbers mathematically represent qualitative failure possibilities. • It is a very good alternative for conventional reliability approach. - Abstract: Fault tree analysis has been widely utilized as a tool for nuclear power plant probabilistic safety assessment. This analysis can be completed only if all basic events of the system fault tree have their quantitative failure rates or failure probabilities. However, it is difficult to obtain those failure data due to insufficient data, environment changing or new components. This study proposes a fuzzy-based reliability approach to evaluate basic events of system fault trees whose failure precise probability distributions of their lifetime to failures are not available. It applies the concept of failure possibilities to qualitatively evaluate basic events and the concept of fuzzy sets to quantitatively represent the corresponding failure possibilities. To demonstrate the feasibility and the effectiveness of the proposed approach, the actual basic event failure probabilities collected from the operational experiences of the David–Besse design of the Babcock and Wilcox reactor protection system fault tree are used to benchmark the failure probabilities generated by the proposed approach. The results confirm that the proposed fuzzy-based reliability approach arises as a suitable alternative for the conventional probabilistic reliability approach when basic events do not have the corresponding quantitative historical failure data for determining their reliability characteristics. Hence, it overcomes the limitation of the conventional fault tree analysis for nuclear power plant probabilistic safety assessment

  1. Probabilistic dual heuristic programming-based adaptive critic

    Science.gov (United States)

    Herzallah, Randa

    2010-02-01

    Adaptive critic (AC) methods have common roots as generalisations of dynamic programming for neural reinforcement learning approaches. Since they approximate the dynamic programming solutions, they are potentially suitable for learning in noisy, non-linear and non-stationary environments. In this study, a novel probabilistic dual heuristic programming (DHP)-based AC controller is proposed. Distinct to current approaches, the proposed probabilistic (DHP) AC method takes uncertainties of forward model and inverse controller into consideration. Therefore, it is suitable for deterministic and stochastic control problems characterised by functional uncertainty. Theoretical development of the proposed method is validated by analytically evaluating the correct value of the cost function which satisfies the Bellman equation in a linear quadratic control problem. The target value of the probabilistic critic network is then calculated and shown to be equal to the analytically derived correct value. Full derivation of the Riccati solution for this non-standard stochastic linear quadratic control problem is also provided. Moreover, the performance of the proposed probabilistic controller is demonstrated on linear and non-linear control examples.

  2. Overview of the probabilistic risk assessment approach

    International Nuclear Information System (INIS)

    Reed, J.W.

    1985-01-01

    The techniques of probabilistic risk assessment (PRA) are applicable to Department of Energy facilities. The background and techniques of PRA are given with special attention to seismic, wind and flooding external events. A specific application to seismic events is provided to demonstrate the method. However, the PRA framework is applicable also to wind and external flooding. 3 references, 8 figures, 1 table

  3. Non-holonomic dynamics and Poisson geometry

    International Nuclear Information System (INIS)

    Borisov, A V; Mamaev, I S; Tsiganov, A V

    2014-01-01

    This is a survey of basic facts presently known about non-linear Poisson structures in the analysis of integrable systems in non-holonomic mechanics. It is shown that by using the theory of Poisson deformations it is possible to reduce various non-holonomic systems to dynamical systems on well-understood phase spaces equipped with linear Lie-Poisson brackets. As a result, not only can different non-holonomic systems be compared, but also fairly advanced methods of Poisson geometry and topology can be used for investigating them. Bibliography: 95 titles

  4. A Probabilistic, Facility-Centric Approach to Lightning Strike Location

    Science.gov (United States)

    Huddleston, Lisa L.; Roeder, William p.; Merceret, Francis J.

    2012-01-01

    A new probabilistic facility-centric approach to lightning strike location has been developed. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even with the location error ellipse. This technique is adapted from a method of calculating the probability of debris collisionith spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force Station. Future applications could include forensic meteorology.

  5. Systems analysis approach to probabilistic modeling of fault trees

    International Nuclear Information System (INIS)

    Bartholomew, R.J.; Qualls, C.R.

    1985-01-01

    A method of probabilistic modeling of fault tree logic combined with stochastic process theory (Markov modeling) has been developed. Systems are then quantitatively analyzed probabilistically in terms of their failure mechanisms including common cause/common mode effects and time dependent failure and/or repair rate effects that include synergistic and propagational mechanisms. The modeling procedure results in a state vector set of first order, linear, inhomogeneous, differential equations describing the time dependent probabilities of failure described by the fault tree. The solutions of this Failure Mode State Variable (FMSV) model are cumulative probability distribution functions of the system. A method of appropriate synthesis of subsystems to form larger systems is developed and applied to practical nuclear power safety systems

  6. Integrated Deterministic-Probabilistic Safety Assessment Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Kudinov, P.; Vorobyev, Y.; Sanchez-Perea, M.; Queral, C.; Jimenez Varas, G.; Rebollo, M. J.; Mena, L.; Gomez-Magin, J.

    2014-02-01

    IDPSA (Integrated Deterministic-Probabilistic Safety Assessment) is a family of methods which use tightly coupled probabilistic and deterministic approaches to address respective sources of uncertainties, enabling Risk informed decision making in a consistent manner. The starting point of the IDPSA framework is that safety justification must be based on the coupling of deterministic (consequences) and probabilistic (frequency) considerations to address the mutual interactions between stochastic disturbances (e.g. failures of the equipment, human actions, stochastic physical phenomena) and deterministic response of the plant (i.e. transients). This paper gives a general overview of some IDPSA methods as well as some possible applications to PWR safety analyses. (Author)

  7. CANDU type fuel behavior evaluation - a probabilistic approach

    International Nuclear Information System (INIS)

    Moscalu, D.R.; Horhoianu, G.; Popescu, I.A.; Olteanu, G.

    1995-01-01

    In order to realistically assess the behavior of the fuel elements during in-reactor operation, probabilistic methods have recently been introduced in the analysis of fuel performance. The present paper summarizes the achievements in this field at the Institute for Nuclear Research (INR), pointing out some advantages of the utilized method in the evaluation of CANDU type fuel behavior in steady state conditions. The Response Surface Method (RSM) has been selected for the investigation of the effects of the variability in fuel element computer code inputs on the code outputs (fuel element performance parameters). A new developed version of the probabilistic code APMESRA based on RSM is briefly presented. The examples of application include the analysis of the results of an in-reactor fuel element experiment and the investigation of the calculated performance parameter distribution for a new CANDU type extended burnup fuel element design. (author)

  8. Probabilistic structural integrity of reactor vessel under pressurized thermal shock

    International Nuclear Information System (INIS)

    Myung Jo Hhung; Young Hwan Choi; Hho Jung Kim; Changheui Jang

    2005-01-01

    Performed here is a comparative assessment study for the probabilistic fracture mechanics approach of the pressurized thermal shock of the reactor pressure vessel. A round robin consisting of 1 prerequisite study and 5 cases for probabilistic approaches is proposed, and all organizations interested are invited. The problems are solved and their results are compared to issue some recommendation of best practices in this area and to assure an understanding of the key parameters of this type of approach, which will be useful in the justification through a probabilistic approach for the case of a plant over-passing the screening criteria. Six participants from 3 organizations in Korea responded to the problem and their results are compiled in this study. (authors)

  9. Poisson brackets of orthogonal polynomials

    OpenAIRE

    Cantero, María José; Simon, Barry

    2009-01-01

    For the standard symplectic forms on Jacobi and CMV matrices, we compute Poisson brackets of OPRL and OPUC, and relate these to other basic Poisson brackets and to Jacobians of basic changes of variable.

  10. Probabilistic Analysis Methods for Hybrid Ventilation

    DEFF Research Database (Denmark)

    Brohus, Henrik; Frier, Christian; Heiselberg, Per

    This paper discusses a general approach for the application of probabilistic analysis methods in the design of ventilation systems. The aims and scope of probabilistic versus deterministic methods are addressed with special emphasis on hybrid ventilation systems. A preliminary application...... of stochastic differential equations is presented comprising a general heat balance for an arbitrary number of loads and zones in a building to determine the thermal behaviour under random conditions....

  11. Constructions and classifications of projective Poisson varieties.

    Science.gov (United States)

    Pym, Brent

    2018-01-01

    This paper is intended both as an introduction to the algebraic geometry of holomorphic Poisson brackets, and as a survey of results on the classification of projective Poisson manifolds that have been obtained in the past 20 years. It is based on the lecture series delivered by the author at the Poisson 2016 Summer School in Geneva. The paper begins with a detailed treatment of Poisson surfaces, including adjunction, ruled surfaces and blowups, and leading to a statement of the full birational classification. We then describe several constructions of Poisson threefolds, outlining the classification in the regular case, and the case of rank-one Fano threefolds (such as projective space). Following a brief introduction to the notion of Poisson subspaces, we discuss Bondal's conjecture on the dimensions of degeneracy loci on Poisson Fano manifolds. We close with a discussion of log symplectic manifolds with simple normal crossings degeneracy divisor, including a new proof of the classification in the case of rank-one Fano manifolds.

  12. Constructions and classifications of projective Poisson varieties

    Science.gov (United States)

    Pym, Brent

    2018-03-01

    This paper is intended both as an introduction to the algebraic geometry of holomorphic Poisson brackets, and as a survey of results on the classification of projective Poisson manifolds that have been obtained in the past 20 years. It is based on the lecture series delivered by the author at the Poisson 2016 Summer School in Geneva. The paper begins with a detailed treatment of Poisson surfaces, including adjunction, ruled surfaces and blowups, and leading to a statement of the full birational classification. We then describe several constructions of Poisson threefolds, outlining the classification in the regular case, and the case of rank-one Fano threefolds (such as projective space). Following a brief introduction to the notion of Poisson subspaces, we discuss Bondal's conjecture on the dimensions of degeneracy loci on Poisson Fano manifolds. We close with a discussion of log symplectic manifolds with simple normal crossings degeneracy divisor, including a new proof of the classification in the case of rank-one Fano manifolds.

  13. Probabilistic approach to requalification of existing NPPs under aircraft crash loading

    International Nuclear Information System (INIS)

    Birbraer, A.N.; Roleder, A.J.; Shulman, G.S.

    1993-01-01

    A probabilistic approach to the analysis of NPP safety under aircraft impact is discussed. It may be used both for requalification of existing NPPs and in the process of NPP design. NPP is considered as a system of components: structures, pipes, different kinds of equipment, soil, foundation. The exceeding of the limit probability of the radioactive products release out of containment (i.e. of the NPP safety requirements non-fulfilment) is taken as a system failure criterion. An example of an event tree representing the consequence of events causing the failure is given. Described are the methods of estimate of elementary events probabilities through which a composite probability of the failure is evaluated. (author)

  14. Future trends in flood risk in Indonesia - A probabilistic approach

    Science.gov (United States)

    Muis, Sanne; Guneralp, Burak; Jongman, Brenden; Ward, Philip

    2014-05-01

    Indonesia is one of the 10 most populous countries in the world and is highly vulnerable to (river) flooding. Catastrophic floods occur on a regular basis; total estimated damages were US 0.8 bn in 2010 and US 3 bn in 2013. Large parts of Greater Jakarta, the capital city, are annually subject to flooding. Flood risks (i.e. the product of hazard, exposure and vulnerability) are increasing due to rapid increases in exposure, such as strong population growth and ongoing economic development. The increase in risk may also be amplified by increasing flood hazards, such as increasing flood frequency and intensity due to climate change and land subsidence. The implementation of adaptation measures, such as the construction of dykes and strategic urban planning, may counteract these increasing trends. However, despite its importance for adaptation planning, a comprehensive assessment of current and future flood risk in Indonesia is lacking. This contribution addresses this issue and aims to provide insight into how socio-economic trends and climate change projections may shape future flood risks in Indonesia. Flood risk were calculated using an adapted version of the GLOFRIS global flood risk assessment model. Using this approach, we produced probabilistic maps of flood risks (i.e. annual expected damage) at a resolution of 30"x30" (ca. 1km x 1km at the equator). To represent flood exposure, we produced probabilistic projections of urban growth in a Monte-Carlo fashion based on probability density functions of projected population and GDP values for 2030. To represent flood hazard, inundation maps were computed using the hydrological-hydraulic component of GLOFRIS. These maps show flood inundation extent and depth for several return periods and were produced for several combinations of GCMs and future socioeconomic scenarios. Finally, the implementation of different adaptation strategies was incorporated into the model to explore to what extent adaptation may be able to

  15. Application of Poisson random effect models for highway network screening.

    Science.gov (United States)

    Jiang, Ximiao; Abdel-Aty, Mohamed; Alamili, Samer

    2014-02-01

    In recent years, Bayesian random effect models that account for the temporal and spatial correlations of crash data became popular in traffic safety research. This study employs random effect Poisson Log-Normal models for crash risk hotspot identification. Both the temporal and spatial correlations of crash data were considered. Potential for Safety Improvement (PSI) were adopted as a measure of the crash risk. Using the fatal and injury crashes that occurred on urban 4-lane divided arterials from 2006 to 2009 in the Central Florida area, the random effect approaches were compared to the traditional Empirical Bayesian (EB) method and the conventional Bayesian Poisson Log-Normal model. A series of method examination tests were conducted to evaluate the performance of different approaches. These tests include the previously developed site consistence test, method consistence test, total rank difference test, and the modified total score test, as well as the newly proposed total safety performance measure difference test. Results show that the Bayesian Poisson model accounting for both temporal and spatial random effects (PTSRE) outperforms the model that with only temporal random effect, and both are superior to the conventional Poisson Log-Normal model (PLN) and the EB model in the fitting of crash data. Additionally, the method evaluation tests indicate that the PTSRE model is significantly superior to the PLN model and the EB model in consistently identifying hotspots during successive time periods. The results suggest that the PTSRE model is a superior alternative for road site crash risk hotspot identification. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Nonlocal Poisson-Fermi model for ionic solvent.

    Science.gov (United States)

    Xie, Dexuan; Liu, Jinn-Liang; Eisenberg, Bob

    2016-07-01

    We propose a nonlocal Poisson-Fermi model for ionic solvent that includes ion size effects and polarization correlations among water molecules in the calculation of electrostatic potential. It includes the previous Poisson-Fermi models as special cases, and its solution is the convolution of a solution of the corresponding nonlocal Poisson dielectric model with a Yukawa-like kernel function. The Fermi distribution is shown to be a set of optimal ionic concentration functions in the sense of minimizing an electrostatic potential free energy. Numerical results are reported to show the difference between a Poisson-Fermi solution and a corresponding Poisson solution.

  17. Differential expression analysis for RNAseq using Poisson mixed models.

    Science.gov (United States)

    Sun, Shiquan; Hood, Michelle; Scott, Laura; Peng, Qinke; Mukherjee, Sayan; Tung, Jenny; Zhou, Xiang

    2017-06-20

    Identifying differentially expressed (DE) genes from RNA sequencing (RNAseq) studies is among the most common analyses in genomics. However, RNAseq DE analysis presents several statistical and computational challenges, including over-dispersed read counts and, in some settings, sample non-independence. Previous count-based methods rely on simple hierarchical Poisson models (e.g. negative binomial) to model independent over-dispersion, but do not account for sample non-independence due to relatedness, population structure and/or hidden confounders. Here, we present a Poisson mixed model with two random effects terms that account for both independent over-dispersion and sample non-independence. We also develop a scalable sampling-based inference algorithm using a latent variable representation of the Poisson distribution. With simulations, we show that our method properly controls for type I error and is generally more powerful than other widely used approaches, except in small samples (n <15) with other unfavorable properties (e.g. small effect sizes). We also apply our method to three real datasets that contain related individuals, population stratification or hidden confounders. Our results show that our method increases power in all three data compared to other approaches, though the power gain is smallest in the smallest sample (n = 6). Our method is implemented in MACAU, freely available at www.xzlab.org/software.html. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  18. Tomography of images with poisson miose: pre-processing of projections

    International Nuclear Information System (INIS)

    Furuie, S.S.

    1989-01-01

    This work present an alternative approach in order to reconstruct images with low signal to noise ratio. Basically it consist of smoothing projections taking into account that the noise is Poisson. These filtered projections are used to reconstruct the original image, applying direct Fourier method. This approach is compared with convolution back projection and EM (Expectation-Maximization). (author) [pt

  19. A probabilistic approach to rock mechanical property characterization for nuclear waste repository design

    International Nuclear Information System (INIS)

    Kim, Kunsoo; Gao, Hang

    1996-01-01

    A probabilistic approach is proposed for the characterization of host rock mechanical properties at the Yucca Mountain site. This approach helps define the probability distribution of rock properties by utilizing extreme value statistics and Monte Carlo simulation. We analyze mechanical property data of tuff obtained by the NNWSI Project to assess the utility of the methodology. The analysis indicates that laboratory measured strength and deformation data of Calico Hills and Bullfrog tuffs follow an extremal. probability distribution (the third type asymptotic distribution of the smallest values). Monte Carlo simulation is carried out to estimate rock mass deformation moduli using a one-dimensional tuff model proposed by Zimmermann and Finley. We suggest that the results of these analyses be incorporated into the repository design

  20. Collision prediction models using multivariate Poisson-lognormal regression.

    Science.gov (United States)

    El-Basyouny, Karim; Sayed, Tarek

    2009-07-01

    This paper advocates the use of multivariate Poisson-lognormal (MVPLN) regression to develop models for collision count data. The MVPLN approach presents an opportunity to incorporate the correlations across collision severity levels and their influence on safety analyses. The paper introduces a new multivariate hazardous location identification technique, which generalizes the univariate posterior probability of excess that has been commonly proposed and applied in the literature. In addition, the paper presents an alternative approach for quantifying the effect of the multivariate structure on the precision of expected collision frequency. The MVPLN approach is compared with the independent (separate) univariate Poisson-lognormal (PLN) models with respect to model inference, goodness-of-fit, identification of hot spots and precision of expected collision frequency. The MVPLN is modeled using the WinBUGS platform which facilitates computation of posterior distributions as well as providing a goodness-of-fit measure for model comparisons. The results indicate that the estimates of the extra Poisson variation parameters were considerably smaller under MVPLN leading to higher precision. The improvement in precision is due mainly to the fact that MVPLN accounts for the correlation between the latent variables representing property damage only (PDO) and injuries plus fatalities (I+F). This correlation was estimated at 0.758, which is highly significant, suggesting that higher PDO rates are associated with higher I+F rates, as the collision likelihood for both types is likely to rise due to similar deficiencies in roadway design and/or other unobserved factors. In terms of goodness-of-fit, the MVPLN model provided a superior fit than the independent univariate models. The multivariate hazardous location identification results demonstrated that some hazardous locations could be overlooked if the analysis was restricted to the univariate models.

  1. Hyperbolically Patterned 3D Graphene Metamaterial with Negative Poisson's Ratio and Superelasticity.

    Science.gov (United States)

    Zhang, Qiangqiang; Xu, Xiang; Lin, Dong; Chen, Wenli; Xiong, Guoping; Yu, Yikang; Fisher, Timothy S; Li, Hui

    2016-03-16

    A hyperbolically patterned 3D graphene metamaterial (GM) with negative Poisson's ratio and superelasticity is highlighted. It is synthesized by a modified hydrothermal approach and subsequent oriented freeze-casting strategy. GM presents a tunable Poisson's ratio by adjusting the structural porosity, macroscopic aspect ratio (L/D), and freeze-casting conditions. Such a GM suggests promising applications as soft actuators, sensors, robust shock absorbers, and environmental remediation. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Probabilistic assessment of nuclear safety and safeguards

    International Nuclear Information System (INIS)

    Higson, D.J.

    1987-01-01

    Nuclear reactor accidents and diversions of materials from the nuclear fuel cycle are perceived by many people as particularly serious threats to society. Probabilistic assessment is a rational approach to the evaluation of both threats, and may provide a basis for decisions on appropriate actions to control them. Probabilistic method have become standard tools used in the analysis of safety, but there are disagreements on the criteria to be applied when assessing the results of analysis. Probabilistic analysis and assessment of the effectiveness of nuclear material safeguards are still at an early stage of development. (author)

  3. A Probabilistic Approach for Breast Boundary Extraction in Mammograms

    Directory of Open Access Journals (Sweden)

    Hamed Habibi Aghdam

    2013-01-01

    Full Text Available The extraction of the breast boundary is crucial to perform further analysis of mammogram. Methods to extract the breast boundary can be classified into two categories: methods based on image processing techniques and those based on models. The former use image transformation techniques such as thresholding, morphological operations, and region growing. In the second category, the boundary is extracted using more advanced techniques, such as the active contour model. The problem with thresholding methods is that it is a hard to automatically find the optimal threshold value by using histogram information. On the other hand, active contour models require defining a starting point close to the actual boundary to be able to successfully extract the boundary. In this paper, we propose a probabilistic approach to address the aforementioned problems. In our approach we use local binary patterns to describe the texture around each pixel. In addition, the smoothness of the boundary is handled by using a new probability model. Experimental results show that the proposed method reaches 38% and 50% improvement with respect to the results obtained by the active contour model and threshold-based methods respectively, and it increases the stability of the boundary extraction process up to 86%.

  4. Optimality of Poisson Processes Intensity Learning with Gaussian Processes

    NARCIS (Netherlands)

    Kirichenko, A.; van Zanten, H.

    2015-01-01

    In this paper we provide theoretical support for the so-called "Sigmoidal Gaussian Cox Process" approach to learning the intensity of an inhomogeneous Poisson process on a d-dimensional domain. This method was proposed by Adams, Murray and MacKay (ICML, 2009), who developed a tractable computational

  5. Probabilistic Decision Graphs - Combining Verification and AI Techniques for Probabilistic Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We adopt probabilistic decision graphs developed in the field of automated verification as a tool for probabilistic model representation and inference. We show that probabilistic inference has linear time complexity in the size of the probabilistic decision graph, that the smallest probabilistic ...

  6. Probabilistic conditional independence structures

    CERN Document Server

    Studeny, Milan

    2005-01-01

    Probabilistic Conditional Independence Structures provides the mathematical description of probabilistic conditional independence structures; the author uses non-graphical methods of their description, and takes an algebraic approach.The monograph presents the methods of structural imsets and supermodular functions, and deals with independence implication and equivalence of structural imsets.Motivation, mathematical foundations and areas of application are included, and a rough overview of graphical methods is also given.In particular, the author has been careful to use suitable terminology, and presents the work so that it will be understood by both statisticians, and by researchers in artificial intelligence.The necessary elementary mathematical notions are recalled in an appendix.

  7. Systematic design of 3D auxetic lattice materials with programmable Poisson's ratio for finite strains

    Science.gov (United States)

    Wang, Fengwen

    2018-05-01

    This paper presents a systematic approach for designing 3D auxetic lattice materials, which exhibit constant negative Poisson's ratios over large strain intervals. A unit cell model mimicking tensile tests is established and based on the proposed model, the secant Poisson's ratio is defined as the negative ratio between the lateral and the longitudinal engineering strains. The optimization problem for designing a material unit cell with a target Poisson's ratio is formulated to minimize the average lateral engineering stresses under the prescribed deformations. Numerical results demonstrate that 3D auxetic lattice materials with constant Poisson's ratios can be achieved by the proposed optimization formulation and that two sets of material architectures are obtained by imposing different symmetry on the unit cell. Moreover, inspired by the topology-optimized material architecture, a subsequent shape optimization is proposed by parametrizing material architectures using super-ellipsoids. By designing two geometrical parameters, simple optimized material microstructures with different target Poisson's ratios are obtained. By interpolating these two parameters as polynomial functions of Poisson's ratios, material architectures for any Poisson's ratio in the interval of ν ∈ [ - 0.78 , 0.00 ] are explicitly presented. Numerical evaluations show that interpolated auxetic lattice materials exhibit constant Poisson's ratios in the target strain interval of [0.00, 0.20] and that 3D auxetic lattice material architectures with programmable Poisson's ratio are achievable.

  8. Validation of the probabilistic approach for the analysis of PWR transients

    International Nuclear Information System (INIS)

    Amesz, J.; Francocci, G.F.; Clarotti, C.

    1978-01-01

    This paper reviews the pilot study at present being carried out on the validation of probabilistic methodology with real data coming from the operational records of the PWR power station at Obrigheim (KWO, Germany) operating since 1969. The aim of this analysis is to validate the a priori predictions of reactor transients performed by a probabilistic methodology, with the posteriori analysis of transients that actually occurred at a power station. Two levels of validation have been distinguished: (a) validation of the rate of occurrence of initiating events; (b) validation of the transient-parameter amplitude (i.e., overpressure) caused by the above mentioned initiating events. The paper describes the a priori calculations performed using a fault-tree analysis by means of a probabilistic code (SALP 3) and event-trees coupled with a PWR system deterministic computer code (LOOP 7). Finally the principle results of these analyses are presented and critically reviewed

  9. Singular reduction of Nambu-Poisson manifolds

    Science.gov (United States)

    Das, Apurba

    The version of Marsden-Ratiu Poisson reduction theorem for Nambu-Poisson manifolds by a regular foliation have been studied by Ibáñez et al. In this paper, we show that this reduction procedure can be extended to the singular case. Under a suitable notion of Hamiltonian flow on the reduced space, we show that a set of Hamiltonians on a Nambu-Poisson manifold can also be reduced.

  10. Infinitesimal deformations of Poisson bi-vectors using the Kontsevich graph calculus

    Science.gov (United States)

    Buring, Ricardo; Kiselev, Arthemy V.; Rutten, Nina

    2018-02-01

    Let \\mathscr{P} be a Poisson structure on a finite-dimensional affine real manifold. Can \\mathscr{P} be deformed in such a way that it stays Poisson? The language of Kontsevich graphs provides a universal approach - with respect to all affine Poisson manifolds - to finding a class of solutions to this deformation problem. For that reasoning, several types of graphs are needed. In this paper we outline the algorithms to generate those graphs. The graphs that encode deformations are classified by the number of internal vertices k; for k ≤ 4 we present all solutions of the deformation problem. For k ≥ 5, first reproducing the pentagon-wheel picture suggested at k = 6 by Kontsevich and Willwacher, we construct the heptagon-wheel cocycle that yields a new unique solution without 2-loops and tadpoles at k = 8.

  11. A probabilistic approach to the management of multi-stage multicriteria process

    Directory of Open Access Journals (Sweden)

    Yu. V. Bugaev

    2017-01-01

    Full Text Available Currently, any production process is viewed as the primary means of profit and competitiveness, in other words, becomes the dominant process approach. In this approach, the final product production appears network of interconnected processing steps during which the conversion of inputs into outputs, with a stable, accurate executable, high process most efficiently and cost-effectively provides a planned quality. An example is the organization of bread production. For the modern period is characterized by the classical recovery technology that allows to improve the palatability of bread, enhance its flavor, longer-lasting freshness. Baking is a process to be controlled in order to obtain the required quality parameters of the final product. One of the new and promising methods of quality management processes is a probabilistic method to determine the increase in the probability of release of quality products within the resources allocated for measures to improve the quality level. The paper was applied a quality management concept is based on a probabilistic approach for the multi-step process, which consists in the fact that as one of the main criteria adopted by the probability of release of high-quality products. However, it is obvious that the implementation of certain measures for its improvement requires the connection of certain resources that, first of all, is inevitably associated with certain cash costs. Thus, we arrive at an optimal control problem, which has at least two criteria - probability qualitative completion of the process, which should be maximized and the total costs of the corrective measures that need to be minimized. The authors of the idealized model of optimal control has been developed for the case when a single event affects only a single step. a special case of vector Uorshall-Floyd algorithm was used to optimize the structure of a multi-step process. The use of vector optimization on graphs allowed the authors to

  12. On the fractal characterization of Paretian Poisson processes

    Science.gov (United States)

    Eliazar, Iddo I.; Sokolov, Igor M.

    2012-06-01

    Paretian Poisson processes are Poisson processes which are defined on the positive half-line, have maximal points, and are quantified by power-law intensities. Paretian Poisson processes are elemental in statistical physics, and are the bedrock of a host of power-law statistics ranging from Pareto's law to anomalous diffusion. In this paper we establish evenness-based fractal characterizations of Paretian Poisson processes. Considering an array of socioeconomic evenness-based measures of statistical heterogeneity, we show that: amongst the realm of Poisson processes which are defined on the positive half-line, and have maximal points, Paretian Poisson processes are the unique class of 'fractal processes' exhibiting scale-invariance. The results established in this paper are diametric to previous results asserting that the scale-invariance of Poisson processes-with respect to physical randomness-based measures of statistical heterogeneity-is characterized by exponential Poissonian intensities.

  13. An integrated approach to the probabilistic assessments of aircraft strikes and structural mode of damages to nuclear power plants

    International Nuclear Information System (INIS)

    Godbout, P.; Brais, A.

    1975-01-01

    The possibilities of an aircraft striking a Canadian nuclear power plant in the vicinity of an airport and of inducing structural failure modes have been evaluated. This evaluation, together with other studies, may enhance decisions in the development of general criteria for the siting of reactors near airports. The study made use, for assessment, of the probabilistic approach and made judicious applications of the finite Canadian, French, German, American and English resources that were available. The tools, techniques and methods used for achieving the above, form what may be called an integrated approach. This method of approach requires that the study be made in six consecutive steps as follows: the qualitative evaluation of having an aircraft strike on a site situated near an airport with the use of the logic model technique; the statistical data gathering on aircraft movements and accidents; evaluating the probability distribution and calculating the basic event probabilities; evaluating the probability of an aircraft strike and the application of the sensitivity approach; generating the probability density distribution versus strike impact energy, that is, the evaluation of the energy envelope; and the probabilistic evaluation of structural failure mode inducements

  14. Application of a time probabilistic approach to seismic landslide hazard estimates in Iran

    Science.gov (United States)

    Rajabi, A. M.; Del Gaudio, V.; Capolongo, D.; Khamehchiyan, M.; Mahdavifar, M. R.

    2009-04-01

    Iran is a country located in a tectonic active belt and is prone to earthquake and related phenomena. In the recent years, several earthquakes caused many fatalities and damages to facilities, e.g. the Manjil (1990), Avaj (2002), Bam (2003) and Firuzabad-e-Kojur (2004) earthquakes. These earthquakes generated many landslides. For instance, catastrophic landslides triggered by the Manjil Earthquake (Ms = 7.7) in 1990 buried the village of Fatalak, killed more than 130 peoples and cut many important road and other lifelines, resulting in major economic disruption. In general, earthquakes in Iran have been concentrated in two major zones with different seismicity characteristics: one is the region of Alborz and Central Iran and the other is the Zagros Orogenic Belt. Understanding where seismically induced landslides are most likely to occur is crucial in reducing property damage and loss of life in future earthquakes. For this purpose a time probabilistic approach for earthquake-induced landslide hazard at regional scale, proposed by Del Gaudio et al. (2003), has been applied to the whole Iranian territory to provide the basis of hazard estimates. This method consists in evaluating the recurrence of seismically induced slope failure conditions inferred from the Newmark's model. First, by adopting Arias Intensity to quantify seismic shaking and using different Arias attenuation relations for Alborz - Central Iran and Zagros regions, well-established methods of seismic hazard assessment, based on the Cornell (1968) method, were employed to obtain the occurrence probabilities for different levels of seismic shaking in a time interval of interest (50 year). Then, following Jibson (1998), empirical formulae specifically developed for Alborz - Central Iran and Zagros, were used to represent, according to the Newmark's model, the relation linking Newmark's displacement Dn to Arias intensity Ia and to slope critical acceleration ac. These formulae were employed to evaluate

  15. NEWTPOIS- NEWTON POISSON DISTRIBUTION PROGRAM

    Science.gov (United States)

    Bowerman, P. N.

    1994-01-01

    The cumulative poisson distribution program, NEWTPOIS, is one of two programs which make calculations involving cumulative poisson distributions. Both programs, NEWTPOIS (NPO-17715) and CUMPOIS (NPO-17714), can be used independently of one another. NEWTPOIS determines percentiles for gamma distributions with integer shape parameters and calculates percentiles for chi-square distributions with even degrees of freedom. It can be used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. NEWTPOIS determines the Poisson parameter (lambda), that is; the mean (or expected) number of events occurring in a given unit of time, area, or space. Given that the user already knows the cumulative probability for a specific number of occurrences (n) it is usually a simple matter of substitution into the Poisson distribution summation to arrive at lambda. However, direct calculation of the Poisson parameter becomes difficult for small positive values of n and unmanageable for large values. NEWTPOIS uses Newton's iteration method to extract lambda from the initial value condition of the Poisson distribution where n=0, taking successive estimations until some user specified error term (epsilon) is reached. The NEWTPOIS program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly on most C compilers. The program format is interactive, accepting epsilon, n, and the cumulative probability of the occurrence of n as inputs. It has been implemented under DOS 3.2 and has a memory requirement of 30K. NEWTPOIS was developed in 1988.

  16. A probabilistic approach to the computation of the levelized cost of electricity

    International Nuclear Information System (INIS)

    Geissmann, Thomas

    2017-01-01

    This paper sets forth a novel approach to calculate the levelized cost of electricity (LCOE) using a probabilistic model that accounts for endogenous input parameters. The approach is applied to the example of a nuclear and gas power project. Monte Carlo simulation results show that a correlation between input parameters has a significant effect on the model outcome. By controlling for endogeneity, a statistically significant difference in the mean LCOE estimate and a change in the order of input leverages is observed. Moreover, the paper discusses the role of discounting options and external costs in detail. In contrast to the gas power project, the economic viability of the nuclear project is considerably weaker. - Highlights: • First model of levelized cost of electricity accounting for uncertainty and endogeneities in input parameters. • Allowance for endogeneities significantly affects results. • Role of discounting options and external costs is discussed and modelled.

  17. All-possible-couplings approach to measuring probabilistic context.

    Directory of Open Access Journals (Sweden)

    Ehtibar N Dzhafarov

    Full Text Available From behavioral sciences to biology to quantum mechanics, one encounters situations where (i a system outputs several random variables in response to several inputs, (ii for each of these responses only some of the inputs may "directly" influence them, but (iii other inputs provide a "context" for this response by influencing its probabilistic relations to other responses. These contextual influences are very different, say, in classical kinetic theory and in the entanglement paradigm of quantum mechanics, which are traditionally interpreted as representing different forms of physical determinism. One can mathematically construct systems with other types of contextuality, whether or not empirically realizable: those that form special cases of the classical type, those that fall between the classical and quantum ones, and those that violate the quantum type. We show how one can quantify and classify all logically possible contextual influences by studying various sets of probabilistic couplings, i.e., sets of joint distributions imposed on random outputs recorded at different (mutually incompatible values of inputs.

  18. Avoiding negative populations in explicit Poisson tau-leaping.

    Science.gov (United States)

    Cao, Yang; Gillespie, Daniel T; Petzold, Linda R

    2005-08-01

    The explicit tau-leaping procedure attempts to speed up the stochastic simulation of a chemically reacting system by approximating the number of firings of each reaction channel during a chosen time increment tau as a Poisson random variable. Since the Poisson random variable can have arbitrarily large sample values, there is always the possibility that this procedure will cause one or more reaction channels to fire so many times during tau that the population of some reactant species will be driven negative. Two recent papers have shown how that unacceptable occurrence can be avoided by replacing the Poisson random variables with binomial random variables, whose values are naturally bounded. This paper describes a modified Poisson tau-leaping procedure that also avoids negative populations, but is easier to implement than the binomial procedure. The new Poisson procedure also introduces a second control parameter, whose value essentially dials the procedure from the original Poisson tau-leaping at one extreme to the exact stochastic simulation algorithm at the other; therefore, the modified Poisson procedure will generally be more accurate than the original Poisson procedure.

  19. A LATENT CLASS POISSON REGRESSION-MODEL FOR HETEROGENEOUS COUNT DATA

    NARCIS (Netherlands)

    WEDEL, M; DESARBO, WS; BULT, [No Value; RAMASWAMY, [No Value

    1993-01-01

    In this paper an approach is developed that accommodates heterogeneity in Poisson regression models for count data. The model developed assumes that heterogeneity arises from a distribution of both the intercept and the coefficients of the explanatory variables. We assume that the mixing

  20. Comparative study of probabilistic methodologies for small signal stability assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rueda, J.L.; Colome, D.G. [Universidad Nacional de San Juan (IEE-UNSJ), San Juan (Argentina). Inst. de Energia Electrica], Emails: joseluisrt@iee.unsj.edu.ar, colome@iee.unsj.edu.ar

    2009-07-01

    Traditional deterministic approaches for small signal stability assessment (SSSA) are unable to properly reflect the existing uncertainties in real power systems. Hence, the probabilistic analysis of small signal stability (SSS) is attracting more attention by power system engineers. This paper discusses and compares two probabilistic methodologies for SSSA, which are based on the two point estimation method and the so-called Monte Carlo method, respectively. The comparisons are based on the results obtained for several power systems of different sizes and with different SSS performance. It is demonstrated that although with an analytical approach the amount of computation of probabilistic SSSA can be reduced, the different degrees of approximations that are adopted, lead to deceptive results. Conversely, Monte Carlo based probabilistic SSSA can be carried out with reasonable computational effort while holding satisfactory estimation precision. (author)

  1. Probabilistic composition of preferences, theory and applications

    CERN Document Server

    Parracho Sant'Anna, Annibal

    2015-01-01

    Putting forward a unified presentation of the features and possible applications of probabilistic preferences composition, and serving as a methodology for decisions employing multiple criteria, this book maximizes reader insights into the evaluation in probabilistic terms and the development of composition approaches that do not depend on assigning weights to the criteria. With key applications in important areas of management such as failure modes, effects analysis and productivity analysis – together with explanations about the application of the concepts involved –this book makes available numerical examples of probabilistic transformation development and probabilistic composition. Useful not only as a reference source for researchers, but also in teaching classes of graduate courses in Production Engineering and Management Science, the key themes of the book will be of especial interest to researchers in the field of Operational Research.

  2. Conditional Poisson models: a flexible alternative to conditional logistic case cross-over analysis.

    Science.gov (United States)

    Armstrong, Ben G; Gasparrini, Antonio; Tobias, Aurelio

    2014-11-24

    The time stratified case cross-over approach is a popular alternative to conventional time series regression for analysing associations between time series of environmental exposures (air pollution, weather) and counts of health outcomes. These are almost always analyzed using conditional logistic regression on data expanded to case-control (case crossover) format, but this has some limitations. In particular adjusting for overdispersion and auto-correlation in the counts is not possible. It has been established that a Poisson model for counts with stratum indicators gives identical estimates to those from conditional logistic regression and does not have these limitations, but it is little used, probably because of the overheads in estimating many stratum parameters. The conditional Poisson model avoids estimating stratum parameters by conditioning on the total event count in each stratum, thus simplifying the computing and increasing the number of strata for which fitting is feasible compared with the standard unconditional Poisson model. Unlike the conditional logistic model, the conditional Poisson model does not require expanding the data, and can adjust for overdispersion and auto-correlation. It is available in Stata, R, and other packages. By applying to some real data and using simulations, we demonstrate that conditional Poisson models were simpler to code and shorter to run than are conditional logistic analyses and can be fitted to larger data sets than possible with standard Poisson models. Allowing for overdispersion or autocorrelation was possible with the conditional Poisson model but when not required this model gave identical estimates to those from conditional logistic regression. Conditional Poisson regression models provide an alternative to case crossover analysis of stratified time series data with some advantages. The conditional Poisson model can also be used in other contexts in which primary control for confounding is by fine

  3. A probabilistic approach of sum rules for heat polynomials

    International Nuclear Information System (INIS)

    Vignat, C; Lévêque, O

    2012-01-01

    In this paper, we show that the sum rules for generalized Hermite polynomials derived by Daboul and Mizrahi (2005 J. Phys. A: Math. Gen. http://dx.doi.org/10.1088/0305-4470/38/2/010) and by Graczyk and Nowak (2004 C. R. Acad. Sci., Ser. 1 338 849) can be interpreted and easily recovered using a probabilistic moment representation of these polynomials. The covariance property of the raising operator of the harmonic oscillator, which is at the origin of the identities proved in Daboul and Mizrahi and the dimension reduction effect expressed in the main result of Graczyk and Nowak are both interpreted in terms of the rotational invariance of the Gaussian distributions. As an application of these results, we uncover a probabilistic moment interpretation of two classical integrals of the Wigner function that involve the associated Laguerre polynomials. (paper)

  4. Cognitive Development Effects of Teaching Probabilistic Decision Making to Middle School Students

    Science.gov (United States)

    Mjelde, James W.; Litzenberg, Kerry K.; Lindner, James R.

    2011-01-01

    This study investigated the comprehension and effectiveness of teaching formal, probabilistic decision-making skills to middle school students. Two specific objectives were to determine (1) if middle school students can comprehend a probabilistic decision-making approach, and (2) if exposure to the modeling approaches improves middle school…

  5. Basic Ideas to Approach Metastability in Probabilistic Cellular Automata

    NARCIS (Netherlands)

    Cirillo, Emilio N. M.; Nardi, Francesca R.; Spitoni, Cristian

    2016-01-01

    Cellular Automata are discrete--time dynamical systems on a spatially extended discrete space which provide paradigmatic examples of nonlinear phenomena. Their stochastic generalizations, i.e., Probabilistic Cellular Automata, are discrete time Markov chains on lattice with finite single--cell

  6. Probabilistic Graph Layout for Uncertain Network Visualization.

    Science.gov (United States)

    Schulz, Christoph; Nocaj, Arlind; Goertler, Jochen; Deussen, Oliver; Brandes, Ulrik; Weiskopf, Daniel

    2017-01-01

    We present a novel uncertain network visualization technique based on node-link diagrams. Nodes expand spatially in our probabilistic graph layout, depending on the underlying probability distributions of edges. The visualization is created by computing a two-dimensional graph embedding that combines samples from the probabilistic graph. A Monte Carlo process is used to decompose a probabilistic graph into its possible instances and to continue with our graph layout technique. Splatting and edge bundling are used to visualize point clouds and network topology. The results provide insights into probability distributions for the entire network-not only for individual nodes and edges. We validate our approach using three data sets that represent a wide range of network types: synthetic data, protein-protein interactions from the STRING database, and travel times extracted from Google Maps. Our approach reveals general limitations of the force-directed layout and allows the user to recognize that some nodes of the graph are at a specific position just by chance.

  7. Structural interactions in ionic liquids linked to higher-order Poisson-Boltzmann equations

    Science.gov (United States)

    Blossey, R.; Maggs, A. C.; Podgornik, R.

    2017-06-01

    We present a derivation of generalized Poisson-Boltzmann equations starting from classical theories of binary fluid mixtures, employing an approach based on the Legendre transform as recently applied to the case of local descriptions of the fluid free energy. Under specific symmetry assumptions, and in the linearized regime, the Poisson-Boltzmann equation reduces to a phenomenological equation introduced by Bazant et al. [Phys. Rev. Lett. 106, 046102 (2011)], 10.1103/PhysRevLett.106.046102, whereby the structuring near the surface is determined by bulk coefficients.

  8. Global/local methods for probabilistic structural analysis

    Science.gov (United States)

    Millwater, H. R.; Wu, Y.-T.

    1993-04-01

    A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.

  9. Poisson Mixture Regression Models for Heart Disease Prediction.

    Science.gov (United States)

    Mufudza, Chipo; Erol, Hamza

    2016-01-01

    Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model.

  10. Poisson Mixture Regression Models for Heart Disease Prediction

    Science.gov (United States)

    Erol, Hamza

    2016-01-01

    Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model. PMID:27999611

  11. Singularities of Poisson structures and Hamiltonian bifurcations

    NARCIS (Netherlands)

    Meer, van der J.C.

    2010-01-01

    Consider a Poisson structure on C8(R3,R) with bracket {, } and suppose that C is a Casimir function. Then {f, g} =<¿C, (¿g x ¿f) > is a possible Poisson structure. This confirms earlier observations concerning the Poisson structure for Hamiltonian systems that are reduced to a one degree of freedom

  12. Bayesian probabilistic network approach for managing earthquake risks of cities

    DEFF Research Database (Denmark)

    Bayraktarli, Yahya; Faber, Michael

    2011-01-01

    This paper considers the application of Bayesian probabilistic networks (BPNs) to large-scale risk based decision making in regard to earthquake risks. A recently developed risk management framework is outlined which utilises Bayesian probabilistic modelling, generic indicator based risk models...... and a fourth module on the consequences of an earthquake. Each of these modules is integrated into a BPN. Special attention is given to aggregated risk, i.e. the risk contribution from assets at multiple locations in a city subjected to the same earthquake. The application of the methodology is illustrated...... on an example considering a portfolio of reinforced concrete structures in a city located close to the western part of the North Anatolian Fault in Turkey....

  13. Intermediate probabilistic safety assessment approach for safety critical digital systems

    International Nuclear Information System (INIS)

    Taeyong, Sung; Hyun Gook, Kang

    2001-01-01

    Even though the conventional probabilistic safety assessment methods are immature for applying to microprocessor-based digital systems, practical needs force to apply it. In the Korea, UCN 5 and 6 units are being constructed and Korean Next Generation Reactor is being designed using the digital instrumentation and control equipment for the safety related functions. Korean regulatory body requires probabilistic safety assessment. This paper analyzes the difficulties on the assessment of digital systems and suggests an intermediate framework for evaluating their safety using fault tree models. The framework deals with several important characteristics of digital systems including software modules and fault-tolerant features. We expect that the analysis result will provide valuable design feedback. (authors)

  14. Decomposition of almost-Poisson structure of generalised Chaplygin's nonholonomic systems

    International Nuclear Information System (INIS)

    Chang, Liu; Peng, Chang; Shi-Xing, Liu; Yong-Xin, Guo

    2010-01-01

    This paper constructs an almost-Poisson structure for the non-self-adjoint dynamical systems, which can be decomposed into a sum of a Poisson bracket and the other almost-Poisson bracket. The necessary and sufficient condition for the decomposition of the almost-Poisson bracket to be two Poisson ones is obtained. As an application, the almost-Poisson structure for generalised Chaplygin's systems is discussed in the framework of the decomposition theory. It proves that the almost-Poisson bracket for the systems can be decomposed into the sum of a canonical Poisson bracket and another two noncanonical Poisson brackets in some special cases, which is useful for integrating the equations of motion

  15. Poisson Spot with Magnetic Levitation

    Science.gov (United States)

    Hoover, Matthew; Everhart, Michael; D'Arruda, Jose

    2010-01-01

    In this paper we describe a unique method for obtaining the famous Poisson spot without adding obstacles to the light path, which could interfere with the effect. A Poisson spot is the interference effect from parallel rays of light diffracting around a solid spherical object, creating a bright spot in the center of the shadow.

  16. Periodic Poisson Solver for Particle Tracking

    International Nuclear Information System (INIS)

    Dohlus, M.; Henning, C.

    2015-05-01

    A method is described to solve the Poisson problem for a three dimensional source distribution that is periodic into one direction. Perpendicular to the direction of periodicity a free space (or open) boundary is realized. In beam physics, this approach allows to calculate the space charge field of a continualized charged particle distribution with periodic pattern. The method is based on a particle mesh approach with equidistant grid and fast convolution with a Green's function. The periodic approach uses only one period of the source distribution, but a periodic extension of the Green's function. The approach is numerically efficient and allows the investigation of periodic- and pseudo-periodic structures with period lengths that are small compared to the source dimensions, for instance of laser modulated beams or of the evolution of micro bunch structures. Applications for laser modulated beams are given.

  17. Stationary response of multi-degree-of-freedom vibro-impact systems to Poisson white noises

    International Nuclear Information System (INIS)

    Wu, Y.; Zhu, W.Q.

    2008-01-01

    The stationary response of multi-degree-of-freedom (MDOF) vibro-impact (VI) systems to random pulse trains is studied. The system is formulated as a stochastically excited and dissipated Hamiltonian system. The constraints are modeled as non-linear springs according to the Hertz contact law. The random pulse trains are modeled as Poisson white noises. The approximate stationary probability density function (PDF) for the response of MDOF dissipated Hamiltonian systems to Poisson white noises is obtained by solving the fourth-order generalized Fokker-Planck-Kolmogorov (FPK) equation using perturbation approach. As examples, two-degree-of-freedom (2DOF) VI systems under external and parametric Poisson white noise excitations, respectively, are investigated. The validity of the proposed approach is confirmed by using the results obtained from Monte Carlo simulation. It is shown that the non-Gaussian behaviour depends on the product of the mean arrival rate of the impulses and the relaxation time of the oscillator

  18. Probabilistic assessment of pressure vessel and piping reliability

    International Nuclear Information System (INIS)

    Sundararajan, C.

    1986-01-01

    The paper presents a critical review of the state-of-the-art in probabilistic assessment of pressure vessel and piping reliability. First the differences in assessing the reliability directly from historical failure data and indirectly by a probabilistic analysis of the failure phenomenon are discussed and the advantages and disadvantages are pointed out. The rest of the paper deals with the latter approach of reliability assessment. Methods of probabilistic reliability assessment are described and major projects where these methods are applied for pressure vessel and piping problems are discussed. An extensive list of references is provided at the end of the paper

  19. Newton/Poisson-Distribution Program

    Science.gov (United States)

    Bowerman, Paul N.; Scheuer, Ernest M.

    1990-01-01

    NEWTPOIS, one of two computer programs making calculations involving cumulative Poisson distributions. NEWTPOIS (NPO-17715) and CUMPOIS (NPO-17714) used independently of one another. NEWTPOIS determines Poisson parameter for given cumulative probability, from which one obtains percentiles for gamma distributions with integer shape parameters and percentiles for X(sup2) distributions with even degrees of freedom. Used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. Program written in C.

  20. Probabilistic approach to the prediction of radioactive contamination of agricultural production

    International Nuclear Information System (INIS)

    Fesenko, S.F.; Chernyaeva, L.G.; Sanzharova, N.I.; Aleksakhin, R.M.

    1993-01-01

    The organization of agricultural production on the territory contaminated as a result of the Chernobyl reactor disaster involves prediction of the content of radionuclides in agro-industrial products. Traditional methods of prediting the contamination in the products does not give sufficient agreement with actual data and as a result it is difficult to make the necessary decisions about eliminating the consequences of the disaster in the agro-industrial complex. In many ways this is because the available methods are based on data on the radionuclide content in soils, plants, and plant and animal products. The parameters of the models used in the prediction are also evaluated on the basis of these results. Even if obtained from a single field or herd of livestock, however, such indicators have substantial variation coefficients due to various factors such as the spatial structure of the fallouts, the variability of the soil properties, the sampling error, the errors of processing and measuring the samples, and well as the data-averaging error. Consequently the parameters of the radionuclide transfer along the agricultural chains are very variable, thus considerably reducing the reliability of predicted values. The reliability of the prediction of radioactive contamination of agricultural products can be increased substantially by taking a probabilistic approach involving information about the random laws of contamination of farming land and the statistical features of the parameters of radionuclie migration along food chains. Considering the above, comparative analysis is made of the results obtained on the basis of the traditional treatment (deterministic in the simplest form) and its probabilistic analog

  1. A Martingale Characterization of Mixed Poisson Processes.

    Science.gov (United States)

    1985-10-01

    03LA A 11. TITLE (Inciuae Security Clanafication, ",A martingale characterization of mixed Poisson processes " ________________ 12. PERSONAL AUTHOR... POISSON PROCESSES Jostification .......... . ... . . Di.;t ib,,jtion by Availability Codes Dietmar Pfeifer* Technical University Aachen Dist Special and...Mixed Poisson processes play an important role in many branches of applied probability, for instance in insurance mathematics and physics (see Albrecht

  2. A probabilistic approach for optimal sensor allocation in structural health monitoring

    International Nuclear Information System (INIS)

    Azarbayejani, M; Reda Taha, M M; El-Osery, A I; Choi, K K

    2008-01-01

    Recent advances in sensor technology promote using large sensor networks to efficiently and economically monitor, identify and quantify damage in structures. In structural health monitoring (SHM) systems, the effectiveness and reliability of the sensor network are crucial to determine the optimal number and locations of sensors in SHM systems. Here, we suggest a probabilistic approach for identifying the optimal number and locations of sensors for SHM. We demonstrate a methodology to establish the probability distribution function that identifies the optimal sensor locations such that damage detection is enhanced. The approach is based on using the weights of a neural network trained from simulations using a priori knowledge about damage locations and damage severities to generate a normalized probability distribution function for optimal sensor allocation. We also demonstrate that the optimal sensor network can be related to the highest probability of detection (POD). The redundancy of the proposed sensor network is examined using a 'leave one sensor out' analysis. A prestressed concrete bridge is selected as a case study to demonstrate the effectiveness of the proposed method. The results show that the proposed approach can provide a robust design for sensor networks that are more efficient than a uniform distribution of sensors on a structure

  3. Deterministic and probabilistic approach to safety analysis

    International Nuclear Information System (INIS)

    Heuser, F.W.

    1980-01-01

    The examples discussed in this paper show that reliability analysis methods fairly well can be applied in order to interpret deterministic safety criteria in quantitative terms. For further improved extension of applied reliability analysis it has turned out that the influence of operational and control systems and of component protection devices should be considered with the aid of reliability analysis methods in detail. Of course, an extension of probabilistic analysis must be accompanied by further development of the methods and a broadening of the data base. (orig.)

  4. Probabilistic and deterministic soil structure interaction analysis including ground motion incoherency effects

    International Nuclear Information System (INIS)

    Elkhoraibi, T.; Hashemi, A.; Ostadan, F.

    2014-01-01

    Soil-structure interaction (SSI) is a major step for seismic design of massive and stiff structures typical of the nuclear facilities and civil infrastructures such as tunnels, underground stations, dams and lock head structures. Currently most SSI analyses are performed deterministically, incorporating limited range of variation in soil and structural properties and without consideration of the ground motion incoherency effects. This often leads to overestimation of the seismic response particularly the In-Structure-Response Spectra (ISRS) with significant impositions of design and equipment qualification costs, especially in the case of high-frequency sensitive equipment at stiff soil or rock sites. The reluctance to incorporate a more comprehensive probabilistic approach is mainly due to the fact that the computational cost of performing probabilistic SSI analysis even without incoherency function considerations has been prohibitive. As such, bounding deterministic approaches have been preferred by the industry and accepted by the regulatory agencies. However, given the recently available and growing computing capabilities, the need for a probabilistic-based approach to the SSI analysis is becoming clear with the advances in performance-based engineering and the utilization of fragility analysis in the decision making process whether by the owners or the regulatory agencies. This paper demonstrates the use of both probabilistic and deterministic SSI analysis techniques to identify important engineering demand parameters in the structure. A typical nuclear industry structure is used as an example for this study. The system is analyzed for two different site conditions: rock and deep soil. Both deterministic and probabilistic SSI analysis approaches are performed, using the program SASSI, with and without ground motion incoherency considerations. In both approaches, the analysis begins at the hard rock level using the low frequency and high frequency hard rock

  5. Probabilistic and deterministic soil structure interaction analysis including ground motion incoherency effects

    Energy Technology Data Exchange (ETDEWEB)

    Elkhoraibi, T., E-mail: telkhora@bechtel.com; Hashemi, A.; Ostadan, F.

    2014-04-01

    Soil-structure interaction (SSI) is a major step for seismic design of massive and stiff structures typical of the nuclear facilities and civil infrastructures such as tunnels, underground stations, dams and lock head structures. Currently most SSI analyses are performed deterministically, incorporating limited range of variation in soil and structural properties and without consideration of the ground motion incoherency effects. This often leads to overestimation of the seismic response particularly the In-Structure-Response Spectra (ISRS) with significant impositions of design and equipment qualification costs, especially in the case of high-frequency sensitive equipment at stiff soil or rock sites. The reluctance to incorporate a more comprehensive probabilistic approach is mainly due to the fact that the computational cost of performing probabilistic SSI analysis even without incoherency function considerations has been prohibitive. As such, bounding deterministic approaches have been preferred by the industry and accepted by the regulatory agencies. However, given the recently available and growing computing capabilities, the need for a probabilistic-based approach to the SSI analysis is becoming clear with the advances in performance-based engineering and the utilization of fragility analysis in the decision making process whether by the owners or the regulatory agencies. This paper demonstrates the use of both probabilistic and deterministic SSI analysis techniques to identify important engineering demand parameters in the structure. A typical nuclear industry structure is used as an example for this study. The system is analyzed for two different site conditions: rock and deep soil. Both deterministic and probabilistic SSI analysis approaches are performed, using the program SASSI, with and without ground motion incoherency considerations. In both approaches, the analysis begins at the hard rock level using the low frequency and high frequency hard rock

  6. Reduction of Nambu-Poisson Manifolds by Regular Distributions

    Science.gov (United States)

    Das, Apurba

    2018-03-01

    The version of Marsden-Ratiu reduction theorem for Nambu-Poisson manifolds by a regular distribution has been studied by Ibáñez et al. In this paper we show that the reduction is always ensured unless the distribution is zero. Next we extend the more general Falceto-Zambon Poisson reduction theorem for Nambu-Poisson manifolds. Finally, we define gauge transformations of Nambu-Poisson structures and show that these transformations commute with the reduction procedure.

  7. Probabilistic risk assessment on maritime spent nuclear fuel transportation (Part II: Ship collision probability)

    International Nuclear Information System (INIS)

    Christian, Robby; Kang, Hyun Gook

    2017-01-01

    This paper proposes a methodology to assess and reduce risks of maritime spent nuclear fuel transportation with a probabilistic approach. Event trees detailing the progression of collisions leading to transport casks’ damage were constructed. Parallel and crossing collision probabilities were formulated based on the Poisson distribution. Automatic Identification System (AIS) data were processed with the Hough Transform algorithm to estimate possible intersections between the shipment route and the marine traffic. Monte Carlo simulations were done to compute collision probabilities and impact energies at each intersection. Possible safety improvement measures through a proper selection of operational transport parameters were investigated. These parameters include shipment routes, ship's cruise velocity, number of transport casks carried in a shipment, the casks’ stowage configuration and loading order on board the ship. A shipment case study is presented. Waters with high collision probabilities were identified. Effective range of cruising velocity to reduce collision risks were discovered. The number of casks in a shipment and their stowage method which gave low cask damage frequencies were obtained. The proposed methodology was successful in quantifying ship collision and cask damage frequency. It was effective in assisting decision making processes to minimize risks in maritime spent nuclear fuel transportation. - Highlights: • Proposes a probabilistic framework on the safety of spent nuclear fuel transportation by sea. • Developed a marine traffic simulation model using Generalized Hough Transform (GHT) algorithm. • A transportation case study on South Korean waters is presented. • Single-vessel risk reduction method is outlined by optimizing transport parameters.

  8. Poisson's spot and Gouy phase

    Science.gov (United States)

    da Paz, I. G.; Soldati, Rodolfo; Cabral, L. A.; de Oliveira, J. G. G.; Sampaio, Marcos

    2016-12-01

    Recently there have been experimental results on Poisson spot matter-wave interferometry followed by theoretical models describing the relative importance of the wave and particle behaviors for the phenomenon. We propose an analytical theoretical model for Poisson's spot with matter waves based on the Babinet principle, in which we use the results for free propagation and single-slit diffraction. We take into account effects of loss of coherence and finite detection area using the propagator for a quantum particle interacting with an environment. We observe that the matter-wave Gouy phase plays a role in the existence of the central peak and thus corroborates the predominantly wavelike character of the Poisson's spot. Our model shows remarkable agreement with the experimental data for deuterium (D2) molecules.

  9. Log-normal frailty models fitted as Poisson generalized linear mixed models.

    Science.gov (United States)

    Hirsch, Katharina; Wienke, Andreas; Kuss, Oliver

    2016-12-01

    The equivalence of a survival model with a piecewise constant baseline hazard function and a Poisson regression model has been known since decades. As shown in recent studies, this equivalence carries over to clustered survival data: A frailty model with a log-normal frailty term can be interpreted and estimated as a generalized linear mixed model with a binary response, a Poisson likelihood, and a specific offset. Proceeding this way, statistical theory and software for generalized linear mixed models are readily available for fitting frailty models. This gain in flexibility comes at the small price of (1) having to fix the number of pieces for the baseline hazard in advance and (2) having to "explode" the data set by the number of pieces. In this paper we extend the simulations of former studies by using a more realistic baseline hazard (Gompertz) and by comparing the model under consideration with competing models. Furthermore, the SAS macro %PCFrailty is introduced to apply the Poisson generalized linear mixed approach to frailty models. The simulations show good results for the shared frailty model. Our new %PCFrailty macro provides proper estimates, especially in case of 4 events per piece. The suggested Poisson generalized linear mixed approach for log-normal frailty models based on the %PCFrailty macro provides several advantages in the analysis of clustered survival data with respect to more flexible modelling of fixed and random effects, exact (in the sense of non-approximate) maximum likelihood estimation, and standard errors and different types of confidence intervals for all variance parameters. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Specifying design conservatism: Worst case versus probabilistic analysis

    Science.gov (United States)

    Miles, Ralph F., Jr.

    1993-01-01

    Design conservatism is the difference between specified and required performance, and is introduced when uncertainty is present. The classical approach of worst-case analysis for specifying design conservatism is presented, along with the modern approach of probabilistic analysis. The appropriate degree of design conservatism is a tradeoff between the required resources and the probability and consequences of a failure. A probabilistic analysis properly models this tradeoff, while a worst-case analysis reveals nothing about the probability of failure, and can significantly overstate the consequences of failure. Two aerospace examples will be presented that illustrate problems that can arise with a worst-case analysis.

  11. Specification of test criteria and probabilistic approach: the case of plutonium air transport

    International Nuclear Information System (INIS)

    Hubert, P.; Pages, P.; Ringot, C.; Tomachewsky, E.

    1989-03-01

    The safety of international transportation relies on compliance with IAEA regulations which specify a serie of test which the package is supposed to withstand. For Plutonium air transport some national regulations are more stringent than the IAEA one, namely the US one. For example the drop test is to be performed at 129 m.s -1 instead of 13.4 m.s -1 . The development of international Plutonium exchanges has raised the question of the adequacy of both those standards. The purpose of this paper is to show how a probabilistic approach helps in assessing the efficiency of a move towards more stringent tests

  12. Probabilistic fuel rod analyses using the TRANSURANUS code

    Energy Technology Data Exchange (ETDEWEB)

    Lassmann, K; O` Carroll, C; Laar, J Van De [CEC Joint Research Centre, Karlsruhe (Germany)

    1997-08-01

    After more than 25 years of fuel rod modelling research, the basic concepts are well established and the limitations of the specific approaches are known. However, the widely used mechanistic approach leads in many cases to discrepancies between theoretical predictions and experimental evidence indicating that models are not exact and that some of the physical processes encountered are of stochastic nature. To better understand uncertainties and their consequences, the mechanistic approach must therefore be augmented by statistical analyses. In the present paper the basic probabilistic methods are briefly discussed. Two such probabilistic approaches are included in the fuel rod performance code TRANSURANUS: the Monte Carlo method and the Numerical Noise Analysis. These two techniques are compared and their capabilities are demonstrated. (author). 12 refs, 4 figs, 2 tabs.

  13. Unimodularity criteria for Poisson structures on foliated manifolds

    Science.gov (United States)

    Pedroza, Andrés; Velasco-Barreras, Eduardo; Vorobiev, Yury

    2018-03-01

    We study the behavior of the modular class of an orientable Poisson manifold and formulate some unimodularity criteria in the semilocal context, around a (singular) symplectic leaf. Our results generalize some known unimodularity criteria for regular Poisson manifolds related to the notion of the Reeb class. In particular, we show that the unimodularity of the transverse Poisson structure of the leaf is a necessary condition for the semilocal unimodular property. Our main tool is an explicit formula for a bigraded decomposition of modular vector fields of a coupling Poisson structure on a foliated manifold. Moreover, we also exploit the notion of the modular class of a Poisson foliation and its relationship with the Reeb class.

  14. Non-isothermal Smoluchowski-Poisson equation as a singular limit of the Navier-Stokes-Fourier-Poisson system

    Czech Academy of Sciences Publication Activity Database

    Feireisl, Eduard; Laurençot, P.

    2007-01-01

    Roč. 88, - (2007), s. 325-349 ISSN 0021-7824 R&D Projects: GA ČR GA201/05/0164 Institutional research plan: CEZ:AV0Z10190503 Keywords : Navier-Stokes-Fourier- Poisson system * Smoluchowski- Poisson system * singular limit Subject RIV: BA - General Mathematics Impact factor: 1.118, year: 2007

  15. Perturbation-induced emergence of Poisson-like behavior in non-Poisson systems

    International Nuclear Information System (INIS)

    Akin, Osman C; Grigolini, Paolo; Paradisi, Paolo

    2009-01-01

    The response of a system with ON–OFF intermittency to an external harmonic perturbation is discussed. ON–OFF intermittency is described by means of a sequence of random events, i.e., the transitions from the ON to the OFF state and vice versa. The unperturbed waiting times (WTs) between two events are assumed to satisfy a renewal condition, i.e., the WTs are statistically independent random variables. The response of a renewal model with non-Poisson ON–OFF intermittency, associated with non-exponential WT distribution, is analyzed by looking at the changes induced in the WT statistical distribution by the harmonic perturbation. The scaling properties are also studied by means of diffusion entropy analysis. It is found that, in the range of fast and relatively strong perturbation, the non-Poisson system displays a Poisson-like behavior in both WT distribution and scaling. In particular, the histogram of perturbed WTs becomes a sequence of equally spaced peaks, with intensity decaying exponentially in time. Further, the diffusion entropy detects an ordinary scaling (related to normal diffusion) instead of the expected unperturbed anomalous scaling related to the inverse power-law decay. Thus, an analysis based on the WT histogram and/or on scaling methods has to be considered with some care when dealing with perturbed intermittent systems

  16. Principles of applying Poisson units in radiology

    International Nuclear Information System (INIS)

    Benyumovich, M.S.

    2000-01-01

    The probability that radioactive particles hit particular space patterns (e.g. cells in the squares of a count chamber net) and time intervals (e.g. radioactive particles hit a given area per time unit) follows the Poisson distribution. The mean is the only parameter from which all this distribution depends. A metrological base of counting the cells and radioactive particles is a property of the Poisson distribution assuming equality of a standard deviation to a root square of mean (property 1). The application of Poisson units in counting of blood formed elements and cultured cells was proposed by us (Russian Federation Patent No. 2126230). Poisson units relate to the means which make the property 1 valid. In a case of cells counting, the square of these units is equal to 1/10 of one of count chamber net where they count the cells. Thus one finds the means from the single cell count rate divided by 10. Finding the Poisson units when counting the radioactive particles should assume determination of a number of these particles sufficient to make equality 1 valid. To this end one should subdivide a time interval used in counting a single particle count rate into different number of equal portions (count numbers). Next one should pick out the count number ensuring the satisfaction of equality 1. Such a portion is taken as a Poisson unit in the radioactive particles count. If the flux of particles is controllable one should set up a count rate sufficient to make equality 1 valid. Operations with means obtained by with the use of Poisson units are performed on the base of approximation of the Poisson distribution by a normal one. (author)

  17. A Seemingly Unrelated Poisson Regression Model

    OpenAIRE

    King, Gary

    1989-01-01

    This article introduces a new estimator for the analysis of two contemporaneously correlated endogenous event count variables. This seemingly unrelated Poisson regression model (SUPREME) estimator combines the efficiencies created by single equation Poisson regression model estimators and insights from "seemingly unrelated" linear regression models.

  18. Poisson geometry from a Dirac perspective

    Science.gov (United States)

    Meinrenken, Eckhard

    2018-03-01

    We present proofs of classical results in Poisson geometry using techniques from Dirac geometry. This article is based on mini-courses at the Poisson summer school in Geneva, June 2016, and at the workshop Quantum Groups and Gravity at the University of Waterloo, April 2016.

  19. On terminating Poisson processes in some shock models

    Energy Technology Data Exchange (ETDEWEB)

    Finkelstein, Maxim, E-mail: FinkelMI@ufs.ac.z [Department of Mathematical Statistics, University of the Free State, Bloemfontein (South Africa); Max Planck Institute for Demographic Research, Rostock (Germany); Marais, Francois, E-mail: fmarais@csc.co [CSC, Cape Town (South Africa)

    2010-08-15

    A system subject to a point process of shocks is considered. Shocks occur in accordance with the homogeneous Poisson process. Different criteria of system failure (termination) are discussed and the corresponding probabilities of failure (accident)-free performance are derived. The described analytical approach is based on deriving integral equations for each setting and solving these equations through the Laplace transform. Some approximations are analyzed and further generalizations and applications are discussed.

  20. On terminating Poisson processes in some shock models

    International Nuclear Information System (INIS)

    Finkelstein, Maxim; Marais, Francois

    2010-01-01

    A system subject to a point process of shocks is considered. Shocks occur in accordance with the homogeneous Poisson process. Different criteria of system failure (termination) are discussed and the corresponding probabilities of failure (accident)-free performance are derived. The described analytical approach is based on deriving integral equations for each setting and solving these equations through the Laplace transform. Some approximations are analyzed and further generalizations and applications are discussed.

  1. Probabilistic SSME blades structural response under random pulse loading

    Science.gov (United States)

    Shiao, Michael; Rubinstein, Robert; Nagpal, Vinod K.

    1987-01-01

    The purpose is to develop models of random impacts on a Space Shuttle Main Engine (SSME) turbopump blade and to predict the probabilistic structural response of the blade to these impacts. The random loading is caused by the impact of debris. The probabilistic structural response is characterized by distribution functions for stress and displacements as functions of the loading parameters which determine the random pulse model. These parameters include pulse arrival, amplitude, and location. The analysis can be extended to predict level crossing rates. This requires knowledge of the joint distribution of the response and its derivative. The model of random impacts chosen allows the pulse arrivals, pulse amplitudes, and pulse locations to be random. Specifically, the pulse arrivals are assumed to be governed by a Poisson process, which is characterized by a mean arrival rate. The pulse intensity is modelled as a normally distributed random variable with a zero mean chosen independently at each arrival. The standard deviation of the distribution is a measure of pulse intensity. Several different models were used for the pulse locations. For example, three points near the blade tip were chosen at which pulses were allowed to arrive with equal probability. Again, the locations were chosen independently at each arrival. The structural response was analyzed both by direct Monte Carlo simulation and by a semi-analytical method.

  2. Dynamic Fault Diagnosis for Nuclear Installation Using Probabilistic Approach

    International Nuclear Information System (INIS)

    Djoko Hari Nugroho; Deswandri; Ahmad Abtokhi; Darlis

    2003-01-01

    Probabilistic based fault diagnosis which represent the relationship between cause and consequence of the events for trouble shooting is developed in this research based on Bayesian Networks. Contribution of on-line data comes from sensors and system/component reliability in node cause is expected increasing the belief level of Bayesian Networks. (author)

  3. Dirichlet forms methods for Poisson point measures and Lévy processes with emphasis on the creation-annihilation techniques

    CERN Document Server

    Bouleau, Nicolas

    2015-01-01

    A simplified approach to Malliavin calculus adapted to Poisson random measures is developed and applied in this book. Called the “lent particle method” it is based on perturbation of the position of particles. Poisson random measures describe phenomena involving random jumps (for instance in mathematical finance) or the random distribution of particles (as in statistical physics). Thanks to the theory of Dirichlet forms, the authors develop a mathematical tool for a quite general class of random Poisson measures and significantly simplify computations of Malliavin matrices of Poisson functionals. The method gives rise to a new explicit calculus that they illustrate on various examples: it consists in adding a particle and then removing it after computing the gradient. Using this method, one can establish absolute continuity of Poisson functionals such as Lévy areas, solutions of SDEs driven by Poisson measure and, by iteration, obtain regularity of laws. The authors also give applications to error calcul...

  4. A novel approach for voltage secure operation using Probabilistic Neural Network in transmission network

    Directory of Open Access Journals (Sweden)

    Santi Behera

    2016-05-01

    Full Text Available This work proposes a unique approach for improving voltage stability limit using a Probabilistic Neural Network (PNN classifier that gives corrective controls available in the system in the scenario of contingencies. The sensitivity of system is analyzed to identify weak buses with ENVCI evaluation approaching zero. The input to the classifier, termed as voltage stability enhancing neural network (VSENN classifier, for training are line flows and bus voltages near the notch point of the P–V curve and the output of the VSENN is a control variable. For various contingencies the control action that improves the voltage profile as well as stability index is identified and trained accordingly. The trained VSENN is finally tested for its robustness to improve load margin and ENVCI as well, apart from trained set of operating condition of the system along with contingencies. The proposed approach is verified in IEEE 39-bus test system.

  5. Non-probabilistic defect assessment for structures with cracks based on interval model

    International Nuclear Information System (INIS)

    Dai, Qiao; Zhou, Changyu; Peng, Jian; Chen, Xiangwei; He, Xiaohua

    2013-01-01

    Highlights: • Non-probabilistic approach is introduced to defect assessment. • Definition and establishment of IFAC are put forward. • Determination of assessment rectangle is proposed. • Solution of non-probabilistic reliability index is presented. -- Abstract: Traditional defect assessment methods conservatively treat uncertainty of parameters as safety factors, while the probabilistic method is based on the clear understanding of detailed statistical information of parameters. In this paper, the non-probabilistic approach is introduced to the failure assessment diagram (FAD) to propose a non-probabilistic defect assessment method for structures with cracks. This novel defect assessment method contains three critical processes: establishment of the interval failure assessment curve (IFAC), determination of the assessment rectangle, and solution of the non-probabilistic reliability degree. Based on the interval theory, uncertain parameters such as crack sizes, material properties and loads are considered as interval variables. As a result, the failure assessment curve (FAC) will vary in a certain range, which is defined as IFAC. And the assessment point will vary within a rectangle zone which is defined as an assessment rectangle. Based on the interval model, the establishment of IFAC and the determination of the assessment rectangle are presented. Then according to the interval possibility degree method, the non-probabilistic reliability degree of IFAC can be determined. Meanwhile, in order to clearly introduce the non-probabilistic defect assessment method, a numerical example for the assessment of a pipe with crack is given. In addition, the assessment result of the proposed method is compared with that of the traditional probabilistic method, which confirms that this non-probabilistic defect assessment can reasonably resolve the practical problem with interval variables

  6. Non-probabilistic defect assessment for structures with cracks based on interval model

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Qiao; Zhou, Changyu, E-mail: changyu_zhou@163.com; Peng, Jian; Chen, Xiangwei; He, Xiaohua

    2013-09-15

    Highlights: • Non-probabilistic approach is introduced to defect assessment. • Definition and establishment of IFAC are put forward. • Determination of assessment rectangle is proposed. • Solution of non-probabilistic reliability index is presented. -- Abstract: Traditional defect assessment methods conservatively treat uncertainty of parameters as safety factors, while the probabilistic method is based on the clear understanding of detailed statistical information of parameters. In this paper, the non-probabilistic approach is introduced to the failure assessment diagram (FAD) to propose a non-probabilistic defect assessment method for structures with cracks. This novel defect assessment method contains three critical processes: establishment of the interval failure assessment curve (IFAC), determination of the assessment rectangle, and solution of the non-probabilistic reliability degree. Based on the interval theory, uncertain parameters such as crack sizes, material properties and loads are considered as interval variables. As a result, the failure assessment curve (FAC) will vary in a certain range, which is defined as IFAC. And the assessment point will vary within a rectangle zone which is defined as an assessment rectangle. Based on the interval model, the establishment of IFAC and the determination of the assessment rectangle are presented. Then according to the interval possibility degree method, the non-probabilistic reliability degree of IFAC can be determined. Meanwhile, in order to clearly introduce the non-probabilistic defect assessment method, a numerical example for the assessment of a pipe with crack is given. In addition, the assessment result of the proposed method is compared with that of the traditional probabilistic method, which confirms that this non-probabilistic defect assessment can reasonably resolve the practical problem with interval variables.

  7. Retention and Curve Number Variability in a Small Agricultural Catchment: The Probabilistic Approach

    Directory of Open Access Journals (Sweden)

    Kazimierz Banasik

    2014-04-01

    Full Text Available The variability of the curve number (CN and the retention parameter (S of the Soil Conservation Service (SCS-CN method in a small agricultural, lowland watershed (23.4 km2 to the gauging station in central Poland has been assessed using the probabilistic approach: distribution fitting and confidence intervals (CIs. Empirical CNs and Ss were computed directly from recorded rainfall depths and direct runoff volumes. Two measures of the goodness of fit were used as selection criteria in the identification of the parent distribution function. The measures specified the generalized extreme value (GEV, normal and general logistic (GLO distributions for 100-CN and GLO, lognormal and GEV distributions for S. The characteristics estimated from theoretical distribution (median, quantiles were compared to the tabulated CN and to the antecedent runoff conditions of Hawkins and Hjelmfelt. The distribution fitting for the whole sample revealed a good agreement between the tabulated CN and the median and between the antecedent runoff conditions (ARCs of Hawkins and Hjelmfelt, which certified a good calibration of the model. However, the division of the CN sample due to heavy and moderate rainfall depths revealed a serious inconsistency between the parameters mentioned. This analysis proves that the application of the SCS-CN method should rely on deep insight into the probabilistic properties of CN and S.

  8. Probabilistic Design of Wave Energy Devices

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Kofoed, Jens Peter; Ferreira, C.B.

    2011-01-01

    Wave energy has a large potential for contributing significantly to production of renewable energy. However, the wave energy sector is still not able to deliver cost competitive and reliable solutions. But the sector has already demonstrated several proofs of concepts. The design of wave energy...... devices is a new and expanding technical area where there is no tradition for probabilistic design—in fact very little full scale devices has been build to date, so it can be said that no design tradition really exists in this area. For this reason it is considered to be of great importance to develop...... and advocate for a probabilistic design approach, as it is assumed (in other areas this has been demonstrated) that this leads to more economical designs compared to designs based on deterministic methods. In the present paper a general framework for probabilistic design and reliability analysis of wave energy...

  9. xLPR - a probabilistic approach to piping integrity analysis

    International Nuclear Information System (INIS)

    Harrington, C.; Rudland, D.; Fyfitch, S.

    2015-01-01

    The xLPR Code is a probabilistic fracture mechanics (PFM) computational tool that can be used to quantitatively determine a best-estimate probability of failure with well characterized uncertainties for reactor coolant system components, beginning with the piping systems and including the effects of relevant active degradation mechanisms. The initial application planned for xLPR is somewhat narrowly focused on validating LBB (leak-before-break) compliance in PWSCC-susceptible systems such as coolant systems of PWRs. The xLPR code incorporates a set of deterministic models that represent the full range of physical phenomena necessary to evaluate both fatigue and PWSCC degradation modes from crack initiation through failure. These models are each implemented in a modular form and linked together by a probabilistic framework that contains the logic for xLPR execution, exercises the individual modules as required, and performs necessary administrative and bookkeeping functions. The completion of the first production version of the xLPR code in a fully documented, releasable condition is presently planned for spring 2015

  10. Probabilistic Design of Wind Turbines

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Toft, H.S.

    2010-01-01

    Probabilistic design of wind turbines requires definition of the structural elements to be included in the probabilistic basis: e.g., blades, tower, foundation; identification of important failure modes; careful stochastic modeling of the uncertain parameters; recommendations for target reliability....... It is described how uncertainties in wind turbine design related to computational models, statistical data from test specimens, results from a few full-scale tests and from prototype wind turbines can be accounted for using the Maximum Likelihood Method and a Bayesian approach. Assessment of the optimal...... reliability level by cost-benefit optimization is illustrated by an offshore wind turbine example. Uncertainty modeling is illustrated by an example where physical, statistical and model uncertainties are estimated....

  11. Application of the probabilistic method at the E.D.F

    International Nuclear Information System (INIS)

    Gachot, Bernard

    1976-01-01

    Having first evoked the problems arising from the definition of a so-called 'acceptable risk', the probabilistic study programme on safety carried out at the E.D.F. is described. The different aspects of the probabilistic estimation of a hazard are presented as well as the different steps i.e. collecting the information, carrying out a quantitative and qualitative analysis, which characterize the probabilistic study of safety problems. The problem of data determination is considered on reliability of the equipment, noting as a conclusion, that in spite of the lack of accuracy of the present data, the probabilistic methods already appear as a highly valuable tool favouring an homogenous and coherent approach of nuclear plant safety [fr

  12. Probabilistic insurance

    OpenAIRE

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these preferences are intuitively appealing they are difficult to reconcile with expected utility theory. Under highly plausible assumptions about the utility function, willingness to pay for probabilistic i...

  13. Probabilistic forecasting and Bayesian data assimilation

    CERN Document Server

    Reich, Sebastian

    2015-01-01

    In this book the authors describe the principles and methods behind probabilistic forecasting and Bayesian data assimilation. Instead of focusing on particular application areas, the authors adopt a general dynamical systems approach, with a profusion of low-dimensional, discrete-time numerical examples designed to build intuition about the subject. Part I explains the mathematical framework of ensemble-based probabilistic forecasting and uncertainty quantification. Part II is devoted to Bayesian filtering algorithms, from classical data assimilation algorithms such as the Kalman filter, variational techniques, and sequential Monte Carlo methods, through to more recent developments such as the ensemble Kalman filter and ensemble transform filters. The McKean approach to sequential filtering in combination with coupling of measures serves as a unifying mathematical framework throughout Part II. Assuming only some basic familiarity with probability, this book is an ideal introduction for graduate students in ap...

  14. Scalable group level probabilistic sparse factor analysis

    DEFF Research Database (Denmark)

    Hinrich, Jesper Løve; Nielsen, Søren Føns Vind; Riis, Nicolai Andre Brogaard

    2017-01-01

    Many data-driven approaches exist to extract neural representations of functional magnetic resonance imaging (fMRI) data, but most of them lack a proper probabilistic formulation. We propose a scalable group level probabilistic sparse factor analysis (psFA) allowing spatially sparse maps, component...... pruning using automatic relevance determination (ARD) and subject specific heteroscedastic spatial noise modeling. For task-based and resting state fMRI, we show that the sparsity constraint gives rise to components similar to those obtained by group independent component analysis. The noise modeling...... shows that noise is reduced in areas typically associated with activation by the experimental design. The psFA model identifies sparse components and the probabilistic setting provides a natural way to handle parameter uncertainties. The variational Bayesian framework easily extends to more complex...

  15. Implementation of upper limit calculation for a poisson variable by bayesian approach

    International Nuclear Information System (INIS)

    Zhu Yongsheng

    2008-01-01

    The calculation of Bayesian confidence upper limit for a Poisson variable including both signal and background with and without systematic uncertainties has been formulated. A Fortran 77 routine, BPULE, has been developed to implement the calculation. The routine can account for systematic uncertainties in the background expectation and signal efficiency. The systematic uncertainties may be separately parameterized by a Gaussian, Log-Gaussian or flat probability density function (pdf). Some technical details of BPULE have been discussed. (authors)

  16. Fully probabilistic design: the way for optimizing of concrete structures

    Directory of Open Access Journals (Sweden)

    I. Laníková

    Full Text Available Some standards for the design of concrete structures (e.g. EC2 and the original ČSN 73 1201-86 allow a structure to be designed by several methods. This contribution documents the fact that even if a structure does not comply with the partial reliability factor method, according to EC2, it can satisfy the conditions during the application of the fully probabilistic approach when using the same standard. From an example of the reliability of a prestressed spun concrete pole designed by the partial factor method and fully probabilistic approach according to the Eurocode it is evident that an expert should apply a more precise (though unfortunately more complicated method in the limiting cases. The Monte Carlo method, modified by the Latin Hypercube Sampling (LHS method, has been used for the calculation of reliability. Ultimate and serviceability limit states were checked for the partial factor method and fully probabilistic design. As a result of fully probabilistic design it is possible to obtain a more efficient design for a structure.

  17. A Unified Probabilistic Framework for Dose-Response Assessment of Human Health Effects.

    Science.gov (United States)

    Chiu, Weihsueh A; Slob, Wout

    2015-12-01

    When chemical health hazards have been identified, probabilistic dose-response assessment ("hazard characterization") quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. We developed a unified framework for probabilistic dose-response assessment. We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose-response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, "effect metrics" can be specified to define "toxicologically equivalent" sizes for this underlying individual response; and d) dose-response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose-response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Probabilistically derived exposure limits are based on estimating a "target human dose" (HDMI), which requires risk management-informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%-10% effect sizes. Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions.

  18. Tetrahedral meshing via maximal Poisson-disk sampling

    KAUST Repository

    Guo, Jianwei

    2016-02-15

    In this paper, we propose a simple yet effective method to generate 3D-conforming tetrahedral meshes from closed 2-manifold surfaces. Our approach is inspired by recent work on maximal Poisson-disk sampling (MPS), which can generate well-distributed point sets in arbitrary domains. We first perform MPS on the boundary of the input domain, we then sample the interior of the domain, and we finally extract the tetrahedral mesh from the samples by using 3D Delaunay or regular triangulation for uniform or adaptive sampling, respectively. We also propose an efficient optimization strategy to protect the domain boundaries and to remove slivers to improve the meshing quality. We present various experimental results to illustrate the efficiency and the robustness of our proposed approach. We demonstrate that the performance and quality (e.g., minimal dihedral angle) of our approach are superior to current state-of-the-art optimization-based approaches.

  19. A Heuristic Probabilistic Approach to Estimating Size-Dependent Mobility of Nonuniform Sediment

    Science.gov (United States)

    Woldegiorgis, B. T.; Wu, F. C.; van Griensven, A.; Bauwens, W.

    2017-12-01

    Simulating the mechanism of bed sediment mobility is essential for modelling sediment dynamics. Despite the fact that many studies are carried out on this subject, they use complex mathematical formulations that are computationally expensive, and are often not easy for implementation. In order to present a simple and computationally efficient complement to detailed sediment mobility models, we developed a heuristic probabilistic approach to estimating the size-dependent mobilities of nonuniform sediment based on the pre- and post-entrainment particle size distributions (PSDs), assuming that the PSDs are lognormally distributed. The approach fits a lognormal probability density function (PDF) to the pre-entrainment PSD of bed sediment and uses the threshold particle size of incipient motion and the concept of sediment mixture to estimate the PSDs of the entrained sediment and post-entrainment bed sediment. The new approach is simple in physical sense and significantly reduces the complexity and computation time and resource required by detailed sediment mobility models. It is calibrated and validated with laboratory and field data by comparing to the size-dependent mobilities predicted with the existing empirical lognormal cumulative distribution function (CDF) approach. The novel features of the current approach are: (1) separating the entrained and non-entrained sediments by a threshold particle size, which is a modified critical particle size of incipient motion by accounting for the mixed-size effects, and (2) using the mixture-based pre- and post-entrainment PSDs to provide a continuous estimate of the size-dependent sediment mobility.

  20. A probabilistic approach for estimating the spatial extent of pesticide agricultural use sites and potential co-occurrence with listed species for use in ecological risk assessments.

    Science.gov (United States)

    Budreski, Katherine; Winchell, Michael; Padilla, Lauren; Bang, JiSu; Brain, Richard A

    2016-04-01

    A crop footprint refers to the estimated spatial extent of growing areas for a specific crop, and is commonly used to represent the potential "use site" footprint for a pesticide labeled for use on that crop. A methodology for developing probabilistic crop footprints to estimate the likelihood of pesticide use and the potential co-occurrence of pesticide use and listed species locations was tested at the national scale and compared to alternative methods. The probabilistic aspect of the approach accounts for annual crop rotations and the uncertainty in remotely sensed crop and land cover data sets. The crop footprints used historically are derived exclusively from the National Land Cover Database (NLCD) Cultivated Crops and/or Pasture/Hay classes. This approach broadly aggregates agriculture into 2 classes, which grossly overestimates the spatial extent of individual crops that are labeled for pesticide use. The approach also does not use all the available crop data, represents a single point in time, and does not account for the uncertainty in land cover data set classifications. The probabilistic crop footprint approach described herein incorporates best available information at the time of analysis from the National Agricultural Statistics Service (NASS) Cropland Data Layer (CDL) for 5 y (2008-2012 at the time of analysis), the 2006 NLCD, the 2007 NASS Census of Agriculture, and 5 y of NASS Quick Stats (2008-2012). The approach accounts for misclassification of crop classes in the CDL by incorporating accuracy assessment information by state, year, and crop. The NLCD provides additional information to improve the CDL crop probability through an adjustment based on the NLCD accuracy assessment data using the principles of Bayes' Theorem. Finally, crop probabilities are scaled at the state level by comparing against NASS surveys (Census of Agriculture and Quick Stats) of reported planted acres by crop. In an example application of the new method, the probabilistic

  1. A Probabilistic Framework for Curve Evolution

    DEFF Research Database (Denmark)

    Dahl, Vedrana Andersen

    2017-01-01

    approach include ability to handle textured images, simple generalization to multiple regions, and efficiency in computation. We test our probabilistic framework in combination with parametric (snakes) and geometric (level-sets) curves. The experimental results on composed and natural images demonstrate...

  2. Probabilistic Linguistic Power Aggregation Operators for Multi-Criteria Group Decision Making

    Directory of Open Access Journals (Sweden)

    Agbodah Kobina

    2017-12-01

    Full Text Available As an effective aggregation tool, power average (PA allows the input arguments being aggregated to support and reinforce each other, which provides more versatility in the information aggregation process. Under the probabilistic linguistic term environment, we deeply investigate the new power aggregation (PA operators for fusing the probabilistic linguistic term sets (PLTSs. In this paper, we firstly develop the probabilistic linguistic power average (PLPA, the weighted probabilistic linguistic power average (WPLPA operators, the probabilistic linguistic power geometric (PLPG and the weighted probabilistic linguistic power geometric (WPLPG operators. At the same time, we carefully analyze the properties of these new aggregation operators. With the aid of the WPLPA and WPLPG operators, we further design the approaches for the application of multi-criteria group decision-making (MCGDM with PLTSs. Finally, we use an illustrated example to expound our proposed methods and verify their performances.

  3. From Sensory Signals to Modality-Independent Conceptual Representations: A Probabilistic Language of Thought Approach.

    Directory of Open Access Journals (Sweden)

    Goker Erdogan

    2015-11-01

    Full Text Available People learn modality-independent, conceptual representations from modality-specific sensory signals. Here, we hypothesize that any system that accomplishes this feat will include three components: a representational language for characterizing modality-independent representations, a set of sensory-specific forward models for mapping from modality-independent representations to sensory signals, and an inference algorithm for inverting forward models-that is, an algorithm for using sensory signals to infer modality-independent representations. To evaluate this hypothesis, we instantiate it in the form of a computational model that learns object shape representations from visual and/or haptic signals. The model uses a probabilistic grammar to characterize modality-independent representations of object shape, uses a computer graphics toolkit and a human hand simulator to map from object representations to visual and haptic features, respectively, and uses a Bayesian inference algorithm to infer modality-independent object representations from visual and/or haptic signals. Simulation results show that the model infers identical object representations when an object is viewed, grasped, or both. That is, the model's percepts are modality invariant. We also report the results of an experiment in which different subjects rated the similarity of pairs of objects in different sensory conditions, and show that the model provides a very accurate account of subjects' ratings. Conceptually, this research significantly contributes to our understanding of modality invariance, an important type of perceptual constancy, by demonstrating how modality-independent representations can be acquired and used. Methodologically, it provides an important contribution to cognitive modeling, particularly an emerging probabilistic language-of-thought approach, by showing how symbolic and statistical approaches can be combined in order to understand aspects of human perception.

  4. Optimization (Alara) and probabilistic exposures: the application of optimization criteria to the control of risks due to exposures of a probabilistic nature

    International Nuclear Information System (INIS)

    Gonzalez, A.J.

    1989-01-01

    The paper described the application of the principles of optimization recommended by the International Commission on Radiological Protection (ICRP) to the restrain of radiation risks due to exposures that may or may not be incurred and to which a probability of occurrence can be assigned. After describing the concept of probabilistic exposures, it proposes a basis for a converging policy of control for both certain and probabilistic exposures, namely the dose-risk relationship adopted for radiation protection purposes. On that basis some coherent approaches for dealing with probabilistic exposures, such as the limitation of individual risks, are discussed. The optimization of safety for reducing all risks from probabilistic exposures to as-low-as-reasonably-achievable (ALARA) levels is reviewed in full. The principles of optimization of protection are used as a basic framework and the relevant factors to be taken into account when moving to probabilistic exposures are presented. The paper also reviews the decision-aiding techniques suitable for performing optimization with particular emphasis to the multi-attribute utility-analysis technique. Finally, there is a discussion on some practical application of decision-aiding multi-attribute utility analysis to probabilistic exposures including the use of probabilistic utilities. In its final outlook, the paper emphasizes the need for standardization and solutions to generic problems, if optimization of safety is to be successful

  5. Evaluating the double Poisson generalized linear model.

    Science.gov (United States)

    Zou, Yaotian; Geedipally, Srinivas Reddy; Lord, Dominique

    2013-10-01

    The objectives of this study are to: (1) examine the applicability of the double Poisson (DP) generalized linear model (GLM) for analyzing motor vehicle crash data characterized by over- and under-dispersion and (2) compare the performance of the DP GLM with the Conway-Maxwell-Poisson (COM-Poisson) GLM in terms of goodness-of-fit and theoretical soundness. The DP distribution has seldom been investigated and applied since its first introduction two decades ago. The hurdle for applying the DP is related to its normalizing constant (or multiplicative constant) which is not available in closed form. This study proposed a new method to approximate the normalizing constant of the DP with high accuracy and reliability. The DP GLM and COM-Poisson GLM were developed using two observed over-dispersed datasets and one observed under-dispersed dataset. The modeling results indicate that the DP GLM with its normalizing constant approximated by the new method can handle crash data characterized by over- and under-dispersion. Its performance is comparable to the COM-Poisson GLM in terms of goodness-of-fit (GOF), although COM-Poisson GLM provides a slightly better fit. For the over-dispersed data, the DP GLM performs similar to the NB GLM. Considering the fact that the DP GLM can be easily estimated with inexpensive computation and that it is simpler to interpret coefficients, it offers a flexible and efficient alternative for researchers to model count data. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Assessing Probabilistic Risk Assessment Approaches for Insect Biological Control Introductions.

    Science.gov (United States)

    Kaufman, Leyla V; Wright, Mark G

    2017-07-07

    The introduction of biological control agents to new environments requires host specificity tests to estimate potential non-target impacts of a prospective agent. Currently, the approach is conservative, and is based on physiological host ranges determined under captive rearing conditions, without consideration for ecological factors that may influence realized host range. We use historical data and current field data from introduced parasitoids that attack an endemic Lepidoptera species in Hawaii to validate a probabilistic risk assessment (PRA) procedure for non-target impacts. We use data on known host range and habitat use in the place of origin of the parasitoids to determine whether contemporary levels of non-target parasitism could have been predicted using PRA. Our results show that reasonable predictions of potential non-target impacts may be made if comprehensive data are available from places of origin of biological control agents, but scant data produce poor predictions. Using apparent mortality data rather than marginal attack rate estimates in PRA resulted in over-estimates of predicted non-target impact. Incorporating ecological data into PRA models improved the predictive power of the risk assessments.

  7. Relaxed Poisson cure rate models.

    Science.gov (United States)

    Rodrigues, Josemar; Cordeiro, Gauss M; Cancho, Vicente G; Balakrishnan, N

    2016-03-01

    The purpose of this article is to make the standard promotion cure rate model (Yakovlev and Tsodikov, ) more flexible by assuming that the number of lesions or altered cells after a treatment follows a fractional Poisson distribution (Laskin, ). It is proved that the well-known Mittag-Leffler relaxation function (Berberan-Santos, ) is a simple way to obtain a new cure rate model that is a compromise between the promotion and geometric cure rate models allowing for superdispersion. So, the relaxed cure rate model developed here can be considered as a natural and less restrictive extension of the popular Poisson cure rate model at the cost of an additional parameter, but a competitor to negative-binomial cure rate models (Rodrigues et al., ). Some mathematical properties of a proper relaxed Poisson density are explored. A simulation study and an illustration of the proposed cure rate model from the Bayesian point of view are finally presented. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Poisson denoising on the sphere

    Science.gov (United States)

    Schmitt, J.; Starck, J. L.; Fadili, J.; Grenier, I.; Casandjian, J. M.

    2009-08-01

    In the scope of the Fermi mission, Poisson noise removal should improve data quality and make source detection easier. This paper presents a method for Poisson data denoising on sphere, called Multi-Scale Variance Stabilizing Transform on Sphere (MS-VSTS). This method is based on a Variance Stabilizing Transform (VST), a transform which aims to stabilize a Poisson data set such that each stabilized sample has an (asymptotically) constant variance. In addition, for the VST used in the method, the transformed data are asymptotically Gaussian. Thus, MS-VSTS consists in decomposing the data into a sparse multi-scale dictionary (wavelets, curvelets, ridgelets...), and then applying a VST on the coefficients in order to get quasi-Gaussian stabilized coefficients. In this present article, the used multi-scale transform is the Isotropic Undecimated Wavelet Transform. Then, hypothesis tests are made to detect significant coefficients, and the denoised image is reconstructed with an iterative method based on Hybrid Steepest Descent (HST). The method is tested on simulated Fermi data.

  9. A comparison of multiple indicator kriging and area-to-point Poisson kriging for mapping patterns of herbivore species abundance in Kruger National Park, South Africa.

    Science.gov (United States)

    Kerry, Ruth; Goovaerts, Pierre; Smit, Izak P J; Ingram, Ben R

    Kruger National Park (KNP), South Africa, provides protected habitats for the unique animals of the African savannah. For the past 40 years, annual aerial surveys of herbivores have been conducted to aid management decisions based on (1) the spatial distribution of species throughout the park and (2) total species populations in a year. The surveys are extremely time consuming and costly. For many years, the whole park was surveyed, but in 1998 a transect survey approach was adopted. This is cheaper and less time consuming but leaves gaps in the data spatially. Also the distance method currently employed by the park only gives estimates of total species populations but not their spatial distribution. We compare the ability of multiple indicator kriging and area-to-point Poisson kriging to accurately map species distribution in the park. A leave-one-out cross-validation approach indicates that multiple indicator kriging makes poor estimates of the number of animals, particularly the few large counts, as the indicator variograms for such high thresholds are pure nugget. Poisson kriging was applied to the prediction of two types of abundance data: spatial density and proportion of a given species. Both Poisson approaches had standardized mean absolute errors (St. MAEs) of animal counts at least an order of magnitude lower than multiple indicator kriging. The spatial density, Poisson approach (1), gave the lowest St. MAEs for the most abundant species and the proportion, Poisson approach (2), did for the least abundant species. Incorporating environmental data into Poisson approach (2) further reduced St. MAEs.

  10. CAD Parts-Based Assembly Modeling by Probabilistic Reasoning

    KAUST Repository

    Zhang, Kai-Ke

    2016-04-11

    Nowadays, increasing amount of parts and sub-assemblies are publicly available, which can be used directly for product development instead of creating from scratch. In this paper, we propose an interactive design framework for efficient and smart assembly modeling, in order to improve the design efficiency. Our approach is based on a probabilistic reasoning. Given a collection of industrial assemblies, we learn a probabilistic graphical model from the relationships between the parts of assemblies. Then in the modeling stage, this probabilistic model is used to suggest the most likely used parts compatible with the current assembly. Finally, the parts are assembled under certain geometric constraints. We demonstrate the effectiveness of our framework through a variety of assembly models produced by our prototype system. © 2015 IEEE.

  11. CAD Parts-Based Assembly Modeling by Probabilistic Reasoning

    KAUST Repository

    Zhang, Kai-Ke; Hu, Kai-Mo; Yin, Li-Cheng; Yan, Dongming; Wang, Bin

    2016-01-01

    Nowadays, increasing amount of parts and sub-assemblies are publicly available, which can be used directly for product development instead of creating from scratch. In this paper, we propose an interactive design framework for efficient and smart assembly modeling, in order to improve the design efficiency. Our approach is based on a probabilistic reasoning. Given a collection of industrial assemblies, we learn a probabilistic graphical model from the relationships between the parts of assemblies. Then in the modeling stage, this probabilistic model is used to suggest the most likely used parts compatible with the current assembly. Finally, the parts are assembled under certain geometric constraints. We demonstrate the effectiveness of our framework through a variety of assembly models produced by our prototype system. © 2015 IEEE.

  12. On the evaluation of the efficacy of a smart damper: a new equivalent energy-based probabilistic approach

    International Nuclear Information System (INIS)

    Aly, A M; Christenson, R E

    2008-01-01

    Smart damping technology has been proposed to protect civil structures from dynamic loads. Each application of smart damping control provides varying levels of performance relative to active and passive control strategies. Currently, researchers compare the relative efficacy of smart damping control to active and passive strategies by running numerous simulations. These simulations can require significant computation time and resources. Because of this, it is desirable to develop an approach to assess the applicability of smart damping technology which requires less computation time. This paper discusses and verifies a probabilistic approach to determine the efficacy of smart damping technology based on clipped optimal state feedback control theory

  13. Improving probabilistic prediction of daily streamflow by identifying Pareto optimal approaches for modelling heteroscedastic residual errors

    Science.gov (United States)

    David, McInerney; Mark, Thyer; Dmitri, Kavetski; George, Kuczera

    2017-04-01

    This study provides guidance to hydrological researchers which enables them to provide probabilistic predictions of daily streamflow with the best reliability and precision for different catchment types (e.g. high/low degree of ephemerality). Reliable and precise probabilistic prediction of daily catchment-scale streamflow requires statistical characterization of residual errors of hydrological models. It is commonly known that hydrological model residual errors are heteroscedastic, i.e. there is a pattern of larger errors in higher streamflow predictions. Although multiple approaches exist for representing this heteroscedasticity, few studies have undertaken a comprehensive evaluation and comparison of these approaches. This study fills this research gap by evaluating 8 common residual error schemes, including standard and weighted least squares, the Box-Cox transformation (with fixed and calibrated power parameter, lambda) and the log-sinh transformation. Case studies include 17 perennial and 6 ephemeral catchments in Australia and USA, and two lumped hydrological models. We find the choice of heteroscedastic error modelling approach significantly impacts on predictive performance, though no single scheme simultaneously optimizes all performance metrics. The set of Pareto optimal schemes, reflecting performance trade-offs, comprises Box-Cox schemes with lambda of 0.2 and 0.5, and the log scheme (lambda=0, perennial catchments only). These schemes significantly outperform even the average-performing remaining schemes (e.g., across ephemeral catchments, median precision tightens from 105% to 40% of observed streamflow, and median biases decrease from 25% to 4%). Theoretical interpretations of empirical results highlight the importance of capturing the skew/kurtosis of raw residuals and reproducing zero flows. Recommendations for researchers and practitioners seeking robust residual error schemes for practical work are provided.

  14. PENERAPAN REGRESI BINOMIAL NEGATIF UNTUK MENGATASI OVERDISPERSI PADA REGRESI POISSON

    Directory of Open Access Journals (Sweden)

    PUTU SUSAN PRADAWATI

    2013-09-01

    Full Text Available Poisson regression was used to analyze the count data which Poisson distributed. Poisson regression analysis requires state equidispersion, in which the mean value of the response variable is equal to the value of the variance. However, there are deviations in which the value of the response variable variance is greater than the mean. This is called overdispersion. If overdispersion happens and Poisson Regression analysis is being used, then underestimated standard errors will be obtained. Negative Binomial Regression can handle overdispersion because it contains a dispersion parameter. From the simulation data which experienced overdispersion in the Poisson Regression model it was found that the Negative Binomial Regression was better than the Poisson Regression model.

  15. Poisson statistics of PageRank probabilities of Twitter and Wikipedia networks

    Science.gov (United States)

    Frahm, Klaus M.; Shepelyansky, Dima L.

    2014-04-01

    We use the methods of quantum chaos and Random Matrix Theory for analysis of statistical fluctuations of PageRank probabilities in directed networks. In this approach the effective energy levels are given by a logarithm of PageRank probability at a given node. After the standard energy level unfolding procedure we establish that the nearest spacing distribution of PageRank probabilities is described by the Poisson law typical for integrable quantum systems. Our studies are done for the Twitter network and three networks of Wikipedia editions in English, French and German. We argue that due to absence of level repulsion the PageRank order of nearby nodes can be easily interchanged. The obtained Poisson law implies that the nearby PageRank probabilities fluctuate as random independent variables.

  16. The quantum poisson-Lie T-duality and mirror symmetry

    International Nuclear Information System (INIS)

    Parkhomenko, S.E.

    1999-01-01

    Poisson-Lie T-duality in quantum N=2 superconformal Wess-Zumino-Novikov-Witten models is considered. The Poisson-Lie T-duality transformation rules of the super-Kac-Moody algebra currents are found from the conjecture that, as in the classical case, the quantum Poisson-Lie T-duality transformation is given by an automorphism which interchanges the isotropic subalgebras of the underlying Manin triple in one of the chirality sectors of the model. It is shown that quantum Poisson-Lie T-duality acts on the N=2 super-Virasoro algebra generators of the quantum models as a mirror symmetry acts: in one of the chirality sectors it is a trivial transformation while in another chirality sector it changes the sign of the U(1) current and interchanges the spin-3/2 currents. A generalization of Poisson-Lie T-duality for the quantum Kazama-Suzuki models is proposed. It is shown that quantum Poisson-Lie T-duality acts in these models as a mirror symmetry also

  17. Statistics of weighted Poisson events and its applications

    International Nuclear Information System (INIS)

    Bohm, G.; Zech, G.

    2014-01-01

    The statistics of the sum of random weights where the number of weights is Poisson distributed has important applications in nuclear physics, particle physics and astrophysics. Events are frequently weighted according to their acceptance or relevance to a certain type of reaction. The sum is described by the compound Poisson distribution (CPD) which is shortly reviewed. It is shown that the CPD can be approximated by a scaled Poisson distribution (SPD). The SPD is applied to parameter estimation in situations where the data are distorted by resolution effects. It performs considerably better than the normal approximation that is usually used. A special Poisson bootstrap technique is presented which permits to derive confidence limits for observations following the CPD

  18. A Unified Probabilistic Framework for Dose–Response Assessment of Human Health Effects

    Science.gov (United States)

    Slob, Wout

    2015-01-01

    Background When chemical health hazards have been identified, probabilistic dose–response assessment (“hazard characterization”) quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. Objectives We developed a unified framework for probabilistic dose–response assessment. Methods We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose–response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, “effect metrics” can be specified to define “toxicologically equivalent” sizes for this underlying individual response; and d) dose–response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose–response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Results Probabilistically derived exposure limits are based on estimating a “target human dose” (HDMI), which requires risk management–informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%–10% effect sizes. Conclusions Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk

  19. Efficient probabilistic inference in generic neural networks trained with non-probabilistic feedback.

    Science.gov (United States)

    Orhan, A Emin; Ma, Wei Ji

    2017-07-26

    Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.

  20. Zero-inflated Poisson regression models for QTL mapping applied to tick-resistance in a Gyr × Holstein F2 population

    Science.gov (United States)

    Silva, Fabyano Fonseca; Tunin, Karen P.; Rosa, Guilherme J.M.; da Silva, Marcos V.B.; Azevedo, Ana Luisa Souza; da Silva Verneque, Rui; Machado, Marco Antonio; Packer, Irineu Umberto

    2011-01-01

    Now a days, an important and interesting alternative in the control of tick-infestation in cattle is to select resistant animals, and identify the respective quantitative trait loci (QTLs) and DNA markers, for posterior use in breeding programs. The number of ticks/animal is characterized as a discrete-counting trait, which could potentially follow Poisson distribution. However, in the case of an excess of zeros, due to the occurrence of several noninfected animals, zero-inflated Poisson and generalized zero-inflated distribution (GZIP) may provide a better description of the data. Thus, the objective here was to compare through simulation, Poisson and ZIP models (simple and generalized) with classical approaches, for QTL mapping with counting phenotypes under different scenarios, and to apply these approaches to a QTL study of tick resistance in an F2 cattle (Gyr × Holstein) population. It was concluded that, when working with zero-inflated data, it is recommendable to use the generalized and simple ZIP model for analysis. On the other hand, when working with data with zeros, but not zero-inflated, the Poisson model or a data-transformation-approach, such as square-root or Box-Cox transformation, are applicable. PMID:22215960

  1. Zero-inflated Poisson regression models for QTL mapping applied to tick-resistance in a Gyr x Holstein F2 population

    Directory of Open Access Journals (Sweden)

    Fabyano Fonseca Silva

    2011-01-01

    Full Text Available Nowadays, an important and interesting alternative in the control of tick-infestation in cattle is to select resistant animals, and identify the respective quantitative trait loci (QTLs and DNA markers, for posterior use in breeding programs. The number of ticks/animal is characterized as a discrete-counting trait, which could potentially follow Poisson distribution. However, in the case of an excess of zeros, due to the occurrence of several noninfected animals, zero-inflated Poisson and generalized zero-inflated distribution (GZIP may provide a better description of the data. Thus, the objective here was to compare through simulation, Poisson and ZIP models (simple and generalized with classical approaches, for QTL mapping with counting phenotypes under different scenarios, and to apply these approaches to a QTL study of tick resistance in an F2 cattle (Gyr x Holstein population. It was concluded that, when working with zero-inflated data, it is recommendable to use the generalized and simple ZIP model for analysis. On the other hand, when working with data with zeros, but not zero-inflated, the Poisson model or a data-transformation-approach, such as square-root or Box-Cox transformation, are applicable.

  2. A probabilistic approach for validating protein NMR chemical shift assignments

    International Nuclear Information System (INIS)

    Wang Bowei; Wang, Yunjun; Wishart, David S.

    2010-01-01

    It has been estimated that more than 20% of the proteins in the BMRB are improperly referenced and that about 1% of all chemical shift assignments are mis-assigned. These statistics also reflect the likelihood that any newly assigned protein will have shift assignment or shift referencing errors. The relatively high frequency of these errors continues to be a concern for the biomolecular NMR community. While several programs do exist to detect and/or correct chemical shift mis-referencing or chemical shift mis-assignments, most can only do one, or the other. The one program (SHIFTCOR) that is capable of handling both chemical shift mis-referencing and mis-assignments, requires the 3D structure coordinates of the target protein. Given that chemical shift mis-assignments and chemical shift re-referencing issues should ideally be addressed prior to 3D structure determination, there is a clear need to develop a structure-independent approach. Here, we present a new structure-independent protocol, which is based on using residue-specific and secondary structure-specific chemical shift distributions calculated over small (3-6 residue) fragments to identify mis-assigned resonances. The method is also able to identify and re-reference mis-referenced chemical shift assignments. Comparisons against existing re-referencing or mis-assignment detection programs show that the method is as good or superior to existing approaches. The protocol described here has been implemented into a freely available Java program called 'Probabilistic Approach for protein Nmr Assignment Validation (PANAV)' and as a web server (http://redpoll.pharmacy.ualberta.ca/PANAVhttp://redpoll.pharmacy.ualberta.ca/PANAV) which can be used to validate and/or correct as well as re-reference assigned protein chemical shifts.

  3. Noncommutative gauge theory for Poisson manifolds

    Energy Technology Data Exchange (ETDEWEB)

    Jurco, Branislav E-mail: jurco@mpim-bonn.mpg.de; Schupp, Peter E-mail: schupp@theorie.physik.uni-muenchen.de; Wess, Julius E-mail: wess@theorie.physik.uni-muenchen.de

    2000-09-25

    A noncommutative gauge theory is associated to every Abelian gauge theory on a Poisson manifold. The semi-classical and full quantum version of the map from the ordinary gauge theory to the noncommutative gauge theory (Seiberg-Witten map) is given explicitly to all orders for any Poisson manifold in the Abelian case. In the quantum case the construction is based on Kontsevich's formality theorem.

  4. Noncommutative gauge theory for Poisson manifolds

    International Nuclear Information System (INIS)

    Jurco, Branislav; Schupp, Peter; Wess, Julius

    2000-01-01

    A noncommutative gauge theory is associated to every Abelian gauge theory on a Poisson manifold. The semi-classical and full quantum version of the map from the ordinary gauge theory to the noncommutative gauge theory (Seiberg-Witten map) is given explicitly to all orders for any Poisson manifold in the Abelian case. In the quantum case the construction is based on Kontsevich's formality theorem

  5. Casimir meets Poisson: improved quark/gluon discrimination with counting observables

    Science.gov (United States)

    Frye, Christopher; Larkoski, Andrew J.; Thaler, Jesse; Zhou, Kevin

    2017-09-01

    Charged track multiplicity is among the most powerful observables for discriminating quark- from gluon-initiated jets. Despite its utility, it is not infrared and collinear (IRC) safe, so perturbative calculations are limited to studying the energy evolution of multiplicity moments. While IRC-safe observables, like jet mass, are perturbatively calculable, their distributions often exhibit Casimir scaling, such that their quark/gluon discrimination power is limited by the ratio of quark to gluon color factors. In this paper, we introduce new IRC-safe counting observables whose discrimination performance exceeds that of jet mass and approaches that of track multiplicity. The key observation is that track multiplicity is approximately Poisson distributed, with more suppressed tails than the Sudakov peak structure from jet mass. By using an iterated version of the soft drop jet grooming algorithm, we can define a "soft drop multiplicity" which is Poisson distributed at leading-logarithmic accuracy. In addition, we calculate the next-to-leading-logarithmic corrections to this Poisson structure. If we allow the soft drop groomer to proceed to the end of the jet branching history, we can define a collinear-unsafe (but still infrared-safe) counting observable. Exploiting the universality of the collinear limit, we define generalized fragmentation functions to study the perturbative energy evolution of collinear-unsafe multiplicity.

  6. Image deblurring with Poisson data: from cells to galaxies

    International Nuclear Information System (INIS)

    Bertero, M; Boccacci, P; Desiderà, G; Vicidomini, G

    2009-01-01

    Image deblurring is an important topic in imaging science. In this review, we consider together fluorescence microscopy and optical/infrared astronomy because of two common features: in both cases the imaging system can be described, with a sufficiently good approximation, by a convolution operator, whose kernel is the so-called point-spread function (PSF); moreover, the data are affected by photon noise, described by a Poisson process. This statistical property of the noise, that is common also to emission tomography, is the basis of maximum likelihood and Bayesian approaches introduced in the mid eighties. From then on, a huge amount of literature has been produced on these topics. This review is a tutorial and a review of a relevant part of this literature, including some of our previous contributions. We discuss the mathematical modeling of the process of image formation and detection, and we introduce the so-called Bayesian paradigm that provides the basis of the statistical treatment of the problem. Next, we describe and discuss the most frequently used algorithms as well as other approaches based on a different description of the Poisson noise. We conclude with a review of other topics related to image deblurring such as boundary effect correction, space-variant PSFs, super-resolution, blind deconvolution and multiple-image deconvolution. (topical review)

  7. Numerical probabilistic analysis for slope stability in fractured rock masses using DFN-DEM approach

    Directory of Open Access Journals (Sweden)

    Alireza Baghbanan

    2017-06-01

    Full Text Available Due to existence of uncertainties in input geometrical properties of fractures, there is not any unique solution for assessing the stability of slopes in jointed rock masses. Therefore, the necessity of applying probabilistic analysis in these cases is inevitable. In this study a probabilistic analysis procedure together with relevant algorithms are developed using Discrete Fracture Network-Distinct Element Method (DFN-DEM approach. In the right abutment of Karun 4 dam and downstream of the dam body, five joint sets and one major joint have been identified. According to the geometrical properties of fractures in Karun river valley, instability situations are probable in this abutment. In order to evaluate the stability of the rock slope, different combinations of joint set geometrical parameters are selected, and a series of numerical DEM simulations are performed on generated and validated DFN models in DFN-DEM approach to measure minimum required support patterns in dry and saturated conditions. Results indicate that the distribution of required bolt length is well fitted with a lognormal distribution in both circumstances. In dry conditions, the calculated mean value is 1125.3 m, and more than 80 percent of models need only 1614.99 m of bolts which is a bolt pattern with 2 m spacing and 12 m length. However, as for the slopes with saturated condition, the calculated mean value is 1821.8 m, and more than 80 percent of models need only 2653.49 m of bolts which is equivalent to a bolt pattern with 15 m length and 1.5 m spacing. Comparison between obtained results with numerical and empirical method show that investigation of a slope stability with different DFN realizations which conducted in different block patterns is more efficient than the empirical methods.

  8. Almost Poisson integration of rigid body systems

    International Nuclear Information System (INIS)

    Austin, M.A.; Krishnaprasad, P.S.; Li-Sheng Wang

    1993-01-01

    In this paper we discuss the numerical integration of Lie-Poisson systems using the mid-point rule. Since such systems result from the reduction of hamiltonian systems with symmetry by lie group actions, we also present examples of reconstruction rules for the full dynamics. A primary motivation is to preserve in the integration process, various conserved quantities of the original dynamics. A main result of this paper is an O(h 3 ) error estimate for the Lie-Poisson structure, where h is the integration step-size. We note that Lie-Poisson systems appear naturally in many areas of physical science and engineering, including theoretical mechanics of fluids and plasmas, satellite dynamics, and polarization dynamics. In the present paper we consider a series of progressively complicated examples related to rigid body systems. We also consider a dissipative example associated to a Lie-Poisson system. The behavior of the mid-point rule and an associated reconstruction rule is numerically explored. 24 refs., 9 figs

  9. Self-energy-modified Poisson-Nernst-Planck equations: WKB approximation and finite-difference approaches.

    Science.gov (United States)

    Xu, Zhenli; Ma, Manman; Liu, Pei

    2014-07-01

    We propose a modified Poisson-Nernst-Planck (PNP) model to investigate charge transport in electrolytes of inhomogeneous dielectric environment. The model includes the ionic polarization due to the dielectric inhomogeneity and the ion-ion correlation. This is achieved by the self energy of test ions through solving a generalized Debye-Hückel (DH) equation. We develop numerical methods for the system composed of the PNP and DH equations. Particularly, toward the numerical challenge of solving the high-dimensional DH equation, we developed an analytical WKB approximation and a numerical approach based on the selective inversion of sparse matrices. The model and numerical methods are validated by simulating the charge diffusion in electrolytes between two electrodes, for which effects of dielectrics and correlation are investigated by comparing the results with the prediction by the classical PNP theory. We find that, at the length scale of the interface separation comparable to the Bjerrum length, the results of the modified equations are significantly different from the classical PNP predictions mostly due to the dielectric effect. It is also shown that when the ion self energy is in weak or mediate strength, the WKB approximation presents a high accuracy, compared to precise finite-difference results.

  10. Development of probabilistic fatigue curve for asphalt concrete based on viscoelastic continuum damage mechanics

    Directory of Open Access Journals (Sweden)

    Himanshu Sharma

    2016-07-01

    Full Text Available Due to its roots in fundamental thermodynamic framework, continuum damage approach is popular for modeling asphalt concrete behavior. Currently used continuum damage models use mixture averaged values for model parameters and assume deterministic damage process. On the other hand, significant scatter is found in fatigue data generated even under extremely controlled laboratory testing conditions. Thus, currently used continuum damage models fail to account the scatter observed in fatigue data. This paper illustrates a novel approach for probabilistic fatigue life prediction based on viscoelastic continuum damage approach. Several specimens were tested for their viscoelastic properties and damage properties under uniaxial mode of loading. The data thus generated were analyzed using viscoelastic continuum damage mechanics principles to predict fatigue life. Weibull (2 parameter, 3 parameter and lognormal distributions were fit to fatigue life predicted using viscoelastic continuum damage approach. It was observed that fatigue damage could be best-described using Weibull distribution when compared to lognormal distribution. Due to its flexibility, 3-parameter Weibull distribution was found to fit better than 2-parameter Weibull distribution. Further, significant differences were found between probabilistic fatigue curves developed in this research and traditional deterministic fatigue curve. The proposed methodology combines advantages of continuum damage mechanics as well as probabilistic approaches. These probabilistic fatigue curves can be conveniently used for reliability based pavement design. Keywords: Probabilistic fatigue curve, Continuum damage mechanics, Weibull distribution, Lognormal distribution

  11. Probabilistic Durability Analysis in Advanced Engineering Design

    Directory of Open Access Journals (Sweden)

    A. Kudzys

    2000-01-01

    Full Text Available Expedience of probabilistic durability concepts and approaches in advanced engineering design of building materials, structural members and systems is considered. Target margin values of structural safety and serviceability indices are analyzed and their draft values are presented. Analytical methods of the cumulative coefficient of correlation and the limit transient action effect for calculation of reliability indices are given. Analysis can be used for probabilistic durability assessment of carrying and enclosure metal, reinforced concrete, wood, plastic, masonry both homogeneous and sandwich or composite structures and some kinds of equipments. Analysis models can be applied in other engineering fields.

  12. Advances in probabilistic databases for uncertain information management

    CERN Document Server

    Yan, Li

    2013-01-01

    This book covers a fast-growing topic in great depth and focuses on the technologies and applications of probabilistic data management. It aims to provide a single account of current studies in probabilistic data management. The objective of the book is to provide the state of the art information to researchers, practitioners, and graduate students of information technology of intelligent information processing, and at the same time serving the information technology professional faced with non-traditional applications that make the application of conventional approaches difficult or impossible.

  13. Modeling and control of an unstable system using probabilistic fuzzy inference system

    Directory of Open Access Journals (Sweden)

    Sozhamadevi N.

    2015-09-01

    Full Text Available A new type Fuzzy Inference System is proposed, a Probabilistic Fuzzy Inference system which model and minimizes the effects of statistical uncertainties. The blend of two different concepts, degree of truth and probability of truth in a unique framework leads to this new concept. This combination is carried out both in Fuzzy sets and Fuzzy rules, which gives rise to Probabilistic Fuzzy Sets and Probabilistic Fuzzy Rules. Introducing these probabilistic elements, a distinctive probabilistic fuzzy inference system is developed and this involves fuzzification, inference and output processing. This integrated approach accounts for all of the uncertainty like rule uncertainties and measurement uncertainties present in the systems and has led to the design which performs optimally after training. In this paper a Probabilistic Fuzzy Inference System is applied for modeling and control of a highly nonlinear, unstable system and also proved its effectiveness.

  14. On Probabilistic Alpha-Fuzzy Fixed Points and Related Convergence Results in Probabilistic Metric and Menger Spaces under Some Pompeiu-Hausdorff-Like Probabilistic Contractive Conditions

    OpenAIRE

    De la Sen, M.

    2015-01-01

    In the framework of complete probabilistic metric spaces and, in particular, in probabilistic Menger spaces, this paper investigates some relevant properties of convergence of sequences to probabilistic α-fuzzy fixed points under some types of probabilistic contractive conditions.

  15. Probabilistic seismic hazard assessment of southern part of Ghana

    Science.gov (United States)

    Ahulu, Sylvanus T.; Danuor, Sylvester Kojo; Asiedu, Daniel K.

    2018-05-01

    This paper presents a seismic hazard map for the southern part of Ghana prepared using the probabilistic approach, and seismic hazard assessment results for six cities. The seismic hazard map was prepared for 10% probability of exceedance for peak ground acceleration in 50 years. The input parameters used for the computations of hazard were obtained using data from a catalogue that was compiled and homogenised to moment magnitude (Mw). The catalogue covered a period of over a century (1615-2009). The hazard assessment is based on the Poisson model for earthquake occurrence, and hence, dependent events were identified and removed from the catalogue. The following attenuation relations were adopted and used in this study—Allen (for south and eastern Australia), Silva et al. (for Central and eastern North America), Campbell and Bozorgnia (for worldwide active-shallow-crust regions) and Chiou and Youngs (for worldwide active-shallow-crust regions). Logic-tree formalism was used to account for possible uncertainties associated with the attenuation relationships. OpenQuake software package was used for the hazard calculation. The highest level of seismic hazard is found in the Accra and Tema seismic zones, with estimated peak ground acceleration close to 0.2 g. The level of the seismic hazard in the southern part of Ghana diminishes with distance away from the Accra/Tema region to a value of 0.05 g at a distance of about 140 km.

  16. Probabilistic seismic hazard assessment of southern part of Ghana

    Science.gov (United States)

    Ahulu, Sylvanus T.; Danuor, Sylvester Kojo; Asiedu, Daniel K.

    2017-12-01

    This paper presents a seismic hazard map for the southern part of Ghana prepared using the probabilistic approach, and seismic hazard assessment results for six cities. The seismic hazard map was prepared for 10% probability of exceedance for peak ground acceleration in 50 years. The input parameters used for the computations of hazard were obtained using data from a catalogue that was compiled and homogenised to moment magnitude (Mw). The catalogue covered a period of over a century (1615-2009). The hazard assessment is based on the Poisson model for earthquake occurrence, and hence, dependent events were identified and removed from the catalogue. The following attenuation relations were adopted and used in this study—Allen (for south and eastern Australia), Silva et al. (for Central and eastern North America), Campbell and Bozorgnia (for worldwide active-shallow-crust regions) and Chiou and Youngs (for worldwide active-shallow-crust regions). Logic-tree formalism was used to account for possible uncertainties associated with the attenuation relationships. OpenQuake software package was used for the hazard calculation. The highest level of seismic hazard is found in the Accra and Tema seismic zones, with estimated peak ground acceleration close to 0.2 g. The level of the seismic hazard in the southern part of Ghana diminishes with distance away from the Accra/Tema region to a value of 0.05 g at a distance of about 140 km.

  17. Probabilistic simulation applications to reliability assessments

    International Nuclear Information System (INIS)

    Miller, Ian; Nutt, Mark W.; Hill, Ralph S. III

    2003-01-01

    Probabilistic risk/reliability (PRA) analyses for engineered systems are conventionally based on fault-tree methods. These methods are mature and efficient, and are well suited to systems consisting of interacting components with known, low probabilities of failure. Even complex systems, such as nuclear power plants or aircraft, are modeled by the careful application of these approaches. However, for systems that may evolve in complex and nonlinear ways, and where the performance of components may be a sensitive function of the history of their working environments, fault-tree methods can be very demanding. This paper proposes an alternative method of evaluating such systems, based on probabilistic simulation using intelligent software objects to represent the components of such systems. Using a Monte Carlo approach, simulation models can be constructed from relatively simple interacting objects that capture the essential behavior of the components that they represent. Such models are capable of reflecting the complex behaviors of the systems that they represent in a natural and realistic way. (author)

  18. Probabilistic modeling of crack networks in thermal fatigue

    International Nuclear Information System (INIS)

    Malesys, N.

    2007-11-01

    Thermal superficial crack networks have been detected in mixing zone of cooling system in nuclear power plants. Numerous experimental works have already been led to characterize initiation and propagation of these cracks. The random aspect of initiation led to propose a probabilistic model for the formation and propagation of crack networks in thermal fatigue. In a first part, uniaxial mechanical test were performed on smooth and slightly notched specimens in order to characterize the initiation of multiple cracks, their arrest due to obscuration and the coalescence phenomenon by recovery of amplification stress zones. In a second time, the probabilistic model was established under two assumptions: the continuous cracks initiation on surface, described by a Poisson point process law with threshold, and the shielding phenomenon which prohibits the initiation or the propagation of a crack if this one is in the relaxation stress zone of another existing crack. The crack propagation is assumed to follow a Paris' law based on the computation of stress intensity factors at the top and the bottom of crack. The evolution of multiaxial cracks on the surface can be followed thanks to three quantities: the shielding probability, comparable to a damage variable of the structure, the initiated crack density, representing the total number of cracks per unit surface which can be compared to experimental observations, and the propagating crack density, representing the number per unit surface of active cracks in the network. The crack sizes distribution is also computed by the model allowing an easier comparison with experimental results. (author)

  19. Compound Poisson Approximations for Sums of Random Variables

    OpenAIRE

    Serfozo, Richard F.

    1986-01-01

    We show that a sum of dependent random variables is approximately compound Poisson when the variables are rarely nonzero and, given they are nonzero, their conditional distributions are nearly identical. We give several upper bounds on the total-variation distance between the distribution of such a sum and a compound Poisson distribution. Included is an example for Markovian occurrences of a rare event. Our bounds are consistent with those that are known for Poisson approximations for sums of...

  20. Generating clustered scale-free networks using Poisson based localization of edges

    Science.gov (United States)

    Türker, İlker

    2018-05-01

    We introduce a variety of network models using a Poisson-based edge localization strategy, which result in clustered scale-free topologies. We first verify the success of our localization strategy by realizing a variant of the well-known Watts-Strogatz model with an inverse approach, implying a small-world regime of rewiring from a random network through a regular one. We then apply the rewiring strategy to a pure Barabasi-Albert model and successfully achieve a small-world regime, with a limited capacity of scale-free property. To imitate the high clustering property of scale-free networks with higher accuracy, we adapted the Poisson-based wiring strategy to a growing network with the ingredients of both preferential attachment and local connectivity. To achieve the collocation of these properties, we used a routine of flattening the edges array, sorting it, and applying a mixing procedure to assemble both global connections with preferential attachment and local clusters. As a result, we achieved clustered scale-free networks with a computational fashion, diverging from the recent studies by following a simple but efficient approach.

  1. Square root approximation to the poisson channel

    NARCIS (Netherlands)

    Tsiatmas, A.; Willems, F.M.J.; Baggen, C.P.M.J.

    2013-01-01

    Starting from the Poisson model we present a channel model for optical communications, called the Square Root (SR) Channel, in which the noise is additive Gaussian with constant variance. Initially, we prove that for large peak or average power, the transmission rate of a Poisson Channel when coding

  2. Duality and modular class of a Nambu-Poisson structure

    International Nuclear Information System (INIS)

    Ibanez, R.; Leon, M. de; Lopez, B.; Marrero, J.C.; Padron, E.

    2001-01-01

    In this paper we introduce cohomology and homology theories for Nambu-Poisson manifolds. Also we study the relation between the existence of a duality for these theories and the vanishing of a particular Nambu-Poisson cohomology class, the modular class. The case of a regular Nambu-Poisson structure and some singular examples are discussed. (author)

  3. Probabilistic Insurance

    NARCIS (Netherlands)

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    Probabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in premium to compensate for a 1% default risk. These observations cannot be

  4. Probabilistic Insurance

    NARCIS (Netherlands)

    P.P. Wakker (Peter); R.H. Thaler (Richard); A. Tversky (Amos)

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these

  5. Scaling the Poisson Distribution

    Science.gov (United States)

    Farnsworth, David L.

    2014-01-01

    We derive the additive property of Poisson random variables directly from the probability mass function. An important application of the additive property to quality testing of computer chips is presented.

  6. Symbolic Computing in Probabilistic and Stochastic Analysis

    Directory of Open Access Journals (Sweden)

    Kamiński Marcin

    2015-12-01

    Full Text Available The main aim is to present recent developments in applications of symbolic computing in probabilistic and stochastic analysis, and this is done using the example of the well-known MAPLE system. The key theoretical methods discussed are (i analytical derivations, (ii the classical Monte-Carlo simulation approach, (iii the stochastic perturbation technique, as well as (iv some semi-analytical approaches. It is demonstrated in particular how to engage the basic symbolic tools implemented in any system to derive the basic equations for the stochastic perturbation technique and how to make an efficient implementation of the semi-analytical methods using an automatic differentiation and integration provided by the computer algebra program itself. The second important illustration is probabilistic extension of the finite element and finite difference methods coded in MAPLE, showing how to solve boundary value problems with random parameters in the environment of symbolic computing. The response function method belongs to the third group, where interference of classical deterministic software with the non-linear fitting numerical techniques available in various symbolic environments is displayed. We recover in this context the probabilistic structural response in engineering systems and show how to solve partial differential equations including Gaussian randomness in their coefficients.

  7. Poisson-Lie T-plurality

    International Nuclear Information System (INIS)

    Unge, Rikard von

    2002-01-01

    We extend the path-integral formalism for Poisson-Lie T-duality to include the case of Drinfeld doubles which can be decomposed into bi-algebras in more than one way. We give the correct shift of the dilaton, correcting a mistake in the literature. We then use the fact that the six dimensional Drinfeld doubles have been classified to write down all possible conformal Poisson-Lie T-duals of three dimensional space times and we explicitly work out two duals to the constant dilaton and zero anti-symmetric tensor Bianchi type V space time and show that they satisfy the string equations of motion. This space-time was previously thought to have no duals because of the tracefulness of the structure constants. (author)

  8. Associative and Lie deformations of Poisson algebras

    OpenAIRE

    Remm, Elisabeth

    2011-01-01

    Considering a Poisson algebra as a non associative algebra satisfying the Markl-Remm identity, we study deformations of Poisson algebras as deformations of this non associative algebra. This gives a natural interpretation of deformations which preserves the underlying associative structure and we study deformations which preserve the underlying Lie algebra.

  9. A note on probabilistic models over strings: the linear algebra approach.

    Science.gov (United States)

    Bouchard-Côté, Alexandre

    2013-12-01

    Probabilistic models over strings have played a key role in developing methods that take into consideration indels as phylogenetically informative events. There is an extensive literature on using automata and transducers on phylogenies to do inference on these probabilistic models, in which an important theoretical question is the complexity of computing the normalization of a class of string-valued graphical models. This question has been investigated using tools from combinatorics, dynamic programming, and graph theory, and has practical applications in Bayesian phylogenetics. In this work, we revisit this theoretical question from a different point of view, based on linear algebra. The main contribution is a set of results based on this linear algebra view that facilitate the analysis and design of inference algorithms on string-valued graphical models. As an illustration, we use this method to give a new elementary proof of a known result on the complexity of inference on the "TKF91" model, a well-known probabilistic model over strings. Compared to previous work, our proving method is easier to extend to other models, since it relies on a novel weak condition, triangular transducers, which is easy to establish in practice. The linear algebra view provides a concise way of describing transducer algorithms and their compositions, opens the possibility of transferring fast linear algebra libraries (for example, based on GPUs), as well as low rank matrix approximation methods, to string-valued inference problems.

  10. Probabilistic liver atlas construction.

    Science.gov (United States)

    Dura, Esther; Domingo, Juan; Ayala, Guillermo; Marti-Bonmati, Luis; Goceri, E

    2017-01-13

    Anatomical atlases are 3D volumes or shapes representing an organ or structure of the human body. They contain either the prototypical shape of the object of interest together with other shapes representing its statistical variations (statistical atlas) or a probability map of belonging to the object (probabilistic atlas). Probabilistic atlases are mostly built with simple estimations only involving the data at each spatial location. A new method for probabilistic atlas construction that uses a generalized linear model is proposed. This method aims to improve the estimation of the probability to be covered by the liver. Furthermore, all methods to build an atlas involve previous coregistration of the sample of shapes available. The influence of the geometrical transformation adopted for registration in the quality of the final atlas has not been sufficiently investigated. The ability of an atlas to adapt to a new case is one of the most important quality criteria that should be taken into account. The presented experiments show that some methods for atlas construction are severely affected by the previous coregistration step. We show the good performance of the new approach. Furthermore, results suggest that extremely flexible registration methods are not always beneficial, since they can reduce the variability of the atlas and hence its ability to give sensible values of probability when used as an aid in segmentation of new cases.

  11. Next-generation probabilistic seismicity forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Hiemer, S.

    2014-07-01

    The development of probabilistic seismicity forecasts is one of the most important tasks of seismologists at present time. Such forecasts form the basis of probabilistic seismic hazard assessment, a widely used approach to generate ground motion exceedance maps. These hazard maps guide the development of building codes, and in the absence of the ability to deterministically predict earthquakes, good building and infrastructure planning is key to prevent catastrophes. Probabilistic seismicity forecasts are models that specify the occurrence rate of earthquakes as a function of space, time and magnitude. The models presented in this thesis are time-invariant mainshock occurrence models. Accordingly, the reliable estimation of the spatial and size distribution of seismicity are of crucial importance when constructing such probabilistic forecasts. Thereby we focus on data-driven approaches to infer these distributions, circumventing the need for arbitrarily chosen external parameters and subjective expert decisions. Kernel estimation has been shown to appropriately transform discrete earthquake locations into spatially continuous probability distributions. However, we show that neglecting the information from fault networks constitutes a considerable shortcoming and thus limits the skill of these current seismicity models. We present a novel earthquake rate forecast that applies the kernel-smoothing method to both past earthquake locations and slip rates on mapped crustal faults applied to Californian and European data. Our model is independent from biases caused by commonly used non-objective seismic zonations, which impose artificial borders of activity that are not expected in nature. Studying the spatial variability of the seismicity size distribution is of great importance. The b-value of the well-established empirical Gutenberg-Richter model forecasts the rates of hazard-relevant large earthquakes based on the observed rates of abundant small events. We propose a

  12. Next-generation probabilistic seismicity forecasting

    International Nuclear Information System (INIS)

    Hiemer, S.

    2014-01-01

    The development of probabilistic seismicity forecasts is one of the most important tasks of seismologists at present time. Such forecasts form the basis of probabilistic seismic hazard assessment, a widely used approach to generate ground motion exceedance maps. These hazard maps guide the development of building codes, and in the absence of the ability to deterministically predict earthquakes, good building and infrastructure planning is key to prevent catastrophes. Probabilistic seismicity forecasts are models that specify the occurrence rate of earthquakes as a function of space, time and magnitude. The models presented in this thesis are time-invariant mainshock occurrence models. Accordingly, the reliable estimation of the spatial and size distribution of seismicity are of crucial importance when constructing such probabilistic forecasts. Thereby we focus on data-driven approaches to infer these distributions, circumventing the need for arbitrarily chosen external parameters and subjective expert decisions. Kernel estimation has been shown to appropriately transform discrete earthquake locations into spatially continuous probability distributions. However, we show that neglecting the information from fault networks constitutes a considerable shortcoming and thus limits the skill of these current seismicity models. We present a novel earthquake rate forecast that applies the kernel-smoothing method to both past earthquake locations and slip rates on mapped crustal faults applied to Californian and European data. Our model is independent from biases caused by commonly used non-objective seismic zonations, which impose artificial borders of activity that are not expected in nature. Studying the spatial variability of the seismicity size distribution is of great importance. The b-value of the well-established empirical Gutenberg-Richter model forecasts the rates of hazard-relevant large earthquakes based on the observed rates of abundant small events. We propose a

  13. Assessing Probabilistic Risk Assessment Approaches for Insect Biological Control Introductions

    Directory of Open Access Journals (Sweden)

    Leyla V. Kaufman

    2017-07-01

    Full Text Available The introduction of biological control agents to new environments requires host specificity tests to estimate potential non-target impacts of a prospective agent. Currently, the approach is conservative, and is based on physiological host ranges determined under captive rearing conditions, without consideration for ecological factors that may influence realized host range. We use historical data and current field data from introduced parasitoids that attack an endemic Lepidoptera species in Hawaii to validate a probabilistic risk assessment (PRA procedure for non-target impacts. We use data on known host range and habitat use in the place of origin of the parasitoids to determine whether contemporary levels of non-target parasitism could have been predicted using PRA. Our results show that reasonable predictions of potential non-target impacts may be made if comprehensive data are available from places of origin of biological control agents, but scant data produce poor predictions. Using apparent mortality data rather than marginal attack rate estimates in PRA resulted in over-estimates of predicted non-target impact. Incorporating ecological data into PRA models improved the predictive power of the risk assessments.

  14. Probabilistic logic networks a comprehensive framework for uncertain inference

    CERN Document Server

    Goertzel, Ben; Goertzel, Izabela Freire; Heljakka, Ari

    2008-01-01

    This comprehensive book describes Probabilistic Logic Networks (PLN), a novel conceptual, mathematical and computational approach to uncertain inference. A broad scope of reasoning types are considered.

  15. Integration of Probabilistic Exposure Assessment and Probabilistic Hazard Characterization

    NARCIS (Netherlands)

    Voet, van der H.; Slob, W.

    2007-01-01

    A method is proposed for integrated probabilistic risk assessment where exposure assessment and hazard characterization are both included in a probabilistic way. The aim is to specify the probability that a random individual from a defined (sub)population will have an exposure high enough to cause a

  16. Poisson-Boltzmann versus Size-Modified Poisson-Boltzmann Electrostatics Applied to Lipid Bilayers.

    Science.gov (United States)

    Wang, Nuo; Zhou, Shenggao; Kekenes-Huskey, Peter M; Li, Bo; McCammon, J Andrew

    2014-12-26

    Mean-field methods, such as the Poisson-Boltzmann equation (PBE), are often used to calculate the electrostatic properties of molecular systems. In the past two decades, an enhancement of the PBE, the size-modified Poisson-Boltzmann equation (SMPBE), has been reported. Here, the PBE and the SMPBE are reevaluated for realistic molecular systems, namely, lipid bilayers, under eight different sets of input parameters. The SMPBE appears to reproduce the molecular dynamics simulation results better than the PBE only under specific parameter sets, but in general, it performs no better than the Stern layer correction of the PBE. These results emphasize the need for careful discussions of the accuracy of mean-field calculations on realistic systems with respect to the choice of parameters and call for reconsideration of the cost-efficiency and the significance of the current SMPBE formulation.

  17. Probabilistic Networks

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Lauritzen, Steffen Lilholt

    2001-01-01

    This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs.......This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs....

  18. Probabilistic hypergraph based hash codes for social image search

    Institute of Scientific and Technical Information of China (English)

    Yi XIE; Hui-min YU; Roland HU

    2014-01-01

    With the rapid development of the Internet, recent years have seen the explosive growth of social media. This brings great challenges in performing efficient and accurate image retrieval on a large scale. Recent work shows that using hashing methods to embed high-dimensional image features and tag information into Hamming space provides a powerful way to index large collections of social images. By learning hash codes through a spectral graph partitioning algorithm, spectral hashing (SH) has shown promising performance among various hashing approaches. However, it is incomplete to model the relations among images only by pairwise simple graphs which ignore the relationship in a higher order. In this paper, we utilize a probabilistic hypergraph model to learn hash codes for social image retrieval. A probabilistic hypergraph model offers a higher order repre-sentation among social images by connecting more than two images in one hyperedge. Unlike a normal hypergraph model, a probabilistic hypergraph model considers not only the grouping information, but also the similarities between vertices in hy-peredges. Experiments on Flickr image datasets verify the performance of our proposed approach.

  19. Toward negative Poisson's ratio composites: Investigation of the auxetic behavior of fibrous networks

    Science.gov (United States)

    Tatlier, Mehmet Seha

    Random fibrous can be found among natural and synthetic materials. Some of these random fibrous networks possess negative Poisson's ratio and they are extensively called auxetic materials. The governing mechanisms behind this counter intuitive property in random networks are yet to be understood and this kind of auxetic material remains widely under-explored. However, most of synthetic auxetic materials suffer from their low strength. This shortcoming can be rectified by developing high strength auxetic composites. The process of embedding auxetic random fibrous networks in a polymer matrix is an attractive alternate route to the manufacture of auxetic composites, however before such an approach can be developed, a methodology for designing fibrous networks with the desired negative Poisson's ratios must first be established. This requires an understanding of the factors which bring about negative Poisson's ratios in these materials. In this study, a numerical model is presented in order to investigate the auxetic behavior in compressed random fiber networks. Finite element analyses of three-dimensional stochastic fiber networks were performed to gain insight into the effects of parameters such as network anisotropy, network density, and degree of network compression on the out-of-plane Poisson's ratio and Young's modulus. The simulation results suggest that the compression is the critical parameter that gives rise to negative Poisson's ratio while anisotropy significantly promotes the auxetic behavior. This model can be utilized to design fibrous auxetic materials and to evaluate feasibility of developing auxetic composites by using auxetic fibrous networks as the reinforcing layer.

  20. Probabilistic risk assessment methodology

    International Nuclear Information System (INIS)

    Shinaishin, M.A.

    1988-06-01

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  1. Probabilistic risk assessment methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shinaishin, M A

    1988-06-15

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  2. Laplace-Laplace analysis of the fractional Poisson process

    OpenAIRE

    Gorenflo, Rudolf; Mainardi, Francesco

    2013-01-01

    We generate the fractional Poisson process by subordinating the standard Poisson process to the inverse stable subordinator. Our analysis is based on application of the Laplace transform with respect to both arguments of the evolving probability densities.

  3. On Poisson Nonlinear Transformations

    Directory of Open Access Journals (Sweden)

    Nasir Ganikhodjaev

    2014-01-01

    Full Text Available We construct the family of Poisson nonlinear transformations defined on the countable sample space of nonnegative integers and investigate their trajectory behavior. We have proved that these nonlinear transformations are regular.

  4. A Poisson Cluster Stochastic Rainfall Generator That Accounts for the Interannual Variability of Rainfall Statistics: Validation at Various Geographic Locations across the United States

    Directory of Open Access Journals (Sweden)

    Dongkyun Kim

    2014-01-01

    Full Text Available A novel approach for a Poisson cluster stochastic rainfall generator was validated in its ability to reproduce important rainfall and watershed response characteristics at 104 locations in the United States. The suggested novel approach, The Hybrid Model (THM, as compared to the traditional Poisson cluster rainfall modeling approaches, has an additional capability to account for the interannual variability of rainfall statistics. THM and a traditional approach of Poisson cluster rainfall model (modified Bartlett-Lewis rectangular pulse model were compared in their ability to reproduce the characteristics of extreme rainfall and watershed response variables such as runoff and peak flow. The results of the comparison indicate that THM generally outperforms the traditional approach in reproducing the distributions of peak rainfall, peak flow, and runoff volume. In addition, THM significantly outperformed the traditional approach in reproducing extreme rainfall by 2.3% to 66% and extreme flow values by 32% to 71%.

  5. Event-Based Media Enrichment Using an Adaptive Probabilistic Hypergraph Model.

    Science.gov (United States)

    Liu, Xueliang; Wang, Meng; Yin, Bao-Cai; Huet, Benoit; Li, Xuelong

    2015-11-01

    Nowadays, with the continual development of digital capture technologies and social media services, a vast number of media documents are captured and shared online to help attendees record their experience during events. In this paper, we present a method combining semantic inference and multimodal analysis for automatically finding media content to illustrate events using an adaptive probabilistic hypergraph model. In this model, media items are taken as vertices in the weighted hypergraph and the task of enriching media to illustrate events is formulated as a ranking problem. In our method, each hyperedge is constructed using the K-nearest neighbors of a given media document. We also employ a probabilistic representation, which assigns each vertex to a hyperedge in a probabilistic way, to further exploit the correlation among media data. Furthermore, we optimize the hypergraph weights in a regularization framework, which is solved as a second-order cone problem. The approach is initiated by seed media and then used to rank the media documents using a transductive inference process. The results obtained from validating the approach on an event dataset collected from EventMedia demonstrate the effectiveness of the proposed approach.

  6. Tools for voltage stability analysis, including a probabilistic approach

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Filho, X; Martins, N; Bianco, A; Pinto, H J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  7. Probabilistic theory of mean field games with applications

    CERN Document Server

    Carmona, René

    2018-01-01

    This two-volume book offers a comprehensive treatment of the probabilistic approach to mean field game models and their applications. The book is self-contained in nature and includes original material and applications with explicit examples throughout, including numerical solutions. Volume I of the book is entirely devoted to the theory of mean field games without a common noise. The first half of the volume provides a self-contained introduction to mean field games, starting from concrete illustrations of games with a finite number of players, and ending with ready-for-use solvability results. Readers are provided with the tools necessary for the solution of forward-backward stochastic differential equations of the McKean-Vlasov type at the core of the probabilistic approach. The second half of this volume focuses on the main principles of analysis on the Wasserstein space. It includes Lions' approach to the Wasserstein differential calculus, and the applications of its results to the analysis of stochastic...

  8. Existence theory for a Poisson-Nernst-Planck model of electrophoresis

    OpenAIRE

    Bedin, Luciano; Thompson, Mark

    2011-01-01

    A system modeling the electrophoretic motion of a charged rigid macromolecule immersed in a incompressible ionized fluid is considered. The ionic concentration is governing by the Nernst-Planck equation coupled with the Poisson equation for the electrostatic potential, Navier-Stokes and Newtonian equations for the fluid and the macromolecule dynamics, respectively. A local in time existence result for suitable weak solutions is established, following the approach of Desjardins and Esteban [Co...

  9. Probabilistic Logical Characterization

    DEFF Research Database (Denmark)

    Hermanns, Holger; Parma, Augusto; Segala, Roberto

    2011-01-01

    Probabilistic automata exhibit both probabilistic and non-deterministic choice. They are therefore a powerful semantic foundation for modeling concurrent systems with random phenomena arising in many applications ranging from artificial intelligence, security, systems biology to performance...... modeling. Several variations of bisimulation and simulation relations have proved to be useful as means to abstract and compare different automata. This paper develops a taxonomy of logical characterizations of these relations on image-finite and image-infinite probabilistic automata....

  10. Probabilistic metric spaces

    CERN Document Server

    Schweizer, B

    2005-01-01

    Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

  11. ProLBB - A Probabilistic Approach to Leak Before Break Demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Dillstroem, Peter; Weilin Zang (Inspecta Technology AB, Stockholm (SE))

    2007-11-15

    Recently, the Swedish Nuclear Power Inspectorate has developed guidelines on how to demonstrate the existence of Leak Before Break (LBB). The guidelines, mainly based on NUREG/CR-6765, define the steps that must be fulfilled to get a conservative assessment of LBB acceptability. In this report, a probabilistic LBB approach is defined and implemented into the software ProLBB. The main conclusions, from the study presented in this report, are summarized below. - The probabilistic approach developed in this study was applied to different piping systems in both Boiler Water Reactors (BWR) and Pressurised Water Reactors (PWR). Pipe sizes were selected so that small, medium and large pipes were included in the analysis. The present study shows that the conditional probability of fracture is in general small for the larger diameter pipes when evaluated as function of leak flow rate. However, when evaluated as function of fraction of crack length around the circumference, then the larger diameter pipes will belong to the ones with the highest conditional fracture probabilities. - The total failure probability, corresponding to the product between the leak probability and the conditional fracture probability, will be very small for all pipe geometries when evaluated as function of fraction of crack length around the circumference. This is mainly due to a small leak probability which is consistent with expectations since no active damage mechanism has been assumed. - One of the objectives of the approach was to be able to check the influence of off-centre cracks (i.e. the possibility that cracks occur randomly around the pipe circumference). To satisfy this objective, new stress intensity factor solutions for off-centre cracks were developed. Also to check how off-centre cracks influence crack opening areas, new form factors solutions for COA were developed taking plastic deformation into account. - The influence from an off-center crack position on the conditional

  12. ProLBB - A Probabilistic Approach to Leak Before Break Demonstration

    International Nuclear Information System (INIS)

    Dillstroem, Peter; Weilin Zang

    2007-11-01

    Recently, the Swedish Nuclear Power Inspectorate has developed guidelines on how to demonstrate the existence of Leak Before Break (LBB). The guidelines, mainly based on NUREG/CR-6765, define the steps that must be fulfilled to get a conservative assessment of LBB acceptability. In this report, a probabilistic LBB approach is defined and implemented into the software ProLBB. The main conclusions, from the study presented in this report, are summarized below. - The probabilistic approach developed in this study was applied to different piping systems in both Boiler Water Reactors (BWR) and Pressurised Water Reactors (PWR). Pipe sizes were selected so that small, medium and large pipes were included in the analysis. The present study shows that the conditional probability of fracture is in general small for the larger diameter pipes when evaluated as function of leak flow rate. However, when evaluated as function of fraction of crack length around the circumference, then the larger diameter pipes will belong to the ones with the highest conditional fracture probabilities. - The total failure probability, corresponding to the product between the leak probability and the conditional fracture probability, will be very small for all pipe geometries when evaluated as function of fraction of crack length around the circumference. This is mainly due to a small leak probability which is consistent with expectations since no active damage mechanism has been assumed. - One of the objectives of the approach was to be able to check the influence of off-centre cracks (i.e. the possibility that cracks occur randomly around the pipe circumference). To satisfy this objective, new stress intensity factor solutions for off-centre cracks were developed. Also to check how off-centre cracks influence crack opening areas, new form factors solutions for COA were developed taking plastic deformation into account. - The influence from an off-center crack position on the conditional

  13. Probabilistic record linkage.

    Science.gov (United States)

    Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona

    2016-06-01

    Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a 'black box' research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. © The Author 2015; Published by Oxford University Press on behalf of the International Epidemiological Association.

  14. Poisson-Lie T-duality of string effective actions: A new approach to the dilaton puzzle

    Czech Academy of Sciences Publication Activity Database

    Jurčo, B.; Vysoký, Jan

    2018-01-01

    Roč. 130, August (2018), s. 1-26 ISSN 0393-0440 Institutional support: RVO:67985840 Keywords : Poisson-Lie T-duality * string effective actions * dilaton field Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 0.819, year: 2016 https://www.sciencedirect.com/science/article/pii/S0393044018301748?via%3Dihub

  15. Poisson-Lie T-duality of string effective actions: A new approach to the dilaton puzzle

    Czech Academy of Sciences Publication Activity Database

    Jurčo, B.; Vysoký, Jan

    2018-01-01

    Roč. 130, August (2018), s. 1-26 ISSN 0393-0440 Institutional support: RVO:67985840 Keywords : Poisson-Lie T- dual ity * string effective actions * dilaton field Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 0.819, year: 2016 https://www.sciencedirect.com/science/article/pii/S0393044018301748?via%3Dihub

  16. Method of Poisson's ratio imaging within a material part

    Science.gov (United States)

    Roth, Don J. (Inventor)

    1996-01-01

    The present invention is directed to a method of displaying the Poisson's ratio image of a material part. In the present invention longitudinal data is produced using a longitudinal wave transducer and shear wave data is produced using a shear wave transducer. The respective data is then used to calculate the Poisson's ratio for the entire material part. The Poisson's ratio approximations are then used to displayed the image.

  17. A new approach for handling longitudinal count data with zero-inflation and overdispersion: poisson geometric process model.

    Science.gov (United States)

    Wan, Wai-Yin; Chan, Jennifer S K

    2009-08-01

    For time series of count data, correlated measurements, clustering as well as excessive zeros occur simultaneously in biomedical applications. Ignoring such effects might contribute to misleading treatment outcomes. A generalized mixture Poisson geometric process (GMPGP) model and a zero-altered mixture Poisson geometric process (ZMPGP) model are developed from the geometric process model, which was originally developed for modelling positive continuous data and was extended to handle count data. These models are motivated by evaluating the trend development of new tumour counts for bladder cancer patients as well as by identifying useful covariates which affect the count level. The models are implemented using Bayesian method with Markov chain Monte Carlo (MCMC) algorithms and are assessed using deviance information criterion (DIC).

  18. Probabilistic and Fuzzy Arithmetic Approaches for the Treatment of Uncertainties in the Installation of Torpedo Piles

    Directory of Open Access Journals (Sweden)

    Denise Margareth Kazue Nishimura Kunitaki

    2008-01-01

    Full Text Available The “torpedo” pile is a foundation system that has been recently considered to anchor mooring lines and risers of floating production systems for offshore oil exploitation. The pile is installed in a free fall operation from a vessel. However, the soil parameters involved in the penetration model of the torpedo pile contain uncertainties that can affect the precision of analysis methods to evaluate its final penetration depth. Therefore, this paper deals with methodologies for the assessment of the sensitivity of the response to the variation of the uncertain parameters and mainly to incorporate into the analysis method techniques for the formal treatment of the uncertainties. Probabilistic and “possibilistic” approaches are considered, involving, respectively, the Monte Carlo method (MC and concepts of fuzzy arithmetic (FA. The results and performance of both approaches are compared, stressing the ability of the latter approach to efficiently deal with the uncertainties of the model, with outstanding computational efficiency, and therefore, to comprise an effective design tool.

  19. The applicability of the Poisson distribution in radiochemical measurements

    International Nuclear Information System (INIS)

    Luthardt, M.; Proesch, U.

    1980-01-01

    The fact that, on principle, the Poisson distribution describes the statistics of nuclear decay is generally accepted. The applicability of this distribution to nuclear radiation measurements has recently been questioned. Applying the chi-squared test for goodness of fit on the analogy of the moving average, at least 3 cases may be distinguished, which lead to an incorrect rejection of the Poisson distribution for measurements. Examples are given. Distributions, which make allowance for special parameters, should only be used after careful examination of the data with regard to other interfering effects. The Poisson distribution will further on be applicable to many simple measuring operations. Some basic equations for the analysis of poisson-distributed data are given. (author)

  20. Combination of the deterministic and probabilistic approaches for risk-informed decision-making in US NRC regulatory guides

    International Nuclear Information System (INIS)

    Patrik, M.; Babic, P.

    2001-06-01

    The report responds to the trend where probabilistic safety analyses are attached, on a voluntary basis (as yet), to the mandatory deterministic assessment of modifications of NPP systems or operating procedures, resulting in risk-informed type documents. It contains a nearly complete Czech translation of US NRC Regulatory Guide 1.177 and presents some suggestions for improving a) PSA study applications; b) the development of NPP documents for the regulatory body; and c) the interconnection between PSA and traditional deterministic analyses as contained in the risk-informed approach. (P.A.)

  1. Multivariate fractional Poisson processes and compound sums

    OpenAIRE

    Beghin, Luisa; Macci, Claudio

    2015-01-01

    In this paper we present multivariate space-time fractional Poisson processes by considering common random time-changes of a (finite-dimensional) vector of independent classical (non-fractional) Poisson processes. In some cases we also consider compound processes. We obtain some equations in terms of some suitable fractional derivatives and fractional difference operators, which provides the extension of known equations for the univariate processes.

  2. Contravariant gravity on Poisson manifolds and Einstein gravity

    International Nuclear Information System (INIS)

    Kaneko, Yukio; Watamura, Satoshi; Muraki, Hisayoshi

    2017-01-01

    A relation between gravity on Poisson manifolds proposed in Asakawa et al (2015 Fortschr. Phys . 63 683–704) and Einstein gravity is investigated. The compatibility of the Poisson and Riemann structures defines a unique connection, the contravariant Levi-Civita connection, and leads to the idea of the contravariant gravity. The Einstein–Hilbert-type action yields an equation of motion which is written in terms of the analog of the Einstein tensor, and it includes couplings between the metric and the Poisson tensor. The study of the Weyl transformation reveals properties of those interactions. It is argued that this theory can have an equivalent description as a system of Einstein gravity coupled to matter. As an example, it is shown that the contravariant gravity on a two-dimensional Poisson manifold can be described by a real scalar field coupled to the metric in a specific manner. (paper)

  3. Modified Regression Correlation Coefficient for Poisson Regression Model

    Science.gov (United States)

    Kaengthong, Nattacha; Domthong, Uthumporn

    2017-09-01

    This study gives attention to indicators in predictive power of the Generalized Linear Model (GLM) which are widely used; however, often having some restrictions. We are interested in regression correlation coefficient for a Poisson regression model. This is a measure of predictive power, and defined by the relationship between the dependent variable (Y) and the expected value of the dependent variable given the independent variables [E(Y|X)] for the Poisson regression model. The dependent variable is distributed as Poisson. The purpose of this research was modifying regression correlation coefficient for Poisson regression model. We also compare the proposed modified regression correlation coefficient with the traditional regression correlation coefficient in the case of two or more independent variables, and having multicollinearity in independent variables. The result shows that the proposed regression correlation coefficient is better than the traditional regression correlation coefficient based on Bias and the Root Mean Square Error (RMSE).

  4. Probabilistic aspects of risk analyses for hazardous facilities

    International Nuclear Information System (INIS)

    Morici, A.; Valeri, A.; Zaffiro, C.

    1989-01-01

    The work described in the paper discusses the aspects of the risk analysis concerned with the use of the probabilistic methodology, in order to see how this approach may affect the risk management of industrial hazardous facilities. To this purpose reference is done to the Probabilistic Risk Assessment (PRA) of nuclear power plants. The paper points out that even though the public aversion towards nuclear risks is still far from being removed, the probabilistic approach may provide a sound support to the decision making and authorization process for any industrial activity implying risk for the environment and the public health. It is opinion of the authors that the probabilistic techniques have been developed to a great level of sophistication in the nuclear industry and provided much more experience in this field than in others. For some particular areas of the nuclear applications, such as the plant reliability and the plant response to the accidents, these techniques have reached a sufficient level of maturity and so some results have been usefully taken as a measure of the safety level of the plant itself. The use of some limited safety goals is regarded as a relevant item of the nuclear licensing process. The paper claims that it is time now that these methods would be applied with equal success to other hazardous facilities, and makes some comparative consideration on the differences of these plants with nuclear power plants in order to understand the effect of these differences on the PRA results and on the use one intends to make with them. (author)

  5. A robust probabilistic approach for variational inversion in shallow water acoustic tomography

    International Nuclear Information System (INIS)

    Berrada, M; Badran, F; Crépon, M; Thiria, S; Hermand, J-P

    2009-01-01

    This paper presents a variational methodology for inverting shallow water acoustic tomography (SWAT) measurements. The aim is to determine the vertical profile of the speed of sound c(z), knowing the acoustic pressures generated by a frequency source and collected by a sparse vertical hydrophone array (VRA). A variational approach that minimizes a cost function measuring the distance between observations and their modeled equivalents is used. A regularization term in the form of a quadratic restoring term to a background is also added. To avoid inverting the variance–covariance matrix associated with the above-weighted quadratic background, this work proposes to model the sound speed vector using probabilistic principal component analysis (PPCA). The PPCA introduces an optimum reduced number of non-correlated latent variables η, which determine a new control vector and a new regularization term, expressed as η T η. The PPCA represents a rigorous formalism for the use of a priori information and allows an efficient implementation of the variational inverse method

  6. A comparison of Poisson-one-inflated power series distributions for ...

    African Journals Online (AJOL)

    A class of Poisson-one-inflated power series distributions (the binomial, the Poisson, the negative binomial, the geometric, the log-series and the misrecorded Poisson) are proposed for modeling rural out-migration at the household level. The probability mass functions of the mixture distributions are derived and fitted to the ...

  7. Monitoring Poisson observations using combined applications of Shewhart and EWMA charts

    Science.gov (United States)

    Abujiya, Mu'azu Ramat

    2017-11-01

    The Shewhart and exponentially weighted moving average (EWMA) charts for nonconformities are the most widely used procedures of choice for monitoring Poisson observations in modern industries. Individually, the Shewhart EWMA charts are only sensitive to large and small shifts, respectively. To enhance the detection abilities of the two schemes in monitoring all kinds of shifts in Poisson count data, this study examines the performance of combined applications of the Shewhart, and EWMA Poisson control charts. Furthermore, the study proposes modifications based on well-structured statistical data collection technique, ranked set sampling (RSS), to detect shifts in the mean of a Poisson process more quickly. The relative performance of the proposed Shewhart-EWMA Poisson location charts is evaluated in terms of the average run length (ARL), standard deviation of the run length (SDRL), median run length (MRL), average ratio ARL (ARARL), average extra quadratic loss (AEQL) and performance comparison index (PCI). Consequently, all the new Poisson control charts based on RSS method are generally more superior than most of the existing schemes for monitoring Poisson processes. The use of these combined Shewhart-EWMA Poisson charts is illustrated with an example to demonstrate the practical implementation of the design procedure.

  8. Probabilistic methods for physics

    International Nuclear Information System (INIS)

    Cirier, G

    2013-01-01

    We present an asymptotic method giving a probability of presence of the iterated spots of R d by a polynomial function f. We use the well-known Perron Frobenius operator (PF) that lets certain sets and measure invariant by f. Probabilistic solutions can exist for the deterministic iteration. If the theoretical result is already known, here we quantify these probabilities. This approach seems interesting to use for computing situations when the deterministic methods don't run. Among the examined applications, are asymptotic solutions of Lorenz, Navier-Stokes or Hamilton's equations. In this approach, linearity induces many difficult problems, all of whom we have not yet resolved.

  9. Poisson-Jacobi reduction of homogeneous tensors

    International Nuclear Information System (INIS)

    Grabowski, J; Iglesias, D; Marrero, J C; Padron, E; Urbanski, P

    2004-01-01

    The notion of homogeneous tensors is discussed. We show that there is a one-to-one correspondence between multivector fields on a manifold M, homogeneous with respect to a vector field Δ on M, and first-order polydifferential operators on a closed submanifold N of codimension 1 such that Δ is transversal to N. This correspondence relates the Schouten-Nijenhuis bracket of multivector fields on M to the Schouten-Jacobi bracket of first-order polydifferential operators on N and generalizes the Poissonization of Jacobi manifolds. Actually, it can be viewed as a super-Poissonization. This procedure of passing from a homogeneous multivector field to a first-order polydifferential operator can also be understood as a sort of reduction; in the standard case-a half of a Poisson reduction. A dual version of the above correspondence yields in particular the correspondence between Δ-homogeneous symplectic structures on M and contact structures on N

  10. Limitations of Poisson statistics in describing radioactive decay.

    Science.gov (United States)

    Sitek, Arkadiusz; Celler, Anna M

    2015-12-01

    The assumption that nuclear decays are governed by Poisson statistics is an approximation. This approximation becomes unjustified when data acquisition times longer than or even comparable with the half-lives of the radioisotope in the sample are considered. In this work, the limits of the Poisson-statistics approximation are investigated. The formalism for the statistics of radioactive decay based on binomial distribution is derived. The theoretical factor describing the deviation of variance of the number of decays predicated by the Poisson distribution from the true variance is defined and investigated for several commonly used radiotracers such as (18)F, (15)O, (82)Rb, (13)N, (99m)Tc, (123)I, and (201)Tl. The variance of the number of decays estimated using the Poisson distribution is significantly different than the true variance for a 5-minute observation time of (11)C, (15)O, (13)N, and (82)Rb. Durations of nuclear medicine studies often are relatively long; they may be even a few times longer than the half-lives of some short-lived radiotracers. Our study shows that in such situations the Poisson statistics is unsuitable and should not be applied to describe the statistics of the number of decays in radioactive samples. However, the above statement does not directly apply to counting statistics at the level of event detection. Low sensitivities of detectors which are used in imaging studies make the Poisson approximation near perfect. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  11. Tsunamigenic scenarios for southern Peru and northern Chile seismic gap: Deterministic and probabilistic hybrid approach for hazard assessment

    Science.gov (United States)

    González-Carrasco, J. F.; Gonzalez, G.; Aránguiz, R.; Yanez, G. A.; Melgar, D.; Salazar, P.; Shrivastava, M. N.; Das, R.; Catalan, P. A.; Cienfuegos, R.

    2017-12-01

    Plausible worst-case tsunamigenic scenarios definition plays a relevant role in tsunami hazard assessment focused in emergency preparedness and evacuation planning for coastal communities. During the last decade, the occurrence of major and moderate tsunamigenic earthquakes along worldwide subduction zones has given clues about critical parameters involved in near-field tsunami inundation processes, i.e. slip spatial distribution, shelf resonance of edge waves and local geomorphology effects. To analyze the effects of these seismic and hydrodynamic variables over the epistemic uncertainty of coastal inundation, we implement a combined methodology using deterministic and probabilistic approaches to construct 420 tsunamigenic scenarios in a mature seismic gap of southern Peru and northern Chile, extended from 17ºS to 24ºS. The deterministic scenarios are calculated using a regional distribution of trench-parallel gravity anomaly (TPGA) and trench-parallel topography anomaly (TPTA), three-dimensional Slab 1.0 worldwide subduction zones geometry model and published interseismic coupling (ISC) distributions. As result, we find four higher slip deficit zones interpreted as major seismic asperities of the gap, used in a hierarchical tree scheme to generate ten tsunamigenic scenarios with seismic magnitudes fluctuates between Mw 8.4 to Mw 8.9. Additionally, we construct ten homogeneous slip scenarios as inundation baseline. For the probabilistic approach, we implement a Karhunen - Loève expansion to generate 400 stochastic tsunamigenic scenarios over the maximum extension of the gap, with the same magnitude range of the deterministic sources. All the scenarios are simulated through a non-hydrostatic tsunami model Neowave 2D, using a classical nesting scheme, for five coastal major cities in northern Chile (Arica, Iquique, Tocopilla, Mejillones and Antofagasta) obtaining high resolution data of inundation depth, runup, coastal currents and sea level elevation. The

  12. Nonhomogeneous Poisson process with nonparametric frailty

    International Nuclear Information System (INIS)

    Slimacek, Vaclav; Lindqvist, Bo Henry

    2016-01-01

    The failure processes of heterogeneous repairable systems are often modeled by non-homogeneous Poisson processes. The common way to describe an unobserved heterogeneity between systems is to multiply the basic rate of occurrence of failures by a random variable (a so-called frailty) having a specified parametric distribution. Since the frailty is unobservable, the choice of its distribution is a problematic part of using these models, as are often the numerical computations needed in the estimation of these models. The main purpose of this paper is to develop a method for estimation of the parameters of a nonhomogeneous Poisson process with unobserved heterogeneity which does not require parametric assumptions about the heterogeneity and which avoids the frequently encountered numerical problems associated with the standard models for unobserved heterogeneity. The introduced method is illustrated on an example involving the power law process, and is compared to the standard gamma frailty model and to the classical model without unobserved heterogeneity. The derived results are confirmed in a simulation study which also reveals several not commonly known properties of the gamma frailty model and the classical model, and on a real life example. - Highlights: • A new method for estimation of a NHPP with frailty is introduced. • Introduced method does not require parametric assumptions about frailty. • The approach is illustrated on an example with the power law process. • The method is compared to the gamma frailty model and to the model without frailty.

  13. Probabilistic vs linear blending approaches to shared control for wheelchair driving.

    Science.gov (United States)

    Ezeh, Chinemelu; Trautman, Pete; Devigne, Louise; Bureau, Valentin; Babel, Marie; Carlson, Tom

    2017-07-01

    Some people with severe mobility impairments are unable to operate powered wheelchairs reliably and effectively, using commercially available interfaces. This has sparked a body of research into "smart wheelchairs", which assist users to drive safely and create opportunities for them to use alternative interfaces. Various "shared control" techniques have been proposed to provide an appropriate level of assistance that is satisfactory and acceptable to the user. Most shared control techniques employ a traditional strategy called linear blending (LB), where the user's commands and wheelchair's autonomous commands are combined in some proportion. In this paper, however, we implement a more generalised form of shared control called probabilistic shared control (PSC). This probabilistic formulation improves the accuracy of modelling the interaction between the user and the wheelchair by taking into account uncertainty in the interaction. In this paper, we demonstrate the practical success of PSC over LB in terms of safety, particularly for novice users.

  14. Poisson solvers for self-consistent multi-particle simulations

    International Nuclear Information System (INIS)

    Qiang, J; Paret, S

    2014-01-01

    Self-consistent multi-particle simulation plays an important role in studying beam-beam effects and space charge effects in high-intensity beams. The Poisson equation has to be solved at each time-step based on the particle density distribution in the multi-particle simulation. In this paper, we review a number of numerical methods that can be used to solve the Poisson equation efficiently. The computational complexity of those numerical methods will be O(N log(N)) or O(N) instead of O(N2), where N is the total number of grid points used to solve the Poisson equation

  15. Decision making by hybrid probabilistic: Possibilistic utility theory

    Directory of Open Access Journals (Sweden)

    Pap Endre

    2009-01-01

    Full Text Available It is presented an approach to decision theory based upon nonprobabilistic uncertainty. There is an axiomatization of the hybrid probabilistic possibilistic mixtures based on a pair of triangular conorm and triangular norm satisfying restricted distributivity law, and the corresponding non-additive Smeasure. This is characterized by the families of operations involved in generalized mixtures, based upon a previous result on the characterization of the pair of continuous t-norm and t-conorm such that the former is restrictedly distributive over the latter. The obtained family of mixtures combines probabilistic and idempotent (possibilistic mixtures via a threshold.

  16. Seasonally adjusted birth frequencies follow the Poisson distribution.

    Science.gov (United States)

    Barra, Mathias; Lindstrøm, Jonas C; Adams, Samantha S; Augestad, Liv A

    2015-12-15

    Variations in birth frequencies have an impact on activity planning in maternity wards. Previous studies of this phenomenon have commonly included elective births. A Danish study of spontaneous births found that birth frequencies were well modelled by a Poisson process. Somewhat unexpectedly, there were also weekly variations in the frequency of spontaneous births. Another study claimed that birth frequencies follow the Benford distribution. Our objective was to test these results. We analysed 50,017 spontaneous births at Akershus University Hospital in the period 1999-2014. To investigate the Poisson distribution of these births, we plotted their variance over a sliding average. We specified various Poisson regression models, with the number of births on a given day as the outcome variable. The explanatory variables included various combinations of years, months, days of the week and the digit sum of the date. The relationship between the variance and the average fits well with an underlying Poisson process. A Benford distribution was disproved by a goodness-of-fit test (p Poisson process when monthly and day-of-the-week variation is included. The frequency is highest in summer towards June and July, Friday and Tuesday stand out as particularly busy days, and the activity level is at its lowest during weekends.

  17. Modeling laser velocimeter signals as triply stochastic Poisson processes

    Science.gov (United States)

    Mayo, W. T., Jr.

    1976-01-01

    Previous models of laser Doppler velocimeter (LDV) systems have not adequately described dual-scatter signals in a manner useful for analysis and simulation of low-level photon-limited signals. At low photon rates, an LDV signal at the output of a photomultiplier tube is a compound nonhomogeneous filtered Poisson process, whose intensity function is another (slower) Poisson process with the nonstationary rate and frequency parameters controlled by a random flow (slowest) process. In the present paper, generalized Poisson shot noise models are developed for low-level LDV signals. Theoretical results useful in detection error analysis and simulation are presented, along with measurements of burst amplitude statistics. Computer generated simulations illustrate the difference between Gaussian and Poisson models of low-level signals.

  18. A high order solver for the unbounded Poisson equation

    DEFF Research Database (Denmark)

    Hejlesen, Mads Mølholm; Rasmussen, Johannes Tophøj; Chatelain, Philippe

    2013-01-01

    . The method is extended to directly solve the derivatives of the solution to Poissonʼs equation. In this way differential operators such as the divergence or curl of the solution field can be solved to the same high order convergence without additional computational effort. The method, is applied......A high order converging Poisson solver is presented, based on the Greenʼs function solution to Poissonʼs equation subject to free-space boundary conditions. The high order convergence is achieved by formulating regularised integration kernels, analogous to a smoothing of the solution field...... and validated, however not restricted, to the equations of fluid mechanics, and can be used in many applications to solve Poissonʼs equation on a rectangular unbounded domain....

  19. Some applications of the fractional Poisson probability distribution

    International Nuclear Information System (INIS)

    Laskin, Nick

    2009-01-01

    Physical and mathematical applications of the recently invented fractional Poisson probability distribution have been presented. As a physical application, a new family of quantum coherent states has been introduced and studied. As mathematical applications, we have developed the fractional generalization of Bell polynomials, Bell numbers, and Stirling numbers of the second kind. The appearance of fractional Bell polynomials is natural if one evaluates the diagonal matrix element of the evolution operator in the basis of newly introduced quantum coherent states. Fractional Stirling numbers of the second kind have been introduced and applied to evaluate the skewness and kurtosis of the fractional Poisson probability distribution function. A representation of the Bernoulli numbers in terms of fractional Stirling numbers of the second kind has been found. In the limit case when the fractional Poisson probability distribution becomes the Poisson probability distribution, all of the above listed developments and implementations turn into the well-known results of the quantum optics and the theory of combinatorial numbers.

  20. A comparison between Poisson and zero-inflated Poisson regression models with an application to number of black spots in Corriedale sheep

    Directory of Open Access Journals (Sweden)

    Rodrigues-Motta Mariana

    2008-07-01

    Full Text Available Abstract Dark spots in the fleece area are often associated with dark fibres in wool, which limits its competitiveness with other textile fibres. Field data from a sheep experiment in Uruguay revealed an excess number of zeros for dark spots. We compared the performance of four Poisson and zero-inflated Poisson (ZIP models under four simulation scenarios. All models performed reasonably well under the same scenario for which the data were simulated. The deviance information criterion favoured a Poisson model with residual, while the ZIP model with a residual gave estimates closer to their true values under all simulation scenarios. Both Poisson and ZIP models with an error term at the regression level performed better than their counterparts without such an error. Field data from Corriedale sheep were analysed with Poisson and ZIP models with residuals. Parameter estimates were similar for both models. Although the posterior distribution of the sire variance was skewed due to a small number of rams in the dataset, the median of this variance suggested a scope for genetic selection. The main environmental factor was the age of the sheep at shearing. In summary, age related processes seem to drive the number of dark spots in this breed of sheep.

  1. Probabilistic approach to earthquake prediction.

    Directory of Open Access Journals (Sweden)

    G. D'Addezio

    2002-06-01

    Full Text Available The evaluation of any earthquake forecast hypothesis requires the application of rigorous statistical methods. It implies a univocal definition of the model characterising the concerned anomaly or precursor, so as it can be objectively recognised in any circumstance and by any observer.A valid forecast hypothesis is expected to maximise successes and minimise false alarms. The probability gain associated to a precursor is also a popular way to estimate the quality of the predictions based on such precursor. Some scientists make use of a statistical approach based on the computation of the likelihood of an observed realisation of seismic events, and on the comparison of the likelihood obtained under different hypotheses. This method can be extended to algorithms that allow the computation of the density distribution of the conditional probability of earthquake occurrence in space, time and magnitude. Whatever method is chosen for building up a new hypothesis, the final assessment of its validity should be carried out by a test on a new and independent set of observations. The implementation of this test could, however, be problematic for seismicity characterised by long-term recurrence intervals. Even using the historical record, that may span time windows extremely variable between a few centuries to a few millennia, we have a low probability to catch more than one or two events on the same fault. Extending the record of earthquakes of the past back in time up to several millennia, paleoseismology represents a great opportunity to study how earthquakes recur through time and thus provide innovative contributions to time-dependent seismic hazard assessment. Sets of paleoseimologically dated earthquakes have been established for some faults in the Mediterranean area: the Irpinia fault in Southern Italy, the Fucino fault in Central Italy, the El Asnam fault in Algeria and the Skinos fault in Central Greece. By using the age of the

  2. Probabilistic risk assessment for six vapour intrusion algorithms

    NARCIS (Netherlands)

    Provoost, J.; Reijnders, L.; Bronders, J.; Van Keer, I.; Govaerts, S.

    2014-01-01

    A probabilistic assessment with sensitivity analysis using Monte Carlo simulation for six vapour intrusion algorithms, used in various regulatory frameworks for contaminated land management, is presented here. In addition a deterministic approach with default parameter sets is evaluated against

  3. Disjunctive Probabilistic Modal Logic is Enough for Bisimilarity on Reactive Probabilistic Systems

    OpenAIRE

    Bernardo, Marco; Miculan, Marino

    2016-01-01

    Larsen and Skou characterized probabilistic bisimilarity over reactive probabilistic systems with a logic including true, negation, conjunction, and a diamond modality decorated with a probabilistic lower bound. Later on, Desharnais, Edalat, and Panangaden showed that negation is not necessary to characterize the same equivalence. In this paper, we prove that the logical characterization holds also when conjunction is replaced by disjunction, with negation still being not necessary. To this e...

  4. Probabilistic Structural Analysis Program

    Science.gov (United States)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  5. Generalized outcome-based strategy classification: comparing deterministic and probabilistic choice models.

    Science.gov (United States)

    Hilbig, Benjamin E; Moshagen, Morten

    2014-12-01

    Model comparisons are a vital tool for disentangling which of several strategies a decision maker may have used--that is, which cognitive processes may have governed observable choice behavior. However, previous methodological approaches have been limited to models (i.e., decision strategies) with deterministic choice rules. As such, psychologically plausible choice models--such as evidence-accumulation and connectionist models--that entail probabilistic choice predictions could not be considered appropriately. To overcome this limitation, we propose a generalization of Bröder and Schiffer's (Journal of Behavioral Decision Making, 19, 361-380, 2003) choice-based classification method, relying on (1) parametric order constraints in the multinomial processing tree framework to implement probabilistic models and (2) minimum description length for model comparison. The advantages of the generalized approach are demonstrated through recovery simulations and an experiment. In explaining previous methods and our generalization, we maintain a nontechnical focus--so as to provide a practical guide for comparing both deterministic and probabilistic choice models.

  6. Application of probabilistic precipitation forecasts from a ...

    African Journals Online (AJOL)

    Application of probabilistic precipitation forecasts from a deterministic model towards increasing the lead-time of flash flood forecasts in South Africa. ... The procedure is applied to a real flash flood event and the ensemble-based rainfall forecasts are verified against rainfall estimated by the SAFFG system. The approach ...

  7. Universal Poisson Statistics of mRNAs with Complex Decay Pathways.

    Science.gov (United States)

    Thattai, Mukund

    2016-01-19

    Messenger RNA (mRNA) dynamics in single cells are often modeled as a memoryless birth-death process with a constant probability per unit time that an mRNA molecule is synthesized or degraded. This predicts a Poisson steady-state distribution of mRNA number, in close agreement with experiments. This is surprising, since mRNA decay is known to be a complex process. The paradox is resolved by realizing that the Poisson steady state generalizes to arbitrary mRNA lifetime distributions. A mapping between mRNA dynamics and queueing theory highlights an identifiability problem: a measured Poisson steady state is consistent with a large variety of microscopic models. Here, I provide a rigorous and intuitive explanation for the universality of the Poisson steady state. I show that the mRNA birth-death process and its complex decay variants all take the form of the familiar Poisson law of rare events, under a nonlinear rescaling of time. As a corollary, not only steady-states but also transients are Poisson distributed. Deviations from the Poisson form occur only under two conditions, promoter fluctuations leading to transcriptional bursts or nonindependent degradation of mRNA molecules. These results place severe limits on the power of single-cell experiments to probe microscopic mechanisms, and they highlight the need for single-molecule measurements. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  8. Intertime jump statistics of state-dependent Poisson processes.

    Science.gov (United States)

    Daly, Edoardo; Porporato, Amilcare

    2007-01-01

    A method to obtain the probability distribution of the interarrival times of jump occurrences in systems driven by state-dependent Poisson noise is proposed. Such a method uses the survivor function obtained by a modified version of the master equation associated to the stochastic process under analysis. A model for the timing of human activities shows the capability of state-dependent Poisson noise to generate power-law distributions. The application of the method to a model for neuron dynamics and to a hydrological model accounting for land-atmosphere interaction elucidates the origin of characteristic recurrence intervals and possible persistence in state-dependent Poisson models.

  9. A probabilistic approach for the interpretation of RNA profiles as cell type evidence.

    Science.gov (United States)

    de Zoete, Jacob; Curran, James; Sjerps, Marjan

    2016-01-01

    DNA profiles can be used as evidence to distinguish between possible donors of a crime stain. In some cases, both the prosecution and the defence claim that the cell material was left by the suspect but they dispute which cell type was left behind. For example, in sexual offense cases the prosecution could claim that the sample contains semen cells where the defence argues that the sample contains skin cells. In these cases, traditional methods (e.g. a phosphatase test) can be used to examine the cell type contained in the sample. However, there are some drawbacks when using these methods. For instance, many of these techniques need to be carried out separately for each cell type and each of them requires part of the available sample, which reduces the amount that can be used for DNA analysis. Another option is messenger RNA (mRNA) evidence. mRNA expression levels vary among cell types and can be used to make (probability) statements about the cell type(s) present in a sample. Existing methods for the interpretation of RNA profiles as evidence for the presence of certain cell types aim at making categorical statements. Such statements limit the possibility to report the associated uncertainty. Some of these existing methods will be discussed. Most notably, a method based on a 'n/2' scoring rule (Lindenbergh et al.) and a method using marker values and cell type scoring thresholds (Roeder et al.). From a statistical point of view, a probabilistic approach is the most obvious choice. Two approaches (multinomial logistic regression and naïve Bayes') are suggested. All methods are compared, using two different datasets and several criteria regarding their ability to assess the evidential value of RNA profiles. We conclude that both the naïve Bayes' method and a method based on multinomial logistic regression, that produces a probabilistic statement as measure of the evidential value, are an important improvement of the existing methods. Besides a better performance

  10. Cluster X-varieties, amalgamation, and Poisson-Lie groups

    DEFF Research Database (Denmark)

    Fock, V. V.; Goncharov, A. B.

    2006-01-01

    In this paper, starting from a split semisimple real Lie group G with trivial center, we define a family of varieties with additional structures. We describe them as cluster χ-varieties, as defined in [FG2]. In particular they are Poisson varieties. We define canonical Poisson maps of these varie...

  11. A probabilistic bridge safety evaluation against floods.

    Science.gov (United States)

    Liao, Kuo-Wei; Muto, Yasunori; Chen, Wei-Lun; Wu, Bang-Ho

    2016-01-01

    To further capture the influences of uncertain factors on river bridge safety evaluation, a probabilistic approach is adopted. Because this is a systematic and nonlinear problem, MPP-based reliability analyses are not suitable. A sampling approach such as a Monte Carlo simulation (MCS) or importance sampling is often adopted. To enhance the efficiency of the sampling approach, this study utilizes Bayesian least squares support vector machines to construct a response surface followed by an MCS, providing a more precise safety index. Although there are several factors impacting the flood-resistant reliability of a bridge, previous experiences and studies show that the reliability of the bridge itself plays a key role. Thus, the goal of this study is to analyze the system reliability of a selected bridge that includes five limit states. The random variables considered here include the water surface elevation, water velocity, local scour depth, soil property and wind load. Because the first three variables are deeply affected by river hydraulics, a probabilistic HEC-RAS-based simulation is performed to capture the uncertainties in those random variables. The accuracy and variation of our solutions are confirmed by a direct MCS to ensure the applicability of the proposed approach. The results of a numerical example indicate that the proposed approach can efficiently provide an accurate bridge safety evaluation and maintain satisfactory variation.

  12. Poisson traces, D-modules, and symplectic resolutions.

    Science.gov (United States)

    Etingof, Pavel; Schedler, Travis

    2018-01-01

    We survey the theory of Poisson traces (or zeroth Poisson homology) developed by the authors in a series of recent papers. The goal is to understand this subtle invariant of (singular) Poisson varieties, conditions for it to be finite-dimensional, its relationship to the geometry and topology of symplectic resolutions, and its applications to quantizations. The main technique is the study of a canonical D-module on the variety. In the case the variety has finitely many symplectic leaves (such as for symplectic singularities and Hamiltonian reductions of symplectic vector spaces by reductive groups), the D-module is holonomic, and hence, the space of Poisson traces is finite-dimensional. As an application, there are finitely many irreducible finite-dimensional representations of every quantization of the variety. Conjecturally, the D-module is the pushforward of the canonical D-module under every symplectic resolution of singularities, which implies that the space of Poisson traces is dual to the top cohomology of the resolution. We explain many examples where the conjecture is proved, such as symmetric powers of du Val singularities and symplectic surfaces and Slodowy slices in the nilpotent cone of a semisimple Lie algebra. We compute the D-module in the case of surfaces with isolated singularities and show it is not always semisimple. We also explain generalizations to arbitrary Lie algebras of vector fields, connections to the Bernstein-Sato polynomial, relations to two-variable special polynomials such as Kostka polynomials and Tutte polynomials, and a conjectural relationship with deformations of symplectic resolutions. In the appendix we give a brief recollection of the theory of D-modules on singular varieties that we require.

  13. Poisson traces, D-modules, and symplectic resolutions

    Science.gov (United States)

    Etingof, Pavel; Schedler, Travis

    2018-03-01

    We survey the theory of Poisson traces (or zeroth Poisson homology) developed by the authors in a series of recent papers. The goal is to understand this subtle invariant of (singular) Poisson varieties, conditions for it to be finite-dimensional, its relationship to the geometry and topology of symplectic resolutions, and its applications to quantizations. The main technique is the study of a canonical D-module on the variety. In the case the variety has finitely many symplectic leaves (such as for symplectic singularities and Hamiltonian reductions of symplectic vector spaces by reductive groups), the D-module is holonomic, and hence, the space of Poisson traces is finite-dimensional. As an application, there are finitely many irreducible finite-dimensional representations of every quantization of the variety. Conjecturally, the D-module is the pushforward of the canonical D-module under every symplectic resolution of singularities, which implies that the space of Poisson traces is dual to the top cohomology of the resolution. We explain many examples where the conjecture is proved, such as symmetric powers of du Val singularities and symplectic surfaces and Slodowy slices in the nilpotent cone of a semisimple Lie algebra. We compute the D-module in the case of surfaces with isolated singularities and show it is not always semisimple. We also explain generalizations to arbitrary Lie algebras of vector fields, connections to the Bernstein-Sato polynomial, relations to two-variable special polynomials such as Kostka polynomials and Tutte polynomials, and a conjectural relationship with deformations of symplectic resolutions. In the appendix we give a brief recollection of the theory of D-modules on singular varieties that we require.

  14. Evolutionary inference via the Poisson Indel Process.

    Science.gov (United States)

    Bouchard-Côté, Alexandre; Jordan, Michael I

    2013-01-22

    We address the problem of the joint statistical inference of phylogenetic trees and multiple sequence alignments from unaligned molecular sequences. This problem is generally formulated in terms of string-valued evolutionary processes along the branches of a phylogenetic tree. The classic evolutionary process, the TKF91 model [Thorne JL, Kishino H, Felsenstein J (1991) J Mol Evol 33(2):114-124] is a continuous-time Markov chain model composed of insertion, deletion, and substitution events. Unfortunately, this model gives rise to an intractable computational problem: The computation of the marginal likelihood under the TKF91 model is exponential in the number of taxa. In this work, we present a stochastic process, the Poisson Indel Process (PIP), in which the complexity of this computation is reduced to linear. The Poisson Indel Process is closely related to the TKF91 model, differing only in its treatment of insertions, but it has a global characterization as a Poisson process on the phylogeny. Standard results for Poisson processes allow key computations to be decoupled, which yields the favorable computational profile of inference under the PIP model. We present illustrative experiments in which Bayesian inference under the PIP model is compared with separate inference of phylogenies and alignments.

  15. Particle-wave discrimination in Poisson spot experiments

    International Nuclear Information System (INIS)

    Reisinger, T; Bracco, G; Holst, B

    2011-01-01

    Matter-wave interferometry has been used extensively over the last few years to demonstrate the quantum-mechanical wave nature of increasingly larger and more massive particles. We have recently suggested the use of the historical Poisson spot setup to test the diffraction properties of larger objects. In this paper, we present the results of a classical particle van der Waals (vdW) force model for a Poisson spot experimental setup and compare these to Fresnel diffraction calculations with a vdW phase term. We include the effect of disc-edge roughness in both models. Calculations are performed with D 2 and with C 70 using realistic parameters. We find that the sensitivity of the on-axis interference/focus spot to disc-edge roughness is very different in the two cases. We conclude that by measuring the intensity on the optical axis as a function of disc-edge roughness, it can be determined whether the objects behave as de Broglie waves or classical particles. The scaling of the Poisson spot experiment to larger molecular masses is, however, not as favorable as in the case of near-field light-grating-based interferometers. Instead, we discuss the possibility of studying the Casimir-Polder potential using the Poisson spot setup.

  16. Poisson cohomology of scalar multidimensional Dubrovin-Novikov brackets

    Science.gov (United States)

    Carlet, Guido; Casati, Matteo; Shadrin, Sergey

    2017-04-01

    We compute the Poisson cohomology of a scalar Poisson bracket of Dubrovin-Novikov type with D independent variables. We find that the second and third cohomology groups are generically non-vanishing in D > 1. Hence, in contrast with the D = 1 case, the deformation theory in the multivariable case is non-trivial.

  17. Quantum algebras and Poisson geometry in mathematical physics

    CERN Document Server

    Karasev, M V

    2005-01-01

    This collection presents new and interesting applications of Poisson geometry to some fundamental well-known problems in mathematical physics. The methods used by the authors include, in addition to advanced Poisson geometry, unexpected algebras with non-Lie commutation relations, nontrivial (quantum) Kählerian structures of hypergeometric type, dynamical systems theory, semiclassical asymptotics, etc.

  18. Automatic Probabilistic Program Verification through Random Variable Abstraction

    Directory of Open Access Journals (Sweden)

    Damián Barsotti

    2010-06-01

    Full Text Available The weakest pre-expectation calculus has been proved to be a mature theory to analyze quantitative properties of probabilistic and nondeterministic programs. We present an automatic method for proving quantitative linear properties on any denumerable state space using iterative backwards fixed point calculation in the general framework of abstract interpretation. In order to accomplish this task we present the technique of random variable abstraction (RVA and we also postulate a sufficient condition to achieve exact fixed point computation in the abstract domain. The feasibility of our approach is shown with two examples, one obtaining the expected running time of a probabilistic program, and the other the expected gain of a gambling strategy. Our method works on general guarded probabilistic and nondeterministic transition systems instead of plain pGCL programs, allowing us to easily model a wide range of systems including distributed ones and unstructured programs. We present the operational and weakest precondition semantics for this programs and prove its equivalence.

  19. Estimating Bird / Aircraft Collision Probabilities and Risk Utilizing Spatial Poisson Processes

    Science.gov (United States)

    2012-06-10

    ESTIMATING BIRD/AIRCRAFT COLLISION PROBABILITIES AND RISK UTILIZING SPATIAL POISSON PROCESSES GRADUATE...AND RISK UTILIZING SPATIAL POISSON PROCESSES GRADUATE RESEARCH PAPER Presented to the Faculty Department of Operational Sciences...COLLISION PROBABILITIES AND RISK UTILIZING SPATIAL POISSON PROCESSES Brady J. Vaira, BS, MS Major, USAF Approved

  20. Poisson structures for reduced non-holonomic systems

    International Nuclear Information System (INIS)

    Ramos, Arturo

    2004-01-01

    Borisov, Mamaev and Kilin have recently found certain Poisson structures with respect to which the reduced and rescaled systems of certain non-holonomic problems, involving rolling bodies without slipping, become Hamiltonian, the Hamiltonian function being the reduced energy. We study further the algebraic origin of these Poisson structures, showing that they are of rank 2 and therefore the mentioned rescaling is not necessary. We show that they are determined, up to a non-vanishing factor function, by the existence of a system of first-order differential equations providing two integrals of motion. We generalize the form of the Poisson structures and extend their domain of definition. We apply the theory to the rolling disc, the Routh's sphere, the ball rolling on a surface of revolution, and its special case of a ball rolling inside a cylinder

  1. High order Poisson Solver for unbounded flows

    DEFF Research Database (Denmark)

    Hejlesen, Mads Mølholm; Rasmussen, Johannes Tophøj; Chatelain, Philippe

    2015-01-01

    This paper presents a high order method for solving the unbounded Poisson equation on a regular mesh using a Green’s function solution. The high order convergence was achieved by formulating mollified integration kernels, that were derived from a filter regularisation of the solution field....... The method was implemented on a rectangular domain using fast Fourier transforms (FFT) to increase computational efficiency. The Poisson solver was extended to directly solve the derivatives of the solution. This is achieved either by including the differential operator in the integration kernel...... the equations of fluid mechanics as an example, but can be used in many physical problems to solve the Poisson equation on a rectangular unbounded domain. For the two-dimensional case we propose an infinitely smooth test function which allows for arbitrary high order convergence. Using Gaussian smoothing...

  2. Simple probabilistic approach to evaluate radioiodine behavior at severe accidents: application to Phebus test FPT1

    International Nuclear Information System (INIS)

    Rydl, A.

    2007-01-01

    The contribution of radioiodine to risk from a severe accident is recognized to be one of the highest among all the fission products. In a long term (e.g. several days), volatile species of iodine are the most important forms of iodine from the safety point of view. These volatile forms ('volatile iodine') are mainly molecular iodine, I 2 , and various types of organic iodides, RI. A certain controversy exist today among the international research community about the relative importance of the processes leading to volatile iodine formation in containment under severe accident conditions. The amount of knowledge, coming from experiments, of the phenomenology of iodine behavior is enormous and it is embedded in specialized mechanistic or empirical codes. An exhaustive description of the processes governing the iodine behavior in containment is given in reference 1. Yet, all this knowledge is still not enough to resolve some important questions. Moreover, the results of different codes -when applied to relatively simple experiments, such as RTF or CAIMAN - vary widely. Thus, as a complement (or maybe even as an alternative in some instances) to deterministic analyses of iodine behavior, simple probabilistic approach is proposed in this work which could help to see the whole problem in a different perspective. The final goal of using this approach should be the characterization of uncertainties of the description of various processes in question. This would allow for identification of the processes which contribute most significantly to the overall uncertainty of the predictions of iodine volatility in containment. In this work we made a dedicated, small event tree to describe iodine behavior at an accident and we used that tree for a simple sensitivity study. For the evaluation of the tree, the US NRC code EVNTRE was used. To test the proposed probabilistic approach we analyzed results of the integral PHEBUS FPT1 experiment which comprises most of the important

  3. Solving probabilistic inverse problems rapidly with prior samples

    NARCIS (Netherlands)

    Käufl, Paul; Valentine, Andrew P.; de Wit, Ralph W.; Trampert, Jeannot

    2016-01-01

    Owing to the increasing availability of computational resources, in recent years the probabilistic solution of non-linear, geophysical inverse problems by means of sampling methods has become increasingly feasible. Nevertheless, we still face situations in which a Monte Carlo approach is not

  4. Impact of external events on site evaluation: a probabilistic approach

    International Nuclear Information System (INIS)

    Jaccarino, E.; Giuliani, P.; Zaffiro, C.

    1975-01-01

    A probabilistic method is proposed for definition of the reference external events of nuclear sites. The external events taken into account are earthquakes, floods and tornadoes. On the basis of the available historical data for each event it is possible to perform statistical analyses to determine the probability of occurrence on site of events of given characteristics. For earthquakes, the method of analysis takes into consideration both the annual frequency of seismic events in Italy and the probabilistic distribution of areas stricken by each event. For floods, the methods of analysis of hydrological data and the basic criteria for the determination of design events are discussed and the general lines of the hydraulic analysis of a nuclear site are shown. For tornadoes, the statistical analysis has been performed for the events which occurred in Italy during the last 40 years; these events have been classified according to an empirical intensity scale. The probability of each reference event should be a function of the potential radiological damage associated with the particular type of plant which must be installed on the site. Thus the reference event could be chosen such that for the whole of the national territory the risk for safety and environmental protection is the same. (author)

  5. Probabilistic seasonal Forecasts to deterministic Farm Leve Decisions: Innovative Approach

    Science.gov (United States)

    Mwangi, M. W.

    2015-12-01

    Climate change and vulnerability are major challenges in ensuring household food security. Climate information services have the potential to cushion rural households from extreme climate risks. However, most the probabilistic nature of climate information products is not easily understood by majority of smallholder farmers. Despite the probabilistic nature, climate information have proved to be a valuable climate risk adaptation strategy at the farm level. This calls for innovative ways to help farmers understand and apply climate information services to inform their farm level decisions. The study endeavored to co-design and test appropriate innovation systems for climate information services uptake and scale up necessary for achieving climate risk development. In addition it also determined the conditions necessary to support the effective performance of the proposed innovation system. Data and information sources included systematic literature review, secondary sources, government statistics, focused group discussions, household surveys and semi-structured interviews. Data wasanalyzed using both quantitative and qualitative data analysis techniques. Quantitative data was analyzed using the Statistical Package for Social Sciences (SPSS) software. Qualitative data was analyzed using qualitative techniques, which involved establishing the categories and themes, relationships/patterns and conclusions in line with the study objectives. Sustainable livelihood, reduced household poverty and climate change resilience were the impact that resulted from the study.

  6. A tiered approach for probabilistic ecological risk assessment of contaminated sites; Un approccio multilivello per l'analisi probabilistica di rischio ecologico di siti contaminati

    Energy Technology Data Exchange (ETDEWEB)

    Zolezzi, M. [Fisia Italimpianti SpA, Genova (Italy); Nicolella, C. [Pisa Univ., Pisa (Italy). Dipartimento di ingegneria chimica, chimica industriale e scienza dei materiali; Tarazona, J.V. [Instituto Nacional de Investigacion y Tecnologia Agraria y Alimentaria, Madrid (Spain). Departamento de Medio Ambiente, Laboratorio de toxicologia

    2005-09-15

    This paper presents a tiered methodology for probabilistic ecological risk assessment. The proposed approach starts from deterministic comparison (ratio) of single exposure concentration and threshold or safe level calculated from a dose-response relationship, goes through comparison of probabilistic distributions that describe exposure values and toxicological responses of organisms to the chemical of concern, and finally determines the so called distribution-based quotients (DBQs). In order to illustrate the proposed approach, soil concentrations of 1,2,4-trichlorobenzene (1,2,4- TCB) measured in an industrial contaminated site were used for site-specific probabilistic ecological risks assessment. By using probabilistic distributions, the risk, which exceeds a level of concern for soil organisms with the deterministic approach, is associated to the presence of hot spots reaching concentrations able to affect acutely more than 50% of the soil species, while the large majority of the area presents 1,2,4- TCB concentrations below those reported as toxic. [Italian] Scopo del presente studio e fornire una procedura per l'analisi di rischio ecologico di siti contaminati basata su livelli successivi di approfondimento. L'approccio proposto, partendo dal semplice rapporto deterministico tra un livello di esposizione ed un valore di effetto che consenta la salvaguardia del maggior numero di specie dell'ecosistema considerato, procede attraverso il confronto tra le distribuzioni statistiche dei valori di esposizione e di sensitivita delle specie, per determinare infine la distribuzione probabilistica del quoziente di rischio. Ai fini di illustrare la metodologia proposta, le concentrazioni di 1,2,4-triclorobenzene determinate nel suolo di un sito industriale contaminato sono state utilizzate per condurre l'analisi di rischio per le specie terrestri. L'utilizzo delle distribuzioni probabilistiche ha permesso di associare il rischio, inizialmente

  7. Studies on a Double Poisson-Geometric Insurance Risk Model with Interference

    Directory of Open Access Journals (Sweden)

    Yujuan Huang

    2013-01-01

    Full Text Available This paper mainly studies a generalized double Poisson-Geometric insurance risk model. By martingale and stopping time approach, we obtain adjustment coefficient equation, the Lundberg inequality, and the formula for the ruin probability. Also the Laplace transformation of the time when the surplus reaches a given level for the first time is discussed, and the expectation and its variance are obtained. Finally, we give the numerical examples.

  8. Probabilistic seismic hazards: Guidelines and constraints in evaluating results

    International Nuclear Information System (INIS)

    Sadigh, R.K.; Power, M.S.

    1989-01-01

    In conducting probabilistic seismic hazard analyses, consideration of the dispersion as well as the upper bounds on ground motion is of great significance. In particular, the truncation of ground motion levels at some upper limit would have a major influence on the computed hazard at the low-to-very-low probability levels. Additionally, other deterministic guidelines and constraints should be considered in evaluating the probabilistic seismic hazard results. In contrast to probabilistic seismic hazard evaluations, mean plus one standard deviation ground motions are typically used for deterministic estimates of ground motions from maximum events that may affect a structure. To be consistent with standard deterministic maximum estimates of ground motions values should be the highest level considered for the site. These maximum values should be associated with the largest possible event occurring at the site. Furthermore, the relationships between the ground motion level and probability of exceedance should reflect a transition from purely probabilistic assessments of ground motion at high probability levels where there are multiple chances for events to a deterministic upper bound ground motion at very low probability levels where there is very limited opportunity for maximum events to occur. In Interplate Regions, where the seismic sources may be characterized by a high-to-very-high rate of activity, the deterministic bounds will be approached or exceeded by the computer probabilistic hazard values at annual probability of exceedance levels typically as high as 10 -2 to 10 -3 . Thus, at these or lower values probability levels, probabilistically computed hazard values could be readily interpreted in the light of the deterministic constraints

  9. Confluence reduction for probabilistic systems

    NARCIS (Netherlands)

    Timmer, Mark; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette

    In this presentation we introduce a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We proved that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To

  10. Probabilistic systems coalgebraically: A survey

    Science.gov (United States)

    Sokolova, Ana

    2011-01-01

    We survey the work on both discrete and continuous-space probabilistic systems as coalgebras, starting with how probabilistic systems are modeled as coalgebras and followed by a discussion of their bisimilarity and behavioral equivalence, mentioning results that follow from the coalgebraic treatment of probabilistic systems. It is interesting to note that, for different reasons, for both discrete and continuous probabilistic systems it may be more convenient to work with behavioral equivalence than with bisimilarity. PMID:21998490

  11. Probabilistic and possibilistic approach for assessment of radiological risk due to organically bound and tissue free water tritium

    International Nuclear Information System (INIS)

    Dahiya, Sudhir; Hegde, A.G.; Joshi, M.L.; Verma, P.C.; Kushwaha, H.S.

    2006-01-01

    This study illustrates use of two approaches namely probabilistic using Monte Carlo simulation (MCS) and possibilistic using fuzzy α-cut (FAC) to estimate the radiological cancer risk to the population from ingestion of organically bound tritium (OBT) and tissue free water tritium (TFWT) from fish consumption from the Rana Pratap Sagar Lake (RPSL), Kota. Using FAC technique, radiological cancer risk rate (year -1 ) at A αl.0 level were 1.15E-08 and 1.50E-09 for OBT and TFWT respectively from fish ingestion pathway. The radiological cancer risk rate (year -1 ) using MCS approach at 50th percentile (median) level is 1.14E-08 and 1.49E-09 for OBT and HTO respectively from ingestion of fresh water fish. (author)

  12. Exact solution for the Poisson field in a semi-infinite strip.

    Science.gov (United States)

    Cohen, Yossi; Rothman, Daniel H

    2017-04-01

    The Poisson equation is associated with many physical processes. Yet exact analytic solutions for the two-dimensional Poisson field are scarce. Here we derive an analytic solution for the Poisson equation with constant forcing in a semi-infinite strip. We provide a method that can be used to solve the field in other intricate geometries. We show that the Poisson flux reveals an inverse square-root singularity at a tip of a slit, and identify a characteristic length scale in which a small perturbation, in a form of a new slit, is screened by the field. We suggest that this length scale expresses itself as a characteristic spacing between tips in real Poisson networks that grow in response to fluxes at tips.

  13. Identifying traffic accident black spots with Poisson-Tweedie models

    DEFF Research Database (Denmark)

    Debrabant, Birgit; Halekoh, Ulrich; Bonat, Wagner Hugo

    2018-01-01

    This paper aims at the identification of black spots for traffic accidents, i.e. locations with accident counts beyond what is usual for similar locations, using spatially and temporally aggregated hospital records from Funen, Denmark. Specifically, we apply an autoregressive Poisson-Tweedie model...... considered calendar years and calculated by simulations a probability of p=0.03 for these to be chance findings. Altogether, our results recommend these sites for further investigation and suggest that our simple approach could play a role in future area based traffic accident prevention planning....

  14. A Method of Poisson's Ration Imaging Within a Material Part

    Science.gov (United States)

    Roth, Don J. (Inventor)

    1994-01-01

    The present invention is directed to a method of displaying the Poisson's ratio image of a material part. In the present invention, longitudinal data is produced using a longitudinal wave transducer and shear wave data is produced using a shear wave transducer. The respective data is then used to calculate the Poisson's ratio for the entire material part. The Poisson's ratio approximations are then used to display the data.

  15. Fractional poisson--a simple dose-response model for human norovirus.

    Science.gov (United States)

    Messner, Michael J; Berger, Philip; Nappier, Sharon P

    2014-10-01

    This study utilizes old and new Norovirus (NoV) human challenge data to model the dose-response relationship for human NoV infection. The combined data set is used to update estimates from a previously published beta-Poisson dose-response model that includes parameters for virus aggregation and for a beta-distribution that describes variable susceptibility among hosts. The quality of the beta-Poisson model is examined and a simpler model is proposed. The new model (fractional Poisson) characterizes hosts as either perfectly susceptible or perfectly immune, requiring a single parameter (the fraction of perfectly susceptible hosts) in place of the two-parameter beta-distribution. A second parameter is included to account for virus aggregation in the same fashion as it is added to the beta-Poisson model. Infection probability is simply the product of the probability of nonzero exposure (at least one virus or aggregate is ingested) and the fraction of susceptible hosts. The model is computationally simple and appears to be well suited to the data from the NoV human challenge studies. The model's deviance is similar to that of the beta-Poisson, but with one parameter, rather than two. As a result, the Akaike information criterion favors the fractional Poisson over the beta-Poisson model. At low, environmentally relevant exposure levels (Poisson model; however, caution is advised because no subjects were challenged at such a low dose. New low-dose data would be of great value to further clarify the NoV dose-response relationship and to support improved risk assessment for environmentally relevant exposures. © 2014 Society for Risk Analysis Published 2014. This article is a U.S. Government work and is in the public domain for the U.S.A.

  16. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    Science.gov (United States)

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1987-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach has been developed to quantify the effects of the random uncertainties. The results of this study indicate that only the variations in geometry have significant effects.

  17. Conditional Probabilistic Population Forecasting

    OpenAIRE

    Sanderson, Warren C.; Scherbov, Sergei; O'Neill, Brian C.; Lutz, Wolfgang

    2004-01-01

    Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because...

  18. The role of probabilistic safety assessment in the design

    International Nuclear Information System (INIS)

    Green, A.; Ingham, E.L.

    1989-01-01

    The use of probabilistic safety assessment (PSA) for Heysham 2 and Torness marked a major change in the design approach to nuclear safety within the U.K. Design Safety Guidelines incorporating probabilistic safety targets required that design justification would necessitate explicit consideration of the consequence of accidents in relation to their frequency. The paper discusses these safety targets and their implications, the integration of PSA into the design process and an outline of the methodology. The influence of PSA on the design is discussed together with its role in the overall demonstration of reactor safety. (author)

  19. Action-angle variables and a KAM theorem for b-Poisson manifolds

    OpenAIRE

    Kiesenhofer, Anna; Miranda Galcerán, Eva; Scott, Geoffrey

    2015-01-01

    In this article we prove an action-angle theorem for b-integrable systems on b-Poisson manifolds improving the action-angle theorem contained in [14] for general Poisson manifolds in this setting. As an application, we prove a KAM-type theorem for b-Poisson manifolds. (C) 2015 Elsevier Masson SAS. All rights reserved.

  20. Probabilistic flood damage modelling at the meso-scale

    Science.gov (United States)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2014-05-01

    Decisions on flood risk management and adaptation are usually based on risk analyses. Such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments. Most damage models have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood damage models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we show how the model BT-FLEMO (Bagging decision Tree based Flood Loss Estimation MOdel) can be applied on the meso-scale, namely on the basis of ATKIS land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany. The application of BT-FLEMO provides a probability distribution of estimated damage to residential buildings per municipality. Validation is undertaken on the one hand via a comparison with eight other damage models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official damage data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of damage estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation model BT-FLEMO is that it inherently provides quantitative information about the uncertainty of the prediction. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64.

  1. A Methodology for Probabilistic Accident Management

    International Nuclear Information System (INIS)

    Munteanu, Ion; Aldemir, Tunc

    2003-01-01

    While techniques have been developed to tackle different tasks in accident management, there have been very few attempts to develop an on-line operator assistance tool for accident management and none that can be found in the literature that uses probabilistic arguments, which are important in today's licensing climate. The state/parameter estimation capability of the dynamic system doctor (DSD) approach is combined with the dynamic event-tree generation capability of the integrated safety assessment (ISA) methodology to address this issue. The DSD uses the cell-to-cell mapping technique for system representation that models the system evolution in terms of probability of transitions in time between sets of user-defined parameter/state variable magnitude intervals (cells) within a user-specified time interval (e.g., data sampling interval). The cell-to-cell transition probabilities are obtained from the given system model. The ISA follows the system dynamics in tree form and braches every time a setpoint for system/operator intervention is exceeded. The combined approach (a) can automatically account for uncertainties in the monitored system state, inputs, and modeling uncertainties through the appropriate choice of the cells, as well as providing a probabilistic measure to rank the likelihood of possible system states in view of these uncertainties; (b) allows flexibility in system representation; (c) yields the lower and upper bounds on the estimated values of state variables/parameters as well as their expected values; and (d) leads to fewer branchings in the dynamic event-tree generation. Using a simple but realistic pressurizer model, the potential use of the DSD-ISA methodology for on-line probabilistic accident management is illustrated

  2. Probabilistic language models in cognitive neuroscience: Promises and pitfalls.

    Science.gov (United States)

    Armeni, Kristijan; Willems, Roel M; Frank, Stefan L

    2017-12-01

    Cognitive neuroscientists of language comprehension study how neural computations relate to cognitive computations during comprehension. On the cognitive part of the equation, it is important that the computations and processing complexity are explicitly defined. Probabilistic language models can be used to give a computationally explicit account of language complexity during comprehension. Whereas such models have so far predominantly been evaluated against behavioral data, only recently have the models been used to explain neurobiological signals. Measures obtained from these models emphasize the probabilistic, information-processing view of language understanding and provide a set of tools that can be used for testing neural hypotheses about language comprehension. Here, we provide a cursory review of the theoretical foundations and example neuroimaging studies employing probabilistic language models. We highlight the advantages and potential pitfalls of this approach and indicate avenues for future research. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Non-Poisson Processes: Regression to Equilibrium Versus Equilibrium Correlation Functions

    Science.gov (United States)

    2004-07-07

    ARTICLE IN PRESSPhysica A 347 (2005) 268–2880378-4371/$ - doi:10.1016/j Correspo E-mail adwww.elsevier.com/locate/physaNon- Poisson processes : regression...05.40.a; 89.75.k; 02.50.Ey Keywords: Stochastic processes; Non- Poisson processes ; Liouville and Liouville-like equations; Correlation function...which is not legitimate with renewal non- Poisson processes , is a correct property if the deviation from the exponential relaxation is obtained by time

  4. Multi-parameter full waveform inversion using Poisson

    KAUST Repository

    Oh, Juwon

    2016-07-21

    In multi-parameter full waveform inversion (FWI), the success of recovering each parameter is dependent on characteristics of the partial derivative wavefields (or virtual sources), which differ according to parameterisation. Elastic FWIs based on the two conventional parameterisations (one uses Lame constants and density; the other employs P- and S-wave velocities and density) have low resolution of gradients for P-wave velocities (or ). Limitations occur because the virtual sources for P-wave velocity or (one of the Lame constants) are related only to P-P diffracted waves, and generate isotropic explosions, which reduce the spatial resolution of the FWI for these parameters. To increase the spatial resolution, we propose a new parameterisation using P-wave velocity, Poisson\\'s ratio, and density for frequency-domain multi-parameter FWI for isotropic elastic media. By introducing Poisson\\'s ratio instead of S-wave velocity, the virtual source for the P-wave velocity generates P-S and S-S diffracted waves as well as P-P diffracted waves in the partial derivative wavefields for the P-wave velocity. Numerical examples of the cross-triangle-square (CTS) model indicate that the new parameterisation provides highly resolved descent directions for the P-wave velocity. Numerical examples of noise-free and noisy data synthesised for the elastic Marmousi-II model support the fact that the new parameterisation is more robust for noise than the two conventional parameterisations.

  5. Study on two-dimensional POISSON design of large-scale FFAG magnet

    International Nuclear Information System (INIS)

    Ouyang Huafu

    2006-01-01

    In order to decrease the edge effect of the field, the designed magnetic field distribution in a large-scale FFAG magnet is realized by both the trim coil and the shape of the magnet pole-face. Through two-dimensional POISSON simulations, the distribution about the current and the position of the trim coil and the shape of the magnet pole are determined. In order to facilitate the POISSON design, two codes are writteen to automatically adjust the current and the position of the trim coil and the shape of magnet pole-face appeared in the POISSON input file. With the two codes, the efficiency of POISSON simulations is improved and the mistakes which might occur in writing and adjusting the POISSON input file manually could be avoided. (authors)

  6. A self-consistent phase-field approach to implicit solvation of charged molecules with Poisson-Boltzmann electrostatics.

    Science.gov (United States)

    Sun, Hui; Wen, Jiayi; Zhao, Yanxiang; Li, Bo; McCammon, J Andrew

    2015-12-28

    Dielectric boundary based implicit-solvent models provide efficient descriptions of coarse-grained effects, particularly the electrostatic effect, of aqueous solvent. Recent years have seen the initial success of a new such model, variational implicit-solvent model (VISM) [Dzubiella, Swanson, and McCammon Phys. Rev. Lett. 96, 087802 (2006) and J. Chem. Phys. 124, 084905 (2006)], in capturing multiple dry and wet hydration states, describing the subtle electrostatic effect in hydrophobic interactions, and providing qualitatively good estimates of solvation free energies. Here, we develop a phase-field VISM to the solvation of charged molecules in aqueous solvent to include more flexibility. In this approach, a stable equilibrium molecular system is described by a phase field that takes one constant value in the solute region and a different constant value in the solvent region, and smoothly changes its value on a thin transition layer representing a smeared solute-solvent interface or dielectric boundary. Such a phase field minimizes an effective solvation free-energy functional that consists of the solute-solvent interfacial energy, solute-solvent van der Waals interaction energy, and electrostatic free energy described by the Poisson-Boltzmann theory. We apply our model and methods to the solvation of single ions, two parallel plates, and protein complexes BphC and p53/MDM2 to demonstrate the capability and efficiency of our approach at different levels. With a diffuse dielectric boundary, our new approach can describe the dielectric asymmetry in the solute-solvent interfacial region. Our theory is developed based on rigorous mathematical studies and is also connected to the Lum-Chandler-Weeks theory (1999). We discuss these connections and possible extensions of our theory and methods.

  7. Failure probability assessment of wall-thinned nuclear pipes using probabilistic fracture mechanics

    International Nuclear Information System (INIS)

    Lee, Sang-Min; Chang, Yoon-Suk; Choi, Jae-Boong; Kim, Young-Jin

    2006-01-01

    The integrity of nuclear piping system has to be maintained during operation. In order to maintain the integrity, reliable assessment procedures including fracture mechanics analysis, etc., are required. Up to now, this has been performed using conventional deterministic approaches even though there are many uncertainties to hinder a rational evaluation. In this respect, probabilistic approaches are considered as an appropriate method for piping system evaluation. The objectives of this paper are to estimate the failure probabilities of wall-thinned pipes in nuclear secondary systems and to propose limited operating conditions under different types of loadings. To do this, a probabilistic assessment program using reliability index and simulation techniques was developed and applied to evaluate failure probabilities of wall-thinned pipes subjected to internal pressure, bending moment and combined loading of them. The sensitivity analysis results as well as prototypal integrity assessment results showed a promising applicability of the probabilistic assessment program, necessity of practical evaluation reflecting combined loading condition and operation considering limited condition

  8. Conditional Probabilistic Population Forecasting

    OpenAIRE

    Sanderson, W.C.; Scherbov, S.; O'Neill, B.C.; Lutz, W.

    2003-01-01

    Since policy makers often prefer to think in terms of scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy makers it allows them to answer "what if"...

  9. Conditional probabilistic population forecasting

    OpenAIRE

    Sanderson, Warren; Scherbov, Sergei; O'Neill, Brian; Lutz, Wolfgang

    2003-01-01

    Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because it allows them...

  10. Boundary Lax pairs from non-ultra-local Poisson algebras

    International Nuclear Information System (INIS)

    Avan, Jean; Doikou, Anastasia

    2009-01-01

    We consider non-ultra-local linear Poisson algebras on a continuous line. Suitable combinations of representations of these algebras yield representations of novel generalized linear Poisson algebras or 'boundary' extensions. They are parametrized by a boundary scalar matrix and depend, in addition, on the choice of an antiautomorphism. The new algebras are the classical-linear counterparts of the known quadratic quantum boundary algebras. For any choice of parameters, the non-ultra-local contribution of the original Poisson algebra disappears. We also systematically construct the associated classical Lax pair. The classical boundary principal chiral model is examined as a physical example.

  11. Students’ difficulties in probabilistic problem-solving

    Science.gov (United States)

    Arum, D. P.; Kusmayadi, T. A.; Pramudya, I.

    2018-03-01

    There are many errors can be identified when students solving mathematics problems, particularly in solving the probabilistic problem. This present study aims to investigate students’ difficulties in solving the probabilistic problem. It focuses on analyzing and describing students errors during solving the problem. This research used the qualitative method with case study strategy. The subjects in this research involve ten students of 9th grade that were selected by purposive sampling. Data in this research involve students’ probabilistic problem-solving result and recorded interview regarding students’ difficulties in solving the problem. Those data were analyzed descriptively using Miles and Huberman steps. The results show that students have difficulties in solving the probabilistic problem and can be divided into three categories. First difficulties relate to students’ difficulties in understanding the probabilistic problem. Second, students’ difficulties in choosing and using appropriate strategies for solving the problem. Third, students’ difficulties with the computational process in solving the problem. Based on the result seems that students still have difficulties in solving the probabilistic problem. It means that students have not able to use their knowledge and ability for responding probabilistic problem yet. Therefore, it is important for mathematics teachers to plan probabilistic learning which could optimize students probabilistic thinking ability.

  12. Probabilistic Forecasting for On-line Operation of Urban Drainage Systems

    DEFF Research Database (Denmark)

    Löwe, Roland

    This thesis deals with the generation of probabilistic forecasts in urban hydrology. In particular, we focus on the case of runoff forecasting for real-time control (RTC) on horizons of up to two hours. For the generation of probabilistic on-line runoff forecasts, we apply the stochastic grey...... and forecasts have on on-line runoff forecast quality. Finally, we implement the stochastic grey-box model approach in a real-world real-time control (RTC) setup and study how RTC can benefit from a dynamic quantification of runoff forecast uncertainty....

  13. A General Framework for Probabilistic Characterizing Formulae

    DEFF Research Database (Denmark)

    Sack, Joshua; Zhang, Lijun

    2012-01-01

    Recently, a general framework on characteristic formulae was proposed by Aceto et al. It offers a simple theory that allows one to easily obtain characteristic formulae of many non-probabilistic behavioral relations. Our paper studies their techniques in a probabilistic setting. We provide...... a general method for determining characteristic formulae of behavioral relations for probabilistic automata using fixed-point probability logics. We consider such behavioral relations as simulations and bisimulations, probabilistic bisimulations, probabilistic weak simulations, and probabilistic forward...

  14. The BRST complex of homological Poisson reduction

    Science.gov (United States)

    Müller-Lennert, Martin

    2017-02-01

    BRST complexes are differential graded Poisson algebras. They are associated with a coisotropic ideal J of a Poisson algebra P and provide a description of the Poisson algebra (P/J)^J as their cohomology in degree zero. Using the notion of stable equivalence introduced in Felder and Kazhdan (Contemporary Mathematics 610, Perspectives in representation theory, 2014), we prove that any two BRST complexes associated with the same coisotropic ideal are quasi-isomorphic in the case P = R[V] where V is a finite-dimensional symplectic vector space and the bracket on P is induced by the symplectic structure on V. As a corollary, the cohomology of the BRST complexes is canonically associated with the coisotropic ideal J in the symplectic case. We do not require any regularity assumptions on the constraints generating the ideal J. We finally quantize the BRST complex rigorously in the presence of infinitely many ghost variables and discuss the uniqueness of the quantization procedure.

  15. Poisson's Ratio and Auxetic Properties of Natural Rocks

    Science.gov (United States)

    Ji, Shaocheng; Li, Le; Motra, Hem Bahadur; Wuttke, Frank; Sun, Shengsi; Michibayashi, Katsuyoshi; Salisbury, Matthew H.

    2018-02-01

    Here we provide an appraisal of the Poisson's ratios (υ) for natural elements, common oxides, silicate minerals, and rocks with the purpose of searching for naturally auxetic materials. The Poisson's ratios of equivalently isotropic polycrystalline aggregates were calculated from dynamically measured elastic properties. Alpha-cristobalite is currently the only known naturally occurring mineral that has exclusively negative υ values at 20-1,500°C. Quartz and potentially berlinite (AlPO4) display auxetic behavior in the vicinity of their α-β structure transition. None of the crystalline igneous and metamorphic rocks (e.g., amphibolite, gabbro, granite, peridotite, and schist) display auxetic behavior at pressures of >5 MPa and room temperature. Our experimental measurements showed that quartz-rich sedimentary rocks (i.e., sandstone and siltstone) are most likely to be the only rocks with negative Poisson's ratios at low confining pressures (≤200 MPa) because their main constituent mineral, α-quartz, already has extremely low Poisson's ratio (υ = 0.08) and they contain microcracks, micropores, and secondary minerals. This finding may provide a new explanation for formation of dome-and-basin structures in quartz-rich sedimentary rocks in response to a horizontal compressional stress in the upper crust.

  16. A probabilistic Hu-Washizu variational principle

    Science.gov (United States)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  17. Estimation of a Non-homogeneous Poisson Model: An Empirical ...

    African Journals Online (AJOL)

    This article aims at applying the Nonhomogeneous Poisson process to trends of economic development. For this purpose, a modified Nonhomogeneous Poisson process is derived when the intensity rate is considered as a solution of stochastic differential equation which satisfies the geometric Brownian motion. The mean ...

  18. Probabilistic analysis of preload in the abutment screw of a dental implant complex.

    Science.gov (United States)

    Guda, Teja; Ross, Thomas A; Lang, Lisa A; Millwater, Harry R

    2008-09-01

    Screw loosening is a problem for a percentage of implants. A probabilistic analysis to determine the cumulative probability distribution of the preload, the probability of obtaining an optimal preload, and the probabilistic sensitivities identifying important variables is lacking. The purpose of this study was to examine the inherent variability of material properties, surface interactions, and applied torque in an implant system to determine the probability of obtaining desired preload values and to identify the significant variables that affect the preload. Using software programs, an abutment screw was subjected to a tightening torque and the preload was determined from finite element (FE) analysis. The FE model was integrated with probabilistic analysis software. Two probabilistic analysis methods (advanced mean value and Monte Carlo sampling) were applied to determine the cumulative distribution function (CDF) of preload. The coefficient of friction, elastic moduli, Poisson's ratios, and applied torque were modeled as random variables and defined by probability distributions. Separate probability distributions were determined for the coefficient of friction in well-lubricated and dry environments. The probabilistic analyses were performed and the cumulative distribution of preload was determined for each environment. A distinct difference was seen between the preload probability distributions generated in a dry environment (normal distribution, mean (SD): 347 (61.9) N) compared to a well-lubricated environment (normal distribution, mean (SD): 616 (92.2) N). The probability of obtaining a preload value within the target range was approximately 54% for the well-lubricated environment and only 0.02% for the dry environment. The preload is predominately affected by the applied torque and coefficient of friction between the screw threads and implant bore at lower and middle values of the preload CDF, and by the applied torque and the elastic modulus of the abutment

  19. Probabilistic Capacity of a Grid connected Wind Farm

    DEFF Research Database (Denmark)

    Zhao, Menghua; Chen, Zhe; Blaabjerg, Frede

    2005-01-01

    This paper proposes a method to find the maximum acceptable wind power injection regarding the thermal limits, steady state stability limits and voltage limits of the grid system. The probabilistic wind power is introduced based on the probability distribution of wind speed. Based on Power Transfer...... Distribution Factor (PTDF) and voltage sensitivities, a predictor-corrector method is suggested to calculate the acceptable active power injection. Then this method is combined with the probabilistic model of wind power to compute the allowable capacity of the wind farm. Finally, an example is illustrated...... to test this method. It is concluded that proposed method in this paper is a feasible, fast, and accurate approach to find the size of a wind farm....

  20. Adaptive maximal poisson-disk sampling on surfaces

    KAUST Repository

    Yan, Dongming

    2012-01-01

    In this paper, we study the generation of maximal Poisson-disk sets with varying radii on surfaces. Based on the concepts of power diagram and regular triangulation, we present a geometric analysis of gaps in such disk sets on surfaces, which is the key ingredient of the adaptive maximal Poisson-disk sampling framework. Moreover, we adapt the presented sampling framework for remeshing applications. Several novel and efficient operators are developed for improving the sampling/meshing quality over the state-of-theart. © 2012 ACM.

  1. Seismic Probabilistic Risk Assessment (SPRA), approach and results

    International Nuclear Information System (INIS)

    Campbell, R.D.

    1995-01-01

    During the past 15 years there have been over 30 Seismic Probabilistic Risk Assessments (SPRAs) and Seismic Probabilistic Safety Assessments (SPSAs) conducted of Western Nuclear Power Plants, principally of US design. In this paper PRA and PSA are used interchangeably as the overall process is essentially the same. Some similar assessments have been done for reactors in Taiwan, Korea, Japan, Switzerland and Slovenia. These plants were also principally US supplied or built under US license. Since the restructuring of the governments in former Soviet Bloc countries, there has been grave concern regarding the safety of the reactors in these countries. To date there has been considerable activity in conducting partial seismic upgrades but the overall quantification of risk has not been pursued to the depth that it has in Western countries. This paper summarizes the methodology for Seismic PRA/PSA and compares results of two partially completed and two completed PRAs of soviet designed reactors to results from earlier PRAs on US Reactors. A WWER 440 and a WWER 1000 located in low seismic activity regions have completed PRAs and results show the seismic risk to be very low for both designs. For more active regions, partially completed PRAs of a WWER 440 and WWER 1000 located at the same site show the WWER 440 to have much greater seismic risk than the WWER 1000 plant. The seismic risk from the 1000 MW plant compares with the high end of seismic risk for earlier seismic PRAs in the US. Just as for most US plants, the seismic risk appears to be less than the risk from internal events if risk is measured is terms of mean core damage frequency. However, due to the lack of containment for the earlier WWER 440s, the risk to the public may be significantly greater due to the more probable scenario of an early release. The studies reported have not taken the accident sequences beyond the stage of core damage hence the public heath risk ratios are speculative. (author)

  2. Rate-optimal Bayesian intensity smoothing for inhomogeneous Poisson processes

    NARCIS (Netherlands)

    Belitser, E.N.; Serra, P.; van Zanten, H.

    2015-01-01

    We apply nonparametric Bayesian methods to study the problem of estimating the intensity function of an inhomogeneous Poisson process. To motivate our results we start by analyzing count data coming from a call center which we model as a Poisson process. This analysis is carried out using a certain

  3. Development of Nuclear Safety Culture evaluation method for an operation team based on the probabilistic approach

    International Nuclear Information System (INIS)

    Han, Sang Min; Lee, Seung Min; Yim, Ho Bin; Seong, Poong Hyun

    2018-01-01

    Highlights: •We proposed a Probabilistic Safety Culture Healthiness Evaluation Method. •Positive relationship between the ‘success’ states of NSC and performance was shown. •The state probability profile showed a unique ratio regardless of the scenarios. •Cutset analysis provided not only root causes but also the latent causes of failures. •Pro-SCHEMe was found to be applicable to Korea NPPs. -- Abstract: The aim of this study is to propose a new quantitative evaluation method for Nuclear Safety Culture (NSC) in Nuclear Power Plant (NPP) operation teams based on the probabilistic approach. Various NSC evaluation methods have been developed, and the Korea NPP utility company has conducted the NSC assessment according to international practice. However, most of methods are conducted by interviews, observations, and the self-assessment. Consequently, the results are often qualitative, subjective, and mainly dependent on evaluator’s judgement, so the assessment results can be interpreted from different perspectives. To resolve limitations of present evaluation methods, the concept of Safety Culture Healthiness was suggested to produce quantitative results and provide faster evaluation process. This paper presents Probabilistic Safety Culture Healthiness Evaluation Method (Pro-SCHEMe) to generate quantitative inputs for Human Reliability Assessment (HRA) in Probabilistic Safety Assessment (PSA). Evaluation items which correspond to a basic event in PSA are derived in the first part of the paper through the literature survey; mostly from nuclear-related organizations such as the International Atomic Energy Agency (IAEA), the United States Nuclear Regulatory Commission (U.S.NRC), and the Institute of Nuclear Power Operations (INPO). Event trees (ETs) and fault trees (FTs) are devised to apply evaluation items to PSA based on the relationships among such items. The Modeling Guidelines are also suggested to classify and calculate NSC characteristics of

  4. Probabilistic safety assessment for research reactors

    International Nuclear Information System (INIS)

    1986-12-01

    Increasing interest in using Probabilistic Safety Assessment (PSA) methods for research reactor safety is being observed in many countries throughout the world. This is mainly because of the great ability of this approach in achieving safe and reliable operation of research reactors. There is also a need to assist developing countries to apply Probabilistic Safety Assessment to existing nuclear facilities which are simpler and therefore less complicated to analyse than a large Nuclear Power Plant. It may be important, therefore, to develop PSA for research reactors. This might also help to better understand the safety characteristics of the reactor and to base any backfitting on a cost-benefit analysis which would ensure that only necessary changes are made. This document touches on all the key aspects of PSA but placed greater emphasis on so-called systems analysis aspects rather than the in-plant or ex-plant consequences

  5. Fractional Poisson process (II)

    International Nuclear Information System (INIS)

    Wang Xiaotian; Wen Zhixiong; Zhang Shiying

    2006-01-01

    In this paper, we propose a stochastic process W H (t)(H-bar (12,1)) which we call fractional Poisson process. The process W H (t) is self-similar in wide sense, displays long range dependence, and has more fatter tail than Gaussian process. In addition, it converges to fractional Brownian motion in distribution

  6. Nonlinear poisson brackets geometry and quantization

    CERN Document Server

    Karasev, M V

    2012-01-01

    This book deals with two old mathematical problems. The first is the problem of constructing an analog of a Lie group for general nonlinear Poisson brackets. The second is the quantization problem for such brackets in the semiclassical approximation (which is the problem of exact quantization for the simplest classes of brackets). These problems are progressively coming to the fore in the modern theory of differential equations and quantum theory, since the approach based on constructions of algebras and Lie groups seems, in a certain sense, to be exhausted. The authors' main goal is to describe in detail the new objects that appear in the solution of these problems. Many ideas of algebra, modern differential geometry, algebraic topology, and operator theory are synthesized here. The authors prove all statements in detail, thus making the book accessible to graduate students.

  7. Improved Denoising via Poisson Mixture Modeling of Image Sensor Noise.

    Science.gov (United States)

    Zhang, Jiachao; Hirakawa, Keigo

    2017-04-01

    This paper describes a study aimed at comparing the real image sensor noise distribution to the models of noise often assumed in image denoising designs. A quantile analysis in pixel, wavelet transform, and variance stabilization domains reveal that the tails of Poisson, signal-dependent Gaussian, and Poisson-Gaussian models are too short to capture real sensor noise behavior. A new Poisson mixture noise model is proposed to correct the mismatch of tail behavior. Based on the fact that noise model mismatch results in image denoising that undersmoothes real sensor data, we propose a mixture of Poisson denoising method to remove the denoising artifacts without affecting image details, such as edge and textures. Experiments with real sensor data verify that denoising for real image sensor data is indeed improved by this new technique.

  8. Radiation protection criteria for cases of probabilistic disruptive events

    International Nuclear Information System (INIS)

    Beninson, D.J.

    1985-01-01

    The individual risk limitation for the case of probabilistic disruptive events is studied, when the radiation effects cease to be only stochastic; the proposed criterion is applied for the case of high level waste repositories. The protection's optimization results from the differential cost-benefit. More general procedures of decision theory that use probabilistically defined utility functions are considered for its calculation. These more general procedures can be applied also in cases where radiation exposures are only potential, to optimize the required level of safety features. It is shown that for disruptive events of low probability and large resulting consequences, the concept of 'expectation' of consequence can not be used in decision making, but that the use of probabilistically based utility functions can conceptually assure a consistent approach in deciding the required level of safety. The use of utility functions of logaritmic form to assign weights to consequences involving different loss of life is explored (M.E.L.) [es

  9. Prediction Uncertainty and Groundwater Management: Approaches to get the Most out of Probabilistic Outputs

    Science.gov (United States)

    Peeters, L. J.; Mallants, D.; Turnadge, C.

    2017-12-01

    Groundwater impact assessments are increasingly being undertaken in a probabilistic framework whereby various sources of uncertainty (model parameters, model structure, boundary conditions, and calibration data) are taken into account. This has resulted in groundwater impact metrics being presented as probability density functions and/or cumulative distribution functions, spatial maps displaying isolines of percentile values for specific metrics, etc. Groundwater management on the other hand typically uses single values (i.e., in a deterministic framework) to evaluate what decisions are required to protect groundwater resources. For instance, in New South Wales, Australia, a nominal drawdown value of two metres is specified by the NSW Aquifer Interference Policy as trigger-level threshold. In many cases, when drawdowns induced by groundwater extraction exceed two metres, "make-good" provisions are enacted (such as the surrendering of extraction licenses). The information obtained from a quantitative uncertainty analysis can be used to guide decision making in several ways. Two examples are discussed here: the first of which would not require modification of existing "deterministic" trigger or guideline values, whereas the second example assumes that the regulatory criteria are also expressed in probabilistic terms. The first example is a straightforward interpretation of calculated percentile values for specific impact metrics. The second examples goes a step further, as the previous deterministic thresholds do not currently allow for a probabilistic interpretation; e.g., there is no statement that "the probability of exceeding the threshold shall not be larger than 50%". It would indeed be sensible to have a set of thresholds with an associated acceptable probability of exceedance (or probability of not exceeding a threshold) that decreases as the impact increases. We here illustrate how both the prediction uncertainty and management rules can be expressed in a

  10. Optimization-Based Approaches to Control of Probabilistic Boolean Networks

    Directory of Open Access Journals (Sweden)

    Koichi Kobayashi

    2017-02-01

    Full Text Available Control of gene regulatory networks is one of the fundamental topics in systems biology. In the last decade, control theory of Boolean networks (BNs, which is well known as a model of gene regulatory networks, has been widely studied. In this review paper, our previously proposed methods on optimal control of probabilistic Boolean networks (PBNs are introduced. First, the outline of PBNs is explained. Next, an optimal control method using polynomial optimization is explained. The finite-time optimal control problem is reduced to a polynomial optimization problem. Furthermore, another finite-time optimal control problem, which can be reduced to an integer programming problem, is also explained.

  11. Probabilistic finite elements with dynamic limit bounds; a case study: 17th street flood wall, New Orleans

    NARCIS (Netherlands)

    Rajabali Nejad, Mohammadreza; van Gelder, P.H.A.J.M.; Vrijling, J.K.

    2008-01-01

    The probabilistic approach provides a better understanding of failure mechanisms and occurrence probabilities as well as consequences of failure. Besides, main advantages of the probabilistic design in comparison with the deterministic design are: a more careful, more cost effective, and more

  12. Bayesian regression of piecewise homogeneous Poisson processes

    Directory of Open Access Journals (Sweden)

    Diego Sevilla

    2015-12-01

    Full Text Available In this paper, a Bayesian method for piecewise regression is adapted to handle counting processes data distributed as Poisson. A numerical code in Mathematica is developed and tested analyzing simulated data. The resulting method is valuable for detecting breaking points in the count rate of time series for Poisson processes. Received: 2 November 2015, Accepted: 27 November 2015; Edited by: R. Dickman; Reviewed by: M. Hutter, Australian National University, Canberra, Australia.; DOI: http://dx.doi.org/10.4279/PIP.070018 Cite as: D J R Sevilla, Papers in Physics 7, 070018 (2015

  13. Gyrokinetic energy conservation and Poisson-bracket formulation

    International Nuclear Information System (INIS)

    Brizard, A.

    1989-01-01

    An integral expression for the gyrokinetic total energy of a magnetized plasma, with general magnetic field configuration perturbed by fully electromagnetic fields, was recently derived through the use of a gyrocenter Lie transformation. It is shown that the gyrokinetic energy is conserved by the gyrokinetic Hamiltonian flow to all orders in perturbed fields. An explicit demonstration that a gyrokinetic Hamiltonian containing quadratic nonlinearities preserves the gyrokinetic energy up to third order is given. The Poisson-bracket formulation greatly facilitates this demonstration with the help of the Jacobi identity and other properties of the Poisson brackets

  14. Hamiltonian field description of the one-dimensional Poisson-Vlasov equations

    International Nuclear Information System (INIS)

    Morrison, P.J.

    1981-07-01

    The one-dimensional Poisson-Vlasov equations are cast into Hamiltonian form. A Poisson Bracket in terms of the phase space density, as sole dynamical variable, is presented. This Poisson bracket is not of the usual form, but possesses the commutator properties of antisymmetry, bilinearity, and nonassociativity by virtue of the Jacobi requirement. Clebsch potentials are seen to yield a conventional (canonical) formulation. This formulation is discretized by expansion in terms of an arbitrary complete set of basis functions. In particular, a wave field representation is obtained

  15. Exact and approximate probabilistic symbolic execution for nondeterministic programs

    DEFF Research Database (Denmark)

    Luckow, Kasper Søe; Păsăreanu, Corina S.; Dwyer, Matthew B.

    2014-01-01

    Probabilistic software analysis seeks to quantify the likelihood of reaching a target event under uncertain environments. Recent approaches compute probabilities of execution paths using symbolic execution, but do not support nondeterminism. Nondeterminism arises naturally when no suitable probab...... Java programs. We show that our algorithms significantly improve upon a state-of-the-art statistical model checking algorithm, originally developed for Markov Decision Processes....... probabilistic model can capture a program behavior, e.g., for multithreading or distributed systems. In this work, we propose a technique, based on symbolic execution, to synthesize schedulers that resolve nondeterminism to maximize the probability of reaching a target event. To scale to large systems, we also...

  16. Probabilistic programmable quantum processors

    International Nuclear Information System (INIS)

    Buzek, V.; Ziman, M.; Hillery, M.

    2004-01-01

    We analyze how to improve performance of probabilistic programmable quantum processors. We show how the probability of success of the probabilistic processor can be enhanced by using the processor in loops. In addition, we show that an arbitrary SU(2) transformations of qubits can be encoded in program state of a universal programmable probabilistic quantum processor. The probability of success of this processor can be enhanced by a systematic correction of errors via conditional loops. Finally, we show that all our results can be generalized also for qudits. (Abstract Copyright [2004], Wiley Periodicals, Inc.)

  17. Poisson-type inequalities for growth properties of positive superharmonic functions.

    Science.gov (United States)

    Luan, Kuan; Vieira, John

    2017-01-01

    In this paper, we present new Poisson-type inequalities for Poisson integrals with continuous data on the boundary. The obtained inequalities are used to obtain growth properties at infinity of positive superharmonic functions in a smooth cone.

  18. Soft network materials with isotropic negative Poisson's ratios over large strains.

    Science.gov (United States)

    Liu, Jianxing; Zhang, Yihui

    2018-01-31

    Auxetic materials with negative Poisson's ratios have important applications across a broad range of engineering areas, such as biomedical devices, aerospace engineering and automotive engineering. A variety of design strategies have been developed to achieve artificial auxetic materials with controllable responses in the Poisson's ratio. The development of designs that can offer isotropic negative Poisson's ratios over large strains can open up new opportunities in emerging biomedical applications, which, however, remains a challenge. Here, we introduce deterministic routes to soft architected materials that can be tailored precisely to yield the values of Poisson's ratio in the range from -1 to 1, in an isotropic manner, with a tunable strain range from 0% to ∼90%. The designs rely on a network construction in a periodic lattice topology, which incorporates zigzag microstructures as building blocks to connect lattice nodes. Combined experimental and theoretical studies on broad classes of network topologies illustrate the wide-ranging utility of these concepts. Quantitative mechanics modeling under both infinitesimal and finite deformations allows the development of a rigorous design algorithm that determines the necessary network geometries to yield target Poisson ratios over desired strain ranges. Demonstrative examples in artificial skin with both the negative Poisson's ratio and the nonlinear stress-strain curve precisely matching those of the cat's skin and in unusual cylindrical structures with engineered Poisson effect and shape memory effect suggest potential applications of these network materials.

  19. Development of a Probabilistic Flood Hazard Assessment (PFHA) for the nuclear safety

    Science.gov (United States)

    Ben Daoued, Amine; Guimier, Laurent; Hamdi, Yasser; Duluc, Claire-Marie; Rebour, Vincent

    2016-04-01

    The purpose of this study is to lay the basis for a probabilistic evaluation of flood hazard (PFHA). Probabilistic assessment of external floods has become a current topic of interest to the nuclear scientific community. Probabilistic approaches complement deterministic approaches by exploring a set of scenarios and associating a probability to each of them. These approaches aim to identify all possible failure scenarios, combining their probability, in order to cover all possible sources of risk. They are based on the distributions of initiators and/or the variables caracterizing these initiators. The PFHA can characterize the water level for example at defined point of interest in the nuclear site. This probabilistic flood hazard characterization takes into account all the phenomena that can contribute to the flooding of the site. The main steps of the PFHA are: i) identification of flooding phenomena (rains, sea water level, etc.) and screening of relevant phenomena to the nuclear site, ii) identification and probabilization of parameters associated to selected flooding phenomena, iii) spreading of the probabilized parameters from the source to the point of interest in the site, v) obtaining hazard curves and aggregation of flooding phenomena contributions at the point of interest taking into account the uncertainties. Within this framework, the methodology of the PFHA has been developed for several flooding phenomena (rain and/or sea water level, etc.) and then implemented and tested with a simplified case study. In the same logic, our study is still in progress to take into account other flooding phenomena and to carry out more case studies.

  20. The dialectical thinking about deterministic and probabilistic safety analysis

    International Nuclear Information System (INIS)

    Qian Yongbai; Tong Jiejuan; Zhang Zuoyi; He Xuhong

    2005-01-01

    There are two methods in designing and analysing the safety performance of a nuclear power plant, the traditional deterministic method and the probabilistic method. To date, the design of nuclear power plant is based on the deterministic method. It has been proved in practice that the deterministic method is effective on current nuclear power plant. However, the probabilistic method (Probabilistic Safety Assessment - PSA) considers a much wider range of faults, takes an integrated look at the plant as a whole, and uses realistic criteria for the performance of the systems and constructions of the plant. PSA can be seen, in principle, to provide a broader and realistic perspective on safety issues than the deterministic approaches. In this paper, the historical origins and development trend of above two methods are reviewed and summarized in brief. Based on the discussion of two application cases - one is the changes to specific design provisions of the general design criteria (GDC) and the other is the risk-informed categorization of structure, system and component, it can be concluded that the deterministic method and probabilistic method are dialectical and unified, and that they are being merged into each other gradually, and being used in coordination. (authors)

  1. A Time-Varied Probabilistic ON/OFF Switching Algorithm for Cellular Networks

    KAUST Repository

    Rached, Nadhir B.; Ghazzai, Hakim; Kadri, Abdullah; Alouini, Mohamed-Slim

    2018-01-01

    In this letter, we develop a time-varied probabilistic on/off switching planning method for cellular networks to reduce their energy consumption. It consists in a risk-aware optimization approach that takes into consideration the randomness of the user profile associated with each base station (BS). The proposed approach jointly determines (i) the instants of time at which the current active BS configuration must be updated due to an increase or decrease of the network traffic load, and (ii) the set of minimum BSs to be activated to serve the networks’ subscribers. Probabilistic metrics modeling the traffic profile variation are developed to trigger this dynamic on/off switching operation. Selected simulation results are then performed to validate the proposed algorithm for different system parameters.

  2. A Time-Varied Probabilistic ON/OFF Switching Algorithm for Cellular Networks

    KAUST Repository

    Rached, Nadhir B.

    2018-01-11

    In this letter, we develop a time-varied probabilistic on/off switching planning method for cellular networks to reduce their energy consumption. It consists in a risk-aware optimization approach that takes into consideration the randomness of the user profile associated with each base station (BS). The proposed approach jointly determines (i) the instants of time at which the current active BS configuration must be updated due to an increase or decrease of the network traffic load, and (ii) the set of minimum BSs to be activated to serve the networks’ subscribers. Probabilistic metrics modeling the traffic profile variation are developed to trigger this dynamic on/off switching operation. Selected simulation results are then performed to validate the proposed algorithm for different system parameters.

  3. Symplectic and Poisson Geometry in Interaction with Analysis, Algebra and Topology & Symplectic Geometry, Noncommutative Geometry and Physics

    CERN Document Server

    Eliashberg, Yakov; Maeda, Yoshiaki; Symplectic, Poisson, and Noncommutative geometry

    2014-01-01

    Symplectic geometry originated in physics, but it has flourished as an independent subject in mathematics, together with its offspring, symplectic topology. Symplectic methods have even been applied back to mathematical physics. Noncommutative geometry has developed an alternative mathematical quantization scheme based on a geometric approach to operator algebras. Deformation quantization, a blend of symplectic methods and noncommutative geometry, approaches quantum mechanics from a more algebraic viewpoint, as it addresses quantization as a deformation of Poisson structures. This volume contains seven chapters based on lectures given by invited speakers at two May 2010 workshops held at the Mathematical Sciences Research Institute: Symplectic and Poisson Geometry in Interaction with Analysis, Algebra and Topology (honoring Alan Weinstein, one of the key figures in the field) and Symplectic Geometry, Noncommutative Geometry and Physics. The chapters include presentations of previously unpublished results and ...

  4. Comparison of three-dimensional poisson solution methods for particle-based simulation and inhomogeneous dielectrics.

    Science.gov (United States)

    Berti, Claudio; Gillespie, Dirk; Bardhan, Jaydeep P; Eisenberg, Robert S; Fiegna, Claudio

    2012-07-01

    Particle-based simulation represents a powerful approach to modeling physical systems in electronics, molecular biology, and chemical physics. Accounting for the interactions occurring among charged particles requires an accurate and efficient solution of Poisson's equation. For a system of discrete charges with inhomogeneous dielectrics, i.e., a system with discontinuities in the permittivity, the boundary element method (BEM) is frequently adopted. It provides the solution of Poisson's equation, accounting for polarization effects due to the discontinuity in the permittivity by computing the induced charges at the dielectric boundaries. In this framework, the total electrostatic potential is then found by superimposing the elemental contributions from both source and induced charges. In this paper, we present a comparison between two BEMs to solve a boundary-integral formulation of Poisson's equation, with emphasis on the BEMs' suitability for particle-based simulations in terms of solution accuracy and computation speed. The two approaches are the collocation and qualocation methods. Collocation is implemented following the induced-charge computation method of D. Boda et al. [J. Chem. Phys. 125, 034901 (2006)]. The qualocation method is described by J. Tausch et al. [IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems 20, 1398 (2001)]. These approaches are studied using both flat and curved surface elements to discretize the dielectric boundary, using two challenging test cases: a dielectric sphere embedded in a different dielectric medium and a toy model of an ion channel. Earlier comparisons of the two BEM approaches did not address curved surface elements or semiatomistic models of ion channels. Our results support the earlier findings that for flat-element calculations, qualocation is always significantly more accurate than collocation. On the other hand, when the dielectric boundary is discretized with curved surface elements, the

  5. PROBABILISTIC RELATIONAL MODELS OF COMPLETE IL-SEMIRINGS

    OpenAIRE

    Tsumagari, Norihiro

    2012-01-01

    This paper studies basic properties of probabilistic multirelations which are generalized the semantic domain of probabilistic systems and then provides two probabilistic models of complete IL-semirings using probabilistic multirelations. Also it is shown that these models need not be models of complete idempotentsemirings.

  6. Optimized thick-wall cylinders by virtue of Poisson's ratio selection

    International Nuclear Information System (INIS)

    Whitty, J.P.M.; Henderson, B.; Francis, J.; Lloyd, N.

    2011-01-01

    The principal stress distributions in thick-wall cylinders due to variation in the Poisson's ratio are predicted using analytical and finite element methods. Analyses of appropriate brittle and ductile failure criteria show that under the isochoric pressure conditions investigated that auextic (i.e. those possessing a negative Poisson's ratio) materials act as stress concentrators; hence they are predicted to fail before their conventional (i.e. possessing a positive Poisson's ratio) material counterparts. The key finding of the work presented shows that for constrained thick-wall cylinders the maximum tensile principal stress can vanish at a particular Poisson's ratio and aspect ratio. This phenomenon is exploited in order to present an optimized design criterion for thick-wall cylinders. Moreover, via the use of a cogent finite element model, this criterion is also shown to be applicable for the design of micro-porous materials.

  7. The Lie-Poisson structure of integrable classical non-linear sigma models

    International Nuclear Information System (INIS)

    Bordemann, M.; Forger, M.; Schaeper, U.; Laartz, J.

    1993-01-01

    The canonical structure of classical non-linear sigma models on Riemannian symmetric spaces, which constitute the most general class of classical non-linear sigma models known to be integrable, is shown to be governed by a fundamental Poisson bracket relation that fits into the r-s-matrix formalism for non-ultralocal integrable models first discussed by Maillet. The matrices r and s are computed explicitly and, being field dependent, satisfy fundamental Poisson bracket relations of their own, which can be expressed in terms of a new numerical matrix c. It is proposed that all these Poisson brackets taken together are representation conditions for a new kind of algebra which, for this class of models, replaces the classical Yang-Baxter algebra governing the canonical structure of ultralocal models. The Poisson brackets for the transition matrices are also computed, and the notorious regularization problem associated with the definition of the Poisson brackets for the monodromy matrices is discussed. (orig.)

  8. Zero-inflated Conway-Maxwell Poisson Distribution to Analyze Discrete Data.

    Science.gov (United States)

    Sim, Shin Zhu; Gupta, Ramesh C; Ong, Seng Huat

    2018-01-09

    In this paper, we study the zero-inflated Conway-Maxwell Poisson (ZICMP) distribution and develop a regression model. Score and likelihood ratio tests are also implemented for testing the inflation/deflation parameter. Simulation studies are carried out to examine the performance of these tests. A data example is presented to illustrate the concepts. In this example, the proposed model is compared to the well-known zero-inflated Poisson (ZIP) and the zero- inflated generalized Poisson (ZIGP) regression models. It is shown that the fit by ZICMP is comparable or better than these models.

  9. Quantized Algebras of Functions on Homogeneous Spaces with Poisson Stabilizers

    Science.gov (United States)

    Neshveyev, Sergey; Tuset, Lars

    2012-05-01

    Let G be a simply connected semisimple compact Lie group with standard Poisson structure, K a closed Poisson-Lie subgroup, 0 topology on the spectrum of C( G q / K q ). Next we show that the family of C*-algebras C( G q / K q ), 0 < q ≤ 1, has a canonical structure of a continuous field of C*-algebras and provides a strict deformation quantization of the Poisson algebra {{C}[G/K]} . Finally, extending a result of Nagy, we show that C( G q / K q ) is canonically KK-equivalent to C( G/ K).

  10. Poisson-Lie T-duality open strings and D-branes

    CERN Document Server

    Klimcik, C.

    1996-01-01

    Global issues of the Poisson-Lie T-duality are addressed. It is shown that oriented open strings propagating on a group manifold G are dual to D-brane - anti-D-brane pairs propagating on the dual group manifold \\ti G. The D-branes coincide with the symplectic leaves of the standard Poisson structure induced on the dual group \\ti G by the dressing action of the group G. T-duality maps the momentum of the open string into the mutual distance of the D-branes in the pair. The whole picture is then extended to the full modular space M(D) of the Poisson-Lie equivalent \\si-models which is the space of all Manin triples of a given Drinfeld double.T-duality rotates the zero modes of pairs of D-branes living on targets belonging to M(D). In this more general case the D-branes are preimages of symplectic leaves in certain Poisson homogeneous spaces of their targets and, as such, they are either all even or all odd dimensional.

  11. Probabilistic safety analysis vs probabilistic fracture mechanics -relation and necessary merging

    International Nuclear Information System (INIS)

    Nilsson, Fred

    1997-01-01

    A comparison is made between some general features of probabilistic fracture mechanics (PFM) and probabilistic safety assessment (PSA) in its standard form. We conclude that: Result from PSA is a numerically expressed level of confidence in the system based on the state of current knowledge. It is thus not any objective measure of risk. It is important to carefully define the precise nature of the probabilistic statement and relate it to a well defined situation. Standardisation of PFM methods is necessary. PFM seems to be the only way to obtain estimates of the pipe break probability. Service statistics are of doubtful value because of scarcity of data and statistical inhomogeneity. Collection of service data should be directed towards the occurrence of growing cracks

  12. Corroborating a new probabilistic seismic hazard assessment for greater Tokyo from historical intensity observations

    Science.gov (United States)

    Bozkurt, S.; Stein, R.; Toda, S.

    2006-12-01

    The long recorded history of earthquakes in Japan affords an opportunity to forecast seismic shaking exclusively from past observations of shaking. For this we analyzed 10,000 intensity observations recorded during AD 1600-2000 in a 350 x 350 km area centered on Tokyo in a Geographic Information System. A frequency-intensity curve is found for each 5 x 5 km cell, and from this the probability of exceeding any intensity level can be estimated. The principal benefits of this approach is that it builds the fewest possible assumptions into a probabilistic seismic forecast, it includes site and source effects without imposing this behavior, and we do not need to know the size or location of any earthquake or the location and slip rate of any fault. The cost is that we must abandon any attempt to make a time-dependent forecast, which could be quite different. We believe the method is suitable to many applications of probabilistic seismic hazard assessment, and to other regions. The two key assumptions are that the slope of the observed frequency-intensity relation at every site is the same, and that the 400-year record is long enough to encompass the full range of seismic behavior. Tests we conduct suggest that both assumptions are sound. The resulting 30-year probability of IJMA>=6 shaking (roughly equivalent to PGA>=0.9 g or MMI=IX-X) is 30-40% in Tokyo, Kawasaki, and Yokohama, and 10-15% in Chiba and Tsukuba, the range reflecting spatial variability and curve-fitting alternatives. The strongest shaking is forecast along the margins of Tokyo Bay, within the river sediments extending northwest from Tokyo, and at coastal sites near the plate boundary faults. We also produce long- term exceedance maps of peak ground acceleration for building code regulations, and short-term hazard maps associated with hypothetical catastrophe bonds. Our results for greater Tokyo resemble our independent Poisson probability developed from conventional seismic hazard analysis, as well as

  13. Modelling probabilistic fatigue crack propagation rates for a mild structural steel

    Directory of Open Access Journals (Sweden)

    J.A.F.O. Correia

    2015-01-01

    Full Text Available A class of fatigue crack growth models based on elastic–plastic stress–strain histories at the crack tip region and local strain-life damage models have been proposed in literature. The fatigue crack growth is regarded as a process of continuous crack initializations over successive elementary material blocks, which may be governed by smooth strain-life damage data. Some approaches account for the residual stresses developing at the crack tip in the actual crack driving force assessment, allowing mean stresses and loading sequential effects to be modelled. An extension of the fatigue crack propagation model originally proposed by Noroozi et al. (2005 to derive probabilistic fatigue crack propagation data is proposed, in particular concerning the derivation of probabilistic da/dN-ΔK-R fields. The elastic-plastic stresses at the vicinity of the crack tip, computed using simplified formulae, are compared with the stresses computed using an elasticplastic finite element analyses for specimens considered in the experimental program proposed to derive the fatigue crack propagation data. Using probabilistic strain-life data available for the S355 structural mild steel, probabilistic crack propagation fields are generated, for several stress ratios, and compared with experimental fatigue crack propagation data. A satisfactory agreement between the predicted probabilistic fields and experimental data is observed.

  14. Alternative Forms of Compound Fractional Poisson Processes

    Directory of Open Access Journals (Sweden)

    Luisa Beghin

    2012-01-01

    Full Text Available We study here different fractional versions of the compound Poisson process. The fractionality is introduced in the counting process representing the number of jumps as well as in the density of the jumps themselves. The corresponding distributions are obtained explicitly and proved to be solution of fractional equations of order less than one. Only in the final case treated in this paper, where the number of jumps is given by the fractional-difference Poisson process defined in Orsingher and Polito (2012, we have a fractional driving equation, with respect to the time argument, with order greater than one. Moreover, in this case, the compound Poisson process is Markovian and this is also true for the corresponding limiting process. All the processes considered here are proved to be compositions of continuous time random walks with stable processes (or inverse stable subordinators. These subordinating relationships hold, not only in the limit, but also in the finite domain. In some cases the densities satisfy master equations which are the fractional analogues of the well-known Kolmogorov one.

  15. Exterior differentials in superspace and Poisson brackets

    International Nuclear Information System (INIS)

    Soroka, Dmitrij V.; Soroka, Vyacheslav A.

    2003-01-01

    It is shown that two definitions for an exterior differential in superspace, giving the same exterior calculus, yet lead to different results when applied to the Poisson bracket. A prescription for the transition with the help of these exterior differentials from the given Poisson bracket of definite Grassmann parity to another bracket is introduced. It is also indicated that the resulting bracket leads to generalization of the Schouten-Nijenhuis bracket for the cases of superspace and brackets of diverse Grassmann parities. It is shown that in the case of the Grassmann-odd exterior differential the resulting bracket is the bracket given on exterior forms. The above-mentioned transition with the use of the odd exterior differential applied to the linear even/odd Poisson brackets, that correspond to semi-simple Lie groups, results, respectively, in also linear odd/even brackets which are naturally connected with the Lie superalgebra. The latter contains the BRST and anti-BRST charges and can be used for calculation of the BRST operator cogomology. (author)

  16. Quantization with maximally degenerate Poisson brackets: the harmonic oscillator!

    International Nuclear Information System (INIS)

    Nutku, Yavuz

    2003-01-01

    Nambu's construction of multi-linear brackets for super-integrable systems can be thought of as degenerate Poisson brackets with a maximal set of Casimirs in their kernel. By introducing privileged coordinates in phase space these degenerate Poisson brackets are brought to the form of Heisenberg's equations. We propose a definition for constructing quantum operators for classical functions, which enables us to turn the maximally degenerate Poisson brackets into operators. They pose a set of eigenvalue problems for a new state vector. The requirement of the single-valuedness of this eigenfunction leads to quantization. The example of the harmonic oscillator is used to illustrate this general procedure for quantizing a class of maximally super-integrable systems

  17. A modified Poisson-Boltzmann equation applied to protein adsorption.

    Science.gov (United States)

    Gama, Marlon de Souza; Santos, Mirella Simões; Lima, Eduardo Rocha de Almeida; Tavares, Frederico Wanderley; Barreto, Amaro Gomes Barreto

    2018-01-05

    Ion-exchange chromatography has been widely used as a standard process in purification and analysis of protein, based on the electrostatic interaction between the protein and the stationary phase. Through the years, several approaches are used to improve the thermodynamic description of colloidal particle-surface interaction systems, however there are still a lot of gaps specifically when describing the behavior of protein adsorption. Here, we present an improved methodology for predicting the adsorption equilibrium constant by solving the modified Poisson-Boltzmann (PB) equation in bispherical coordinates. By including dispersion interactions between ions and protein, and between ions and surface, the modified PB equation used can describe the Hofmeister effects. We solve the modified Poisson-Boltzmann equation to calculate the protein-surface potential of mean force, treated as spherical colloid-plate system, as a function of process variables. From the potential of mean force, the Henry constants of adsorption, for different proteins and surfaces, are calculated as a function of pH, salt concentration, salt type, and temperature. The obtained Henry constants are compared with experimental data for several isotherms showing excellent agreement. We have also performed a sensitivity analysis to verify the behavior of different kind of salts and the Hofmeister effects. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. A Probabilistic Design Methodology for a Turboshaft Engine Overall Performance Analysis

    Directory of Open Access Journals (Sweden)

    Min Chen

    2014-05-01

    Full Text Available In reality, the cumulative effect of the many uncertainties in engine component performance may stack up to affect the engine overall performance. This paper aims to quantify the impact of uncertainty in engine component performance on the overall performance of a turboshaft engine based on Monte-Carlo probabilistic design method. A novel probabilistic model of turboshaft engine, consisting of a Monte-Carlo simulation generator, a traditional nonlinear turboshaft engine model, and a probability statistical model, was implemented to predict this impact. One of the fundamental results shown herein is that uncertainty in component performance has a significant impact on the engine overall performance prediction. This paper also shows that, taking into consideration the uncertainties in component performance, the turbine entry temperature and overall pressure ratio based on the probabilistic design method should increase by 0.76% and 8.33%, respectively, compared with the ones of deterministic design method. The comparison shows that the probabilistic approach provides a more credible and reliable way to assign the design space for a target engine overall performance.

  19. Economic order quantity (EOQ) by game theory approach in probabilistic supply chain system under service level constraint for items with imperfect quality

    Science.gov (United States)

    Setiawan, R.

    2018-03-01

    In this paper, Economic Order Quantity (EOQ) of probabilistic two-level supply – chain system for items with imperfect quality has been analyzed under service level constraint. A firm applies an active service level constraint to avoid unpredictable shortage terms in the objective function. Mathematical analysis of optimal result is delivered using two equilibrium scheme concept in game theory approach. Stackelberg’s equilibrium for cooperative strategy and Stackelberg’s Equilibrium for noncooperative strategy. This is a new approach to game theory result in inventory system whether service level constraint is applied by a firm in his moves.

  20. Numerical methods for realizing nonstationary Poisson processes with piecewise-constant instantaneous-rate functions

    DEFF Research Database (Denmark)

    Harrod, Steven; Kelton, W. David

    2006-01-01

    Nonstationary Poisson processes are appropriate in many applications, including disease studies, transportation, finance, and social policy. The authors review the risks of ignoring nonstationarity in Poisson processes and demonstrate three algorithms for generation of Poisson processes...