POISSON, Analysis Solution of Poisson Problems in Probabilistic Risk Assessment
International Nuclear Information System (INIS)
Froehner, F.H.
1986-01-01
1 - Description of program or function: Purpose of program: Analytic treatment of two-stage Poisson problem in Probabilistic Risk Assessment. Input: estimated a-priori mean failure rate and error factor of system considered (for calculation of stage-1 prior), number of failures and operating times for similar systems (for calculation of stage-2 prior). Output: a-posteriori probability distributions on linear and logarithmic time scale (on specified time grid) and expectation values of failure rate and error factors are calculated for: - stage-1 a-priori distribution, - stage-1 a-posteriori distribution, - stage-2 a-priori distribution, - stage-2 a-posteriori distribution. 2 - Method of solution: Bayesian approach with conjugate stage-1 prior, improved with experience from similar systems to yield stage-2 prior, and likelihood function from experience with system under study (documentation see below under 10.). 3 - Restrictions on the complexity of the problem: Up to 100 similar systems (including the system considered), arbitrary number of problems (failure types) with same grid
Poisson-Like Spiking in Circuits with Probabilistic Synapses
Moreno-Bote, Rubén
2014-01-01
Neuronal activity in cortex is variable both spontaneously and during stimulation, and it has the remarkable property that it is Poisson-like over broad ranges of firing rates covering from virtually zero to hundreds of spikes per second. The mechanisms underlying cortical-like spiking variability over such a broad continuum of rates are currently unknown. We show that neuronal networks endowed with probabilistic synaptic transmission, a well-documented source of variability in cortex, robustly generate Poisson-like variability over several orders of magnitude in their firing rate without fine-tuning of the network parameters. Other sources of variability, such as random synaptic delays or spike generation jittering, do not lead to Poisson-like variability at high rates because they cannot be sufficiently amplified by recurrent neuronal networks. We also show that probabilistic synapses predict Fano factor constancy of synaptic conductances. Our results suggest that synaptic noise is a robust and sufficient mechanism for the type of variability found in cortex. PMID:25032705
Probabilistic approach to mechanisms
Sandler, BZ
1984-01-01
This book discusses the application of probabilistics to the investigation of mechanical systems. The book shows, for example, how random function theory can be applied directly to the investigation of random processes in the deflection of cam profiles, pitch or gear teeth, pressure in pipes, etc. The author also deals with some other technical applications of probabilistic theory, including, amongst others, those relating to pneumatic and hydraulic mechanisms and roller bearings. Many of the aspects are illustrated by examples of applications of the techniques under discussion.
Probabilistic approaches to recommendations
Barbieri, Nicola; Ritacco, Ettore
2014-01-01
The importance of accurate recommender systems has been widely recognized by academia and industry, and recommendation is rapidly becoming one of the most successful applications of data mining and machine learning. Understanding and predicting the choices and preferences of users is a challenging task: real-world scenarios involve users behaving in complex situations, where prior beliefs, specific tendencies, and reciprocal influences jointly contribute to determining the preferences of users toward huge amounts of information, services, and products. Probabilistic modeling represents a robus
Probabilistic approach to EMP assessment
International Nuclear Information System (INIS)
Bevensee, R.M.; Cabayan, H.S.; Deadrick, F.J.; Martin, L.C.; Mensing, R.W.
1980-09-01
The development of nuclear EMP hardness requirements must account for uncertainties in the environment, in interaction and coupling, and in the susceptibility of subsystems and components. Typical uncertainties of the last two kinds are briefly summarized, and an assessment methodology is outlined, based on a probabilistic approach that encompasses the basic concepts of reliability. It is suggested that statements of survivability be made compatible with system reliability. Validation of the approach taken for simple antenna/circuit systems is performed with experiments and calculations that involve a Transient Electromagnetic Range, numerical antenna modeling, separate device failure data, and a failure analysis computer program
Analytic Bayesian solution of the two-stage poisson-type problem in probabilistic risk analysis
International Nuclear Information System (INIS)
Frohner, F.H.
1985-01-01
The basic purpose of probabilistic risk analysis is to make inferences about the probabilities of various postulated events, with an account of all relevant information such as prior knowledge and operating experience with the specific system under study, as well as experience with other similar systems. Estimation of the failure rate of a Poisson-type system leads to an especially simple Bayesian solution in closed form if the prior probabilty implied by the invariance properties of the problem is properly taken into account. This basic simplicity persists if a more realistic prior, representing order of magnitude knowledge of the rate parameter, is employed instead. Moreover, the more realistic prior allows direct incorporation of experience gained from other similar systems, without need to postulate a statistical model for an underlying ensemble. The analytic formalism is applied to actual nuclear reactor data
Dependent Neyman type A processes based on common shock Poisson approach
Kadilar, Gamze Özel; Kadilar, Cem
2016-04-01
The Neyman type A process is used for describing clustered data since the Poisson process is insufficient for clustering of events. In a multivariate setting, there may be dependencies between multivarite Neyman type A processes. In this study, dependent form of the Neyman type A process is considered under common shock approach. Then, the joint probability function are derived for the dependent Neyman type A Poisson processes. Then, an application based on forest fires in Turkey are given. The results show that the joint probability function of the dependent Neyman type A processes, which is obtained in this study, can be a good tool for the probabilistic fitness for the total number of burned trees in Turkey.
Probabilistic Approaches to Video Retrieval
Ianeva, Tzvetanka; Boldareva, L.; Westerveld, T.H.W.; Cornacchia, Roberto; Hiemstra, Djoerd; de Vries, A.P.
Our experiments for TRECVID 2004 further investigate the applicability of the so-called “Generative Probabilistic Models to video retrieval��?. TRECVID 2003 results demonstrated that mixture models computed from video shot sequences improve the precision of “query by examples��? results when
Probabilistic approach to earthquake prediction.
Directory of Open Access Journals (Sweden)
G. D'Addezio
2002-06-01
Full Text Available The evaluation of any earthquake forecast hypothesis requires the application of rigorous statistical methods. It implies a univocal definition of the model characterising the concerned anomaly or precursor, so as it can be objectively recognised in any circumstance and by any observer.A valid forecast hypothesis is expected to maximise successes and minimise false alarms. The probability gain associated to a precursor is also a popular way to estimate the quality of the predictions based on such precursor. Some scientists make use of a statistical approach based on the computation of the likelihood of an observed realisation of seismic events, and on the comparison of the likelihood obtained under different hypotheses. This method can be extended to algorithms that allow the computation of the density distribution of the conditional probability of earthquake occurrence in space, time and magnitude. Whatever method is chosen for building up a new hypothesis, the final assessment of its validity should be carried out by a test on a new and independent set of observations. The implementation of this test could, however, be problematic for seismicity characterised by long-term recurrence intervals. Even using the historical record, that may span time windows extremely variable between a few centuries to a few millennia, we have a low probability to catch more than one or two events on the same fault. Extending the record of earthquakes of the past back in time up to several millennia, paleoseismology represents a great opportunity to study how earthquakes recur through time and thus provide innovative contributions to time-dependent seismic hazard assessment. Sets of paleoseimologically dated earthquakes have been established for some faults in the Mediterranean area: the Irpinia fault in Southern Italy, the Fucino fault in Central Italy, the El Asnam fault in Algeria and the Skinos fault in Central Greece. By using the age of the
Heil, Peter; Matysiak, Artur; Neubauer, Heinrich
2017-09-01
Thresholds for detecting sounds in quiet decrease with increasing sound duration in every species studied. The neural mechanisms underlying this trade-off, often referred to as temporal integration, are not fully understood. Here, we probe the human auditory system with a large set of tone stimuli differing in duration, shape of the temporal amplitude envelope, duration of silent gaps between bursts, and frequency. Duration was varied by varying the plateau duration of plateau-burst (PB) stimuli, the duration of the onsets and offsets of onset-offset (OO) stimuli, and the number of identical bursts of multiple-burst (MB) stimuli. Absolute thresholds for a large number of ears (>230) were measured using a 3-interval-3-alternative forced choice (3I-3AFC) procedure. Thresholds decreased with increasing sound duration in a manner that depended on the temporal envelope. Most commonly, thresholds for MB stimuli were highest followed by thresholds for OO and PB stimuli of corresponding durations. Differences in the thresholds for MB and OO stimuli and in the thresholds for MB and PB stimuli, however, varied widely across ears, were negative in some ears, and were tightly correlated. We show that the variation and correlation of MB-OO and MB-PB threshold differences are linked to threshold microstructure, which affects the relative detectability of the sidebands of the MB stimuli and affects estimates of the bandwidth of auditory filters. We also found that thresholds for MB stimuli increased with increasing duration of the silent gaps between bursts. We propose a new model and show that it accurately accounts for our results and does so considerably better than a leaky-integrator-of-intensity model and a probabilistic model proposed by others. Our model is based on the assumption that sensory events are generated by a Poisson point process with a low rate in the absence of stimulation and higher, time-varying rates in the presence of stimulation. A subject in a 3I-3AFC
Probabilistic approach to manipulator kinematics and dynamics
International Nuclear Information System (INIS)
Rao, S.S.; Bhatti, P.K.
2001-01-01
A high performance, high speed robotic arm must be able to manipulate objects with a high degree of accuracy and repeatability. As with any other physical system, there are a number of factors causing uncertainties in the behavior of a robotic manipulator. These factors include manufacturing and assembling tolerances, and errors in the joint actuators and controllers. In order to study the effect of these uncertainties on the robotic end-effector and to obtain a better insight into the manipulator behavior, the manipulator kinematics and dynamics are modeled using a probabilistic approach. Based on the probabilistic model, kinematic and dynamic performance criteria are defined to provide measures of the behavior of the robotic end-effector. Techniques are presented to compute the kinematic and dynamic reliabilities of the manipulator. The effects of tolerances associated with the various manipulator parameters on the reliabilities are studied. Numerical examples are presented to illustrate the procedures
PROBABILISTIC APPROACH OF STABILIZED ELECTROMAGNETIC FIELD EFFECTS
Directory of Open Access Journals (Sweden)
FELEA. I.
2017-09-01
Full Text Available The effects of the omnipresence of the electromagnetic field are certain and recognized. Assessing as accurately as possible these effects, which characterize random phenomena require the use of statistical-probabilistic calculation. This paper aims at assessing the probability of exceeding the admissible values of the characteristic sizes of the electromagnetic field - magnetic induction and electric field strength. The first part justifies the need for concern and specifies how to approach it. The mathematical model of approach and treatment is presented in the second part of the paper and the results obtained with reference to 14 power stations are synthesized in the third part. In the last part, are formulated the conclusions of the evaluations.
Probabilistic interpretation of data a physicist's approach
Miller, Guthrie
2013-01-01
This book is a physicists approach to interpretation of data using Markov Chain Monte Carlo (MCMC). The concepts are derived from first principles using a style of mathematics that quickly elucidates the basic ideas, sometimes with the aid of examples. Probabilistic data interpretation is a straightforward problem involving conditional probability. A prior probability distribution is essential, and examples are given. In this small book (200 pages) the reader is led from the most basic concepts of mathematical probability all the way to parallel processing algorithms for Markov Chain Monte Carlo. Fortran source code (for eigenvalue analysis of finite discrete Markov Chains, for MCMC, and for nonlinear least squares) is included with the supplementary material for this book (available online).
A probabilistic approach to controlling crevice chemistry
International Nuclear Information System (INIS)
Millett, P.J.; Brobst, G.E.; Riddle, J.
1995-01-01
It has been generally accepted that the corrosion of steam generator tubing could be reduced if the local pH in regions where impurities concentrate could be controlled. The practice of molar ratio control is based on this assumption. Unfortunately, due to the complexity of the crevice concentration process, efforts to model the crevice chemistry based on bulk water conditions are quite uncertain. In-situ monitoring of the crevice chemistry is desirable, but may not be achievable in the near future. The current methodology for assessing the crevice chemistry is to monitor the hideout return chemistry when the plant shuts down. This approach also has its shortcomings, but may provide sufficient data to evaluate whether the crevice pH is in a desirable range. In this paper, an approach to controlling the crevice chemistry based on a target molar ratio indicator is introduced. The molar ratio indicator is based on what is believed to be the most reliable hideout return data. Probabilistic arguments are then used to show that the crevice pH will most likely be in a desirable range when the target molar ratio is achieved
Semantics of probabilistic processes an operational approach
Deng, Yuxin
2015-01-01
This book discusses the semantic foundations of concurrent systems with nondeterministic and probabilistic behaviour. Particular attention is given to clarifying the relationship between testing and simulation semantics and characterising bisimulations from metric, logical, and algorithmic perspectives. Besides presenting recent research outcomes in probabilistic concurrency theory, the book exemplifies the use of many mathematical techniques to solve problems in computer science, which is intended to be accessible to postgraduate students in Computer Science and Mathematics. It can also be us
Probabilistic approaches for geotechnical site characterization and slope stability analysis
Cao, Zijun; Li, Dianqing
2017-01-01
This is the first book to revisit geotechnical site characterization from a probabilistic point of view and provide rational tools to probabilistically characterize geotechnical properties and underground stratigraphy using limited information obtained from a specific site. This book not only provides new probabilistic approaches for geotechnical site characterization and slope stability analysis, but also tackles the difficulties in practical implementation of these approaches. In addition, this book also develops efficient Monte Carlo simulation approaches for slope stability analysis and implements these approaches in a commonly available spreadsheet environment. These approaches and the software package are readily available to geotechnical practitioners and alleviate them from reliability computational algorithms. The readers will find useful information for a non-specialist to determine project-specific statistics of geotechnical properties and to perform probabilistic analysis of slope stability.
A sampling-based approach to probabilistic pursuit evasion
Mahadevan, Aditya; Amato, Nancy M.
2012-01-01
Probabilistic roadmaps (PRMs) are a sampling-based approach to motion-planning that encodes feasible paths through the environment using a graph created from a subset of valid positions. Prior research has shown that PRMs can be augmented
A Markov Chain Approach to Probabilistic Swarm Guidance
Acikmese, Behcet; Bayard, David S.
2012-01-01
This paper introduces a probabilistic guidance approach for the coordination of swarms of autonomous agents. The main idea is to drive the swarm to a prescribed density distribution in a prescribed region of the configuration space. In its simplest form, the probabilistic approach is completely decentralized and does not require communication or collabo- ration between agents. Agents make statistically independent probabilistic decisions based solely on their own state, that ultimately guides the swarm to the desired density distribution in the configuration space. In addition to being completely decentralized, the probabilistic guidance approach has a novel autonomous self-repair property: Once the desired swarm density distribution is attained, the agents automatically repair any damage to the distribution without collaborating and without any knowledge about the damage.
Overview of the probabilistic risk assessment approach
International Nuclear Information System (INIS)
Reed, J.W.
1985-01-01
The techniques of probabilistic risk assessment (PRA) are applicable to Department of Energy facilities. The background and techniques of PRA are given with special attention to seismic, wind and flooding external events. A specific application to seismic events is provided to demonstrate the method. However, the PRA framework is applicable also to wind and external flooding. 3 references, 8 figures, 1 table
Yelland, Lisa N; Salter, Amy B; Ryan, Philip
2011-10-15
Modified Poisson regression, which combines a log Poisson regression model with robust variance estimation, is a useful alternative to log binomial regression for estimating relative risks. Previous studies have shown both analytically and by simulation that modified Poisson regression is appropriate for independent prospective data. This method is often applied to clustered prospective data, despite a lack of evidence to support its use in this setting. The purpose of this article is to evaluate the performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data, by using generalized estimating equations to account for clustering. A simulation study is conducted to compare log binomial regression and modified Poisson regression for analyzing clustered data from intervention and observational studies. Both methods generally perform well in terms of bias, type I error, and coverage. Unlike log binomial regression, modified Poisson regression is not prone to convergence problems. The methods are contrasted by using example data sets from 2 large studies. The results presented in this article support the use of modified Poisson regression as an alternative to log binomial regression for analyzing clustered prospective data when clustering is taken into account by using generalized estimating equations.
A random probabilistic approach to seismic nuclear power plant analysis
International Nuclear Information System (INIS)
Romo, M.P.
1985-01-01
A probabilistic method for the seismic analysis of structures which takes into account the random nature of earthquakes and of the soil parameter uncertainties is presented in this paper. The method was developed combining elements of the theory of perturbations, the Random vibration theory and the complex response method. The probabilistic method is evaluated by comparing the responses of a single degree of freedom system computed with this approach and the Monte Carlo method. (orig.)
A probabilistic approach to crack instability
Chudnovsky, A.; Kunin, B.
1989-01-01
A probabilistic model of brittle fracture is examined with reference to two-dimensional problems. The model is illustrated by using experimental data obtained for 25 macroscopically identical specimens made of short-fiber-reinforced composites. It is shown that the model proposed here provides a predictive formalism for the probability distributions of critical crack depth, critical loads, and crack arrest depths. It also provides similarity criteria for small-scale testing.
Probabilistic Forecasting of Photovoltaic Generation: An Efficient Statistical Approach
DEFF Research Database (Denmark)
Wan, Can; Lin, Jin; Song, Yonghua
2017-01-01
This letter proposes a novel efficient probabilistic forecasting approach to accurately quantify the variability and uncertainty of the power production from photovoltaic (PV) systems. Distinguished from most existing models, a linear programming based prediction interval construction model for P...... power generation is proposed based on extreme learning machine and quantile regression, featuring high reliability and computational efficiency. The proposed approach is validated through the numerical studies on PV data from Denmark.......This letter proposes a novel efficient probabilistic forecasting approach to accurately quantify the variability and uncertainty of the power production from photovoltaic (PV) systems. Distinguished from most existing models, a linear programming based prediction interval construction model for PV...
Deterministic and probabilistic approach to safety analysis
International Nuclear Information System (INIS)
Heuser, F.W.
1980-01-01
The examples discussed in this paper show that reliability analysis methods fairly well can be applied in order to interpret deterministic safety criteria in quantitative terms. For further improved extension of applied reliability analysis it has turned out that the influence of operational and control systems and of component protection devices should be considered with the aid of reliability analysis methods in detail. Of course, an extension of probabilistic analysis must be accompanied by further development of the methods and a broadening of the data base. (orig.)
Variational approach to probabilistic finite elements
Belytschko, T.; Liu, W. K.; Mani, A.; Besterfield, G.
1991-08-01
Probabilistic finite element methods (PFEM), synthesizing the power of finite element methods with second-moment techniques, are formulated for various classes of problems in structural and solid mechanics. Time-invariant random materials, geometric properties and loads are incorporated in terms of their fundamental statistics viz. second-moments. Analogous to the discretization of the displacement field in finite element methods, the random fields are also discretized. Preserving the conceptual simplicity, the response moments are calculated with minimal computations. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. By construction, these methods are applicable when the scale of randomness is not very large and when the probabilistic density functions have decaying tails. The accuracy and efficiency of these methods, along with their limitations, are demonstrated by various applications. Results obtained are compared with those of Monte Carlo simulation and it is shown that good accuracy can be obtained for both linear and nonlinear problems. The methods are amenable to implementation in deterministic FEM based computer codes.
The probabilistic approach and the deterministic licensing procedure
International Nuclear Information System (INIS)
Fabian, H.; Feigel, A.; Gremm, O.
1984-01-01
If safety goals are given, the creativity of the engineers is necessary to transform the goals into actual safety measures. That is, safety goals are not sufficient for the derivation of a safety concept; the licensing process asks ''What does a safe plant look like.'' The answer connot be given by a probabilistic procedure, but need definite deterministic statements; the conclusion is, that the licensing process needs a deterministic approach. The probabilistic approach should be used in a complementary role in cases where deterministic criteria are not complete, not detailed enough or not consistent and additional arguments for decision making in connection with the adequacy of a specific measure are necessary. But also in these cases the probabilistic answer has to be transformed into a clear deterministic statement. (orig.)
Bloch, Isabelle
2010-01-01
The area of information fusion has grown considerably during the last few years, leading to a rapid and impressive evolution. In such fast-moving times, it is important to take stock of the changes that have occurred. As such, this books offers an overview of the general principles and specificities of information fusion in signal and image processing, as well as covering the main numerical methods (probabilistic approaches, fuzzy sets and possibility theory and belief functions).
Analysis of Blood Transfusion Data Using Bivariate Zero-Inflated Poisson Model: A Bayesian Approach.
Mohammadi, Tayeb; Kheiri, Soleiman; Sedehi, Morteza
2016-01-01
Recognizing the factors affecting the number of blood donation and blood deferral has a major impact on blood transfusion. There is a positive correlation between the variables "number of blood donation" and "number of blood deferral": as the number of return for donation increases, so does the number of blood deferral. On the other hand, due to the fact that many donors never return to donate, there is an extra zero frequency for both of the above-mentioned variables. In this study, in order to apply the correlation and to explain the frequency of the excessive zero, the bivariate zero-inflated Poisson regression model was used for joint modeling of the number of blood donation and number of blood deferral. The data was analyzed using the Bayesian approach applying noninformative priors at the presence and absence of covariates. Estimating the parameters of the model, that is, correlation, zero-inflation parameter, and regression coefficients, was done through MCMC simulation. Eventually double-Poisson model, bivariate Poisson model, and bivariate zero-inflated Poisson model were fitted on the data and were compared using the deviance information criteria (DIC). The results showed that the bivariate zero-inflated Poisson regression model fitted the data better than the other models.
Upper limit for Poisson variable incorporating systematic uncertainties by Bayesian approach
International Nuclear Information System (INIS)
Zhu, Yongsheng
2007-01-01
To calculate the upper limit for the Poisson observable at given confidence level with inclusion of systematic uncertainties in background expectation and signal efficiency, formulations have been established along the line of Bayesian approach. A FORTRAN program, BPULE, has been developed to implement the upper limit calculation
A sampling-based approach to probabilistic pursuit evasion
Mahadevan, Aditya
2012-05-01
Probabilistic roadmaps (PRMs) are a sampling-based approach to motion-planning that encodes feasible paths through the environment using a graph created from a subset of valid positions. Prior research has shown that PRMs can be augmented with useful information to model interesting scenarios related to multi-agent interaction and coordination. © 2012 IEEE.
Sepú lveda, Nuno; Campino, Susana G; Assefa, Samuel A; Sutherland, Colin J; Pain, Arnab; Clark, Taane G
2013-01-01
Background: The advent of next generation sequencing technology has accelerated efforts to map and catalogue copy number variation (CNV) in genomes of important micro-organisms for public health. A typical analysis of the sequence data involves mapping reads onto a reference genome, calculating the respective coverage, and detecting regions with too-low or too-high coverage (deletions and amplifications, respectively). Current CNV detection methods rely on statistical assumptions (e.g., a Poisson model) that may not hold in general, or require fine-tuning the underlying algorithms to detect known hits. We propose a new CNV detection methodology based on two Poisson hierarchical models, the Poisson-Gamma and Poisson-Lognormal, with the advantage of being sufficiently flexible to describe different data patterns, whilst robust against deviations from the often assumed Poisson model.Results: Using sequence coverage data of 7 Plasmodium falciparum malaria genomes (3D7 reference strain, HB3, DD2, 7G8, GB4, OX005, and OX006), we showed that empirical coverage distributions are intrinsically asymmetric and overdispersed in relation to the Poisson model. We also demonstrated a low baseline false positive rate for the proposed methodology using 3D7 resequencing data and simulation. When applied to the non-reference isolate data, our approach detected known CNV hits, including an amplification of the PfMDR1 locus in DD2 and a large deletion in the CLAG3.2 gene in GB4, and putative novel CNV regions. When compared to the recently available FREEC and cn.MOPS approaches, our findings were more concordant with putative hits from the highest quality array data for the 7G8 and GB4 isolates.Conclusions: In summary, the proposed methodology brings an increase in flexibility, robustness, accuracy and statistical rigour to CNV detection using sequence coverage data. 2013 Seplveda et al.; licensee BioMed Central Ltd.
Sepúlveda, Nuno; Campino, Susana G; Assefa, Samuel A; Sutherland, Colin J; Pain, Arnab; Clark, Taane G
2013-02-26
The advent of next generation sequencing technology has accelerated efforts to map and catalogue copy number variation (CNV) in genomes of important micro-organisms for public health. A typical analysis of the sequence data involves mapping reads onto a reference genome, calculating the respective coverage, and detecting regions with too-low or too-high coverage (deletions and amplifications, respectively). Current CNV detection methods rely on statistical assumptions (e.g., a Poisson model) that may not hold in general, or require fine-tuning the underlying algorithms to detect known hits. We propose a new CNV detection methodology based on two Poisson hierarchical models, the Poisson-Gamma and Poisson-Lognormal, with the advantage of being sufficiently flexible to describe different data patterns, whilst robust against deviations from the often assumed Poisson model. Using sequence coverage data of 7 Plasmodium falciparum malaria genomes (3D7 reference strain, HB3, DD2, 7G8, GB4, OX005, and OX006), we showed that empirical coverage distributions are intrinsically asymmetric and overdispersed in relation to the Poisson model. We also demonstrated a low baseline false positive rate for the proposed methodology using 3D7 resequencing data and simulation. When applied to the non-reference isolate data, our approach detected known CNV hits, including an amplification of the PfMDR1 locus in DD2 and a large deletion in the CLAG3.2 gene in GB4, and putative novel CNV regions. When compared to the recently available FREEC and cn.MOPS approaches, our findings were more concordant with putative hits from the highest quality array data for the 7G8 and GB4 isolates. In summary, the proposed methodology brings an increase in flexibility, robustness, accuracy and statistical rigour to CNV detection using sequence coverage data.
Sepúlveda, Nuno
2013-02-26
Background: The advent of next generation sequencing technology has accelerated efforts to map and catalogue copy number variation (CNV) in genomes of important micro-organisms for public health. A typical analysis of the sequence data involves mapping reads onto a reference genome, calculating the respective coverage, and detecting regions with too-low or too-high coverage (deletions and amplifications, respectively). Current CNV detection methods rely on statistical assumptions (e.g., a Poisson model) that may not hold in general, or require fine-tuning the underlying algorithms to detect known hits. We propose a new CNV detection methodology based on two Poisson hierarchical models, the Poisson-Gamma and Poisson-Lognormal, with the advantage of being sufficiently flexible to describe different data patterns, whilst robust against deviations from the often assumed Poisson model.Results: Using sequence coverage data of 7 Plasmodium falciparum malaria genomes (3D7 reference strain, HB3, DD2, 7G8, GB4, OX005, and OX006), we showed that empirical coverage distributions are intrinsically asymmetric and overdispersed in relation to the Poisson model. We also demonstrated a low baseline false positive rate for the proposed methodology using 3D7 resequencing data and simulation. When applied to the non-reference isolate data, our approach detected known CNV hits, including an amplification of the PfMDR1 locus in DD2 and a large deletion in the CLAG3.2 gene in GB4, and putative novel CNV regions. When compared to the recently available FREEC and cn.MOPS approaches, our findings were more concordant with putative hits from the highest quality array data for the 7G8 and GB4 isolates.Conclusions: In summary, the proposed methodology brings an increase in flexibility, robustness, accuracy and statistical rigour to CNV detection using sequence coverage data. 2013 Seplveda et al.; licensee BioMed Central Ltd.
DEFF Research Database (Denmark)
Fokianos, Konstantinos; Rahbek, Anders Christian; Tjøstheim, Dag
2009-01-01
In this article we consider geometric ergodicity and likelihood-based inference for linear and nonlinear Poisson autoregression. In the linear case, the conditional mean is linked linearly to its past values, as well as to the observed values of the Poisson process. This also applies...... to the conditional variance, making possible interpretation as an integer-valued generalized autoregressive conditional heteroscedasticity process. In a nonlinear conditional Poisson model, the conditional mean is a nonlinear function of its past values and past observations. As a particular example, we consider...... an exponential autoregressive Poisson model for time series. Under geometric ergodicity, the maximum likelihood estimators are shown to be asymptotically Gaussian in the linear model. In addition, we provide a consistent estimator of their asymptotic covariance matrix. Our approach to verifying geometric...
PROBABILISTIC APPROACH TO OBJECT DETECTION AND RECOGNITION FOR VIDEOSTREAM PROCESSING
Directory of Open Access Journals (Sweden)
Volodymyr Kharchenko
2017-07-01
Full Text Available Purpose: The represented research results are aimed to improve theoretical basics of computer vision and artificial intelligence of dynamical system. Proposed approach of object detection and recognition is based on probabilistic fundamentals to ensure the required level of correct object recognition. Methods: Presented approach is grounded at probabilistic methods, statistical methods of probability density estimation and computer-based simulation at verification stage of development. Results: Proposed approach for object detection and recognition for video stream data processing has shown several advantages in comparison with existing methods due to its simple realization and small time of data processing. Presented results of experimental verification look plausible for object detection and recognition in video stream. Discussion: The approach can be implemented in dynamical system within changeable environment such as remotely piloted aircraft systems and can be a part of artificial intelligence in navigation and control systems.
On Partial Defaults in Portfolio Credit Risk : A Poisson Mixture Model Approach
Weißbach, Rafael; von Lieres und Wilkau, Carsten
2005-01-01
Most credit portfolio models exclusively calculate the loss distribution for a portfolio of performing counterparts. Conservative default definitions cause considerable insecurity about the loss for a long time after the default. We present three approaches to account for defaulted counterparts in the calculation of the economic capital. Two of the approaches are based on the Poisson mixture model CreditRisk+ and derive a loss distribution for an integrated portfolio. The third method treats ...
Application of probabilistic risk based optimization approaches in environmental restoration
International Nuclear Information System (INIS)
Goldammer, W.
1995-01-01
The paper presents a general approach to site-specific risk assessments and optimization procedures. In order to account for uncertainties in the assessment of the current situation and future developments, optimization parameters are treated as probabilistic distributions. The assessments are performed within the framework of a cost-benefit analysis. Radiation hazards and conventional risks are treated within an integrated approach. Special consideration is given to consequences of low probability events such as, earthquakes or major floods. Risks and financial costs are combined to an overall figure of detriment allowing one to distinguish between benefits of available reclamation options. The probabilistic analysis uses a Monte Carlo simulation technique. The paper demonstrates the applicability of this approach in aiding the reclamation planning using an example from the German reclamation program for uranium mining and milling sites
A probabilistic approach to delineating functional brain regions
DEFF Research Database (Denmark)
Kalbitzer, Jan; Svarer, Claus; Frokjaer, Vibe G
2009-01-01
The purpose of this study was to develop a reliable observer-independent approach to delineating volumes of interest (VOIs) for functional brain regions that are not identifiable on structural MR images. The case is made for the raphe nuclei, a collection of nuclei situated in the brain stem known...... to be densely packed with serotonin transporters (5-hydroxytryptaminic [5-HTT] system). METHODS: A template set for the raphe nuclei, based on their high content of 5-HTT as visualized in parametric (11)C-labeled 3-amino-4-(2-dimethylaminomethyl-phenylsulfanyl)-benzonitrile PET images, was created for 10...... healthy subjects. The templates were subsequently included in the region sets used in a previously published automatic MRI-based approach to create an observer- and activity-independent probabilistic VOI map. The probabilistic map approach was tested in a different group of 10 subjects and compared...
Li, Xian-Ying; Hu, Shi-Min
2013-02-01
Harmonic functions are the critical points of a Dirichlet energy functional, the linear projections of conformal maps. They play an important role in computer graphics, particularly for gradient-domain image processing and shape-preserving geometric computation. We propose Poisson coordinates, a novel transfinite interpolation scheme based on the Poisson integral formula, as a rapid way to estimate a harmonic function on a certain domain with desired boundary values. Poisson coordinates are an extension of the Mean Value coordinates (MVCs) which inherit their linear precision, smoothness, and kernel positivity. We give explicit formulas for Poisson coordinates in both continuous and 2D discrete forms. Superior to MVCs, Poisson coordinates are proved to be pseudoharmonic (i.e., they reproduce harmonic functions on n-dimensional balls). Our experimental results show that Poisson coordinates have lower Dirichlet energies than MVCs on a number of typical 2D domains (particularly convex domains). As well as presenting a formula, our approach provides useful insights for further studies on coordinates-based interpolation and fast estimation of harmonic functions.
Poisson regression approach for modeling fatal injury rates amongst Malaysian workers
International Nuclear Information System (INIS)
Kamarulzaman Ibrahim; Heng Khai Theng
2005-01-01
Many safety studies are based on the analysis carried out on injury surveillance data. The injury surveillance data gathered for the analysis include information on number of employees at risk of injury in each of several strata where the strata are defined in terms of a series of important predictor variables. Further insight into the relationship between fatal injury rates and predictor variables may be obtained by the poisson regression approach. Poisson regression is widely used in analyzing count data. In this study, poisson regression is used to model the relationship between fatal injury rates and predictor variables which are year (1995-2002), gender, recording system and industry type. Data for the analysis were obtained from PERKESO and Jabatan Perangkaan Malaysia. It is found that the assumption that the data follow poisson distribution has been violated. After correction for the problem of over dispersion, the predictor variables that are found to be significant in the model are gender, system of recording, industry type, two interaction effects (interaction between recording system and industry type and between year and industry type). Introduction Regression analysis is one of the most popular
Probabilistic approaches to life prediction of nuclear plant structural components
International Nuclear Information System (INIS)
Villain, B.; Pitner, P.; Procaccia, H.
1996-01-01
In the last decade there has been an increasing interest at EDF in developing and applying probabilistic methods for a variety of purposes. In the field of structural integrity and reliability they are used to evaluate the effect of deterioration due to aging mechanisms, mainly on major passive structural components such as steam generators, pressure vessels and piping in nuclear plants. Because there can be numerous uncertainties involved in a assessment of the performance of these structural components, probabilistic methods. The benefits of a probabilistic approach are the clear treatment of uncertainly and the possibility to perform sensitivity studies from which it is possible to identify and quantify the effect of key factors and mitigative actions. They thus provide information to support effective decisions to optimize In-Service Inspection planning and maintenance strategies and for realistic lifetime prediction or reassessment. The purpose of the paper is to discuss and illustrate the methods available at EDF for probabilistic component life prediction. This includes a presentation of software tools in classical, Bayesian and structural reliability, and an application on two case studies (steam generator tube bundle, reactor pressure vessel). (authors)
Probabilistic approaches to life prediction of nuclear plant structural components
International Nuclear Information System (INIS)
Villain, B.; Pitner, P.; Procaccia, H.
1996-01-01
In the last decade there has been an increasing interest at EDF in developing and applying probabilistic methods for a variety of purposes. In the field of structural integrity and reliability they are used to evaluate the effect of deterioration due to aging mechanisms, mainly on major passive structural components such as steam generators, pressure vessels and piping in nuclear plants. Because there can be numerous uncertainties involved in an assessment of the performance of these structural components, probabilistic methods provide an attractive alternative or supplement to more conventional deterministic methods. The benefits of a probabilistic approach are the clear treatment of uncertainty and the possibility to perform sensitivity studies from which it is possible to identify and quantify the effect of key factors and mitigative actions. They thus provide information to support effective decisions to optimize In-Service Inspection planning and maintenance strategies and for realistic lifetime prediction or reassessment. The purpose of the paper is to discuss and illustrate the methods available at EDF for probabilistic component life prediction. This includes a presentation of software tools in classical, Bayesian and structural reliability, and an application on two case studies (steam generator tube bundle, reactor pressure vessel)
Standardized approach for developing probabilistic exposure factor distributions
Energy Technology Data Exchange (ETDEWEB)
Maddalena, Randy L.; McKone, Thomas E.; Sohn, Michael D.
2003-03-01
The effectiveness of a probabilistic risk assessment (PRA) depends critically on the quality of input information that is available to the risk assessor and specifically on the probabilistic exposure factor distributions that are developed and used in the exposure and risk models. Deriving probabilistic distributions for model inputs can be time consuming and subjective. The absence of a standard approach for developing these distributions can result in PRAs that are inconsistent and difficult to review by regulatory agencies. We present an approach that reduces subjectivity in the distribution development process without limiting the flexibility needed to prepare relevant PRAs. The approach requires two steps. First, we analyze data pooled at a population scale to (1) identify the most robust demographic variables within the population for a given exposure factor, (2) partition the population data into subsets based on these variables, and (3) construct archetypal distributions for each subpopulation. Second, we sample from these archetypal distributions according to site- or scenario-specific conditions to simulate exposure factor values and use these values to construct the scenario-specific input distribution. It is envisaged that the archetypal distributions from step 1 will be generally applicable so risk assessors will not have to repeatedly collect and analyze raw data for each new assessment. We demonstrate the approach for two commonly used exposure factors--body weight (BW) and exposure duration (ED)--using data for the U.S. population. For these factors we provide a first set of subpopulation based archetypal distributions along with methodology for using these distributions to construct relevant scenario-specific probabilistic exposure factor distributions.
2009-10-13
This paper describes a probabilistic approach to estimate the conditional probability of release of hazardous materials from railroad tank cars during train accidents. Monte Carlo methods are used in developing a probabilistic model to simulate head ...
A probabilistic approach to Radiological Environmental Impact Assessment
International Nuclear Information System (INIS)
Avila, Rodolfo; Larsson, Carl-Magnus
2001-01-01
Since a radiological environmental impact assessment typically relies on limited data and poorly based extrapolation methods, point estimations, as implied by a deterministic approach, do not suffice. To be of practical use for risk management, it is necessary to quantify the uncertainty margins of the estimates as well. In this paper we discuss how to work out a probabilistic approach for dealing with uncertainties in assessments of the radiological risks to non-human biota of a radioactive contamination. Possible strategies for deriving the relevant probability distribution functions from available empirical data and theoretical knowledge are outlined
Probabilistic logics and probabilistic networks
Haenni, Rolf; Wheeler, Gregory; Williamson, Jon; Andrews, Jill
2014-01-01
Probabilistic Logic and Probabilistic Networks presents a groundbreaking framework within which various approaches to probabilistic logic naturally fit. Additionally, the text shows how to develop computationally feasible methods to mesh with this framework.
A probabilistic approach for representation of interval uncertainty
International Nuclear Information System (INIS)
Zaman, Kais; Rangavajhala, Sirisha; McDonald, Mark P.; Mahadevan, Sankaran
2011-01-01
In this paper, we propose a probabilistic approach to represent interval data for input variables in reliability and uncertainty analysis problems, using flexible families of continuous Johnson distributions. Such a probabilistic representation of interval data facilitates a unified framework for handling aleatory and epistemic uncertainty. For fitting probability distributions, methods such as moment matching are commonly used in the literature. However, unlike point data where single estimates for the moments of data can be calculated, moments of interval data can only be computed in terms of upper and lower bounds. Finding bounds on the moments of interval data has been generally considered an NP-hard problem because it includes a search among the combinations of multiple values of the variables, including interval endpoints. In this paper, we present efficient algorithms based on continuous optimization to find the bounds on second and higher moments of interval data. With numerical examples, we show that the proposed bounding algorithms are scalable in polynomial time with respect to increasing number of intervals. Using the bounds on moments computed using the proposed approach, we fit a family of Johnson distributions to interval data. Furthermore, using an optimization approach based on percentiles, we find the bounding envelopes of the family of distributions, termed as a Johnson p-box. The idea of bounding envelopes for the family of Johnson distributions is analogous to the notion of empirical p-box in the literature. Several sets of interval data with different numbers of intervals and type of overlap are presented to demonstrate the proposed methods. As against the computationally expensive nested analysis that is typically required in the presence of interval variables, the proposed probabilistic representation enables inexpensive optimization-based strategies to estimate bounds on an output quantity of interest.
Convex models and probabilistic approach of nonlinear fatigue failure
International Nuclear Information System (INIS)
Qiu Zhiping; Lin Qiang; Wang Xiaojun
2008-01-01
This paper is concerned with the nonlinear fatigue failure problem with uncertainties in the structural systems. In the present study, in order to solve the nonlinear problem by convex models, the theory of ellipsoidal algebra with the help of the thought of interval analysis is applied. In terms of the inclusion monotonic property of ellipsoidal functions, the nonlinear fatigue failure problem with uncertainties can be solved. A numerical example of 25-bar truss structures is given to illustrate the efficiency of the presented method in comparison with the probabilistic approach
A probabilistic approach for RIA fuel failure criteria
International Nuclear Information System (INIS)
Carlo Vitanza, Dr.
2008-01-01
Substantial experimental data have been produced in support of the definition of the RIA safety limits for water reactor fuels at high burn up. Based on these data, fuel failure enthalpy limits can be derived based on methods having a varying degree of complexity. However, regardless of sophistication, it is unlikely that any deterministic approach would result in perfect predictions of all failure and non failure data obtained in RIA tests. Accordingly, a probabilistic approach is proposed in this paper, where in addition to a best estimate evaluation of the failure enthalpy, a RIA fuel failure probability distribution is defined within an enthalpy band surrounding the best estimate failure enthalpy. The band width and the failure probability distribution within this band are determined on the basis of the whole data set, including failure and non failure data and accounting for the actual scatter of the database. The present probabilistic approach can be used in conjunction with any deterministic model or correlation. For deterministic models or correlations having good prediction capability, the probability distribution will be sharply increasing within a narrow band around the best estimate value. For deterministic predictions of lower quality, instead, the resulting probability distribution will be broad and coarser
International Nuclear Information System (INIS)
Di Maio, Francesco; Rai, Ajit; Zio, Enrico
2016-01-01
The challenge of Risk-Informed Safety Margin Characterization (RISMC) is to develop a methodology for estimating system safety margins in the presence of stochastic and epistemic uncertainties affecting the system dynamic behavior. This is useful to support decision-making for licensing purposes. In the present work, safety margin uncertainties are handled by Order Statistics (OS) (with both Bracketing and Coverage approaches) to jointly estimate percentiles of the distributions of the safety parameter and of the time required for it to reach these percentiles values during its dynamic evolution. The novelty of the proposed approach consists in the integration of dynamic aspects (i.e., timing of events) into the definition of a dynamic safety margin for a probabilistic Quantification of Margin and Uncertainties (QMU). The system here considered for demonstration purposes is the Lead–Bismuth Eutectic- eXperimental Accelerator Driven System (LBE-XADS). - Highlights: • We integrate dynamic aspects into the definition of a safety margins. • We consider stochastic and epistemic uncertainties affecting the system dynamics. • Uncertainties are handled by Order Statistics (OS). • We estimate the system grace time during accidental scenarios. • We apply the approach to an LBE-XADS accidental scenario.
International Nuclear Information System (INIS)
Togo, Y.; Sato, K.
1981-01-01
The probabilistic approach has long seemed to be one of the most comprehensive methods for evaluating the safety of nuclear plants. So far, most of the guidelines and criteria for licensing are based on the deterministic concept. However, there have been a few examples to which the probabilistic approach was directly applied, such as the evaluation of aircraft crashes and turbine missiles. One may find other examples of such applications. However, a much more important role is now to be played by this concept, in implementing the 52 recommendations from the lessons learned from the TMI accident. To develop the probabilistic risk assessment methodology most relevant to Japanese situations, a five-year programme plan has been adopted and is to be conducted by the Japan Atomic Research Institute from fiscal 1980. Various problems have been identified and are to be solved through this programme plan. The current status of developments is described together with activities outside the government programme. (author)
An approach for obtaining integrable Hamiltonians from Poisson-commuting polynomial families
Leyvraz, F.
2017-07-01
We discuss a general approach permitting the identification of a broad class of sets of Poisson-commuting Hamiltonians, which are integrable in the sense of Liouville. It is shown that all such Hamiltonians can be solved explicitly by a separation of variables ansatz. The method leads in particular to a proof that the so-called "goldfish" Hamiltonian is maximally superintegrable and leads to an elementary identification of a full set of integrals of motion. The Hamiltonians in involution with the "goldfish" Hamiltonian are also explicitly integrated. New integrable Hamiltonians are identified, among which some have the property of being isochronous, that is, all their orbits have the same period. Finally, a peculiar structure is identified in the Poisson brackets between the elementary symmetric functions and the set of Hamiltonians commuting with the "goldfish" Hamiltonian: these can be expressed as products between elementary symmetric functions and Hamiltonians. The structure displays an invariance property with respect to one element and has both a symmetry and a closure property. The meaning of this structure is not altogether clear to the author, but it turns out to be a powerful tool.
Future trends in flood risk in Indonesia - A probabilistic approach
Muis, Sanne; Guneralp, Burak; Jongman, Brenden; Ward, Philip
2014-05-01
Indonesia is one of the 10 most populous countries in the world and is highly vulnerable to (river) flooding. Catastrophic floods occur on a regular basis; total estimated damages were US 0.8 bn in 2010 and US 3 bn in 2013. Large parts of Greater Jakarta, the capital city, are annually subject to flooding. Flood risks (i.e. the product of hazard, exposure and vulnerability) are increasing due to rapid increases in exposure, such as strong population growth and ongoing economic development. The increase in risk may also be amplified by increasing flood hazards, such as increasing flood frequency and intensity due to climate change and land subsidence. The implementation of adaptation measures, such as the construction of dykes and strategic urban planning, may counteract these increasing trends. However, despite its importance for adaptation planning, a comprehensive assessment of current and future flood risk in Indonesia is lacking. This contribution addresses this issue and aims to provide insight into how socio-economic trends and climate change projections may shape future flood risks in Indonesia. Flood risk were calculated using an adapted version of the GLOFRIS global flood risk assessment model. Using this approach, we produced probabilistic maps of flood risks (i.e. annual expected damage) at a resolution of 30"x30" (ca. 1km x 1km at the equator). To represent flood exposure, we produced probabilistic projections of urban growth in a Monte-Carlo fashion based on probability density functions of projected population and GDP values for 2030. To represent flood hazard, inundation maps were computed using the hydrological-hydraulic component of GLOFRIS. These maps show flood inundation extent and depth for several return periods and were produced for several combinations of GCMs and future socioeconomic scenarios. Finally, the implementation of different adaptation strategies was incorporated into the model to explore to what extent adaptation may be able to
A Probabilistic, Facility-Centric Approach to Lightning Strike Location
Huddleston, Lisa L.; Roeder, William p.; Merceret, Francis J.
2012-01-01
A new probabilistic facility-centric approach to lightning strike location has been developed. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even with the location error ellipse. This technique is adapted from a method of calculating the probability of debris collisionith spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force Station. Future applications could include forensic meteorology.
A probabilistic approach to the evaluation of the PTS issue
International Nuclear Information System (INIS)
Cheverton, R.D.; Selby, D.L.
1991-01-01
An integrated probabilistic approach for the evaluation of the pressurized-thermal-shock (PTS) issue was developed at the Oak Ridge National Laboratory (ORNL) at the request of the Nuclear Regulatory Commission (NRC). The purpose was to provide a method for identifying dominant plant design and operating features, evaluating possible remedial measures and the validity of the NRC PTS screening criteria, and to provide an additional tool for estimating vessel life expectancy. The approach was to be integrated in the sense that it would include the postulation of transients; estimates of their frequencies of occurrence; systems analyses to obtain the corresponding primary-system pressure, down-comer coolant temperature, and fluid-film heat-transfer coefficient adjacent to the vessel wall; and a probabilistic fracture-mechanics analysis, using the latter data as input. A summation of the products of frequency of transient and conditional probability of failure for all postulated transients provides an estimate of frequency of vessel failure. In the process of developing the integrated pressurized-thermal-shock (IPTS) methodology, three specific plant analyses were conducted. The results indicate that the NRC screening criteria may not be appropriate for all US pressurized water reactor (PWR) plants; that is, for some PWRs, the calculated mean frequency of vessel failure corresponding to the screening criteria may be greater than the maximum permissible value in Regulatory Guide 1.154. A recent view of the ORNL IPTS study, which was completed in 1985, indicates that there are a number of areas in which the methodology can and should be updated, but it is not clear whether the update will increase or decrease the calculated probabilities. 31 refs., 2 tabs
Transmission capacity assessment by probabilistic planning. An approach
International Nuclear Information System (INIS)
Lammintausta, M.
2002-01-01
The Finnish electricity markets participate in the Scandinavian markets, Nord-Pool. The Finnish market is free for marketers, producers and consumers. All these participants can be seen as customers of the transmission network, which in turn can be considered to be a market place in which electricity can be sold and bought. The Finnish transmission network is owned and operated by an independent company, Fingrid that has the full responsibility of the Finnish transmission system. The available transfer capacity of a transmission route is traditionally limited by deterministic security constraints. More efficient and flexible network utilisation could be achieved with probabilistic planning methods. This report introduces a simple and practical probabilistic approach for transfer limit and risk assessment. The method is based on the economical benefit and risk predictions. It uses also the existing results of deterministic data and it could be used side by side with the deterministic method. The basic concept and necessary equations for expected risks of various market players have been derived for further developments. The outage costs and thereby the risks of the market participants depend on how the system operator reacts to the faults. In the Finnish power system consumers will usually experience no costs due to the faults because of meshed network and counter trade method preferred by the system operator. The costs to the producers and dealers are also low because of the counter trade method. The network company will lose the cost of reparation, additional losses and cost of regulation power because of counter trades. In case power flows will be rearranged drastically because of aggressive strategies used in the electricity markets, the only way to fulfil the needs of free markets is that the network operator buys regulation power for short-term problems and reinforces the network in the long-term situations. The reinforcement is done if the network can not be
A Probabilistic Approach for Breast Boundary Extraction in Mammograms
Directory of Open Access Journals (Sweden)
Hamed Habibi Aghdam
2013-01-01
Full Text Available The extraction of the breast boundary is crucial to perform further analysis of mammogram. Methods to extract the breast boundary can be classified into two categories: methods based on image processing techniques and those based on models. The former use image transformation techniques such as thresholding, morphological operations, and region growing. In the second category, the boundary is extracted using more advanced techniques, such as the active contour model. The problem with thresholding methods is that it is a hard to automatically find the optimal threshold value by using histogram information. On the other hand, active contour models require defining a starting point close to the actual boundary to be able to successfully extract the boundary. In this paper, we propose a probabilistic approach to address the aforementioned problems. In our approach we use local binary patterns to describe the texture around each pixel. In addition, the smoothness of the boundary is handled by using a new probability model. Experimental results show that the proposed method reaches 38% and 50% improvement with respect to the results obtained by the active contour model and threshold-based methods respectively, and it increases the stability of the boundary extraction process up to 86%.
A probabilistic approach for validating protein NMR chemical shift assignments
International Nuclear Information System (INIS)
Wang Bowei; Wang, Yunjun; Wishart, David S.
2010-01-01
It has been estimated that more than 20% of the proteins in the BMRB are improperly referenced and that about 1% of all chemical shift assignments are mis-assigned. These statistics also reflect the likelihood that any newly assigned protein will have shift assignment or shift referencing errors. The relatively high frequency of these errors continues to be a concern for the biomolecular NMR community. While several programs do exist to detect and/or correct chemical shift mis-referencing or chemical shift mis-assignments, most can only do one, or the other. The one program (SHIFTCOR) that is capable of handling both chemical shift mis-referencing and mis-assignments, requires the 3D structure coordinates of the target protein. Given that chemical shift mis-assignments and chemical shift re-referencing issues should ideally be addressed prior to 3D structure determination, there is a clear need to develop a structure-independent approach. Here, we present a new structure-independent protocol, which is based on using residue-specific and secondary structure-specific chemical shift distributions calculated over small (3-6 residue) fragments to identify mis-assigned resonances. The method is also able to identify and re-reference mis-referenced chemical shift assignments. Comparisons against existing re-referencing or mis-assignment detection programs show that the method is as good or superior to existing approaches. The protocol described here has been implemented into a freely available Java program called 'Probabilistic Approach for protein Nmr Assignment Validation (PANAV)' and as a web server (http://redpoll.pharmacy.ualberta.ca/PANAVhttp://redpoll.pharmacy.ualberta.ca/PANAV) which can be used to validate and/or correct as well as re-reference assigned protein chemical shifts.
Assessing Probabilistic Risk Assessment Approaches for Insect Biological Control Introductions.
Kaufman, Leyla V; Wright, Mark G
2017-07-07
The introduction of biological control agents to new environments requires host specificity tests to estimate potential non-target impacts of a prospective agent. Currently, the approach is conservative, and is based on physiological host ranges determined under captive rearing conditions, without consideration for ecological factors that may influence realized host range. We use historical data and current field data from introduced parasitoids that attack an endemic Lepidoptera species in Hawaii to validate a probabilistic risk assessment (PRA) procedure for non-target impacts. We use data on known host range and habitat use in the place of origin of the parasitoids to determine whether contemporary levels of non-target parasitism could have been predicted using PRA. Our results show that reasonable predictions of potential non-target impacts may be made if comprehensive data are available from places of origin of biological control agents, but scant data produce poor predictions. Using apparent mortality data rather than marginal attack rate estimates in PRA resulted in over-estimates of predicted non-target impact. Incorporating ecological data into PRA models improved the predictive power of the risk assessments.
Assessing Probabilistic Risk Assessment Approaches for Insect Biological Control Introductions
Directory of Open Access Journals (Sweden)
Leyla V. Kaufman
2017-07-01
Full Text Available The introduction of biological control agents to new environments requires host specificity tests to estimate potential non-target impacts of a prospective agent. Currently, the approach is conservative, and is based on physiological host ranges determined under captive rearing conditions, without consideration for ecological factors that may influence realized host range. We use historical data and current field data from introduced parasitoids that attack an endemic Lepidoptera species in Hawaii to validate a probabilistic risk assessment (PRA procedure for non-target impacts. We use data on known host range and habitat use in the place of origin of the parasitoids to determine whether contemporary levels of non-target parasitism could have been predicted using PRA. Our results show that reasonable predictions of potential non-target impacts may be made if comprehensive data are available from places of origin of biological control agents, but scant data produce poor predictions. Using apparent mortality data rather than marginal attack rate estimates in PRA resulted in over-estimates of predicted non-target impact. Incorporating ecological data into PRA models improved the predictive power of the risk assessments.
Garbin, Silvia; Alessi Celegon, Elisa; Fanton, Pietro; Botter, Gianluca
2017-04-01
The temporal variability of river flow regime is a key feature structuring and controlling fluvial ecological communities and ecosystem processes. In particular, streamflow variability induced by climate/landscape heterogeneities or other anthropogenic factors significantly affects the connectivity between streams with notable implication for river fragmentation. Hydrologic connectivity is a fundamental property that guarantees species persistence and ecosystem integrity in riverine systems. In riverine landscapes, most ecological transitions are flow-dependent and the structure of flow regimes may affect ecological functions of endemic biota (i.e., fish spawning or grazing of invertebrate species). Therefore, minimum flow thresholds must be guaranteed to support specific ecosystem services, like fish migration, aquatic biodiversity and habitat suitability. In this contribution, we present a probabilistic approach aiming at a spatially-explicit, quantitative assessment of hydrologic connectivity at the network-scale as derived from river flow variability. Dynamics of daily streamflows are estimated based on catchment-scale climatic and morphological features, integrating a stochastic, physically based approach that accounts for the stochasticity of rainfall with a water balance model and a geomorphic recession flow model. The non-exceedance probability of ecologically meaningful flow thresholds is used to evaluate the fragmentation of individual stream reaches, and the ensuing network-scale connectivity metrics. A multi-dimensional Poisson Process for the stochastic generation of rainfall is used to evaluate the impact of climate signature on reach-scale and catchment-scale connectivity. The analysis shows that streamflow patterns and network-scale connectivity are influenced by the topology of the river network and the spatial variability of climatic properties (rainfall, evapotranspiration). The framework offers a robust basis for the prediction of the impact of
Xu, Zhenli; Ma, Manman; Liu, Pei
2014-07-01
We propose a modified Poisson-Nernst-Planck (PNP) model to investigate charge transport in electrolytes of inhomogeneous dielectric environment. The model includes the ionic polarization due to the dielectric inhomogeneity and the ion-ion correlation. This is achieved by the self energy of test ions through solving a generalized Debye-Hückel (DH) equation. We develop numerical methods for the system composed of the PNP and DH equations. Particularly, toward the numerical challenge of solving the high-dimensional DH equation, we developed an analytical WKB approximation and a numerical approach based on the selective inversion of sparse matrices. The model and numerical methods are validated by simulating the charge diffusion in electrolytes between two electrodes, for which effects of dielectrics and correlation are investigated by comparing the results with the prediction by the classical PNP theory. We find that, at the length scale of the interface separation comparable to the Bjerrum length, the results of the modified equations are significantly different from the classical PNP predictions mostly due to the dielectric effect. It is also shown that when the ion self energy is in weak or mediate strength, the WKB approximation presents a high accuracy, compared to precise finite-difference results.
National Aeronautics and Space Administration — A general framework for probabilistic prognosis using maximum entropy approach, MRE, is proposed in this paper to include all available information and uncertainties...
Directory of Open Access Journals (Sweden)
Yuri B. Tebekin
2011-11-01
Full Text Available The article is devoted to the problem of the quality management for multiphase processes on the basis of the probabilistic approach. Method with continuous response functions is offered from the application of the method of Lagrange multipliers.
A Probabilistic Approach for Robustness Evaluation of Timber Structures
DEFF Research Database (Denmark)
Kirkegaard, Poul Henning; Sørensen, John Dalsgaard
of Structures and a probabilistic modelling of the timber material proposed in the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS). Due to the framework in the Danish Code the timber structure has to be evaluated with respect to the following criteria where at least one shall...... to criteria a) and b) the timber frame structure has one column with a reliability index a bit lower than an assumed target level. By removal three columns one by one no significant extensive failure of the entire structure or significant parts of it are obatined. Therefore the structure can be considered......A probabilistic based robustness analysis has been performed for a glulam frame structure supporting the roof over the main court in a Norwegian sports centre. The robustness analysis is based on the framework for robustness analysis introduced in the Danish Code of Practice for the Safety...
An approximate methods approach to probabilistic structural analysis
Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.
1989-01-01
A probabilistic structural analysis method (PSAM) is described which makes an approximate calculation of the structural response of a system, including the associated probabilistic distributions, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The method employs the fast probability integration (FPI) algorithm of Wu and Wirsching. Typical solution strategies are illustrated by formulations for a representative critical component chosen from the Space Shuttle Main Engine (SSME) as part of a major NASA-sponsored program on PSAM. Typical results are presented to demonstrate the role of the methodology in engineering design and analysis.
Hallin, M.; Piegorsch, W.; El Shaarawi, A.
2012-01-01
The random variable X taking values 0,1,2,…,x,… with probabilities pλ(x) = e−λλx/x!, where λ∈R0+ is called a Poisson variable, and its distribution a Poisson distribution, with parameter λ. The Poisson distribution with parameter λ can be obtained as the limit, as n → ∞ and p → 0 in such a way that
A tiered approach for probabilistic ecological risk assessment of contaminated sites
International Nuclear Information System (INIS)
Zolezzi, M.; Nicolella, C.; Tarazona, J.V.
2005-01-01
This paper presents a tiered methodology for probabilistic ecological risk assessment. The proposed approach starts from deterministic comparison (ratio) of single exposure concentration and threshold or safe level calculated from a dose-response relationship, goes through comparison of probabilistic distributions that describe exposure values and toxicological responses of organisms to the chemical of concern, and finally determines the so called distribution-based quotients (DBQs). In order to illustrate the proposed approach, soil concentrations of 1,2,4-trichlorobenzene (1,2,4- TCB) measured in an industrial contaminated site were used for site-specific probabilistic ecological risks assessment. By using probabilistic distributions, the risk, which exceeds a level of concern for soil organisms with the deterministic approach, is associated to the presence of hot spots reaching concentrations able to affect acutely more than 50% of the soil species, while the large majority of the area presents 1,2,4- TCB concentrations below those reported as toxic [it
DEFF Research Database (Denmark)
Fokianos, Konstantinos; Rahbek, Anders Christian; Tjøstheim, Dag
This paper considers geometric ergodicity and likelihood based inference for linear and nonlinear Poisson autoregressions. In the linear case the conditional mean is linked linearly to its past values as well as the observed values of the Poisson process. This also applies to the conditional...... variance, implying an interpretation as an integer valued GARCH process. In a nonlinear conditional Poisson model, the conditional mean is a nonlinear function of its past values and a nonlinear function of past observations. As a particular example an exponential autoregressive Poisson model for time...
DEFF Research Database (Denmark)
Fokianos, Konstantinos; Rahbæk, Anders; Tjøstheim, Dag
This paper considers geometric ergodicity and likelihood based inference for linear and nonlinear Poisson autoregressions. In the linear case the conditional mean is linked linearly to its past values as well as the observed values of the Poisson process. This also applies to the conditional...... variance, making an interpretation as an integer valued GARCH process possible. In a nonlinear conditional Poisson model, the conditional mean is a nonlinear function of its past values and a nonlinear function of past observations. As a particular example an exponential autoregressive Poisson model...
Dynamic Fault Diagnosis for Nuclear Installation Using Probabilistic Approach
International Nuclear Information System (INIS)
Djoko Hari Nugroho; Deswandri; Ahmad Abtokhi; Darlis
2003-01-01
Probabilistic based fault diagnosis which represent the relationship between cause and consequence of the events for trouble shooting is developed in this research based on Bayesian Networks. Contribution of on-line data comes from sensors and system/component reliability in node cause is expected increasing the belief level of Bayesian Networks. (author)
Identification of probabilistic approaches and map-based navigation ...
Indian Academy of Sciences (India)
B Madhevan
2018-02-07
Feb 7, 2018 ... consists of three processes: map learning (ML), localization and PP [73–76]. (i) ML ...... [83] Thrun S 2001 A probabilistic online mapping algorithm for teams of .... for target tracking using fuzzy logic controller in game theoretic ...
Basic Ideas to Approach Metastability in Probabilistic Cellular Automata
Cirillo, Emilio N. M.; Nardi, Francesca R.; Spitoni, Cristian
2016-01-01
Cellular Automata are discrete--time dynamical systems on a spatially extended discrete space which provide paradigmatic examples of nonlinear phenomena. Their stochastic generalizations, i.e., Probabilistic Cellular Automata, are discrete time Markov chains on lattice with finite single--cell
Boxma, O.J.; Yechiali, U.; Ruggeri, F.; Kenett, R.S.; Faltin, F.W.
2007-01-01
The Poisson process is a stochastic counting process that arises naturally in a large variety of daily life situations. We present a few definitions of the Poisson process and discuss several properties as well as relations to some well-known probability distributions. We further briefly discuss the
CANDU type fuel behavior evaluation - a probabilistic approach
International Nuclear Information System (INIS)
Moscalu, D.R.; Horhoianu, G.; Popescu, I.A.; Olteanu, G.
1995-01-01
In order to realistically assess the behavior of the fuel elements during in-reactor operation, probabilistic methods have recently been introduced in the analysis of fuel performance. The present paper summarizes the achievements in this field at the Institute for Nuclear Research (INR), pointing out some advantages of the utilized method in the evaluation of CANDU type fuel behavior in steady state conditions. The Response Surface Method (RSM) has been selected for the investigation of the effects of the variability in fuel element computer code inputs on the code outputs (fuel element performance parameters). A new developed version of the probabilistic code APMESRA based on RSM is briefly presented. The examples of application include the analysis of the results of an in-reactor fuel element experiment and the investigation of the calculated performance parameter distribution for a new CANDU type extended burnup fuel element design. (author)
A probabilistic approach of sum rules for heat polynomials
International Nuclear Information System (INIS)
Vignat, C; Lévêque, O
2012-01-01
In this paper, we show that the sum rules for generalized Hermite polynomials derived by Daboul and Mizrahi (2005 J. Phys. A: Math. Gen. http://dx.doi.org/10.1088/0305-4470/38/2/010) and by Graczyk and Nowak (2004 C. R. Acad. Sci., Ser. 1 338 849) can be interpreted and easily recovered using a probabilistic moment representation of these polynomials. The covariance property of the raising operator of the harmonic oscillator, which is at the origin of the identities proved in Daboul and Mizrahi and the dimension reduction effect expressed in the main result of Graczyk and Nowak are both interpreted in terms of the rotational invariance of the Gaussian distributions. As an application of these results, we uncover a probabilistic moment interpretation of two classical integrals of the Wigner function that involve the associated Laguerre polynomials. (paper)
Intermediate probabilistic safety assessment approach for safety critical digital systems
International Nuclear Information System (INIS)
Taeyong, Sung; Hyun Gook, Kang
2001-01-01
Even though the conventional probabilistic safety assessment methods are immature for applying to microprocessor-based digital systems, practical needs force to apply it. In the Korea, UCN 5 and 6 units are being constructed and Korean Next Generation Reactor is being designed using the digital instrumentation and control equipment for the safety related functions. Korean regulatory body requires probabilistic safety assessment. This paper analyzes the difficulties on the assessment of digital systems and suggests an intermediate framework for evaluating their safety using fault tree models. The framework deals with several important characteristics of digital systems including software modules and fault-tolerant features. We expect that the analysis result will provide valuable design feedback. (authors)
Systems analysis approach to probabilistic modeling of fault trees
International Nuclear Information System (INIS)
Bartholomew, R.J.; Qualls, C.R.
1985-01-01
A method of probabilistic modeling of fault tree logic combined with stochastic process theory (Markov modeling) has been developed. Systems are then quantitatively analyzed probabilistically in terms of their failure mechanisms including common cause/common mode effects and time dependent failure and/or repair rate effects that include synergistic and propagational mechanisms. The modeling procedure results in a state vector set of first order, linear, inhomogeneous, differential equations describing the time dependent probabilities of failure described by the fault tree. The solutions of this Failure Mode State Variable (FMSV) model are cumulative probability distribution functions of the system. A method of appropriate synthesis of subsystems to form larger systems is developed and applied to practical nuclear power safety systems
Bayesian probabilistic network approach for managing earthquake risks of cities
DEFF Research Database (Denmark)
Bayraktarli, Yahya; Faber, Michael
2011-01-01
This paper considers the application of Bayesian probabilistic networks (BPNs) to large-scale risk based decision making in regard to earthquake risks. A recently developed risk management framework is outlined which utilises Bayesian probabilistic modelling, generic indicator based risk models...... and a fourth module on the consequences of an earthquake. Each of these modules is integrated into a BPN. Special attention is given to aggregated risk, i.e. the risk contribution from assets at multiple locations in a city subjected to the same earthquake. The application of the methodology is illustrated...... on an example considering a portfolio of reinforced concrete structures in a city located close to the western part of the North Anatolian Fault in Turkey....
Tools for voltage stability analysis, including a probabilistic approach
Energy Technology Data Exchange (ETDEWEB)
Vieira Filho, X; Martins, N; Bianco, A; Pinto, H J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)
1994-12-31
This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.
Statistical shape analysis using 3D Poisson equation--A quantitatively validated approach.
Gao, Yi; Bouix, Sylvain
2016-05-01
Statistical shape analysis has been an important area of research with applications in biology, anatomy, neuroscience, agriculture, paleontology, etc. Unfortunately, the proposed methods are rarely quantitatively evaluated, and as shown in recent studies, when they are evaluated, significant discrepancies exist in their outputs. In this work, we concentrate on the problem of finding the consistent location of deformation between two population of shapes. We propose a new shape analysis algorithm along with a framework to perform a quantitative evaluation of its performance. Specifically, the algorithm constructs a Signed Poisson Map (SPoM) by solving two Poisson equations on the volumetric shapes of arbitrary topology, and statistical analysis is then carried out on the SPoMs. The method is quantitatively evaluated on synthetic shapes and applied on real shape data sets in brain structures. Copyright © 2016 Elsevier B.V. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
Jurčo, Branislav, E-mail: jurco@karlin.mff.cuni.cz [Charles University in Prague, Faculty of Mathematics and Physics, Mathematical Institute, Prague 186 75 (Czech Republic); Schupp, Peter, E-mail: p.schupp@jacobs-university.de [Jacobs University Bremen, 28759 Bremen (Germany); Vysoký, Jan, E-mail: vysokjan@fjfi.cvut.cz [Jacobs University Bremen, 28759 Bremen (Germany); Czech Technical University in Prague, Faculty of Nuclear Sciences and Physical Engineering, Prague 115 19 (Czech Republic)
2014-06-02
We generalize noncommutative gauge theory using Nambu–Poisson structures to obtain a new type of gauge theory with higher brackets and gauge fields. The approach is based on covariant coordinates and higher versions of the Seiberg–Witten map. We construct a covariant Nambu–Poisson gauge theory action, give its first order expansion in the Nambu–Poisson tensor and relate it to a Nambu–Poisson matrix model.
International Nuclear Information System (INIS)
Jurčo, Branislav; Schupp, Peter; Vysoký, Jan
2014-01-01
We generalize noncommutative gauge theory using Nambu–Poisson structures to obtain a new type of gauge theory with higher brackets and gauge fields. The approach is based on covariant coordinates and higher versions of the Seiberg–Witten map. We construct a covariant Nambu–Poisson gauge theory action, give its first order expansion in the Nambu–Poisson tensor and relate it to a Nambu–Poisson matrix model.
Implementation of upper limit calculation for a poisson variable by bayesian approach
International Nuclear Information System (INIS)
Zhu Yongsheng
2008-01-01
The calculation of Bayesian confidence upper limit for a Poisson variable including both signal and background with and without systematic uncertainties has been formulated. A Fortran 77 routine, BPULE, has been developed to implement the calculation. The routine can account for systematic uncertainties in the background expectation and signal efficiency. The systematic uncertainties may be separately parameterized by a Gaussian, Log-Gaussian or flat probability density function (pdf). Some technical details of BPULE have been discussed. (authors)
International Nuclear Information System (INIS)
Lauritzen, B.; Baeverstam, U.; Naadland Holo, E.; Sinkko, K.
1997-12-01
This report deals with Operational Intervention Levels (OILs) in a nuclear or radiation emergency. OILs are defined as the values of environmental measurements, in particular dose rate measurements, above which specific protective actions should be carried out in emergency exposure situations. The derivation and the application of OILs are discussed, and an overview of the presently adopted values is provided, with emphasis on the situation in the Nordic countries. A new, probabilistic approach to derive OILs is presented and the method is illustrated by calculating dose rate OILs in a simplified setting. Contrary to the standard approach, the probabilistic approach allows for optimization of OILs. It is argued, that optimized OILs may be much larger than the presently adopted or suggested values. It is recommended, that the probabilistic approach is further developed and employed in determining site specific OILs and in optimizing environmental measuring strategies. (au)
Optimization-Based Approaches to Control of Probabilistic Boolean Networks
Directory of Open Access Journals (Sweden)
Koichi Kobayashi
2017-02-01
Full Text Available Control of gene regulatory networks is one of the fundamental topics in systems biology. In the last decade, control theory of Boolean networks (BNs, which is well known as a model of gene regulatory networks, has been widely studied. In this review paper, our previously proposed methods on optimal control of probabilistic Boolean networks (PBNs are introduced. First, the outline of PBNs is explained. Next, an optimal control method using polynomial optimization is explained. The finite-time optimal control problem is reduced to a polynomial optimization problem. Furthermore, another finite-time optimal control problem, which can be reduced to an integer programming problem, is also explained.
Impact of external events on site evaluation: a probabilistic approach
International Nuclear Information System (INIS)
Jaccarino, E.; Giuliani, P.; Zaffiro, C.
1975-01-01
A probabilistic method is proposed for definition of the reference external events of nuclear sites. The external events taken into account are earthquakes, floods and tornadoes. On the basis of the available historical data for each event it is possible to perform statistical analyses to determine the probability of occurrence on site of events of given characteristics. For earthquakes, the method of analysis takes into consideration both the annual frequency of seismic events in Italy and the probabilistic distribution of areas stricken by each event. For floods, the methods of analysis of hydrological data and the basic criteria for the determination of design events are discussed and the general lines of the hydraulic analysis of a nuclear site are shown. For tornadoes, the statistical analysis has been performed for the events which occurred in Italy during the last 40 years; these events have been classified according to an empirical intensity scale. The probability of each reference event should be a function of the potential radiological damage associated with the particular type of plant which must be installed on the site. Thus the reference event could be chosen such that for the whole of the national territory the risk for safety and environmental protection is the same. (author)
xLPR - a probabilistic approach to piping integrity analysis
International Nuclear Information System (INIS)
Harrington, C.; Rudland, D.; Fyfitch, S.
2015-01-01
The xLPR Code is a probabilistic fracture mechanics (PFM) computational tool that can be used to quantitatively determine a best-estimate probability of failure with well characterized uncertainties for reactor coolant system components, beginning with the piping systems and including the effects of relevant active degradation mechanisms. The initial application planned for xLPR is somewhat narrowly focused on validating LBB (leak-before-break) compliance in PWSCC-susceptible systems such as coolant systems of PWRs. The xLPR code incorporates a set of deterministic models that represent the full range of physical phenomena necessary to evaluate both fatigue and PWSCC degradation modes from crack initiation through failure. These models are each implemented in a modular form and linked together by a probabilistic framework that contains the logic for xLPR execution, exercises the individual modules as required, and performs necessary administrative and bookkeeping functions. The completion of the first production version of the xLPR code in a fully documented, releasable condition is presently planned for spring 2015
All-possible-couplings approach to measuring probabilistic context.
Directory of Open Access Journals (Sweden)
Ehtibar N Dzhafarov
Full Text Available From behavioral sciences to biology to quantum mechanics, one encounters situations where (i a system outputs several random variables in response to several inputs, (ii for each of these responses only some of the inputs may "directly" influence them, but (iii other inputs provide a "context" for this response by influencing its probabilistic relations to other responses. These contextual influences are very different, say, in classical kinetic theory and in the entanglement paradigm of quantum mechanics, which are traditionally interpreted as representing different forms of physical determinism. One can mathematically construct systems with other types of contextuality, whether or not empirically realizable: those that form special cases of the classical type, those that fall between the classical and quantum ones, and those that violate the quantum type. We show how one can quantify and classify all logically possible contextual influences by studying various sets of probabilistic couplings, i.e., sets of joint distributions imposed on random outputs recorded at different (mutually incompatible values of inputs.
Probabilistic seasonal Forecasts to deterministic Farm Leve Decisions: Innovative Approach
Mwangi, M. W.
2015-12-01
Climate change and vulnerability are major challenges in ensuring household food security. Climate information services have the potential to cushion rural households from extreme climate risks. However, most the probabilistic nature of climate information products is not easily understood by majority of smallholder farmers. Despite the probabilistic nature, climate information have proved to be a valuable climate risk adaptation strategy at the farm level. This calls for innovative ways to help farmers understand and apply climate information services to inform their farm level decisions. The study endeavored to co-design and test appropriate innovation systems for climate information services uptake and scale up necessary for achieving climate risk development. In addition it also determined the conditions necessary to support the effective performance of the proposed innovation system. Data and information sources included systematic literature review, secondary sources, government statistics, focused group discussions, household surveys and semi-structured interviews. Data wasanalyzed using both quantitative and qualitative data analysis techniques. Quantitative data was analyzed using the Statistical Package for Social Sciences (SPSS) software. Qualitative data was analyzed using qualitative techniques, which involved establishing the categories and themes, relationships/patterns and conclusions in line with the study objectives. Sustainable livelihood, reduced household poverty and climate change resilience were the impact that resulted from the study.
A Probabilistic Model for Diagnosing Misconceptions by a Pattern Classification Approach.
Tatsuoka, Kikumi K.
A probabilistic approach is introduced to classify and diagnose erroneous rules of operation resulting from a variety of misconceptions ("bugs") in a procedural domain of arithmetic. The model is contrasted with the deterministic approach which has commonly been used in the field of artificial intelligence, and the advantage of treating the…
Understanding poisson regression.
Hayat, Matthew J; Higgins, Melinda
2014-04-01
Nurse investigators often collect study data in the form of counts. Traditional methods of data analysis have historically approached analysis of count data either as if the count data were continuous and normally distributed or with dichotomization of the counts into the categories of occurred or did not occur. These outdated methods for analyzing count data have been replaced with more appropriate statistical methods that make use of the Poisson probability distribution, which is useful for analyzing count data. The purpose of this article is to provide an overview of the Poisson distribution and its use in Poisson regression. Assumption violations for the standard Poisson regression model are addressed with alternative approaches, including addition of an overdispersion parameter or negative binomial regression. An illustrative example is presented with an application from the ENSPIRE study, and regression modeling of comorbidity data is included for illustrative purposes. Copyright 2014, SLACK Incorporated.
A random matrix approach to the crossover of energy-level statistics from Wigner to Poisson
International Nuclear Information System (INIS)
Datta, Nilanjana; Kunz, Herve
2004-01-01
We analyze a class of parametrized random matrix models, introduced by Rosenzweig and Porter, which is expected to describe the energy level statistics of quantum systems whose classical dynamics varies from regular to chaotic as a function of a parameter. We compute the generating function for the correlations of energy levels, in the limit of infinite matrix size. The crossover between Poisson and Wigner statistics is measured by a renormalized coupling constant. The model is exactly solved in the sense that, in the limit of infinite matrix size, the energy-level correlation functions and their generating function are given in terms of a finite set of integrals
Unit commitment with probabilistic reserve: An IPSO approach
International Nuclear Information System (INIS)
Lee, Tsung-Ying; Chen, Chun-Lung
2007-01-01
This paper presents a new algorithm for solution of the nonlinear optimal scheduling problem. This algorithm is named the iteration particle swarm optimization (IPSO). A new index, called iteration best, is incorporated into particle swarm optimization (PSO) to improve the solution quality and computation efficiency. IPSO is applied to solve the unit commitment with probabilistic reserve problem of a power system. The outage cost as well as fuel cost of thermal units was considered in the unit commitment program to evaluate the level of spinning reserve. The optimal scheduling of on line generation units was reached while minimizing the sum of fuel cost and outage cost. A 48 unit power system was used as a numerical example to test the new algorithm. The optimal scheduling of on line generation units could be reached in the testing results while satisfying the requirement of the objective function
The development of a probabilistic approach to forecast coastal change
Lentz, Erika E.; Hapke, Cheryl J.; Rosati, Julie D.; Wang, Ping; Roberts, Tiffany M.
2011-01-01
This study demonstrates the applicability of a Bayesian probabilistic model as an effective tool in predicting post-storm beach changes along sandy coastlines. Volume change and net shoreline movement are modeled for two study sites at Fire Island, New York in response to two extratropical storms in 2007 and 2009. Both study areas include modified areas adjacent to unmodified areas in morphologically different segments of coast. Predicted outcomes are evaluated against observed changes to test model accuracy and uncertainty along 163 cross-shore transects. Results show strong agreement in the cross validation of predictions vs. observations, with 70-82% accuracies reported. Although no consistent spatial pattern in inaccurate predictions could be determined, the highest prediction uncertainties appeared in locations that had been recently replenished. Further testing and model refinement are needed; however, these initial results show that Bayesian networks have the potential to serve as important decision-support tools in forecasting coastal change.
Probabilistic Risk Assessment (PRA): A Practical and Cost Effective Approach
Lee, Lydia L.; Ingegneri, Antonino J.; Djam, Melody
2006-01-01
The Lunar Reconnaissance Orbiter (LRO) is the first mission of the Robotic Lunar Exploration Program (RLEP), a space exploration venture to the Moon, Mars and beyond. The LRO mission includes spacecraft developed by NASA Goddard Space Flight Center (GSFC) and seven instruments built by GSFC, Russia, and contractors across the nation. LRO is defined as a measurement mission, not a science mission. It emphasizes the overall objectives of obtaining data to facilitate returning mankind safely to the Moon in preparation for an eventual manned mission to Mars. As the first mission in response to the President's commitment of the journey of exploring the solar system and beyond: returning to the Moon in the next decade, then venturing further into the solar system, ultimately sending humans to Mars and beyond, LRO has high-visibility to the public but limited resources and a tight schedule. This paper demonstrates how NASA's Lunar Reconnaissance Orbiter Mission project office incorporated reliability analyses in assessing risks and performing design tradeoffs to ensure mission success. Risk assessment is performed using NASA Procedural Requirements (NPR) 8705.5 - Probabilistic Risk Assessment (PRA) Procedures for NASA Programs and Projects to formulate probabilistic risk assessment (PRA). As required, a limited scope PRA is being performed for the LRO project. The PRA is used to optimize the mission design within mandated budget, manpower, and schedule constraints. The technique that LRO project office uses to perform PRA relies on the application of a component failure database to quantify the potential mission success risks. To ensure mission success in an efficient manner, low cost and tight schedule, the traditional reliability analyses, such as reliability predictions, Failure Modes and Effects Analysis (FMEA), and Fault Tree Analysis (FTA), are used to perform PRA for the large system of LRO with more than 14,000 piece parts and over 120 purchased or contractor
Terashima, Yuji
2008-01-01
In this paper, defining Poisson functions on super manifolds, we show that the graphs of Poisson functions are Dirac structures, and find Poisson functions which include as special cases both quasi-Poisson structures and twisted Poisson structures.
Seismic Probabilistic Risk Assessment (SPRA), approach and results
International Nuclear Information System (INIS)
Campbell, R.D.
1995-01-01
During the past 15 years there have been over 30 Seismic Probabilistic Risk Assessments (SPRAs) and Seismic Probabilistic Safety Assessments (SPSAs) conducted of Western Nuclear Power Plants, principally of US design. In this paper PRA and PSA are used interchangeably as the overall process is essentially the same. Some similar assessments have been done for reactors in Taiwan, Korea, Japan, Switzerland and Slovenia. These plants were also principally US supplied or built under US license. Since the restructuring of the governments in former Soviet Bloc countries, there has been grave concern regarding the safety of the reactors in these countries. To date there has been considerable activity in conducting partial seismic upgrades but the overall quantification of risk has not been pursued to the depth that it has in Western countries. This paper summarizes the methodology for Seismic PRA/PSA and compares results of two partially completed and two completed PRAs of soviet designed reactors to results from earlier PRAs on US Reactors. A WWER 440 and a WWER 1000 located in low seismic activity regions have completed PRAs and results show the seismic risk to be very low for both designs. For more active regions, partially completed PRAs of a WWER 440 and WWER 1000 located at the same site show the WWER 440 to have much greater seismic risk than the WWER 1000 plant. The seismic risk from the 1000 MW plant compares with the high end of seismic risk for earlier seismic PRAs in the US. Just as for most US plants, the seismic risk appears to be less than the risk from internal events if risk is measured is terms of mean core damage frequency. However, due to the lack of containment for the earlier WWER 440s, the risk to the public may be significantly greater due to the more probable scenario of an early release. The studies reported have not taken the accident sequences beyond the stage of core damage hence the public heath risk ratios are speculative. (author)
International Nuclear Information System (INIS)
Soriano Pena, A.; Lopez Arroyo, A.; Roesset, J.M.
1976-01-01
The probabilistic and deterministic approaches for calculating the seismic risk of nuclear power plants are both applied to a particular case in Southern Spain. The results obtained by both methods, when varying the input data, are presented and some conclusions drawn in relation to the applicability of the methods, their reliability and their sensitivity to change
Determining reserve requirements in DK1 area of Nord Pool using a probabilistic approach
DEFF Research Database (Denmark)
Saez Gallego, Javier; Morales González, Juan Miguel; Madsen, Henrik
2014-01-01
a probabilistic framework where the reserve requirements are computed based on scenarios of wind power forecast error, load forecast errors and power plant outages. Our approach is first motivated by the increasing wind power penetration in power systems worldwide as well as the current market design of the DK1...... System Operator). © 2014 Elsevier Ltd. All rights reserved....
An approach to handle Real Time and Probabilistic behaviors in e-commerce
DEFF Research Database (Denmark)
Diaz, G.; Larsen, Kim Guldstrand; Pardo, J.
2005-01-01
In this work we describe an approach to deal with systems having at the same time probabilistic and real-time behav- iors. The main goal in the paper is to show the automatic translation from a real time model based on UPPAAL tool, which makes automatic verification of Real Time Systems, to the R...
Use of adjoint methods in the probabilistic finite element approach to fracture mechanics
Liu, Wing Kam; Besterfield, Glen; Lawrence, Mark; Belytschko, Ted
1988-01-01
The adjoint method approach to probabilistic finite element methods (PFEM) is presented. When the number of objective functions is small compared to the number of random variables, the adjoint method is far superior to the direct method in evaluating the objective function derivatives with respect to the random variables. The PFEM is extended to probabilistic fracture mechanics (PFM) using an element which has the near crack-tip singular strain field embedded. Since only two objective functions (i.e., mode I and II stress intensity factors) are needed for PFM, the adjoint method is well suited.
A probabilistic approach to emission-line galaxy classification
de Souza, R. S.; Dantas, M. L. L.; Costa-Duarte, M. V.; Feigelson, E. D.; Killedar, M.; Lablanche, P.-Y.; Vilalta, R.; Krone-Martins, A.; Beck, R.; Gieseke, F.
2017-12-01
We invoke a Gaussian mixture model (GMM) to jointly analyse two traditional emission-line classification schemes of galaxy ionization sources: the Baldwin-Phillips-Terlevich (BPT) and WH α versus [N II]/H α (WHAN) diagrams, using spectroscopic data from the Sloan Digital Sky Survey Data Release 7 and SEAGal/STARLIGHT data sets. We apply a GMM to empirically define classes of galaxies in a three-dimensional space spanned by the log [O III]/H β, log [N II]/H α and log EW(H α) optical parameters. The best-fitting GMM based on several statistical criteria suggests a solution around four Gaussian components (GCs), which are capable to explain up to 97 per cent of the data variance. Using elements of information theory, we compare each GC to their respective astronomical counterpart. GC1 and GC4 are associated with star-forming galaxies, suggesting the need to define a new starburst subgroup. GC2 is associated with BPT's active galactic nuclei (AGN) class and WHAN's weak AGN class. GC3 is associated with BPT's composite class and WHAN's strong AGN class. Conversely, there is no statistical evidence - based on four GCs - for the existence of a Seyfert/low-ionization nuclear emission-line region (LINER) dichotomy in our sample. Notwithstanding, the inclusion of an additional GC5 unravels it. The GC5 appears associated with the LINER and passive galaxies on the BPT and WHAN diagrams, respectively. This indicates that if the Seyfert/LINER dichotomy is there, it does not account significantly to the global data variance and may be overlooked by standard metrics of goodness of fit. Subtleties aside, we demonstrate the potential of our methodology to recover/unravel different objects inside the wilderness of astronomical data sets, without lacking the ability to convey physically interpretable results. The probabilistic classifications from the GMM analysis are publicly available within the COINtoolbox at https://cointoolbox.github.io/GMM_Catalogue/.
Application of probabilistic approach to UP3-A reprocessing plant
Energy Technology Data Exchange (ETDEWEB)
Mercier, J P; Bonneval, F; Weber, M [Institut de Protection et de Surete Nucleaire, Commissariat a l' Energie Atomique, Centre d' Etudes Nucleaires, Fontenay-aux-Roses (France)
1992-02-01
In the UP3-A design studies, three complementary approaches were used: - observance of regulations, and good practice; - review of experience feedback; - the correlation of probabilities and consequences making use of an acceptability graph. The latter approach was considered by the safety authorities to be an acceptable practice where the probability calculations were sufficiently accurate. Examples of its application are presented. (author)
Précis of bayesian rationality: The probabilistic approach to human reasoning.
Oaksford, Mike; Chater, Nick
2009-02-01
According to Aristotle, humans are the rational animal. The borderline between rationality and irrationality is fundamental to many aspects of human life including the law, mental health, and language interpretation. But what is it to be rational? One answer, deeply embedded in the Western intellectual tradition since ancient Greece, is that rationality concerns reasoning according to the rules of logic--the formal theory that specifies the inferential connections that hold with certainty between propositions. Piaget viewed logical reasoning as defining the end-point of cognitive development; and contemporary psychology of reasoning has focussed on comparing human reasoning against logical standards. Bayesian Rationality argues that rationality is defined instead by the ability to reason about uncertainty. Although people are typically poor at numerical reasoning about probability, human thought is sensitive to subtle patterns of qualitative Bayesian, probabilistic reasoning. In Chapters 1-4 of Bayesian Rationality (Oaksford & Chater 2007), the case is made that cognition in general, and human everyday reasoning in particular, is best viewed as solving probabilistic, rather than logical, inference problems. In Chapters 5-7 the psychology of "deductive" reasoning is tackled head-on: It is argued that purportedly "logical" reasoning problems, revealing apparently irrational behaviour, are better understood from a probabilistic point of view. Data from conditional reasoning, Wason's selection task, and syllogistic inference are captured by recasting these problems probabilistically. The probabilistic approach makes a variety of novel predictions which have been experimentally confirmed. The book considers the implications of this work, and the wider "probabilistic turn" in cognitive science and artificial intelligence, for understanding human rationality.
A Probabilistic Approach to Tropical Cyclone Conditions of Readiness (TCCOR)
National Research Council Canada - National Science Library
Wallace, Kenneth A
2008-01-01
Tropical Cyclone Conditions of Readiness (TCCOR) are set at DoD installations in the Western Pacific to convey the risk associated with the onset of destructive winds from approaching tropical cyclones...
A Probabilistic Approach to Baffle Bolt IASCC Predictions
International Nuclear Information System (INIS)
Griesbach, Timothy J.; Licina, George J.; Riccardella, Peter C.; Rashid, Joe R.; Nickell, Robert E.
2012-01-01
test results reported in the literature, and plotted as percent of irradiated yield strength versus irradiation dose. This method was used to determine the probability of IASCC occurring at various stress levels, using a Weibull fit of the cumulative failure probability vs. IASCC ratio. To benchmark the model, bolt-by-bolt stresses in a typical PWR were estimated and the accumulated fluence or dose level were used with the model to predict the probabilities (or numbers) of baffle-former bolt failures due to IASCC over time (i.e., at various Effective Full Power Years). The resulting predictions were compared to actual field experience with bolt cracking in several operating PWRs. The model provides a probabilistic estimate of the number of cracked bolts that might be expected to be found during any future refueling outage with inspections of the baffle-former bolts. Such a priori knowledge is important because the plan for inspection of the baffle-former bolts may require additional contingencies depending on the likely outcome. (author)
A probabilistic approach to quantum mechanics based on 'tomograms'
International Nuclear Information System (INIS)
Caponigro, M.; Mancini, S.; Man'ko, V.I.
2006-01-01
It is usually believed that a picture of Quantum Mechanics in terms of true probabilities cannot be given due to the uncertainty relations. Here we discuss a tomographic approach to quantum states that leads to a probability representation of quantum states. This can be regarded as a classical-like formulation of quantum mechanics which avoids the counterintuitive concepts of wave function and density operator. The relevant concepts of quantum mechanics are then reconsidered and the epistemological implications of such approach discussed. (Abstract Copyright [2006], Wiley Periodicals, Inc.)
Probabilistic Damage Characterization Using the Computationally-Efficient Bayesian Approach
Warner, James E.; Hochhalter, Jacob D.
2016-01-01
This work presents a computationally-ecient approach for damage determination that quanti es uncertainty in the provided diagnosis. Given strain sensor data that are polluted with measurement errors, Bayesian inference is used to estimate the location, size, and orientation of damage. This approach uses Bayes' Theorem to combine any prior knowledge an analyst may have about the nature of the damage with information provided implicitly by the strain sensor data to form a posterior probability distribution over possible damage states. The unknown damage parameters are then estimated based on samples drawn numerically from this distribution using a Markov Chain Monte Carlo (MCMC) sampling algorithm. Several modi cations are made to the traditional Bayesian inference approach to provide signi cant computational speedup. First, an ecient surrogate model is constructed using sparse grid interpolation to replace a costly nite element model that must otherwise be evaluated for each sample drawn with MCMC. Next, the standard Bayesian posterior distribution is modi ed using a weighted likelihood formulation, which is shown to improve the convergence of the sampling process. Finally, a robust MCMC algorithm, Delayed Rejection Adaptive Metropolis (DRAM), is adopted to sample the probability distribution more eciently. Numerical examples demonstrate that the proposed framework e ectively provides damage estimates with uncertainty quanti cation and can yield orders of magnitude speedup over standard Bayesian approaches.
A probabilistic approach to the drag-based model
Napoletano, Gianluca; Forte, Roberta; Moro, Dario Del; Pietropaolo, Ermanno; Giovannelli, Luca; Berrilli, Francesco
2018-02-01
The forecast of the time of arrival (ToA) of a coronal mass ejection (CME) to Earth is of critical importance for our high-technology society and for any future manned exploration of the Solar System. As critical as the forecast accuracy is the knowledge of its precision, i.e. the error associated to the estimate. We propose a statistical approach for the computation of the ToA using the drag-based model by introducing the probability distributions, rather than exact values, as input parameters, thus allowing the evaluation of the uncertainty on the forecast. We test this approach using a set of CMEs whose transit times are known, and obtain extremely promising results: the average value of the absolute differences between measure and forecast is 9.1h, and half of these residuals are within the estimated errors. These results suggest that this approach deserves further investigation. We are working to realize a real-time implementation which ingests the outputs of automated CME tracking algorithms as inputs to create a database of events useful for a further validation of the approach.
Assessing Probabilistic Risk Assessment Approaches for Insect Biological Control Introductions
Kaufman, Leyla V.; Wright, Mark G.
2017-01-01
The introduction of biological control agents to new environments requires host specificity tests to estimate potential non-target impacts of a prospective agent. Currently, the approach is conservative, and is based on physiological host ranges determined under captive rearing conditions, without consideration for ecological factors that may influence realized host range. We use historical data and current field data from introduced parasitoids that attack an endemic Lepidoptera species in H...
A probabilistic approach to assessing radioactive waste container lifetimes
International Nuclear Information System (INIS)
Porter, F.M.; Naish, C.C.; Sharland, S.M.
1994-01-01
A general methodology has been developed to make assessments of the lifetime of specific radioactive waste container designs in a repository environment. The methodology employs a statistical approach, which aims to reflect uncertainty in the corrosion rates, and the evolution of the environmental conditions. In this paper, the methodology is demonstrated for an intermediate-level waste (ILW) container in the anticipated UK repository situation
Directory of Open Access Journals (Sweden)
Mengmeng Ma
2015-01-01
Full Text Available To solve the invalidation problem of Dempster-Shafer theory of evidence (DS with high conflict in multisensor data fusion, this paper presents a novel combination approach of conflict evidence with different weighting factors using a new probabilistic dissimilarity measure. Firstly, an improved probabilistic transformation function is proposed to map basic belief assignments (BBAs to probabilities. Then, a new dissimilarity measure integrating fuzzy nearness and introduced correlation coefficient is proposed to characterize not only the difference between basic belief functions (BBAs but also the divergence degree of the hypothesis that two BBAs support. Finally, the weighting factors used to reassign conflicts on BBAs are developed and Dempster’s rule is chosen to combine the discounted sources. Simple numerical examples are employed to demonstrate the merit of the proposed method. Through analysis and comparison of the results, the new combination approach can effectively solve the problem of conflict management with better convergence performance and robustness.
Development of a Quantitative Framework for Regulatory Risk Assessments: Probabilistic Approaches
International Nuclear Information System (INIS)
Wilmot, R.D.
2003-11-01
The Swedish regulators have been active in the field of performance assessment for many years and have developed sophisticated approaches to the development of scenarios and other aspects of assessments. These assessments have generally used dose as the assessment end-point and have been based on deterministic calculations. Recently introduced Swedish regulations have introduced a risk criterion for radioactive waste disposal: the annual risk of harmful effects after closure of a disposal facility should not exceed 10 -6 for a representative individual in the group exposed to the greatest risk. A recent review of the overall structure of risk assessments in safety cases concluded that there are a number of decisions and assumptions in the development of a risk assessment methodology that could potentially affect the calculated results. Regulatory understanding of these issues, potentially supported by independent calculations, is important in preparing for review of a proponent's risk assessment. One approach to evaluating risk in performance assessments is to use the concept of probability to express uncertainties, and to propagate these probabilities through the analysis. This report describes the various approaches available for undertaking such probabilistic analyses, both as a means of accounting for uncertainty in the determination of risk and more generally as a means of sensitivity and uncertainty analysis. The report discusses the overall nature of probabilistic analyses and how they are applied to both the calculation of risk and sensitivity analyses. Several approaches are available, including differential analysis, response surface methods and simulation. Simulation is the approach most commonly used, both in assessments for radioactive waste disposal and in other subject areas, and the report describes the key stages of this approach in detail. Decisions relating to the development of input PDFs, sampling methods (including approaches to the treatment
Approach to modeling of human performance for purposes of probabilistic risk assessment
International Nuclear Information System (INIS)
Swain, A.D.
1983-01-01
This paper describes the general approach taken in NUREG/CR-1278 to model human performance in sufficienct detail to permit probabilistic risk assessments of nuclear power plant operations. To show the basis for the more specific models in the above NUREG, a simplified model of the human component in man-machine systems is presented, the role of performance shaping factors is discussed, and special problems in modeling the cognitive aspect of behavior are described
Simulation approaches to probabilistic structural design at the component level
International Nuclear Information System (INIS)
Stancampiano, P.A.
1978-01-01
In this paper, structural failure of large nuclear components is viewed as a random process with a low probability of occurrence. Therefore, a statistical interpretation of probability does not apply and statistical inferences cannot be made due to the sparcity of actual structural failure data. In such cases, analytical estimates of the failure probabilities may be obtained from stress-strength interference theory. Since the majority of real design applications are complex, numerical methods are required to obtain solutions. Monte Carlo simulation appears to be the best general numerical approach. However, meaningful applications of simulation methods suggest research activities in three categories: methods development, failure mode models development, and statistical data models development. (Auth.)
Analytic and probabilistic approaches to dynamics in negative curvature
Peigné, Marc; Sambusetti, Andrea
2014-01-01
The work of E. Hopf and G.A. Hedlund, in the 1930s, on transitivity and ergodicity of the geodesic flow for hyperbolic surfaces, marked the beginning of the investigation of the statistical properties and stochastic behavior of the flow. The first central limit theorem for the geodesic flow was proved in the 1960s by Y. Sinai for compact hyperbolic manifolds. Since then, strong relationships have been found between the fields of ergodic theory, analysis, and geometry. Different approaches and new tools have been developed to study the geodesic flow, including measure theory, thermodynamic formalism, transfer operators, Laplace operators, and Brownian motion. All these different points of view have led to a deep understanding of more general dynamical systems, in particular the so-called Anosov systems, with applications to geometric problems such as counting, equirepartition, mixing, and recurrence properties of the orbits. This book comprises two independent texts that provide a self-contained introduction t...
Approaches to probabilistic model learning for mobile manipulation robots
Sturm, Jürgen
2013-01-01
Mobile manipulation robots are envisioned to provide many useful services both in domestic environments as well as in the industrial context. Examples include domestic service robots that implement large parts of the housework, and versatile industrial assistants that provide automation, transportation, inspection, and monitoring services. The challenge in these applications is that the robots have to function under changing, real-world conditions, be able to deal with considerable amounts of noise and uncertainty, and operate without the supervision of an expert. This book presents novel learning techniques that enable mobile manipulation robots, i.e., mobile platforms with one or more robotic manipulators, to autonomously adapt to new or changing situations. The approaches presented in this book cover the following topics: (1) learning the robot's kinematic structure and properties using actuation and visual feedback, (2) learning about articulated objects in the environment in which the robot is operating,...
A new probabilistic approach to the microdosimetry of BNCT
International Nuclear Information System (INIS)
Santa Cruz, G.A.; Palmer, M.R.; Kiger, W.S. III; Zamenhof, R.G.; Matatagui, E.
2000-01-01
Using H and E micrographs of glioma, melanoma, and normal brain cells and applying stereological reconstruction, we computed the chord length distributions of the nuclei in these tissues that are the basis of a new approach to microdosimetry. This new formalism, derived from the field of geometric probability, allows the calculation of event statistics and mean specific energy (z F and z D ) values for different kinds of 10 B distributions, including the case of cell surface bound compounds. The results suggest a new method for predicting boron compound efficacy in terms of cell geometry and microscopic boron distribution. The new formalism can be applied to any high-LET particle type such as protons or heavy recoil particles. Illustrative results are presented. (author)
Multilevel probabilistic approach to evaluate manufacturing defect in composite aircraft structures
International Nuclear Information System (INIS)
Caracciolo, Paola
2014-01-01
In this work it is developed a reliable approach and its feasibility to the design and analysis of a composite structures. The metric is compared the robustness and reliability designs versus the traditional design, to demonstrate the gain that can be achieved with a probabilistic approach. The use of the stochastic approach of the uncertain parameteters in combination with the multi-scale levels analysis is the main objective of this paper. The work is dedicated to analyze the uncertainties in the design, tests, manufacturing process, and key gates such as materials characteristic
Multilevel probabilistic approach to evaluate manufacturing defect in composite aircraft structures
Energy Technology Data Exchange (ETDEWEB)
Caracciolo, Paola, E-mail: paola.caracciolo@airbus.com [AIRBUS INDUSTRIES Germany, Department of Airframe Architecture and Integration-Research and Technology-Kreetslag, 10, D-21129, Hamburg (Germany)
2014-05-15
In this work it is developed a reliable approach and its feasibility to the design and analysis of a composite structures. The metric is compared the robustness and reliability designs versus the traditional design, to demonstrate the gain that can be achieved with a probabilistic approach. The use of the stochastic approach of the uncertain parameteters in combination with the multi-scale levels analysis is the main objective of this paper. The work is dedicated to analyze the uncertainties in the design, tests, manufacturing process, and key gates such as materials characteristic.
CLASSIFYING X-RAY BINARIES: A PROBABILISTIC APPROACH
International Nuclear Information System (INIS)
Gopalan, Giri; Bornn, Luke; Vrtilek, Saeqa Dil
2015-01-01
In X-ray binary star systems consisting of a compact object that accretes material from an orbiting secondary star, there is no straightforward means to decide whether the compact object is a black hole or a neutron star. To assist in this process, we develop a Bayesian statistical model that makes use of the fact that X-ray binary systems appear to cluster based on their compact object type when viewed from a three-dimensional coordinate system derived from X-ray spectral data where the first coordinate is the ratio of counts in the mid- to low-energy band (color 1), the second coordinate is the ratio of counts in the high- to low-energy band (color 2), and the third coordinate is the sum of counts in all three bands. We use this model to estimate the probabilities of an X-ray binary system containing a black hole, non-pulsing neutron star, or pulsing neutron star. In particular, we utilize a latent variable model in which the latent variables follow a Gaussian process prior distribution, and hence we are able to induce the spatial correlation which we believe exists between systems of the same type. The utility of this approach is demonstrated by the accurate prediction of system types using Rossi X-ray Timing Explorer All Sky Monitor data, but it is not flawless. In particular, non-pulsing neutron systems containing “bursters” that are close to the boundary demarcating systems containing black holes tend to be classified as black hole systems. As a byproduct of our analyses, we provide the astronomer with the public R code which can be used to predict the compact object type of XRBs given training data
Slob, Wout; Bakker, Martine I; Biesebeek, Jan Dirk Te; Bokkers, Bas G H
2014-08-01
Current methods for cancer risk assessment result in single values, without any quantitative information on the uncertainties in these values. Therefore, single risk values could easily be overinterpreted. In this study, we discuss a full probabilistic cancer risk assessment approach in which all the generally recognized uncertainties in both exposure and hazard assessment are quantitatively characterized and probabilistically evaluated, resulting in a confidence interval for the final risk estimate. The methodology is applied to three example chemicals (aflatoxin, N-nitrosodimethylamine, and methyleugenol). These examples illustrate that the uncertainty in a cancer risk estimate may be huge, making single value estimates of cancer risk meaningless. Further, a risk based on linear extrapolation tends to be lower than the upper 95% confidence limit of a probabilistic risk estimate, and in that sense it is not conservative. Our conceptual analysis showed that there are two possible basic approaches for cancer risk assessment, depending on the interpretation of the dose-incidence data measured in animals. However, it remains unclear which of the two interpretations is the more adequate one, adding an additional uncertainty to the already huge confidence intervals for cancer risk estimates. © 2014 Society for Risk Analysis.
Probabilistic and machine learning-based retrieval approaches for biomedical dataset retrieval
Karisani, Payam; Qin, Zhaohui S; Agichtein, Eugene
2018-01-01
Abstract The bioCADDIE dataset retrieval challenge brought together different approaches to retrieval of biomedical datasets relevant to a user’s query, expressed as a text description of a needed dataset. We describe experiments in applying a data-driven, machine learning-based approach to biomedical dataset retrieval as part of this challenge. We report on a series of experiments carried out to evaluate the performance of both probabilistic and machine learning-driven techniques from information retrieval, as applied to this challenge. Our experiments with probabilistic information retrieval methods, such as query term weight optimization, automatic query expansion and simulated user relevance feedback, demonstrate that automatically boosting the weights of important keywords in a verbose query is more effective than other methods. We also show that although there is a rich space of potential representations and features available in this domain, machine learning-based re-ranking models are not able to improve on probabilistic information retrieval techniques with the currently available training data. The models and algorithms presented in this paper can serve as a viable implementation of a search engine to provide access to biomedical datasets. The retrieval performance is expected to be further improved by using additional training data that is created by expert annotation, or gathered through usage logs, clicks and other processes during natural operation of the system. Database URL: https://github.com/emory-irlab/biocaddie
A Probabilistic Approach to Predict Thermal Fatigue Life for Ball Grid Array Solder Joints
Wei, Helin; Wang, Kuisheng
2011-11-01
Numerous studies of the reliability of solder joints have been performed. Most life prediction models are limited to a deterministic approach. However, manufacturing induces uncertainty in the geometry parameters of solder joints, and the environmental temperature varies widely due to end-user diversity, creating uncertainties in the reliability of solder joints. In this study, a methodology for accounting for variation in the lifetime prediction for lead-free solder joints of ball grid array packages (PBGA) is demonstrated. The key aspects of the solder joint parameters and the cyclic temperature range related to reliability are involved. Probabilistic solutions of the inelastic strain range and thermal fatigue life based on the Engelmaier model are developed to determine the probability of solder joint failure. The results indicate that the standard deviation increases significantly when more random variations are involved. Using the probabilistic method, the influence of each variable on the thermal fatigue life is quantified. This information can be used to optimize product design and process validation acceptance criteria. The probabilistic approach creates the opportunity to identify the root causes of failed samples from product fatigue tests and field returns. The method can be applied to better understand how variation affects parameters of interest in an electronic package design with area array interconnections.
Energy Technology Data Exchange (ETDEWEB)
Knio, Omar [Duke Univ., Durham, NC (United States). Dept. of Mechanical Engineering and Materials Science
2017-05-05
The current project develops a novel approach that uses a probabilistic description to capture the current state of knowledge about the computational solution. To effectively spread the computational effort over multiple nodes, the global computational domain is split into many subdomains. Computational uncertainty in the solution translates into uncertain boundary conditions for the equation system to be solved on those subdomains, and many independent, concurrent subdomain simulations are used to account for this bound- ary condition uncertainty. By relying on the fact that solutions on neighboring subdomains must agree with each other, a more accurate estimate for the global solution can be achieved. Statistical approaches in this update process make it possible to account for the effect of system faults in the probabilistic description of the computational solution, and the associated uncertainty is reduced through successive iterations. By combining all of these elements, the probabilistic reformulation allows splitting the computational work over very many independent tasks for good scalability, while being robust to system faults.
Multiple sequential failure model: A probabilistic approach to quantifying human error dependency
International Nuclear Information System (INIS)
Samanta
1985-01-01
This paper rpesents a probabilistic approach to quantifying human error dependency when multiple tasks are performed. Dependent human failures are dominant contributors to risks from nuclear power plants. An overview of the Multiple Sequential Failure (MSF) model developed and its use in probabilistic risk assessments (PRAs) depending on the available data are discussed. A small-scale psychological experiment was conducted on the nature of human dependency and the interpretation of the experimental data by the MSF model show remarkable accommodation of the dependent failure data. The model, which provides an unique method for quantification of dependent failures in human reliability analysis, can be used in conjunction with any of the general methods currently used for performing the human reliability aspect in PRAs
A probabilistic approach for debris impact risk with numerical simulations of debris behaviors
International Nuclear Information System (INIS)
Kihara, Naoto; Matsuyama, Masafumi; Fujii, Naoki
2013-01-01
We propose a probabilistic approach for evaluating the impact risk of tsunami debris through Monte Carlo simulations with a combined system comprising a depth-averaged two-dimensional shallow water model and a discrete element model customized to simulate the motions of floating objects such as vessels. In the proposed method, first, probabilistic tsunami hazard analysis is carried out, and the exceedance probability of tsunami height and numerous tsunami time series for various hazard levels on the offshore side of a target site are estimated. Second, a characteristic tsunami time series for each hazard level is created by cluster analysis. Third, using the Monte Carlo simulation model the debris impact probability with the buildings of interest and the exceedance probability of debris impact speed are evaluated. (author)
DEFF Research Database (Denmark)
Jensen, Bodil Hamborg; Andersen, Jens Hinge; Petersen, Annette
2008-01-01
Probabilistic and deterministic estimates of the acute and chronic exposure of the Danish populations to dithiocarbamate residues were performed. The Monte Carlo Risk Assessment programme (MCRA 4.0) was used for the probabilistic risk assessment. Food consumption data were obtained from...... the nationwide dietary survey conducted in 2000-02. Residue data for 5721 samples from the monitoring programme conducted in the period 1998-2003 were used for dithiocarbamates, which had been determined as carbon disulphide. Contributions from 26 commodities were included in the calculations. Using...... the probabilistic approach, the daily acute intakes at the 99.9% percentile for adults and children were 11.2 and 28.2 mu g kg(-1) body weight day(-1), representing 5.6% and 14.1% of the ARfD for maneb, respectively. When comparing the point estimate approach with the probabilistic approach, the outcome...
DEFF Research Database (Denmark)
Opper, Manfred; Winther, Ole
2001-01-01
We develop an advanced mean held method for approximating averages in probabilistic data models that is based on the Thouless-Anderson-Palmer (TAP) approach of disorder physics. In contrast to conventional TAP. where the knowledge of the distribution of couplings between the random variables...... is required. our method adapts to the concrete couplings. We demonstrate the validity of our approach, which is so far restricted to models with nonglassy behavior? by replica calculations for a wide class of models as well as by simulations for a real data set....
Directory of Open Access Journals (Sweden)
Yan Li
2017-11-01
Full Text Available Due to the volatile and correlated nature of wind speed, a high share of wind power penetration poses challenges to power system production simulation. Existing power system probabilistic production simulation approaches are in short of considering the time-varying characteristics of wind power and load, as well as the correlation between wind speeds at the same time, which brings about some problems in planning and analysis for the power system with high wind power penetration. Based on universal generating function (UGF, this paper proposes a novel probabilistic production simulation approach considering wind speed correlation. UGF is utilized to develop the chronological models of wind power that characterizes wind speed correlation simultaneously, as well as the chronological models of conventional generation sources and load. The supply and demand are matched chronologically to not only obtain generation schedules, but also reliability indices both at each simulation interval and the whole period. The proposed approach has been tested on the improved IEEE-RTS 79 test system and is compared with the Monte Carlo approach and the sequence operation theory approach. The results verified the proposed approach with the merits of computation simplicity and accuracy.
Identification of failure type in corroded pipelines: a bayesian probabilistic approach.
Breton, T; Sanchez-Gheno, J C; Alamilla, J L; Alvarez-Ramirez, J
2010-07-15
Spillover of hazardous materials from transport pipelines can lead to catastrophic events with serious and dangerous environmental impact, potential fire events and human fatalities. The problem is more serious for large pipelines when the construction material is under environmental corrosion conditions, as in the petroleum and gas industries. In this way, predictive models can provide a suitable framework for risk evaluation, maintenance policies and substitution procedure design that should be oriented to reduce increased hazards. This work proposes a bayesian probabilistic approach to identify and predict the type of failure (leakage or rupture) for steel pipelines under realistic corroding conditions. In the first step of the modeling process, the mechanical performance of the pipe is considered for establishing conditions under which either leakage or rupture failure can occur. In the second step, experimental burst tests are used to introduce a mean probabilistic boundary defining a region where the type of failure is uncertain. In the boundary vicinity, the failure discrimination is carried out with a probabilistic model where the events are considered as random variables. In turn, the model parameters are estimated with available experimental data and contrasted with a real catastrophic event, showing good discrimination capacity. The results are discussed in terms of policies oriented to inspection and maintenance of large-size pipelines in the oil and gas industry. 2010 Elsevier B.V. All rights reserved.
Nistal-Nuño, Beatriz
2017-03-31
In Chile, a new law introduced in March 2012 lowered the blood alcohol concentration (BAC) limit for impaired drivers from 0.1% to 0.08% and the BAC limit for driving under the influence of alcohol from 0.05% to 0.03%, but its effectiveness remains uncertain. The goal of this investigation was to evaluate the effects of this enactment on road traffic injuries and fatalities in Chile. A retrospective cohort study. Data were analyzed using a descriptive and a Generalized Linear Models approach, type of Poisson regression, to analyze deaths and injuries in a series of additive Log-Linear Models accounting for the effects of law implementation, month influence, a linear time trend and population exposure. A review of national databases in Chile was conducted from 2003 to 2014 to evaluate the monthly rates of traffic fatalities and injuries associated to alcohol and in total. It was observed a decrease by 28.1 percent in the monthly rate of traffic fatalities related to alcohol as compared to before the law (Plaw (Plaw implemented in 2012 in Chile. Chile experienced a significant reduction in alcohol-related traffic fatalities and injuries, being a successful public health intervention.
A Kullback-Leibler approach for 3D reconstruction of spectral CT data corrupted by Poisson noise
Hohweiller, Tom; Ducros, Nicolas; Peyrin, Françoise; Sixou, Bruno
2017-09-01
While standard computed tomography (CT) data do not depend on energy, spectral computed tomography (SPCT) acquire energy-resolved data, which allows material decomposition of the object of interest. Decompo- sitions in the projection domain allow creating projection mass density (PMD) per materials. From decomposed projections, a tomographic reconstruction creates 3D material density volume. The decomposition is made pos- sible by minimizing a cost function. The variational approach is preferred since this is an ill-posed non-linear inverse problem. Moreover, noise plays a critical role when decomposing data. That is why in this paper, a new data fidelity term is used to take into account of the photonic noise. In this work two data fidelity terms were investigated: a weighted least squares (WLS) term, adapted to Gaussian noise, and the Kullback-Leibler distance (KL), adapted to Poisson noise. A regularized Gauss-Newton algorithm minimizes the cost function iteratively. Both methods decompose materials from a numerical phantom of a mouse. Soft tissues and bones are decomposed in the projection domain; then a tomographic reconstruction creates a 3D material density volume for each material. Comparing relative errors, KL is shown to outperform WLS for low photon counts, in 2D and 3D. This new method could be of particular interest when low-dose acquisitions are performed.
Sun, Hui; Wen, Jiayi; Zhao, Yanxiang; Li, Bo; McCammon, J Andrew
2015-12-28
Dielectric boundary based implicit-solvent models provide efficient descriptions of coarse-grained effects, particularly the electrostatic effect, of aqueous solvent. Recent years have seen the initial success of a new such model, variational implicit-solvent model (VISM) [Dzubiella, Swanson, and McCammon Phys. Rev. Lett. 96, 087802 (2006) and J. Chem. Phys. 124, 084905 (2006)], in capturing multiple dry and wet hydration states, describing the subtle electrostatic effect in hydrophobic interactions, and providing qualitatively good estimates of solvation free energies. Here, we develop a phase-field VISM to the solvation of charged molecules in aqueous solvent to include more flexibility. In this approach, a stable equilibrium molecular system is described by a phase field that takes one constant value in the solute region and a different constant value in the solvent region, and smoothly changes its value on a thin transition layer representing a smeared solute-solvent interface or dielectric boundary. Such a phase field minimizes an effective solvation free-energy functional that consists of the solute-solvent interfacial energy, solute-solvent van der Waals interaction energy, and electrostatic free energy described by the Poisson-Boltzmann theory. We apply our model and methods to the solvation of single ions, two parallel plates, and protein complexes BphC and p53/MDM2 to demonstrate the capability and efficiency of our approach at different levels. With a diffuse dielectric boundary, our new approach can describe the dielectric asymmetry in the solute-solvent interfacial region. Our theory is developed based on rigorous mathematical studies and is also connected to the Lum-Chandler-Weeks theory (1999). We discuss these connections and possible extensions of our theory and methods.
Duplicate Detection in Probabilistic Data
Panse, Fabian; van Keulen, Maurice; de Keijzer, Ander; Ritter, Norbert
2009-01-01
Collected data often contains uncertainties. Probabilistic databases have been proposed to manage uncertain data. To combine data from multiple autonomous probabilistic databases, an integration of probabilistic data has to be performed. Until now, however, data integration approaches have focused
Specification of test criteria and probabilistic approach: the case of plutonium air transport
International Nuclear Information System (INIS)
Hubert, P.; Pages, P.; Ringot, C.; Tomachewsky, E.
1989-03-01
The safety of international transportation relies on compliance with IAEA regulations which specify a serie of test which the package is supposed to withstand. For Plutonium air transport some national regulations are more stringent than the IAEA one, namely the US one. For example the drop test is to be performed at 129 m.s -1 instead of 13.4 m.s -1 . The development of international Plutonium exchanges has raised the question of the adequacy of both those standards. The purpose of this paper is to show how a probabilistic approach helps in assessing the efficiency of a move towards more stringent tests
A Probabilistic Approach to Fitting Period–luminosity Relations and Validating Gaia Parallaxes
Energy Technology Data Exchange (ETDEWEB)
Sesar, Branimir; Fouesneau, Morgan; Bailer-Jones, Coryn A. L.; Gould, Andy; Rix, Hans-Walter [Max Planck Institute for Astronomy, Königstuhl 17, D-69117 Heidelberg (Germany); Price-Whelan, Adrian M., E-mail: bsesar@mpia.de [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States)
2017-04-01
Pulsating stars, such as Cepheids, Miras, and RR Lyrae stars, are important distance indicators and calibrators of the “cosmic distance ladder,” and yet their period–luminosity–metallicity (PLZ) relations are still constrained using simple statistical methods that cannot take full advantage of available data. To enable optimal usage of data provided by the Gaia mission, we present a probabilistic approach that simultaneously constrains parameters of PLZ relations and uncertainties in Gaia parallax measurements. We demonstrate this approach by constraining PLZ relations of type ab RR Lyrae stars in near-infrared W 1 and W 2 bands, using Tycho- Gaia Astrometric Solution (TGAS) parallax measurements for a sample of ≈100 type ab RR Lyrae stars located within 2.5 kpc of the Sun. The fitted PLZ relations are consistent with previous studies, and in combination with other data, deliver distances precise to 6% (once various sources of uncertainty are taken into account). To a precision of 0.05 mas (1 σ ), we do not find a statistically significant offset in TGAS parallaxes for this sample of distant RR Lyrae stars (median parallax of 0.8 mas and distance of 1.4 kpc). With only minor modifications, our probabilistic approach can be used to constrain PLZ relations of other pulsating stars, and we intend to apply it to Cepheid and Mira stars in the near future.
International Nuclear Information System (INIS)
Barthelet, B.; Ardillon, E.
1997-01-01
The flaw acceptance rules in nuclear components rely on deterministic criteria supposed to ensure the safe operating of plants. The interest of having a reliable method of evaluating the safety margins and the integrity of components led Electricite de France to launch a study to link safety factors with requested reliability. A simplified analytical probabilistic approach is developed to analyse the failure risk in Fracture Mechanics. Assuming lognormal distributions of the main random variables, it is possible considering a simple Linear Elastic Fracture Mechanics model, to determine the failure probability as a function of mean values and logarithmic standard deviations. The 'design' failure point can be analytically calculated. Partial safety factors on the main variables (stress, crack size, material toughness) are obtained in relation with reliability target values. The approach is generalized to elastic plastic Fracture Mechanics (piping) by fitting J as a power law function of stress, crack size and yield strength. The simplified approach is validated by detailed probabilistic computations with PROBAN computer program. Assuming reasonable coefficients of variations (logarithmic standard deviations), the method helps to calibrate safety factors for different components taking into account reliability target values in normal, emergency and faulted conditions. Statistical data for the mechanical properties of the main basic materials complement the study. The work involves laboratory results and manufacture data. The results of this study are discussed within a working group of the French in service inspection code RSE-M. (authors)
Illustration of probabilistic approach in consequence assessment of accidental radioactive releases
International Nuclear Information System (INIS)
Pecha, P.; Hofman, R.; Kuca, P.
2008-01-01
We are describing a certain application of uncertainty analysis of environmental model HARP applied on atmospheric and deposition sub-model. Simulation of uncertainties propagation through the model is basic inevitable task bringing data for advanced techniques of probabilistic consequence assessment and further improvement of reliability of model predictions based on statistical procedures of assimilation with measured data. The activities are investigated in the institute IITA AV CR within the grant project supported by GACR (2007-2009). The problem is solved in close cooperation with section of information systems in institute NRPI. The subject of investigation concerns evaluation of consequences of radioactivity propagation after an accidental radioactivity release from nuclear facility.Transport of activity is studied from initial atmospheric propagation, deposition of radionuclides on terrain and spreading through food chains towards human body .Subsequent deposition processes of admixtures and food chain activity transport are modeled. In the final step a hazard estimation based on doses on population is integrated into the software system HARP. Extension to probabilistic approach has increased the complexity substantially, but offers much more informative background for modem methods of estimation accounting for inherent stochastic nature of the problem. Example of probabilistic assessment illustrated here is based on uncertainty analysis of input parameters of SGPM model. Predicted background field of Cs-137 deposition are labelled with index p. as P X SGPM . Final goal is estimation of a certain unknown true background vector χ true , which accounts also for deficiencies of the SGPM formulation in itself insisting in insufficient description of reality. We must have on mind, that even if we know true values of all input parameters θ m true (m= 1 ,..., M) of SGPM model, the χ true still remain uncertain. One possibility how to approach reality insists
Vernez, David; Fraize-Frontier, Sandrine; Vincent, Raymond; Binet, Stéphane; Rousselle, Christophe
2018-03-15
Assessment factors (AFs) are commonly used for deriving reference concentrations for chemicals. These factors take into account variabilities as well as uncertainties in the dataset, such as inter-species and intra-species variabilities or exposure duration extrapolation or extrapolation from the lowest-observed-adverse-effect level (LOAEL) to the noobserved- adverse-effect level (NOAEL). In a deterministic approach, the value of an AF is the result of a debate among experts and, often a conservative value is used as a default choice. A probabilistic framework to better take into account uncertainties and/or variability when setting occupational exposure limits (OELs) is presented and discussed in this paper. Each AF is considered as a random variable with a probabilistic distribution. A short literature was conducted before setting default distributions ranges and shapes for each AF commonly used. A random sampling, using Monte Carlo techniques, is then used for propagating the identified uncertainties and computing the final OEL distribution. Starting from the broad default distributions obtained, experts narrow it to its most likely range, according to the scientific knowledge available for a specific chemical. Introducing distribution rather than single deterministic values allows disclosing and clarifying variability and/or uncertainties inherent to the OEL construction process. This probabilistic approach yields quantitative insight into both the possible range and the relative likelihood of values for model outputs. It thereby provides a better support in decision-making and improves transparency. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.
Illustration of probabilistic approach in consequence assessment of accidental radioactive releases
International Nuclear Information System (INIS)
Pecha, P.; Hofman, R.; Kuca, P.
2009-01-01
We are describing a certain application of uncertainty analysis of environmental model HARP applied on atmospheric and deposition sub-model. Simulation of uncertainties propagation through the model is basic inevitable task bringing data for advanced techniques of probabilistic consequence assessment and further improvement of reliability of model predictions based on statistical procedures of assimilation with measured data. The activities are investigated in the institute IITA AV CR within the grant project supported by GACR (2007-2009). The problem is solved in close cooperation with section of information systems in institute NRPI. The subject of investigation concerns evaluation of consequences of radioactivity propagation after an accidental radioactivity release from nuclear facility.Transport of activity is studied from initial atmospheric propagation, deposition of radionuclides on terrain and spreading through food chains towards human body .Subsequent deposition processes of admixtures and food chain activity transport are modeled. In the final step a hazard estimation based on doses on population is integrated into the software system HARP. Extension to probabilistic approach has increased the complexity substantially, but offers much more informative background for modem methods of estimation accounting for inherent stochastic nature of the problem. Example of probabilistic assessment illustrated here is based on uncertainty analysis of input parameters of SGPM model. Predicted background field of Cs-137 deposition are labelled with index p. as P X SGPM . Final goal is estimation of a certain unknown true background vector χ true , which accounts also for deficiencies of the SGPM formulation in itself insisting in insufficient description of reality. We must have on mind, that even if we know true values of all input parameters θ m true (m= 1 ,..., M) of SGPM model, the χ true still remain uncertain. One possibility how to approach reality insists
Comparison of a Traditional Probabilistic Risk Assessment Approach with Advanced Safety Analysis
Energy Technology Data Exchange (ETDEWEB)
Smith, Curtis L; Mandelli, Diego; Zhegang Ma
2014-11-01
As part of the Light Water Sustainability Program (LWRS) [1], the purpose of the Risk Informed Safety Margin Characterization (RISMC) [2] Pathway research and development (R&D) is to support plant decisions for risk-informed margin management with the aim to improve economics, reliability, and sustain safety of current NPPs. In this paper, we describe the RISMC analysis process illustrating how mechanistic and probabilistic approaches are combined in order to estimate a safety margin. We use the scenario of a “station blackout” (SBO) wherein offsite power and onsite power is lost, thereby causing a challenge to plant safety systems. We describe the RISMC approach, illustrate the station blackout modeling, and contrast this with traditional risk analysis modeling for this type of accident scenario. We also describe our approach we are using to represent advanced flooding analysis.
Directory of Open Access Journals (Sweden)
Santi Behera
2016-05-01
Full Text Available This work proposes a unique approach for improving voltage stability limit using a Probabilistic Neural Network (PNN classifier that gives corrective controls available in the system in the scenario of contingencies. The sensitivity of system is analyzed to identify weak buses with ENVCI evaluation approaching zero. The input to the classifier, termed as voltage stability enhancing neural network (VSENN classifier, for training are line flows and bus voltages near the notch point of the P–V curve and the output of the VSENN is a control variable. For various contingencies the control action that improves the voltage profile as well as stability index is identified and trained accordingly. The trained VSENN is finally tested for its robustness to improve load margin and ENVCI as well, apart from trained set of operating condition of the system along with contingencies. The proposed approach is verified in IEEE 39-bus test system.
Survey on application of probabilistic fracture mechanics approach to nuclear piping
International Nuclear Information System (INIS)
Kashima, Koichi
1987-01-01
Probabilistic fracture mechanics (PFM) approach is newly developed as one of the tools to evaluate the structural integrity of nuclear components. This report describes the current status of PFM studies for pressure vessel and piping system in light water reactors and focuses on the investigations of the piping failure probability which have been undertaken by USNRC. USNRC reevaluates the double-ended guillotine break (DEGB) of rector coolant piping as a design basis event for nuclear power plant by using the PFM approach. For PWR piping systems designed by Westinghouse, two causes of pipe break are considered: pipe failure due to the crack growth and pipe failure indirectly caused by failure of component supports due to an earthquake. PFM approach shows that the probability of DEGB from either cause is very low and that the effect of earthquake on pipe failure can be neglected. (author)
Probabilistic approach in treatment of deterministic analyses results of severe accidents
International Nuclear Information System (INIS)
Krajnc, B.; Mavko, B.
1996-01-01
Severe accidents sequences resulting in loss of the core geometric integrity have been found to have small probability of the occurrence. Because of their potential consequences to public health and safety, an evaluation of the core degradation progression and the resulting effects on the containment is necessary to determine the probability of a significant release of radioactive materials. This requires assessment of many interrelated phenomena including: steel and zircaloy oxidation, steam spikes, in-vessel debris cooling, potential vessel failure mechanisms, release of core material to the containment, containment pressurization from steam generation, or generation of non-condensable gases or hydrogen burn, and ultimately coolability of degraded core material. To asses the answer from the containment event trees in the sense of weather certain phenomenological event would happen or not the plant specific deterministic analyses should be performed. Due to the fact that there is a large uncertainty in the prediction of severe accidents phenomena in Level 2 analyses (containment event trees) the combination of probabilistic and deterministic approach should be used. In fact the result of the deterministic analyses of severe accidents are treated in probabilistic manner due to large uncertainty of results as a consequence of a lack of detailed knowledge. This paper discusses approach used in many IPEs, and which assures that the assigned probability for certain question in the event tree represent the probability that the event will or will not happen and that this probability also includes its uncertainty, which is mainly result of lack of knowledge. (author)
Directory of Open Access Journals (Sweden)
Xuefei Guan
2011-01-01
Full Text Available In this paper, two probabilistic prognosis updating schemes are compared. One is based on the classical Bayesian approach and the other is based on newly developed maximum relative entropy (MRE approach. The algorithm performance of the two models is evaluated using a set of recently developed prognostics-based metrics. Various uncertainties from measurements, modeling, and parameter estimations are integrated into the prognosis framework as random input variables for fatigue damage of materials. Measures of response variables are then used to update the statistical distributions of random variables and the prognosis results are updated using posterior distributions. Markov Chain Monte Carlo (MCMC technique is employed to provide the posterior samples for model updating in the framework. Experimental data are used to demonstrate the operation of the proposed probabilistic prognosis methodology. A set of prognostics-based metrics are employed to quantitatively evaluate the prognosis performance and compare the proposed entropy method with the classical Bayesian updating algorithm. In particular, model accuracy, precision, robustness and convergence are rigorously evaluated in addition to the qualitative visual comparison. Following this, potential development and improvement for the prognostics-based metrics are discussed in detail.
Assessing dynamic postural control during exergaming in older adults: A probabilistic approach.
Soancatl Aguilar, V; Lamoth, C J C; Maurits, N M; Roerdink, J B T M
2018-02-01
Digital games controlled by body movements (exergames) have been proposed as a way to improve postural control among older adults. Exergames are meant to be played at home in an unsupervised way. However, only few studies have investigated the effect of unsupervised home-exergaming on postural control. Moreover, suitable methods to dynamically assess postural control during exergaming are still scarce. Dynamic postural control (DPC) assessment could be used to provide both meaningful feedback and automatic adjustment of exergame difficulty. These features could potentially foster unsupervised exergaming at home and improve the effectiveness of exergames as tools to improve balance control. The main aim of this study is to investigate the effect of six weeks of unsupervised home-exergaming on DPC as assessed by a recently developed probabilistic model. High probability values suggest 'deteriorated' postural control, whereas low probability values suggest 'good' postural control. In a pilot study, ten healthy older adults (average 77.9, SD 7.2 years) played an ice-skating exergame at home half an hour per day, three times a week during six weeks. The intervention effect on DPC was assessed using exergaming trials recorded by Kinect at baseline and every other week. Visualization of the results suggests that the probabilistic model is suitable for real-time DPC assessment. Moreover, linear mixed model analysis and parametric bootstrapping suggest a significant intervention effect on DPC. In conclusion, these results suggest that unsupervised exergaming for improving DPC among older adults is indeed feasible and that probabilistic models could be a new approach to assess DPC. Copyright © 2017 Elsevier B.V. All rights reserved.
Poisson-Lie T-duality of string effective actions: A new approach to the dilaton puzzle
Czech Academy of Sciences Publication Activity Database
Jurčo, B.; Vysoký, Jan
2018-01-01
Roč. 130, August (2018), s. 1-26 ISSN 0393-0440 Institutional support: RVO:67985840 Keywords : Poisson-Lie T-duality * string effective actions * dilaton field Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 0.819, year: 2016 https://www.sciencedirect.com/science/article/pii/S0393044018301748?via%3Dihub
Poisson-Lie T-duality of string effective actions: A new approach to the dilaton puzzle
Czech Academy of Sciences Publication Activity Database
Jurčo, B.; Vysoký, Jan
2018-01-01
Roč. 130, August (2018), s. 1-26 ISSN 0393-0440 Institutional support: RVO:67985840 Keywords : Poisson-Lie T- dual ity * string effective actions * dilaton field Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 0.819, year: 2016 https://www.sciencedirect.com/science/article/pii/S0393044018301748?via%3Dihub
A probabilistic approach to the computation of the levelized cost of electricity
International Nuclear Information System (INIS)
Geissmann, Thomas
2017-01-01
This paper sets forth a novel approach to calculate the levelized cost of electricity (LCOE) using a probabilistic model that accounts for endogenous input parameters. The approach is applied to the example of a nuclear and gas power project. Monte Carlo simulation results show that a correlation between input parameters has a significant effect on the model outcome. By controlling for endogeneity, a statistically significant difference in the mean LCOE estimate and a change in the order of input leverages is observed. Moreover, the paper discusses the role of discounting options and external costs in detail. In contrast to the gas power project, the economic viability of the nuclear project is considerably weaker. - Highlights: • First model of levelized cost of electricity accounting for uncertainty and endogeneities in input parameters. • Allowance for endogeneities significantly affects results. • Role of discounting options and external costs is discussed and modelled.
International Nuclear Information System (INIS)
Kim, Kunsoo; Gao, Hang
1996-01-01
A probabilistic approach is proposed for the characterization of host rock mechanical properties at the Yucca Mountain site. This approach helps define the probability distribution of rock properties by utilizing extreme value statistics and Monte Carlo simulation. We analyze mechanical property data of tuff obtained by the NNWSI Project to assess the utility of the methodology. The analysis indicates that laboratory measured strength and deformation data of Calico Hills and Bullfrog tuffs follow an extremal. probability distribution (the third type asymptotic distribution of the smallest values). Monte Carlo simulation is carried out to estimate rock mass deformation moduli using a one-dimensional tuff model proposed by Zimmermann and Finley. We suggest that the results of these analyses be incorporated into the repository design
International Nuclear Information System (INIS)
Han, Sang Min; Lee, Seung Min; Yim, Ho Bin; Seong, Poong Hyun
2018-01-01
Highlights: •We proposed a Probabilistic Safety Culture Healthiness Evaluation Method. •Positive relationship between the ‘success’ states of NSC and performance was shown. •The state probability profile showed a unique ratio regardless of the scenarios. •Cutset analysis provided not only root causes but also the latent causes of failures. •Pro-SCHEMe was found to be applicable to Korea NPPs. -- Abstract: The aim of this study is to propose a new quantitative evaluation method for Nuclear Safety Culture (NSC) in Nuclear Power Plant (NPP) operation teams based on the probabilistic approach. Various NSC evaluation methods have been developed, and the Korea NPP utility company has conducted the NSC assessment according to international practice. However, most of methods are conducted by interviews, observations, and the self-assessment. Consequently, the results are often qualitative, subjective, and mainly dependent on evaluator’s judgement, so the assessment results can be interpreted from different perspectives. To resolve limitations of present evaluation methods, the concept of Safety Culture Healthiness was suggested to produce quantitative results and provide faster evaluation process. This paper presents Probabilistic Safety Culture Healthiness Evaluation Method (Pro-SCHEMe) to generate quantitative inputs for Human Reliability Assessment (HRA) in Probabilistic Safety Assessment (PSA). Evaluation items which correspond to a basic event in PSA are derived in the first part of the paper through the literature survey; mostly from nuclear-related organizations such as the International Atomic Energy Agency (IAEA), the United States Nuclear Regulatory Commission (U.S.NRC), and the Institute of Nuclear Power Operations (INPO). Event trees (ETs) and fault trees (FTs) are devised to apply evaluation items to PSA based on the relationships among such items. The Modeling Guidelines are also suggested to classify and calculate NSC characteristics of
Retention and Curve Number Variability in a Small Agricultural Catchment: The Probabilistic Approach
Directory of Open Access Journals (Sweden)
Kazimierz Banasik
2014-04-01
Full Text Available The variability of the curve number (CN and the retention parameter (S of the Soil Conservation Service (SCS-CN method in a small agricultural, lowland watershed (23.4 km2 to the gauging station in central Poland has been assessed using the probabilistic approach: distribution fitting and confidence intervals (CIs. Empirical CNs and Ss were computed directly from recorded rainfall depths and direct runoff volumes. Two measures of the goodness of fit were used as selection criteria in the identification of the parent distribution function. The measures specified the generalized extreme value (GEV, normal and general logistic (GLO distributions for 100-CN and GLO, lognormal and GEV distributions for S. The characteristics estimated from theoretical distribution (median, quantiles were compared to the tabulated CN and to the antecedent runoff conditions of Hawkins and Hjelmfelt. The distribution fitting for the whole sample revealed a good agreement between the tabulated CN and the median and between the antecedent runoff conditions (ARCs of Hawkins and Hjelmfelt, which certified a good calibration of the model. However, the division of the CN sample due to heavy and moderate rainfall depths revealed a serious inconsistency between the parameters mentioned. This analysis proves that the application of the SCS-CN method should rely on deep insight into the probabilistic properties of CN and S.
Kumar, Rajesh; Srivastava, Subodh; Srivastava, Rajeev
2017-07-01
For cancer detection from microscopic biopsy images, image segmentation step used for segmentation of cells and nuclei play an important role. Accuracy of segmentation approach dominate the final results. Also the microscopic biopsy images have intrinsic Poisson noise and if it is present in the image the segmentation results may not be accurate. The objective is to propose an efficient fuzzy c-means based segmentation approach which can also handle the noise present in the image during the segmentation process itself i.e. noise removal and segmentation is combined in one step. To address the above issues, in this paper a fourth order partial differential equation (FPDE) based nonlinear filter adapted to Poisson noise with fuzzy c-means segmentation method is proposed. This approach is capable of effectively handling the segmentation problem of blocky artifacts while achieving good tradeoff between Poisson noise removals and edge preservation of the microscopic biopsy images during segmentation process for cancer detection from cells. The proposed approach is tested on breast cancer microscopic biopsy data set with region of interest (ROI) segmented ground truth images. The microscopic biopsy data set contains 31 benign and 27 malignant images of size 896 × 768. The region of interest selected ground truth of all 58 images are also available for this data set. Finally, the result obtained from proposed approach is compared with the results of popular segmentation algorithms; fuzzy c-means, color k-means, texture based segmentation, and total variation fuzzy c-means approaches. The experimental results shows that proposed approach is providing better results in terms of various performance measures such as Jaccard coefficient, dice index, Tanimoto coefficient, area under curve, accuracy, true positive rate, true negative rate, false positive rate, false negative rate, random index, global consistency error, and variance of information as compared to other
International Nuclear Information System (INIS)
Rydl, A.
2007-01-01
The contribution of radioiodine to risk from a severe accident is recognized to be one of the highest among all the fission products. In a long term (e.g. several days), volatile species of iodine are the most important forms of iodine from the safety point of view. These volatile forms ('volatile iodine') are mainly molecular iodine, I 2 , and various types of organic iodides, RI. A certain controversy exist today among the international research community about the relative importance of the processes leading to volatile iodine formation in containment under severe accident conditions. The amount of knowledge, coming from experiments, of the phenomenology of iodine behavior is enormous and it is embedded in specialized mechanistic or empirical codes. An exhaustive description of the processes governing the iodine behavior in containment is given in reference 1. Yet, all this knowledge is still not enough to resolve some important questions. Moreover, the results of different codes -when applied to relatively simple experiments, such as RTF or CAIMAN - vary widely. Thus, as a complement (or maybe even as an alternative in some instances) to deterministic analyses of iodine behavior, simple probabilistic approach is proposed in this work which could help to see the whole problem in a different perspective. The final goal of using this approach should be the characterization of uncertainties of the description of various processes in question. This would allow for identification of the processes which contribute most significantly to the overall uncertainty of the predictions of iodine volatility in containment. In this work we made a dedicated, small event tree to describe iodine behavior at an accident and we used that tree for a simple sensitivity study. For the evaluation of the tree, the US NRC code EVNTRE was used. To test the proposed probabilistic approach we analyzed results of the integral PHEBUS FPT1 experiment which comprises most of the important
David, McInerney; Mark, Thyer; Dmitri, Kavetski; George, Kuczera
2017-04-01
This study provides guidance to hydrological researchers which enables them to provide probabilistic predictions of daily streamflow with the best reliability and precision for different catchment types (e.g. high/low degree of ephemerality). Reliable and precise probabilistic prediction of daily catchment-scale streamflow requires statistical characterization of residual errors of hydrological models. It is commonly known that hydrological model residual errors are heteroscedastic, i.e. there is a pattern of larger errors in higher streamflow predictions. Although multiple approaches exist for representing this heteroscedasticity, few studies have undertaken a comprehensive evaluation and comparison of these approaches. This study fills this research gap by evaluating 8 common residual error schemes, including standard and weighted least squares, the Box-Cox transformation (with fixed and calibrated power parameter, lambda) and the log-sinh transformation. Case studies include 17 perennial and 6 ephemeral catchments in Australia and USA, and two lumped hydrological models. We find the choice of heteroscedastic error modelling approach significantly impacts on predictive performance, though no single scheme simultaneously optimizes all performance metrics. The set of Pareto optimal schemes, reflecting performance trade-offs, comprises Box-Cox schemes with lambda of 0.2 and 0.5, and the log scheme (lambda=0, perennial catchments only). These schemes significantly outperform even the average-performing remaining schemes (e.g., across ephemeral catchments, median precision tightens from 105% to 40% of observed streamflow, and median biases decrease from 25% to 4%). Theoretical interpretations of empirical results highlight the importance of capturing the skew/kurtosis of raw residuals and reproducing zero flows. Recommendations for researchers and practitioners seeking robust residual error schemes for practical work are provided.
Wan, Wai-Yin; Chan, Jennifer S K
2009-08-01
For time series of count data, correlated measurements, clustering as well as excessive zeros occur simultaneously in biomedical applications. Ignoring such effects might contribute to misleading treatment outcomes. A generalized mixture Poisson geometric process (GMPGP) model and a zero-altered mixture Poisson geometric process (ZMPGP) model are developed from the geometric process model, which was originally developed for modelling positive continuous data and was extended to handle count data. These models are motivated by evaluating the trend development of new tumour counts for bladder cancer patients as well as by identifying useful covariates which affect the count level. The models are implemented using Bayesian method with Markov chain Monte Carlo (MCMC) algorithms and are assessed using deviance information criterion (DIC).
Probabilistic approach to requalification of existing NPPs under aircraft crash loading
International Nuclear Information System (INIS)
Birbraer, A.N.; Roleder, A.J.; Shulman, G.S.
1993-01-01
A probabilistic approach to the analysis of NPP safety under aircraft impact is discussed. It may be used both for requalification of existing NPPs and in the process of NPP design. NPP is considered as a system of components: structures, pipes, different kinds of equipment, soil, foundation. The exceeding of the limit probability of the radioactive products release out of containment (i.e. of the NPP safety requirements non-fulfilment) is taken as a system failure criterion. An example of an event tree representing the consequence of events causing the failure is given. Described are the methods of estimate of elementary events probabilities through which a composite probability of the failure is evaluated. (author)
A Heuristic Probabilistic Approach to Estimating Size-Dependent Mobility of Nonuniform Sediment
Woldegiorgis, B. T.; Wu, F. C.; van Griensven, A.; Bauwens, W.
2017-12-01
Simulating the mechanism of bed sediment mobility is essential for modelling sediment dynamics. Despite the fact that many studies are carried out on this subject, they use complex mathematical formulations that are computationally expensive, and are often not easy for implementation. In order to present a simple and computationally efficient complement to detailed sediment mobility models, we developed a heuristic probabilistic approach to estimating the size-dependent mobilities of nonuniform sediment based on the pre- and post-entrainment particle size distributions (PSDs), assuming that the PSDs are lognormally distributed. The approach fits a lognormal probability density function (PDF) to the pre-entrainment PSD of bed sediment and uses the threshold particle size of incipient motion and the concept of sediment mixture to estimate the PSDs of the entrained sediment and post-entrainment bed sediment. The new approach is simple in physical sense and significantly reduces the complexity and computation time and resource required by detailed sediment mobility models. It is calibrated and validated with laboratory and field data by comparing to the size-dependent mobilities predicted with the existing empirical lognormal cumulative distribution function (CDF) approach. The novel features of the current approach are: (1) separating the entrained and non-entrained sediments by a threshold particle size, which is a modified critical particle size of incipient motion by accounting for the mixed-size effects, and (2) using the mixture-based pre- and post-entrainment PSDs to provide a continuous estimate of the size-dependent sediment mobility.
Numerical probabilistic analysis for slope stability in fractured rock masses using DFN-DEM approach
Directory of Open Access Journals (Sweden)
Alireza Baghbanan
2017-06-01
Full Text Available Due to existence of uncertainties in input geometrical properties of fractures, there is not any unique solution for assessing the stability of slopes in jointed rock masses. Therefore, the necessity of applying probabilistic analysis in these cases is inevitable. In this study a probabilistic analysis procedure together with relevant algorithms are developed using Discrete Fracture Network-Distinct Element Method (DFN-DEM approach. In the right abutment of Karun 4 dam and downstream of the dam body, five joint sets and one major joint have been identified. According to the geometrical properties of fractures in Karun river valley, instability situations are probable in this abutment. In order to evaluate the stability of the rock slope, different combinations of joint set geometrical parameters are selected, and a series of numerical DEM simulations are performed on generated and validated DFN models in DFN-DEM approach to measure minimum required support patterns in dry and saturated conditions. Results indicate that the distribution of required bolt length is well fitted with a lognormal distribution in both circumstances. In dry conditions, the calculated mean value is 1125.3 m, and more than 80 percent of models need only 1614.99 m of bolts which is a bolt pattern with 2 m spacing and 12 m length. However, as for the slopes with saturated condition, the calculated mean value is 1821.8 m, and more than 80 percent of models need only 2653.49 m of bolts which is equivalent to a bolt pattern with 15 m length and 1.5 m spacing. Comparison between obtained results with numerical and empirical method show that investigation of a slope stability with different DFN realizations which conducted in different block patterns is more efficient than the empirical methods.
Directory of Open Access Journals (Sweden)
Goker Erdogan
2015-11-01
Full Text Available People learn modality-independent, conceptual representations from modality-specific sensory signals. Here, we hypothesize that any system that accomplishes this feat will include three components: a representational language for characterizing modality-independent representations, a set of sensory-specific forward models for mapping from modality-independent representations to sensory signals, and an inference algorithm for inverting forward models-that is, an algorithm for using sensory signals to infer modality-independent representations. To evaluate this hypothesis, we instantiate it in the form of a computational model that learns object shape representations from visual and/or haptic signals. The model uses a probabilistic grammar to characterize modality-independent representations of object shape, uses a computer graphics toolkit and a human hand simulator to map from object representations to visual and haptic features, respectively, and uses a Bayesian inference algorithm to infer modality-independent object representations from visual and/or haptic signals. Simulation results show that the model infers identical object representations when an object is viewed, grasped, or both. That is, the model's percepts are modality invariant. We also report the results of an experiment in which different subjects rated the similarity of pairs of objects in different sensory conditions, and show that the model provides a very accurate account of subjects' ratings. Conceptually, this research significantly contributes to our understanding of modality invariance, an important type of perceptual constancy, by demonstrating how modality-independent representations can be acquired and used. Methodologically, it provides an important contribution to cognitive modeling, particularly an emerging probabilistic language-of-thought approach, by showing how symbolic and statistical approaches can be combined in order to understand aspects of human perception.
Probabilistic Logic and Probabilistic Networks
Haenni, R.; Romeijn, J.-W.; Wheeler, G.; Williamson, J.
2009-01-01
While in principle probabilistic logics might be applied to solve a range of problems, in practice they are rarely applied at present. This is perhaps because they seem disparate, complicated, and computationally intractable. However, we shall argue in this programmatic paper that several approaches
2012-05-17
... NUCLEAR REGULATORY COMMISSION [NRC-2012-0110] An Approach for Probabilistic Risk Assessment in Risk-Informed Decisions on Plant-Specific Changes to the Licensing Basis AGENCY: Nuclear Regulatory Commission. ACTION: Draft regulatory guide; request for comment. SUMMARY: The U.S. Nuclear Regulatory...
International Nuclear Information System (INIS)
Baek, M.; Lee, S.K.; Lee, U.C.; Kang, C.S.
1996-01-01
A probabilistic approach is formulated to assess the internal radiation exposure following the intake of radioisotopes. This probabilistic approach consists of 4 steps as follows: (1) screening, (2) quantification of uncertainties, (3) propagation of uncertainties, and (4) analysis of output. The approach has been applied for Pu-induced internal dose assessment and a multi-compartment dosimetric model is used for internal transport. In this approach, surrogate models of original system are constructed using response and neural network. And the results of these surrogate models are compared with those of original model. Each surrogate model well approximates the original model. The uncertainty and sensitivity analysis of the model parameters are evaluated in this process. Dominant contributors to each organ are identified and the results show that this approach could serve a good tool of assessing the internal radiation exposure
Probabilistic approach to the prediction of radioactive contamination of agricultural production
International Nuclear Information System (INIS)
Fesenko, S.F.; Chernyaeva, L.G.; Sanzharova, N.I.; Aleksakhin, R.M.
1993-01-01
The organization of agricultural production on the territory contaminated as a result of the Chernobyl reactor disaster involves prediction of the content of radionuclides in agro-industrial products. Traditional methods of prediting the contamination in the products does not give sufficient agreement with actual data and as a result it is difficult to make the necessary decisions about eliminating the consequences of the disaster in the agro-industrial complex. In many ways this is because the available methods are based on data on the radionuclide content in soils, plants, and plant and animal products. The parameters of the models used in the prediction are also evaluated on the basis of these results. Even if obtained from a single field or herd of livestock, however, such indicators have substantial variation coefficients due to various factors such as the spatial structure of the fallouts, the variability of the soil properties, the sampling error, the errors of processing and measuring the samples, and well as the data-averaging error. Consequently the parameters of the radionuclide transfer along the agricultural chains are very variable, thus considerably reducing the reliability of predicted values. The reliability of the prediction of radioactive contamination of agricultural products can be increased substantially by taking a probabilistic approach involving information about the random laws of contamination of farming land and the statistical features of the parameters of radionuclie migration along food chains. Considering the above, comparative analysis is made of the results obtained on the basis of the traditional treatment (deterministic in the simplest form) and its probabilistic analog
Probabilistic methods in exotic option pricing
Anderluh, J.H.M.
2007-01-01
The thesis presents three ways of calculating the Parisian option price as an illustration of probabilistic methods in exotic option pricing. Moreover options on commidities are considered and double-sided barrier options in a compound Poisson framework.
A probabilistic approach for optimal sensor allocation in structural health monitoring
International Nuclear Information System (INIS)
Azarbayejani, M; Reda Taha, M M; El-Osery, A I; Choi, K K
2008-01-01
Recent advances in sensor technology promote using large sensor networks to efficiently and economically monitor, identify and quantify damage in structures. In structural health monitoring (SHM) systems, the effectiveness and reliability of the sensor network are crucial to determine the optimal number and locations of sensors in SHM systems. Here, we suggest a probabilistic approach for identifying the optimal number and locations of sensors for SHM. We demonstrate a methodology to establish the probability distribution function that identifies the optimal sensor locations such that damage detection is enhanced. The approach is based on using the weights of a neural network trained from simulations using a priori knowledge about damage locations and damage severities to generate a normalized probability distribution function for optimal sensor allocation. We also demonstrate that the optimal sensor network can be related to the highest probability of detection (POD). The redundancy of the proposed sensor network is examined using a 'leave one sensor out' analysis. A prestressed concrete bridge is selected as a case study to demonstrate the effectiveness of the proposed method. The results show that the proposed approach can provide a robust design for sensor networks that are more efficient than a uniform distribution of sensors on a structure
Directory of Open Access Journals (Sweden)
Denise Margareth Kazue Nishimura Kunitaki
2008-01-01
Full Text Available The “torpedo” pile is a foundation system that has been recently considered to anchor mooring lines and risers of floating production systems for offshore oil exploitation. The pile is installed in a free fall operation from a vessel. However, the soil parameters involved in the penetration model of the torpedo pile contain uncertainties that can affect the precision of analysis methods to evaluate its final penetration depth. Therefore, this paper deals with methodologies for the assessment of the sensitivity of the response to the variation of the uncertain parameters and mainly to incorporate into the analysis method techniques for the formal treatment of the uncertainties. Probabilistic and “possibilistic” approaches are considered, involving, respectively, the Monte Carlo method (MC and concepts of fuzzy arithmetic (FA. The results and performance of both approaches are compared, stressing the ability of the latter approach to efficiently deal with the uncertainties of the model, with outstanding computational efficiency, and therefore, to comprise an effective design tool.
A probabilistic approach to the management of multi-stage multicriteria process
Directory of Open Access Journals (Sweden)
Yu. V. Bugaev
2017-01-01
Full Text Available Currently, any production process is viewed as the primary means of profit and competitiveness, in other words, becomes the dominant process approach. In this approach, the final product production appears network of interconnected processing steps during which the conversion of inputs into outputs, with a stable, accurate executable, high process most efficiently and cost-effectively provides a planned quality. An example is the organization of bread production. For the modern period is characterized by the classical recovery technology that allows to improve the palatability of bread, enhance its flavor, longer-lasting freshness. Baking is a process to be controlled in order to obtain the required quality parameters of the final product. One of the new and promising methods of quality management processes is a probabilistic method to determine the increase in the probability of release of quality products within the resources allocated for measures to improve the quality level. The paper was applied a quality management concept is based on a probabilistic approach for the multi-step process, which consists in the fact that as one of the main criteria adopted by the probability of release of high-quality products. However, it is obvious that the implementation of certain measures for its improvement requires the connection of certain resources that, first of all, is inevitably associated with certain cash costs. Thus, we arrive at an optimal control problem, which has at least two criteria - probability qualitative completion of the process, which should be maximized and the total costs of the corrective measures that need to be minimized. The authors of the idealized model of optimal control has been developed for the case when a single event affects only a single step. a special case of vector Uorshall-Floyd algorithm was used to optimize the structure of a multi-step process. The use of vector optimization on graphs allowed the authors to
Probabilistic reasoning in data analysis.
Sirovich, Lawrence
2011-09-20
This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.
International Nuclear Information System (INIS)
Swain, A.D.
1986-01-01
The approach to human reliability analysis (HRA) known as THERP/Handbook has been applied to several probabilistic risk assessments (PRAs) of nuclear power plants (NPPs) and other complex systems. The approach is based on a thorough task analysis of the man-machine interfaces, including the interactions among the people, involved in the operations being assessed. The idea is to assess fully the underlying performance shaping factors (PSFs) and dependence effects which result either in reliable or unreliable human performance
Fraldi, M.; Perrella, G.; Ciervo, M.; Bosia, F.; Pugno, N. M.
2017-09-01
Very recently, a Weibull-based probabilistic strategy has been successfully applied to bundles of wires to determine their overall stress-strain behaviour, also capturing previously unpredicted nonlinear and post-elastic features of hierarchical strands. This approach is based on the so-called "Equal Load Sharing (ELS)" hypothesis by virtue of which, when a wire breaks, the load acting on the strand is homogeneously redistributed among the surviving wires. Despite the overall effectiveness of the method, some discrepancies between theoretical predictions and in silico Finite Element-based simulations or experimental findings might arise when more complex structures are analysed, e.g. helically arranged bundles. To overcome these limitations, an enhanced hybrid approach is proposed in which the probability of rupture is combined with a deterministic mechanical model of a strand constituted by helically-arranged and hierarchically-organized wires. The analytical model is validated comparing its predictions with both Finite Element simulations and experimental tests. The results show that generalized stress-strain responses - incorporating tension/torsion coupling - are naturally found and, once one or more elements break, the competition between geometry and mechanics of the strand microstructure, i.e. the different cross sections and helical angles of the wires in the different hierarchical levels of the strand, determines the no longer homogeneous stress redistribution among the surviving wires whose fate is hence governed by a "Hierarchical Load Sharing" criterion.
Mbaya, Timmy
Embedded Aerospace Systems have to perform safety and mission critical operations in a real-time environment where timing and functional correctness are extremely important. Guidance, Navigation, and Control (GN&C) systems substantially rely on complex software interfacing with hardware in real-time; any faults in software or hardware, or their interaction could result in fatal consequences. Integrated Software Health Management (ISWHM) provides an approach for detection and diagnosis of software failures while the software is in operation. The ISWHM approach is based on probabilistic modeling of software and hardware sensors using a Bayesian network. To meet memory and timing constraints of real-time embedded execution, the Bayesian network is compiled into an Arithmetic Circuit, which is used for on-line monitoring. This type of system monitoring, using an ISWHM, provides automated reasoning capabilities that compute diagnoses in a timely manner when failures occur. This reasoning capability enables time-critical mitigating decisions and relieves the human agent from the time-consuming and arduous task of foraging through a multitude of isolated---and often contradictory---diagnosis data. For the purpose of demonstrating the relevance of ISWHM, modeling and reasoning is performed on a simple simulated aerospace system running on a real-time operating system emulator, the OSEK/Trampoline platform. Models for a small satellite and an F-16 fighter jet GN&C (Guidance, Navigation, and Control) system have been implemented. Analysis of the ISWHM is then performed by injecting faults and analyzing the ISWHM's diagnoses.
Application of a time probabilistic approach to seismic landslide hazard estimates in Iran
Rajabi, A. M.; Del Gaudio, V.; Capolongo, D.; Khamehchiyan, M.; Mahdavifar, M. R.
2009-04-01
Iran is a country located in a tectonic active belt and is prone to earthquake and related phenomena. In the recent years, several earthquakes caused many fatalities and damages to facilities, e.g. the Manjil (1990), Avaj (2002), Bam (2003) and Firuzabad-e-Kojur (2004) earthquakes. These earthquakes generated many landslides. For instance, catastrophic landslides triggered by the Manjil Earthquake (Ms = 7.7) in 1990 buried the village of Fatalak, killed more than 130 peoples and cut many important road and other lifelines, resulting in major economic disruption. In general, earthquakes in Iran have been concentrated in two major zones with different seismicity characteristics: one is the region of Alborz and Central Iran and the other is the Zagros Orogenic Belt. Understanding where seismically induced landslides are most likely to occur is crucial in reducing property damage and loss of life in future earthquakes. For this purpose a time probabilistic approach for earthquake-induced landslide hazard at regional scale, proposed by Del Gaudio et al. (2003), has been applied to the whole Iranian territory to provide the basis of hazard estimates. This method consists in evaluating the recurrence of seismically induced slope failure conditions inferred from the Newmark's model. First, by adopting Arias Intensity to quantify seismic shaking and using different Arias attenuation relations for Alborz - Central Iran and Zagros regions, well-established methods of seismic hazard assessment, based on the Cornell (1968) method, were employed to obtain the occurrence probabilities for different levels of seismic shaking in a time interval of interest (50 year). Then, following Jibson (1998), empirical formulae specifically developed for Alborz - Central Iran and Zagros, were used to represent, according to the Newmark's model, the relation linking Newmark's displacement Dn to Arias intensity Ia and to slope critical acceleration ac. These formulae were employed to evaluate
Directory of Open Access Journals (Sweden)
S. Raia
2014-03-01
Full Text Available Distributed models to forecast the spatial and temporal occurrence of rainfall-induced shallow landslides are based on deterministic laws. These models extend spatially the static stability models adopted in geotechnical engineering, and adopt an infinite-slope geometry to balance the resisting and the driving forces acting on the sliding mass. An infiltration model is used to determine how rainfall changes pore-water conditions, modulating the local stability/instability conditions. A problem with the operation of the existing models lays in the difficulty in obtaining accurate values for the several variables that describe the material properties of the slopes. The problem is particularly severe when the models are applied over large areas, for which sufficient information on the geotechnical and hydrological conditions of the slopes is not generally available. To help solve the problem, we propose a probabilistic Monte Carlo approach to the distributed modeling of rainfall-induced shallow landslides. For this purpose, we have modified the transient rainfall infiltration and grid-based regional slope-stability analysis (TRIGRS code. The new code (TRIGRS-P adopts a probabilistic approach to compute, on a cell-by-cell basis, transient pore-pressure changes and related changes in the factor of safety due to rainfall infiltration. Infiltration is modeled using analytical solutions of partial differential equations describing one-dimensional vertical flow in isotropic, homogeneous materials. Both saturated and unsaturated soil conditions can be considered. TRIGRS-P copes with the natural variability inherent to the mechanical and hydrological properties of the slope materials by allowing values of the TRIGRS model input parameters to be sampled randomly from a given probability distribution. The range of variation and the mean value of the parameters can be determined by the usual methods used for preparing the TRIGRS input parameters. The outputs
A probabilistic approach for the interpretation of RNA profiles as cell type evidence.
de Zoete, Jacob; Curran, James; Sjerps, Marjan
2016-01-01
DNA profiles can be used as evidence to distinguish between possible donors of a crime stain. In some cases, both the prosecution and the defence claim that the cell material was left by the suspect but they dispute which cell type was left behind. For example, in sexual offense cases the prosecution could claim that the sample contains semen cells where the defence argues that the sample contains skin cells. In these cases, traditional methods (e.g. a phosphatase test) can be used to examine the cell type contained in the sample. However, there are some drawbacks when using these methods. For instance, many of these techniques need to be carried out separately for each cell type and each of them requires part of the available sample, which reduces the amount that can be used for DNA analysis. Another option is messenger RNA (mRNA) evidence. mRNA expression levels vary among cell types and can be used to make (probability) statements about the cell type(s) present in a sample. Existing methods for the interpretation of RNA profiles as evidence for the presence of certain cell types aim at making categorical statements. Such statements limit the possibility to report the associated uncertainty. Some of these existing methods will be discussed. Most notably, a method based on a 'n/2' scoring rule (Lindenbergh et al.) and a method using marker values and cell type scoring thresholds (Roeder et al.). From a statistical point of view, a probabilistic approach is the most obvious choice. Two approaches (multinomial logistic regression and naïve Bayes') are suggested. All methods are compared, using two different datasets and several criteria regarding their ability to assess the evidential value of RNA profiles. We conclude that both the naïve Bayes' method and a method based on multinomial logistic regression, that produces a probabilistic statement as measure of the evidential value, are an important improvement of the existing methods. Besides a better performance
Ciffroy, Philippe; Charlatchka, Rayna; Ferreira, Daniel; Marang, Laura
2013-07-01
The biotic ligand model (BLM) theoretically enables the derivation of environmental quality standards that are based on true bioavailable fractions of metals. Several physicochemical variables (especially pH, major cations, dissolved organic carbon, and dissolved metal concentrations) must, however, be assigned to run the BLM, but they are highly variable in time and space in natural systems. This article describes probabilistic approaches for integrating such variability during the derivation of risk indexes. To describe each variable using a probability density function (PDF), several methods were combined to 1) treat censored data (i.e., data below the limit of detection), 2) incorporate the uncertainty of the solid-to-liquid partitioning of metals, and 3) detect outliers. From a probabilistic perspective, 2 alternative approaches that are based on log-normal and Γ distributions were tested to estimate the probability of the predicted environmental concentration (PEC) exceeding the predicted non-effect concentration (PNEC), i.e., p(PEC/PNEC>1). The probabilistic approach was tested on 4 real-case studies based on Cu-related data collected from stations on the Loire and Moselle rivers. The approach described in this article is based on BLM tools that are freely available for end-users (i.e., the Bio-Met software) and on accessible statistical data treatments. This approach could be used by stakeholders who are involved in risk assessments of metals for improving site-specific studies. Copyright © 2013 SETAC.
(Quasi-)Poisson enveloping algebras
Yang, Yan-Hong; Yao, Yuan; Ye, Yu
2010-01-01
We introduce the quasi-Poisson enveloping algebra and Poisson enveloping algebra for a non-commutative Poisson algebra. We prove that for a non-commutative Poisson algebra, the category of quasi-Poisson modules is equivalent to the category of left modules over its quasi-Poisson enveloping algebra, and the category of Poisson modules is equivalent to the category of left modules over its Poisson enveloping algebra.
A Probabilistic Alternative Approach to Optimal Project Profitability Based on the Value-at-Risk
Directory of Open Access Journals (Sweden)
Yonggu Kim
2018-03-01
Full Text Available This paper focuses on an investment decision-making process for sustainable development based on the profitability impact factors for overseas projects. Investors prefer to use the discounted cash-flow method. Although this method is simple and straightforward, its critical weakness is its inability to reflect the factor volatility associated with the project evaluation. To overcome this weakness, the Value-at-Risk method is used to apply the volatility of the profitability impact factors, thereby reflecting the risks and establishing decision-making criteria for risk-averse investors. Risk-averse investors can lose relatively acceptable investment opportunities to risk-neutral or risk-amenable investors due to strict investment decision-making criteria. To overcome this problem, critical factors are selected through a Monte Carlo simulation and a sensitivity analysis, and solutions to the critical-factor problems are then found by using the Theory of Inventive Problem Solving and a business version of the Project Definition Rating Index. This study examines the process of recovering investment opportunities with projects that are investment feasible and that have been rejected when applying the criterion of the Value-at-Risk method. To do this, a probabilistic alternative approach is taken. To validate this methodology, the proposed framework for an improved decision-making process is demonstrated using two actual overseas projects of a Korean steel-making company.
Voie, Øyvind Albert; Johnsen, Arnt; Strømseng, Arnljot; Longva, Kjetil Sager
2010-03-15
White phosphorus (P(4)) is a highly toxic compound used in various pyrotechnic products. Ammunitions containing P(4) are widely used in military training areas where the unburned products of P(4) contaminate soil and local ponds. Traditional risk assessment methods presuppose a homogeneous spatial distribution of pollutants. The distribution of P(4) in military training areas is heterogeneous, which reduces the probability of potential receptors being exposed to the P(4) by ingestion, for example. The current approach to assess the environmental risk from the use of P(4) suggests a Bayesian network (Bn) as a risk assessment tool. The probabilistic reasoning supported by a Bn allows us to take into account the heterogeneous distribution of P(4). Furthermore, one can combine empirical data and expert knowledge, which allows the inclusion of all kinds of data that are relevant to the problem. The current work includes an example of the use of the Bn as a risk assessment tool where the risk for P(4) poisoning in humans and grazing animals at a military shooting range in Northern Norway was calculated. P(4) was detected in several craters on the range at concentrations up to 5.7g/kg. The risk to human health was considered acceptable under the current land use. The risk for grazing animals such as sheep, however, was higher, suggesting that precautionary measures may be advisable.
A robust probabilistic approach for variational inversion in shallow water acoustic tomography
International Nuclear Information System (INIS)
Berrada, M; Badran, F; Crépon, M; Thiria, S; Hermand, J-P
2009-01-01
This paper presents a variational methodology for inverting shallow water acoustic tomography (SWAT) measurements. The aim is to determine the vertical profile of the speed of sound c(z), knowing the acoustic pressures generated by a frequency source and collected by a sparse vertical hydrophone array (VRA). A variational approach that minimizes a cost function measuring the distance between observations and their modeled equivalents is used. A regularization term in the form of a quadratic restoring term to a background is also added. To avoid inverting the variance–covariance matrix associated with the above-weighted quadratic background, this work proposes to model the sound speed vector using probabilistic principal component analysis (PPCA). The PPCA introduces an optimum reduced number of non-correlated latent variables η, which determine a new control vector and a new regularization term, expressed as η T η. The PPCA represents a rigorous formalism for the use of a priori information and allows an efficient implementation of the variational inverse method
International Nuclear Information System (INIS)
Vincent, L.
2012-01-01
The present study deals with the long-term mechanical behaviour and damage of structural materials in nuclear power plants. An experimental way is first followed to study the thermal fatigue of austenitic stainless steels with a focus on the effects of mean stress and bi-axiality. Furthermore, the measurement of displacement fields by Digital Image Correlation techniques has been successfully used to detect early crack initiation during high cycle fatigue tests. A probabilistic model based on the shielding zones surrounding existing cracks is proposed to describe the development of crack networks. A more numeric way is then followed to study the embrittlement consequences of the irradiation hardening of the bainitic steel constitutive of nuclear pressure vessels. A crystalline plasticity law, developed in agreement with lower scale results (Dislocation Dynamics), is introduced in a Finite Element code in order to run simulations on aggregates and obtain the distributions of the maximum principal stress inside a Representative Volume Element. These distributions are then used to improve the classical Local Approach to Fracture which estimates the probability for a microstructural defect to be loaded up to a critical level. (author) [fr
Sensitivity analysis on uncertainty variables affecting the NPP's LUEC with probabilistic approach
International Nuclear Information System (INIS)
Nuryanti; Akhmad Hidayatno; Erlinda Muslim
2013-01-01
One thing that is quite crucial to be reviewed prior to any investment decision on the nuclear power plant (NPP) project is the calculation of project economic, including calculation of Levelized Unit Electricity Cost (LUEC). Infrastructure projects such as NPP’s project are vulnerable to a number of uncertainty variables. Information on the uncertainty variables which makes LUEC’s value quite sensitive due to the changes of them is necessary in order the cost overrun can be avoided. Therefore this study aimed to do the sensitivity analysis on variables that affect LUEC with probabilistic approaches. This analysis was done by using Monte Carlo technique that simulate the relationship between the uncertainty variables and visible impact on LUEC. The sensitivity analysis result shows the significant changes on LUEC value of AP1000 and OPR due to the sensitivity of investment cost and capacity factors. While LUEC changes due to sensitivity of U 3 O 8 ’s price looks not quite significant. (author)
Improving PERSIANN-CCS rain estimation using probabilistic approach and multi-sensors information
Karbalaee, N.; Hsu, K. L.; Sorooshian, S.; Kirstetter, P.; Hong, Y.
2016-12-01
This presentation discusses the recent implemented approaches to improve the rainfall estimation from Precipitation Estimation from Remotely Sensed Information using Artificial Neural Network-Cloud Classification System (PERSIANN-CCS). PERSIANN-CCS is an infrared (IR) based algorithm being integrated in the IMERG (Integrated Multi-Satellite Retrievals for the Global Precipitation Mission GPM) to create a precipitation product in 0.1x0.1degree resolution over the chosen domain 50N to 50S every 30 minutes. Although PERSIANN-CCS has a high spatial and temporal resolution, it overestimates or underestimates due to some limitations.PERSIANN-CCS can estimate rainfall based on the extracted information from IR channels at three different temperature threshold levels (220, 235, and 253k). This algorithm relies only on infrared data to estimate rainfall indirectly from this channel which cause missing the rainfall from warm clouds and false estimation for no precipitating cold clouds. In this research the effectiveness of using other channels of GOES satellites such as visible and water vapors has been investigated. By using multi-sensors the precipitation can be estimated based on the extracted information from multiple channels. Also, instead of using the exponential function for estimating rainfall from cloud top temperature, the probabilistic method has been used. Using probability distributions of precipitation rates instead of deterministic values has improved the rainfall estimation for different type of clouds.
International Nuclear Information System (INIS)
Ali, S A; Kim, D-H; Cafaro, C; Giffin, A
2012-01-01
Information geometric techniques and inductive inference methods hold great promise for solving computational problems of interest in classical and quantum physics, especially with regard to complexity characterization of dynamical systems in terms of their probabilistic description on curved statistical manifolds. In this paper, we investigate the possibility of describing the macroscopic behavior of complex systems in terms of the underlying statistical structure of their microscopic degrees of freedom by the use of statistical inductive inference and information geometry. We review the maximum relative entropy formalism and the theoretical structure of the information geometrodynamical approach to chaos on statistical manifolds M S . Special focus is devoted to a description of the roles played by the sectional curvature K M S , the Jacobi field intensity J M S and the information geometrodynamical entropy S M S . These quantities serve as powerful information-geometric complexity measures of information-constrained dynamics associated with arbitrary chaotic and regular systems defined on M S . Finally, the application of such information-geometric techniques to several theoretical models is presented.
International Nuclear Information System (INIS)
Aman, S N A; Latif, Z Abd; Pradhan, B
2014-01-01
Landslide occurrence depends on various interrelating factors which consequently initiate to massive mass of soil and rock debris that move downhill due to the gravity action. LiDAR has come with a progressive approach in mitigating landslide by permitting the formation of more accurate DEM compared to other active space borne and airborne remote sensing techniques. The objective of this research is to assess the susceptibility of landslide in Ulu Klang area by investigating the correlation between past landslide events with geo environmental factors. A high resolution LiDAR DEM was constructed to produce topographic attributes such as slope, curvature and aspect. These data were utilized to derive second deliverables of landslide parameters such as topographic wetness index (TWI), surface area ratio (SAR) and stream power index (SPI) as well as NDVI generated from IKONOS imagery. Subsequently, a probabilistic based frequency ratio model was applied to establish the spatial relationship between the landslide locations and each landslide related factor. Factor ratings were summed up to obtain Landslide Susceptibility Index (LSI) to construct the landslide susceptibility map
Directory of Open Access Journals (Sweden)
Malika Virah-Sawmy
2015-07-01
By generating robust probabilistic baseline scenarios, exponential smoothing models can facilitate the effectiveness of REDD+ payments, support a more efficient allocation of scarce conservation resources, and improve our understanding of effective forest conservation investments, also beyond REDD+.
International Nuclear Information System (INIS)
Kusmierek, J.; Plachcinska, A.
1999-01-01
Background: The Bayesian (probabilistic) approach to the results of a diagnostic test appears to be more informative than an interpretation of results in binary terms (having disease or not). The aim of our study was the analysis of the effect of an optimised evaluation of myocardium perfusion scintigrams on the probability of CAD in individual patients. Methods: 197 patients (132 males and 65 females) suspected of CAD, with no history of myocardial infarction were examined. Scintigraphic images were evaluated applying two methods of analysis: visual (semiquantitative) and quantitative, and the combination of both. The sensitivity and specificity of both methods (and their combination) in the detection of CAD were determined and optimal methods of scintigram evaluation, separately for males and females, were selected. All patients were subjected to coronary angiography. The pre-test probability of CAD was assessed according to Diamond (1) and the post-test probability was evaluated in accordance with Bayes's theorem. Patients were divided, according to a pre-test probability of CAD, into 3 groups: with low, medium and high probability of the disease. The same subdivision was made in relation to post-test probability of CAD. The numbers of patients in respective subgroups, before and after the test, were compared. Moreover, in order to test the reliability of post-test probability, its values were compared with real percentages of CAD occurrence among the patients under study, as demonstrated by the angiography. Results: The combination of visual and quantitative methods was accepted as the optimal method of male scintigram evaluation (with sensitivity and specificity equalling 95% and 82%, respectively) and a sole quantitative analysis as the optimal method of female scintigram evaluation (sensitivity and specificity amounted to 81% and 84%, respectively). In the subgroup of males the percentage of individuals with medium pre-test CAD probability equalled 52 and
Registration of indoor TLS data: in favor of a probabilistic approach initialized by geo-location
International Nuclear Information System (INIS)
Hullo, Jean-Francois
2013-01-01
Many pre-maintenance operations of industrial facilities currently resort on to three dimensional CAD models. The acquisition of these models is performed from point clouds measured by Terrestrial Laser Scanning (TLS). When the scenes are complex, several view points for scanning, also known as stations, are necessary to ensure the completeness and the density of the survey data. The generation of a global point cloud, i.e. the expression of all the acquired data in a common reference frame, is a crucial step called registration. During this process, the pose parameters are estimated. If the GNSS Systems are now a solution for many outdoor scenes, the registration of indoor TLS data still remains a challenge. The goal of this thesis is to improve the acquisition process of TLS data in industrial environments. The aim is to guarantee the precision and accuracy of acquired data, while optimizing on-site acquisition time and protocols by, as often as possible, freeing the operator from the constraints inherent to conventional topography surveys. In a first part, we consider the state of the art of the means and methods used during the acquisition of dense point clouds of complex interior scenes (Part I). In a second part, we study and evaluate the data available for the registration: terrestrial laser data, primitive reconstruction algorithms in point clouds and indoor geo-location Systems (Part II). In the third part, we then formalize and experiment a registration algorithm based on the use of matched primitives, reconstructed from per station point clouds ( Part III). We finally propose a probabilistic approach for matching primitives, allowing the integration of a priori information and uncertainty in the constraints System used for calculating poses (Part IV). The contributions of our work are as follows: - to take a critical look at current methods of TLS data acquisition in industrial environments, - to evaluate, through experimentations, the information
Need to use probabilistic risk approach in performance assessment of waste disposal facilities
International Nuclear Information System (INIS)
Bonano, E.J.; Gallegos, D.P.
1991-01-01
Regulations governing the disposal of radioactive, hazardous, and/or mixed wastes will likely require, either directly or indirectly, that the performance of disposal facilities be assessed quantitatively. Such analyses, commonly called ''performance assessments,'' rely on the use of predictive models to arrive at a quantitative estimate of the potential impact of disposal on the environment and the safety and health of the public. It has been recognized that a suite of uncertainties affect the results of a performance assessment. These uncertainties are conventionally categorized as (1) uncertainty in the future state of the disposal system (facility and surrounding medium), (2) uncertainty in models (including conceptual models, mathematical models, and computer codes), and (3) uncertainty in data and parameters. Decisions regarding the suitability of a waste disposal facility must be made in light of these uncertainties. Hence, an approach is needed that would allow the explicit consideration of these uncertainties so that their impact on the estimated consequences of disposal can be evaluated. While most regulations for waste disposal do not prescribe the consideration of uncertainties, it is proposed that, even in such cases, a meaningful decision regarding the suitability of a waste disposal facility cannot be made without considering the impact of the attendant uncertainties. A probabilistic risk assessment (PRA) approach provides the formalism for considering the uncertainties and the technical basis that the decision makers can use in discharging their duties. A PRA methodology developed and demonstrated for the disposal of high-level radioactive waste provides a general framework for assessing the disposal of all types of wastes (radioactive, hazardous, and mixed). 15 refs., 1 fig., 1 tab
Cislaghi, Alessio; Rigon, Emanuel; Lenzi, Mario Aristide; Bischetti, Gian Battista
2018-04-01
Large wood (LW) plays a key role in physical, chemical, environmental, and biological processes in most natural and seminatural streams. However, it is also a source of hydraulic hazard in anthropised territories. Recruitment from fluvial processes has been the subject of many studies, whereas less attention has been given to hillslope recruitment, which is linked to episodic and spatially distributed events and requires a reliable and accurate slope stability model and a hillslope-channel transfer model. The purpose of this study is to develop an innovative LW hillslope-recruitment estimation approach that combines forest stand characteristics in a spatially distributed form, a probabilistic multidimensional slope stability model able to include the reinforcement exerted by roots, and a hillslope-channel transfer procedure. The approach was tested on a small mountain headwater catchment in the eastern Italian Alps that is prone to shallow landslide and debris flow phenomena. The slope stability model (that had not been calibrated) provided accurate performances, in terms of unstable areas identification according to the landslide inventory (AUC = 0.832) and of LW volume estimation in comparison with LW volume produced by inventoried landslides (7702 m3 corresponding to a recurrence time of about 30 years in the susceptibility curve). The results showed that most LW potentially mobilised by landslides does not reach the channel network (only about 16%), in agreement with the few data reported by other studies, as well as the data normalized for unit length of channel and unit length of channel per year (0-116 m3/km and 0-4 m3/km y-1). This study represents an important contribution to LW research. A rigorous and site-specific estimation of LW hillslope recruitment should, in fact, be an integral part of more general studies on LW dynamics, for forest planning and management, and positioning in-channel wood retention structures.
McInerney, David; Thyer, Mark; Kavetski, Dmitri; Lerat, Julien; Kuczera, George
2017-03-01
Reliable and precise probabilistic prediction of daily catchment-scale streamflow requires statistical characterization of residual errors of hydrological models. This study focuses on approaches for representing error heteroscedasticity with respect to simulated streamflow, i.e., the pattern of larger errors in higher streamflow predictions. We evaluate eight common residual error schemes, including standard and weighted least squares, the Box-Cox transformation (with fixed and calibrated power parameter λ) and the log-sinh transformation. Case studies include 17 perennial and 6 ephemeral catchments in Australia and the United States, and two lumped hydrological models. Performance is quantified using predictive reliability, precision, and volumetric bias metrics. We find the choice of heteroscedastic error modeling approach significantly impacts on predictive performance, though no single scheme simultaneously optimizes all performance metrics. The set of Pareto optimal schemes, reflecting performance trade-offs, comprises Box-Cox schemes with λ of 0.2 and 0.5, and the log scheme (λ = 0, perennial catchments only). These schemes significantly outperform even the average-performing remaining schemes (e.g., across ephemeral catchments, median precision tightens from 105% to 40% of observed streamflow, and median biases decrease from 25% to 4%). Theoretical interpretations of empirical results highlight the importance of capturing the skew/kurtosis of raw residuals and reproducing zero flows. Paradoxically, calibration of λ is often counterproductive: in perennial catchments, it tends to overfit low flows at the expense of abysmal precision in high flows. The log-sinh transformation is dominated by the simpler Pareto optimal schemes listed above. Recommendations for researchers and practitioners seeking robust residual error schemes for practical work are provided.
Borys, Przemysław
2012-06-01
Rat prostate cancer cells have been previously investigated using two cell lines: a highly metastatic one (Mat-Ly-Lu) and a nonmetastatic one (AT-2). It turns out that the highly metastatic Mat-Ly-Lu cells exhibit a phenomenon of cathodal galvanotaxis in an electric field which can be blocked by interrupting the voltage-gated sodium channel (VGSC) activity. The VGSC activity is postulated to be characteristic for metastatic cells and seems to be a reasonable driving force for motile behavior. However, the classical theory of cellular motion depends on calcium ions rather than sodium ions. The current research provides a theoretical connection between cellular sodium inflow and cathodal galvanotaxis of Mat-Ly-Lu cells. Electrical repulsion of intracellular calcium ions by entering sodium ions is proposed after depolarization starting from the cathodal side. The disturbance in the calcium distribution may then drive actin polymerization and myosin contraction. The presented modeling is done within a continuous one-dimensional Poisson-Nernst-Planck equation framework.
ProLBB - A Probabilistic Approach to Leak Before Break Demonstration
Energy Technology Data Exchange (ETDEWEB)
Dillstroem, Peter; Weilin Zang (Inspecta Technology AB, Stockholm (SE))
2007-11-15
Recently, the Swedish Nuclear Power Inspectorate has developed guidelines on how to demonstrate the existence of Leak Before Break (LBB). The guidelines, mainly based on NUREG/CR-6765, define the steps that must be fulfilled to get a conservative assessment of LBB acceptability. In this report, a probabilistic LBB approach is defined and implemented into the software ProLBB. The main conclusions, from the study presented in this report, are summarized below. - The probabilistic approach developed in this study was applied to different piping systems in both Boiler Water Reactors (BWR) and Pressurised Water Reactors (PWR). Pipe sizes were selected so that small, medium and large pipes were included in the analysis. The present study shows that the conditional probability of fracture is in general small for the larger diameter pipes when evaluated as function of leak flow rate. However, when evaluated as function of fraction of crack length around the circumference, then the larger diameter pipes will belong to the ones with the highest conditional fracture probabilities. - The total failure probability, corresponding to the product between the leak probability and the conditional fracture probability, will be very small for all pipe geometries when evaluated as function of fraction of crack length around the circumference. This is mainly due to a small leak probability which is consistent with expectations since no active damage mechanism has been assumed. - One of the objectives of the approach was to be able to check the influence of off-centre cracks (i.e. the possibility that cracks occur randomly around the pipe circumference). To satisfy this objective, new stress intensity factor solutions for off-centre cracks were developed. Also to check how off-centre cracks influence crack opening areas, new form factors solutions for COA were developed taking plastic deformation into account. - The influence from an off-center crack position on the conditional
ProLBB - A Probabilistic Approach to Leak Before Break Demonstration
International Nuclear Information System (INIS)
Dillstroem, Peter; Weilin Zang
2007-11-01
Recently, the Swedish Nuclear Power Inspectorate has developed guidelines on how to demonstrate the existence of Leak Before Break (LBB). The guidelines, mainly based on NUREG/CR-6765, define the steps that must be fulfilled to get a conservative assessment of LBB acceptability. In this report, a probabilistic LBB approach is defined and implemented into the software ProLBB. The main conclusions, from the study presented in this report, are summarized below. - The probabilistic approach developed in this study was applied to different piping systems in both Boiler Water Reactors (BWR) and Pressurised Water Reactors (PWR). Pipe sizes were selected so that small, medium and large pipes were included in the analysis. The present study shows that the conditional probability of fracture is in general small for the larger diameter pipes when evaluated as function of leak flow rate. However, when evaluated as function of fraction of crack length around the circumference, then the larger diameter pipes will belong to the ones with the highest conditional fracture probabilities. - The total failure probability, corresponding to the product between the leak probability and the conditional fracture probability, will be very small for all pipe geometries when evaluated as function of fraction of crack length around the circumference. This is mainly due to a small leak probability which is consistent with expectations since no active damage mechanism has been assumed. - One of the objectives of the approach was to be able to check the influence of off-centre cracks (i.e. the possibility that cracks occur randomly around the pipe circumference). To satisfy this objective, new stress intensity factor solutions for off-centre cracks were developed. Also to check how off-centre cracks influence crack opening areas, new form factors solutions for COA were developed taking plastic deformation into account. - The influence from an off-center crack position on the conditional
Schad, Thorsten; Schulz, Ralf
2011-10-01
The quantification of risk (the likelihood and extent of adverse effects) is a prerequisite in regulatory decision making for plant protection products and is the goal of the Xplicit project. In its present development stage, realism is increased in the exposure assessment (EA), first by using real-world data on, e.g., landscape factors affecting exposure, and second, by taking the variability of key factors into account. Spatial and temporal variability is explicitly addressed. Scale dependencies are taken into account, which allows for risk quantification at different scales, for example, at landscape scale, an overall picture of the potential exposure of nontarget organisms can be derived (e.g., for all off-crop habitats in a given landscape); at local scale, exposure might be relevant to assess recovery and recolonization potential; intermediate scales might best refer to population level and hence might be relevant for risk management decisions (e.g., individual off-crop habitats). The Xplicit approach is designed to comply with a central paradigm of probabilistic approaches, namely, that each individual case that is derived from the variability functions employed should represent a potential real-world case. This is mainly achieved by operating in a spatiotemporally explicit fashion. Landscape factors affecting the local exposure of habitats of nontarget species (i.e., receptors) are derived from geodatabases. Variability in time is resolved by operating at discrete time steps, with the probability of events (e.g., application) or conditions (e.g., wind conditions) defined in probability density functions (PDFs). The propagation of variability of parameters into variability of exposure and risk is done using a Monte Carlo approach. Among the outcomes are expectancy values on the realistic worst-case exposure (predicted environmental concentration [PEC]), the probability p that the PEC exceeds the ecologically acceptable concentration (EAC) for a given
International Nuclear Information System (INIS)
Gudur, Madhu Sudhan Reddy; Hara, Wendy; Le, Quynh-Thu; Wang, Lei; Xing, Lei; Li, Ruijiang
2014-01-01
MRI significantly improves the accuracy and reliability of target delineation in radiation therapy for certain tumors due to its superior soft tissue contrast compared to CT. A treatment planning process with MRI as the sole imaging modality will eliminate systematic CT/MRI co-registration errors, reduce cost and radiation exposure, and simplify clinical workflow. However, MRI lacks the key electron density information necessary for accurate dose calculation and generating reference images for patient setup. The purpose of this work is to develop a unifying method to derive electron density from standard T1-weighted MRI. We propose to combine both intensity and geometry information into a unifying probabilistic Bayesian framework for electron density mapping. For each voxel, we compute two conditional probability density functions (PDFs) of electron density given its: (1) T1-weighted MRI intensity, and (2) geometry in a reference anatomy, obtained by deformable image registration between the MRI of the atlas and test patient. The two conditional PDFs containing intensity and geometry information are combined into a unifying posterior PDF, whose mean value corresponds to the optimal electron density value under the mean-square error criterion. We evaluated the algorithm’s accuracy of electron density mapping and its ability to detect bone in the head for eight patients, using an additional patient as the atlas or template. Mean absolute HU error between the estimated and true CT, as well as receiver operating characteristics for bone detection (HU > 200) were calculated. The performance was compared with a global intensity approach based on T1 and no density correction (set whole head to water). The proposed technique significantly reduced the errors in electron density estimation, with a mean absolute HU error of 126, compared with 139 for deformable registration (p = 2 × 10 −4 ), 283 for the intensity approach (p = 2 × 10 −6 ) and 282
SEX-DETector: A Probabilistic Approach to Study Sex Chromosomes in Non-Model Organisms
Muyle, Aline; Käfer, Jos; Zemp, Niklaus; Mousset, Sylvain; Picard, Franck; Marais, Gabriel AB
2016-01-01
We propose a probabilistic framework to infer autosomal and sex-linked genes from RNA-seq data of a cross for any sex chromosome type (XY, ZW, and UV). Sex chromosomes (especially the non-recombining and repeat-dense Y, W, U, and V) are notoriously difficult to sequence. Strategies have been developed to obtain partially assembled sex chromosome sequences. Most of them remain difficult to apply to numerous non-model organisms, either because they require a reference genome, or because they are designed for evolutionarily old systems. Sequencing a cross (parents and progeny) by RNA-seq to study the segregation of alleles and infer sex-linked genes is a cost-efficient strategy, which also provides expression level estimates. However, the lack of a proper statistical framework has limited a broader application of this approach. Tests on empirical Silene data show that our method identifies 20–35% more sex-linked genes than existing pipelines, while making reliable inferences for downstream analyses. Approximately 12 individuals are needed for optimal results based on simulations. For species with an unknown sex-determination system, the method can assess the presence and type (XY vs. ZW) of sex chromosomes through a model comparison strategy. The method is particularly well optimized for sex chromosomes of young or intermediate age, which are expected in thousands of yet unstudied lineages. Any organisms, including non-model ones for which nothing is known a priori, that can be bred in the lab, are suitable for our method. SEX-DETector and its implementation in a Galaxy workflow are made freely available. PMID:27492231
Validation of the probabilistic approach for the analysis of PWR transients
International Nuclear Information System (INIS)
Amesz, J.; Francocci, G.F.; Clarotti, C.
1978-01-01
This paper reviews the pilot study at present being carried out on the validation of probabilistic methodology with real data coming from the operational records of the PWR power station at Obrigheim (KWO, Germany) operating since 1969. The aim of this analysis is to validate the a priori predictions of reactor transients performed by a probabilistic methodology, with the posteriori analysis of transients that actually occurred at a power station. Two levels of validation have been distinguished: (a) validation of the rate of occurrence of initiating events; (b) validation of the transient-parameter amplitude (i.e., overpressure) caused by the above mentioned initiating events. The paper describes the a priori calculations performed using a fault-tree analysis by means of a probabilistic code (SALP 3) and event-trees coupled with a PWR system deterministic computer code (LOOP 7). Finally the principle results of these analyses are presented and critically reviewed
Shih, Ann T.; Lo, Yunnhon; Ward, Natalie C.
2010-01-01
Quantifying the probability of significant launch vehicle failure scenarios for a given design, while still in the design process, is critical to mission success and to the safety of the astronauts. Probabilistic risk assessment (PRA) is chosen from many system safety and reliability tools to verify the loss of mission (LOM) and loss of crew (LOC) requirements set by the NASA Program Office. To support the integrated vehicle PRA, probabilistic design analysis (PDA) models are developed by using vehicle design and operation data to better quantify failure probabilities and to better understand the characteristics of a failure and its outcome. This PDA approach uses a physics-based model to describe the system behavior and response for a given failure scenario. Each driving parameter in the model is treated as a random variable with a distribution function. Monte Carlo simulation is used to perform probabilistic calculations to statistically obtain the failure probability. Sensitivity analyses are performed to show how input parameters affect the predicted failure probability, providing insight for potential design improvements to mitigate the risk. The paper discusses the application of the PDA approach in determining the probability of failure for two scenarios from the NASA Ares I project
A Probabilistic Approach to Network Event Formation from Pre-Processed Waveform Data
Kohl, B. C.; Given, J.
2017-12-01
The current state of the art for seismic event detection still largely depends on signal detection at individual sensor stations, including picking accurate arrivals times and correctly identifying phases, and relying on fusion algorithms to associate individual signal detections to form event hypotheses. But increasing computational capability has enabled progress toward the objective of fully utilizing body-wave recordings in an integrated manner to detect events without the necessity of previously recorded ground truth events. In 2011-2012 Leidos (then SAIC) operated a seismic network to monitor activity associated with geothermal field operations in western Nevada. We developed a new association approach for detecting and quantifying events by probabilistically combining pre-processed waveform data to deal with noisy data and clutter at local distance ranges. The ProbDet algorithm maps continuous waveform data into continuous conditional probability traces using a source model (e.g. Brune earthquake or Mueller-Murphy explosion) to map frequency content and an attenuation model to map amplitudes. Event detection and classification is accomplished by combining the conditional probabilities from the entire network using a Bayesian formulation. This approach was successful in producing a high-Pd, low-Pfa automated bulletin for a local network and preliminary tests with regional and teleseismic data show that it has promise for global seismic and nuclear monitoring applications. The approach highlights several features that we believe are essential to achieving low-threshold automated event detection: Minimizes the utilization of individual seismic phase detections - in traditional techniques, errors in signal detection, timing, feature measurement and initial phase ID compound and propagate into errors in event formation, Has a formalized framework that utilizes information from non-detecting stations, Has a formalized framework that utilizes source information, in
Weatherill, Graeme; Burton, Paul W.
2010-09-01
The Aegean is the most seismically active and tectonically complex region in Europe. Damaging earthquakes have occurred here throughout recorded history, often resulting in considerable loss of life. The Monte Carlo method of probabilistic seismic hazard analysis (PSHA) is used to determine the level of ground motion likely to be exceeded in a given time period. Multiple random simulations of seismicity are generated to calculate, directly, the ground motion for a given site. Within the seismic hazard analysis we explore the impact of different seismic source models, incorporating both uniform zones and distributed seismicity. A new, simplified, seismic source model, derived from seismotectonic interpretation, is presented for the Aegean region. This is combined into the epistemic uncertainty analysis alongside existing source models for the region, and models derived by a K-means cluster analysis approach. Seismic source models derived using the K-means approach offer a degree of objectivity and reproducibility into the otherwise subjective approach of delineating seismic sources using expert judgment. Similar review and analysis is undertaken for the selection of peak ground acceleration (PGA) attenuation models, incorporating into the epistemic analysis Greek-specific models, European models and a Next Generation Attenuation model. Hazard maps for PGA on a "rock" site with a 10% probability of being exceeded in 50 years are produced and different source and attenuation models are compared. These indicate that Greek-specific attenuation models, with their smaller aleatory variability terms, produce lower PGA hazard, whilst recent European models and Next Generation Attenuation (NGA) model produce similar results. The Monte Carlo method is extended further to assimilate epistemic uncertainty into the hazard calculation, thus integrating across several appropriate source and PGA attenuation models. Site condition and fault-type are also integrated into the hazard
Probabilistic Approaches for Multi-Hazard Risk Assessment of Structures and Systems
Kwag, Shinyoung
Performance assessment of structures, systems, and components for multi-hazard scenarios has received significant attention in recent years. However, the concept of multi-hazard analysis is quite broad in nature and the focus of existing literature varies across a wide range of problems. In some cases, such studies focus on hazards that either occur simultaneously or are closely correlated with each other. For example, seismically induced flooding or seismically induced fires. In other cases, multi-hazard studies relate to hazards that are not dependent or correlated but have strong likelihood of occurrence at different times during the lifetime of a structure. The current approaches for risk assessment need enhancement to account for multi-hazard risks. It must be able to account for uncertainty propagation in a systems-level analysis, consider correlation among events or failure modes, and allow integration of newly available information from continually evolving simulation models, experimental observations, and field measurements. This dissertation presents a detailed study that proposes enhancements by incorporating Bayesian networks and Bayesian updating within a performance-based probabilistic framework. The performance-based framework allows propagation of risk as well as uncertainties in the risk estimates within a systems analysis. Unlike conventional risk assessment techniques such as a fault-tree analysis, a Bayesian network can account for statistical dependencies and correlations among events/hazards. The proposed approach is extended to develop a risk-informed framework for quantitative validation and verification of high fidelity system-level simulation tools. Validation of such simulations can be quite formidable within the context of a multi-hazard risk assessment in nuclear power plants. The efficiency of this approach lies in identification of critical events, components, and systems that contribute to the overall risk. Validation of any event or
Homogeneous Poisson structures
International Nuclear Information System (INIS)
Shafei Deh Abad, A.; Malek, F.
1993-09-01
We provide an algebraic definition for Schouten product and give a decomposition for any homogenenous Poisson structure in any n-dimensional vector space. A large class of n-homogeneous Poisson structures in R k is also characterized. (author). 4 refs
International Nuclear Information System (INIS)
Littlejohn, R.G.
1982-01-01
The Hamiltonian structures discovered by Morrison and Greene for various fluid equations were obtained by guessing a Hamiltonian and a suitable Poisson bracket formula, expressed in terms of noncanonical (but physical) coordinates. In general, such a procedure for obtaining a Hamiltonian system does not produce a Hamiltonian phase space in the usual sense (a symplectic manifold), but rather a family of symplectic manifolds. To state the matter in terms of a system with a finite number of degrees of freedom, the family of symplectic manifolds is parametrized by a set of Casimir functions, which are characterized by having vanishing Poisson brackets with all other functions. The number of independent Casimir functions is the corank of the Poisson tensor J/sup ij/, the components of which are the Poisson brackets of the coordinates among themselves. Thus, these Casimir functions exist only when the Poisson tensor is singular
International Nuclear Information System (INIS)
Aly, A M; Christenson, R E
2008-01-01
Smart damping technology has been proposed to protect civil structures from dynamic loads. Each application of smart damping control provides varying levels of performance relative to active and passive control strategies. Currently, researchers compare the relative efficacy of smart damping control to active and passive strategies by running numerous simulations. These simulations can require significant computation time and resources. Because of this, it is desirable to develop an approach to assess the applicability of smart damping technology which requires less computation time. This paper discusses and verifies a probabilistic approach to determine the efficacy of smart damping technology based on clipped optimal state feedback control theory
Schwartz, Joel D.; Lee, Mihye; Kinney, Patrick L.; Yang, Suijia; Mills, David; Sarofim, Marcus C.; Jones, Russell; Streeter, Richard; St. Juliana, Alexis; Peers, Jennifer;
2015-01-01
Background: A warming climate will affect future temperature-attributable premature deaths. This analysis is the first to project these deaths at a near national scale for the United States using city and month-specific temperature-mortality relationships. Methods: We used Poisson regressions to model temperature-attributable premature mortality as a function of daily average temperature in 209 U.S. cities by month. We used climate data to group cities into clusters and applied an Empirical Bayes adjustment to improve model stability and calculate cluster-based month-specific temperature-mortality functions. Using data from two climate models, we calculated future daily average temperatures in each city under Representative Concentration Pathway 6.0. Holding population constant at 2010 levels, we combined the temperature data and cluster-based temperature-mortality functions to project city-specific temperature-attributable premature deaths for multiple future years which correspond to a single reporting year. Results within the reporting periods are then averaged to account for potential climate variability and reported as a change from a 1990 baseline in the future reporting years of 2030, 2050 and 2100. Results: We found temperature-mortality relationships that vary by location and time of year. In general, the largest mortality response during hotter months (April - September) was in July in cities with cooler average conditions. The largest mortality response during colder months (October-March) was at the beginning (October) and end (March) of the period. Using data from two global climate models, we projected a net increase in premature deaths, aggregated across all 209 cities, in all future periods compared to 1990. However, the magnitude and sign of the change varied by cluster and city. Conclusions: We found increasing future premature deaths across the 209 modeled U.S. cities using two climate model projections, based on constant temperature
On poisson-stopped-sums that are mixed poisson
Valero Baya, Jordi; Pérez Casany, Marta; Ginebra Molins, Josep
2013-01-01
Maceda (1948) characterized the mixed Poisson distributions that are Poisson-stopped-sum distributions based on the mixing distribution. In an alternative characterization of the same set of distributions here the Poisson-stopped-sum distributions that are mixed Poisson distributions is proved to be the set of Poisson-stopped-sums of either a mixture of zero-truncated Poisson distributions or a zero-modification of it. Peer Reviewed
Zeroth Poisson Homology, Foliated Cohomology and Perfect Poisson Manifolds
Martínez-Torres, David; Miranda, Eva
2018-01-01
We prove that, for compact regular Poisson manifolds, the zeroth homology group is isomorphic to the top foliated cohomology group, and we give some applications. In particular, we show that, for regular unimodular Poisson manifolds, top Poisson and foliated cohomology groups are isomorphic. Inspired by the symplectic setting, we define what a perfect Poisson manifold is. We use these Poisson homology computations to provide families of perfect Poisson manifolds.
Learning Probabilistic Logic Models from Probabilistic Examples.
Chen, Jianzhong; Muggleton, Stephen; Santos, José
2008-10-01
We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.
A probabilistic approach for the estimation of earthquake source parameters from spectral inversion
Supino, M.; Festa, G.; Zollo, A.
2017-12-01
The amplitude spectrum of a seismic signal related to an earthquake source carries information about the size of the rupture, moment, stress and energy release. Furthermore, it can be used to characterize the Green's function of the medium crossed by the seismic waves. We describe the earthquake amplitude spectrum assuming a generalized Brune's (1970) source model, and direct P- and S-waves propagating in a layered velocity model, characterized by a frequency-independent Q attenuation factor. The observed displacement spectrum depends indeed on three source parameters, the seismic moment (through the low-frequency spectral level), the corner frequency (that is a proxy of the fault length) and the high-frequency decay parameter. These parameters are strongly correlated each other and with the quality factor Q; a rigorous estimation of the associated uncertainties and parameter resolution is thus needed to obtain reliable estimations.In this work, the uncertainties are characterized adopting a probabilistic approach for the parameter estimation. Assuming an L2-norm based misfit function, we perform a global exploration of the parameter space to find the absolute minimum of the cost function and then we explore the cost-function associated joint a-posteriori probability density function around such a minimum, to extract the correlation matrix of the parameters. The global exploration relies on building a Markov chain in the parameter space and on combining a deterministic minimization with a random exploration of the space (basin-hopping technique). The joint pdf is built from the misfit function using the maximum likelihood principle and assuming a Gaussian-like distribution of the parameters. It is then computed on a grid centered at the global minimum of the cost-function. The numerical integration of the pdf finally provides mean, variance and correlation matrix associated with the set of best-fit parameters describing the model. Synthetic tests are performed to
The Fractional Poisson Process and the Inverse Stable Subordinator
Meerschaert, Mark; Nane, Erkan; Vellaisamy, P.
2011-01-01
The fractional Poisson process is a renewal process with Mittag-Leffler waiting times. Its distributions solve a time-fractional analogue of the Kolmogorov forward equation for a Poisson process. This paper shows that a traditional Poisson process, with the time variable replaced by an independent inverse stable subordinator, is also a fractional Poisson process. This result unifies the two main approaches in the stochastic theory of time-fractional diffusion equations. The equivalence extend...
Cumulative Poisson Distribution Program
Bowerman, Paul N.; Scheuer, Ernest M.; Nolty, Robert
1990-01-01
Overflow and underflow in sums prevented. Cumulative Poisson Distribution Program, CUMPOIS, one of two computer programs that make calculations involving cumulative Poisson distributions. Both programs, CUMPOIS (NPO-17714) and NEWTPOIS (NPO-17715), used independently of one another. CUMPOIS determines cumulative Poisson distribution, used to evaluate cumulative distribution function (cdf) for gamma distributions with integer shape parameters and cdf for X (sup2) distributions with even degrees of freedom. Used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. Written in C.
International Nuclear Information System (INIS)
Harwood, L.H.
1981-01-01
At MSU we have used the POISSON family of programs extensively for magnetic field calculations. In the presently super-saturated computer situation, reducing the run time for the program is imperative. Thus, a series of modifications have been made to POISSON to speed up convergence. Two of the modifications aim at having the first guess solution as close as possible to the final solution. The other two aim at increasing the convergence rate. In this discussion, a working knowledge of POISSON is assumed. The amount of new code and expected time saving for each modification is discussed
International Nuclear Information System (INIS)
Godbout, P.; Brais, A.
1975-01-01
The possibilities of an aircraft striking a Canadian nuclear power plant in the vicinity of an airport and of inducing structural failure modes have been evaluated. This evaluation, together with other studies, may enhance decisions in the development of general criteria for the siting of reactors near airports. The study made use, for assessment, of the probabilistic approach and made judicious applications of the finite Canadian, French, German, American and English resources that were available. The tools, techniques and methods used for achieving the above, form what may be called an integrated approach. This method of approach requires that the study be made in six consecutive steps as follows: the qualitative evaluation of having an aircraft strike on a site situated near an airport with the use of the logic model technique; the statistical data gathering on aircraft movements and accidents; evaluating the probability distribution and calculating the basic event probabilities; evaluating the probability of an aircraft strike and the application of the sensitivity approach; generating the probability density distribution versus strike impact energy, that is, the evaluation of the energy envelope; and the probabilistic evaluation of structural failure mode inducements
Poisson Processes in Free Probability
An, Guimei; Gao, Mingchu
2015-01-01
We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...
Probabilistic vs linear blending approaches to shared control for wheelchair driving.
Ezeh, Chinemelu; Trautman, Pete; Devigne, Louise; Bureau, Valentin; Babel, Marie; Carlson, Tom
2017-07-01
Some people with severe mobility impairments are unable to operate powered wheelchairs reliably and effectively, using commercially available interfaces. This has sparked a body of research into "smart wheelchairs", which assist users to drive safely and create opportunities for them to use alternative interfaces. Various "shared control" techniques have been proposed to provide an appropriate level of assistance that is satisfactory and acceptable to the user. Most shared control techniques employ a traditional strategy called linear blending (LB), where the user's commands and wheelchair's autonomous commands are combined in some proportion. In this paper, however, we implement a more generalised form of shared control called probabilistic shared control (PSC). This probabilistic formulation improves the accuracy of modelling the interaction between the user and the wheelchair by taking into account uncertainty in the interaction. In this paper, we demonstrate the practical success of PSC over LB in terms of safety, particularly for novice users.
Peeters, L. J.; Mallants, D.; Turnadge, C.
2017-12-01
Groundwater impact assessments are increasingly being undertaken in a probabilistic framework whereby various sources of uncertainty (model parameters, model structure, boundary conditions, and calibration data) are taken into account. This has resulted in groundwater impact metrics being presented as probability density functions and/or cumulative distribution functions, spatial maps displaying isolines of percentile values for specific metrics, etc. Groundwater management on the other hand typically uses single values (i.e., in a deterministic framework) to evaluate what decisions are required to protect groundwater resources. For instance, in New South Wales, Australia, a nominal drawdown value of two metres is specified by the NSW Aquifer Interference Policy as trigger-level threshold. In many cases, when drawdowns induced by groundwater extraction exceed two metres, "make-good" provisions are enacted (such as the surrendering of extraction licenses). The information obtained from a quantitative uncertainty analysis can be used to guide decision making in several ways. Two examples are discussed here: the first of which would not require modification of existing "deterministic" trigger or guideline values, whereas the second example assumes that the regulatory criteria are also expressed in probabilistic terms. The first example is a straightforward interpretation of calculated percentile values for specific impact metrics. The second examples goes a step further, as the previous deterministic thresholds do not currently allow for a probabilistic interpretation; e.g., there is no statement that "the probability of exceeding the threshold shall not be larger than 50%". It would indeed be sensible to have a set of thresholds with an associated acceptable probability of exceedance (or probability of not exceeding a threshold) that decreases as the impact increases. We here illustrate how both the prediction uncertainty and management rules can be expressed in a
International Nuclear Information System (INIS)
Adamec, P.
2000-12-01
Following a general summary of the issue, an overview of international experience (USA; Belgium, France, Germany, Russia, Spain, Sweden, The Netherlands, and the UK; and probabilistic PTS assessment for the reactor pressure vessel at Loviisa-1, Finland) is presented, and the applicable computer codes (VISA-II, OCA-P, FAVOR, ZERBERUS) are highlighted and their applicability to VVER type reactor pressure vessels is outlined. (P.A.)
International Nuclear Information System (INIS)
Patrik, M.; Babic, P.
2001-06-01
The report responds to the trend where probabilistic safety analyses are attached, on a voluntary basis (as yet), to the mandatory deterministic assessment of modifications of NPP systems or operating procedures, resulting in risk-informed type documents. It contains a nearly complete Czech translation of US NRC Regulatory Guide 1.177 and presents some suggestions for improving a) PSA study applications; b) the development of NPP documents for the regulatory body; and c) the interconnection between PSA and traditional deterministic analyses as contained in the risk-informed approach. (P.A.)
Zhang, Kejiang; Achari, Gopal; Pei, Yuansheng
2010-10-01
Different types of uncertain information-linguistic, probabilistic, and possibilistic-exist in site characterization. Their representation and propagation significantly influence the management of contaminated sites. In the absence of a framework with which to properly represent and integrate these quantitative and qualitative inputs together, decision makers cannot fully take advantage of the available and necessary information to identify all the plausible alternatives. A systematic methodology was developed in the present work to incorporate linguistic, probabilistic, and possibilistic information into the Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), a subgroup of Multi-Criteria Decision Analysis (MCDA) methods for ranking contaminated sites. The identification of criteria based on the paradigm of comparative risk assessment provides a rationale for risk-based prioritization. Uncertain linguistic, probabilistic, and possibilistic information identified in characterizing contaminated sites can be properly represented as numerical values, intervals, probability distributions, and fuzzy sets or possibility distributions, and linguistic variables according to their nature. These different kinds of representation are first transformed into a 2-tuple linguistic representation domain. The propagation of hybrid uncertainties is then carried out in the same domain. This methodology can use the original site information directly as much as possible. The case study shows that this systematic methodology provides more reasonable results. © 2010 SETAC.
A note on probabilistic models over strings: the linear algebra approach.
Bouchard-Côté, Alexandre
2013-12-01
Probabilistic models over strings have played a key role in developing methods that take into consideration indels as phylogenetically informative events. There is an extensive literature on using automata and transducers on phylogenies to do inference on these probabilistic models, in which an important theoretical question is the complexity of computing the normalization of a class of string-valued graphical models. This question has been investigated using tools from combinatorics, dynamic programming, and graph theory, and has practical applications in Bayesian phylogenetics. In this work, we revisit this theoretical question from a different point of view, based on linear algebra. The main contribution is a set of results based on this linear algebra view that facilitate the analysis and design of inference algorithms on string-valued graphical models. As an illustration, we use this method to give a new elementary proof of a known result on the complexity of inference on the "TKF91" model, a well-known probabilistic model over strings. Compared to previous work, our proving method is easier to extend to other models, since it relies on a novel weak condition, triangular transducers, which is easy to establish in practice. The linear algebra view provides a concise way of describing transducer algorithms and their compositions, opens the possibility of transferring fast linear algebra libraries (for example, based on GPUs), as well as low rank matrix approximation methods, to string-valued inference problems.
On Poisson Nonlinear Transformations
Directory of Open Access Journals (Sweden)
Nasir Ganikhodjaev
2014-01-01
Full Text Available We construct the family of Poisson nonlinear transformations defined on the countable sample space of nonnegative integers and investigate their trajectory behavior. We have proved that these nonlinear transformations are regular.
Scaling the Poisson Distribution
Farnsworth, David L.
2014-01-01
We derive the additive property of Poisson random variables directly from the probability mass function. An important application of the additive property to quality testing of computer chips is presented.
Extended Poisson Exponential Distribution
Directory of Open Access Journals (Sweden)
Anum Fatima
2015-09-01
Full Text Available A new mixture of Modified Exponential (ME and Poisson distribution has been introduced in this paper. Taking the Maximum of Modified Exponential random variable when the sample size follows a zero truncated Poisson distribution we have derived the new distribution, named as Extended Poisson Exponential distribution. This distribution possesses increasing and decreasing failure rates. The Poisson-Exponential, Modified Exponential and Exponential distributions are special cases of this distribution. We have also investigated some mathematical properties of the distribution along with Information entropies and Order statistics of the distribution. The estimation of parameters has been obtained using the Maximum Likelihood Estimation procedure. Finally we have illustrated a real data application of our distribution.
Poisson branching point processes
International Nuclear Information System (INIS)
Matsuo, K.; Teich, M.C.; Saleh, B.E.A.
1984-01-01
We investigate the statistical properties of a special branching point process. The initial process is assumed to be a homogeneous Poisson point process (HPP). The initiating events at each branching stage are carried forward to the following stage. In addition, each initiating event independently contributes a nonstationary Poisson point process (whose rate is a specified function) located at that point. The additional contributions from all points of a given stage constitute a doubly stochastic Poisson point process (DSPP) whose rate is a filtered version of the initiating point process at that stage. The process studied is a generalization of a Poisson branching process in which random time delays are permitted in the generation of events. Particular attention is given to the limit in which the number of branching stages is infinite while the average number of added events per event of the previous stage is infinitesimal. In the special case when the branching is instantaneous this limit of continuous branching corresponds to the well-known Yule--Furry process with an initial Poisson population. The Poisson branching point process provides a useful description for many problems in various scientific disciplines, such as the behavior of electron multipliers, neutron chain reactions, and cosmic ray showers
Poisson-Hopf limit of quantum algebras
International Nuclear Information System (INIS)
Ballesteros, A; Celeghini, E; Olmo, M A del
2009-01-01
The Poisson-Hopf analogue of an arbitrary quantum algebra U z (g) is constructed by introducing a one-parameter family of quantizations U z,ℎ (g) depending explicitly on ℎ and by taking the appropriate ℎ → 0 limit. The q-Poisson analogues of the su(2) algebra are discussed and the novel su q P (3) case is introduced. The q-Serre relations are also extended to the Poisson limit. This approach opens the perspective for possible applications of higher rank q-deformed Hopf algebras in semiclassical contexts
An integral approach to the use of probabilistic risk assessment methods
International Nuclear Information System (INIS)
Schwarzblat, M.; Arellano, J.
1987-01-01
In this chapter some of the work developed at the Instituto de Investigaciones Electricas in the area of probabilistic risk analysis are presented. In this area, work has been basically focused in the following directions: development and implementation of methods, and applications to real systems. The first part of this paper describes the area of methods development and implementation, presenting an integrated package of computer programs for fault tree analysis. In the second part some of the most important applications developed for real systems are presented. (author)
Foundation plate on the elastic half-space, deterministic and probabilistic approach
Directory of Open Access Journals (Sweden)
Tvrdá Katarína
2017-01-01
Full Text Available Interaction between the foundation plate and subgrade can be described by different mathematical - physical model. Elastic foundation can be modelled by different types of models, e.g. one-parametric model, two-parametric model and a comprehensive model - Boussinesque (elastic half-space had been used. The article deals with deterministic and probabilistic analysis of deflection of the foundation plate on the elastic half-space. Contact between the foundation plate and subsoil was modelled using contact elements node-node. At the end the obtained results are presented.
A probabilistic approach to cost and duration uncertainties in environmental decisions
International Nuclear Information System (INIS)
Boak, D.M.; Painton, L.
1996-01-01
Sandia National Laboratories has developed a method for analyzing life-cycle costs using probabilistic cost forecasting and utility theory to determine the most cost-effective alternatives for safe interim storage of radioactive materials. The method explicitly incorporates uncertainties in cost and storage duration by (1) treating uncertain component costs as random variables represented by probability distributions, (2) treating uncertain durations as chance nodes in a decision tree, and (3) using stochastic simulation tools to generate life-cycle cost forecasts for each storage alternative. The method applies utility functions to the forecasted costs to incorporate the decision maker's risk preferences, making it possible to compare alternatives on the basis of both cost and cost utility. Finally, the method is used to help identify key contributors to the uncertainty in forecasted costs to focus efforts aimed at reducing cost uncertainties. Where significant cost and duration uncertainties exist, and where programmatic decisions must be made despite these uncertainties, probabilistic forecasting techniques can yield important insights into decision alternatives, especially when used as part of a larger decision analysis framework and when properly balanced with deterministic analyses. Although the method is built around an interim storage example, it is potentially applicable to many other environmental decision problems
International Nuclear Information System (INIS)
Dahiya, Sudhir; Hegde, A.G.; Joshi, M.L.; Verma, P.C.; Kushwaha, H.S.
2006-01-01
This study illustrates use of two approaches namely probabilistic using Monte Carlo simulation (MCS) and possibilistic using fuzzy α-cut (FAC) to estimate the radiological cancer risk to the population from ingestion of organically bound tritium (OBT) and tissue free water tritium (TFWT) from fish consumption from the Rana Pratap Sagar Lake (RPSL), Kota. Using FAC technique, radiological cancer risk rate (year -1 ) at A αl.0 level were 1.15E-08 and 1.50E-09 for OBT and TFWT respectively from fish ingestion pathway. The radiological cancer risk rate (year -1 ) using MCS approach at 50th percentile (median) level is 1.14E-08 and 1.49E-09 for OBT and HTO respectively from ingestion of fresh water fish. (author)
Substation design improvement with a probabilistic reliability approach using the TOPASE program
Energy Technology Data Exchange (ETDEWEB)
Bulot, M.; Heroin, G.; Bergerot, J-L.; Le Du, M. [Electricite de France (France)
1997-12-31
TOPASE, (the French acronym for Probabilistic Tools and Data Processing for the Analysis of Electric Systems), developed by Electricite de France (EDF) to perform reliability studies on transmission substations, was described. TOPASE serves a dual objective of assisting in the automation of HV substation studies, as well as enabling electrical systems experts who are not necessarily specialists in reliability studies to perform such studies. The program is capable of quantifying the occurrence rate of undesirable events and of identifying critical equipment and the main incident scenarios. The program can be used to improve an existing substation, to choose an HV structure during the design stage, or to choose a system of protective devices. Data collected during 1996 and 1997 will be analyzed to identify useful experiences and to validate the basic concepts of the program. 4 figs.
A Probabilistic Approach to Control of Complex Systems and Its Application to Real-Time Pricing
Directory of Open Access Journals (Sweden)
Koichi Kobayashi
2014-01-01
Full Text Available Control of complex systems is one of the fundamental problems in control theory. In this paper, a control method for complex systems modeled by a probabilistic Boolean network (PBN is studied. A PBN is widely used as a model of complex systems such as gene regulatory networks. For a PBN, the structural control problem is newly formulated. In this problem, a discrete probability distribution appeared in a PBN is controlled by the continuous-valued input. For this problem, an approximate solution method using a matrix-based representation for a PBN is proposed. Then, the problem is approximated by a linear programming problem. Furthermore, the proposed method is applied to design of real-time pricing systems of electricity. Electricity conservation is achieved by appropriately determining the electricity price over time. The effectiveness of the proposed method is presented by a numerical example on real-time pricing systems.
Directory of Open Access Journals (Sweden)
Mosbeh R. Kaloop
2017-01-01
Full Text Available This study evaluates the performance of passively controlled steel frame building under dynamic loads using time series analysis. A novel application is utilized for the time and frequency domains evaluation to analyze the behavior of controlling systems. In addition, the autoregressive moving average (ARMA neural networks are employed to identify the performance of the controller system. Three passive vibration control devices are utilized in this study, namely, tuned mass damper (TMD, tuned liquid damper (TLD, and tuned liquid column damper (TLCD. The results show that the TMD control system is a more reliable controller than TLD and TLCD systems in terms of vibration mitigation. The probabilistic evaluation and identification model showed that the probability analysis and ARMA neural network model are suitable to evaluate and predict the response of coupled building-controller systems.
Fractional Poisson process (II)
International Nuclear Information System (INIS)
Wang Xiaotian; Wen Zhixiong; Zhang Shiying
2006-01-01
In this paper, we propose a stochastic process W H (t)(H-bar (12,1)) which we call fractional Poisson process. The process W H (t) is self-similar in wide sense, displays long range dependence, and has more fatter tail than Gaussian process. In addition, it converges to fractional Brownian motion in distribution
Wakker, P.P.; Thaler, R.H.; Tversky, A.
1997-01-01
textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these preferences are intuitively appealing they are difficult to reconcile with expected utility theory. Under highly plausible assumptions about the utility function, willingness to pay for probabilistic i...
International Nuclear Information System (INIS)
Dr. R. Dyer; Dr. R. Andrews; Dr. A. Van Luik
2005-01-01
Regulatory requirements being addressed in the US geological repository program for spent nuclear fuel and high-level waste disposal specify probabilistically defined mean-value dose limits. These dose limits reflect acceptable levels of risk. The probabilistic approach mandated by regulation calculates a ''risk of a dose,'' a risk of a potential given dose value at a specific time in the future to a hypothetical person. The mean value of the time-dependent performance measure needs to remain below an acceptable level defined by regulation. Because there are uncertain parameters that are important to system performance, the regulation mandates an analysis focused on the mean value of the performance measure, but that also explores the ''full range of defensible and reasonable parameter distributions''...System performance evaluations should not be unduly influenced by...''extreme physical situations and parameter values''. Challenges in this approach lie in defending the scientific basis for the models selected, and the data and distributions sampled. A significant challenge lies in showing that uncertainties are properly identified and evaluated. A single-value parameter has no uncertainty, and where used such values need to be supported by scientific information showing the selected value is appropriate. Uncertainties are inherent in data, but are also introduced by creating parameter distributions from data sets, selecting models from among alternative models, abstracting models for use in probabilistic analysis, and in selecting the range of initiating event probabilities for unlikely events. The goal of the assessment currently in progress is to evaluate the level of risk inherent in moving ahead to the next phase of repository development: construction. During the construction phase, more will be learned to inform a new long-term risk evaluation to support moving to the next phase: accepting waste. Therefore, though there was sufficient confidence of safety
González-Carrasco, J. F.; Gonzalez, G.; Aránguiz, R.; Yanez, G. A.; Melgar, D.; Salazar, P.; Shrivastava, M. N.; Das, R.; Catalan, P. A.; Cienfuegos, R.
2017-12-01
Plausible worst-case tsunamigenic scenarios definition plays a relevant role in tsunami hazard assessment focused in emergency preparedness and evacuation planning for coastal communities. During the last decade, the occurrence of major and moderate tsunamigenic earthquakes along worldwide subduction zones has given clues about critical parameters involved in near-field tsunami inundation processes, i.e. slip spatial distribution, shelf resonance of edge waves and local geomorphology effects. To analyze the effects of these seismic and hydrodynamic variables over the epistemic uncertainty of coastal inundation, we implement a combined methodology using deterministic and probabilistic approaches to construct 420 tsunamigenic scenarios in a mature seismic gap of southern Peru and northern Chile, extended from 17ºS to 24ºS. The deterministic scenarios are calculated using a regional distribution of trench-parallel gravity anomaly (TPGA) and trench-parallel topography anomaly (TPTA), three-dimensional Slab 1.0 worldwide subduction zones geometry model and published interseismic coupling (ISC) distributions. As result, we find four higher slip deficit zones interpreted as major seismic asperities of the gap, used in a hierarchical tree scheme to generate ten tsunamigenic scenarios with seismic magnitudes fluctuates between Mw 8.4 to Mw 8.9. Additionally, we construct ten homogeneous slip scenarios as inundation baseline. For the probabilistic approach, we implement a Karhunen - Loève expansion to generate 400 stochastic tsunamigenic scenarios over the maximum extension of the gap, with the same magnitude range of the deterministic sources. All the scenarios are simulated through a non-hydrostatic tsunami model Neowave 2D, using a classical nesting scheme, for five coastal major cities in northern Chile (Arica, Iquique, Tocopilla, Mejillones and Antofagasta) obtaining high resolution data of inundation depth, runup, coastal currents and sea level elevation. The
Test of Poisson Process for Earthquakes in and around Korea
International Nuclear Information System (INIS)
Noh, Myunghyun; Choi, Hoseon
2015-01-01
Since Cornell's work on the probabilistic seismic hazard analysis (hereafter, PSHA), majority of PSHA computer codes are assuming that the earthquake occurrence is Poissonian. To the author's knowledge, it is uncertain who first opened the issue of the Poisson process for the earthquake occurrence. The systematic PSHA in Korea, led by the nuclear industry, were carried out for more than 25 year with the assumption of the Poisson process. However, the assumption of the Poisson process has never been tested. Therefore, the test is of significance. We tested whether the Korean earthquakes follow the Poisson process or not. The Chi-square test with the significance level of 5% was applied. The test turned out that the Poisson process could not be rejected for the earthquakes of magnitude 2.9 or larger. However, it was still observed in the graphical comparison that some portion of the observed distribution significantly deviated from the Poisson distribution. We think this is due to the small earthquake data. The earthquakes of magnitude 2.9 or larger occurred only 376 times during 34 years. Therefore, the judgment on the Poisson process derived in the present study is not conclusive
Probabilistic approach to diffusion in shear flows of generalized viscoelastic second-grade fluids
International Nuclear Information System (INIS)
Wafo Soh, C
2010-01-01
We study diffusion in point-source-driven shear flows of generalized second-grade fluids. We start by obtaining exact solutions of shear flows triggered by point sources under various boundary conditions. For unrestricted flows, we demonstrate that the velocity distribution is the probability density function of a coupled or uncoupled continuous-time random walk. In the first instance, the motion is described by a compound Poisson process with an explicit probability density function corresponding to the velocity distribution. The average waiting time in this situation is finite and is identified with the structural relaxation time. In the second case, we obtain an explicit formula for the probability density function in terms of special functions. In both cases, the probability density functions of the associated stochastic processes are leptokurtic at all finite times with variances linear in time. By using the method of images, we infer velocity fields for restricted flows from those of unrestricted flows. Equipped with some exact expressions of the velocity field, we analyze advection–diffusion via the Feynman–Kac formula, which lends itself naturally to Monte Carlo simulation
Gromek, Katherine Emily
A novel computational and inference framework of the physics-of-failure (PoF) reliability modeling for complex dynamic systems has been established in this research. The PoF-based reliability models are used to perform a real time simulation of system failure processes, so that the system level reliability modeling would constitute inferences from checking the status of component level reliability at any given time. The "agent autonomy" concept is applied as a solution method for the system-level probabilistic PoF-based (i.e. PPoF-based) modeling. This concept originated from artificial intelligence (AI) as a leading intelligent computational inference in modeling of multi agents systems (MAS). The concept of agent autonomy in the context of reliability modeling was first proposed by M. Azarkhail [1], where a fundamentally new idea of system representation by autonomous intelligent agents for the purpose of reliability modeling was introduced. Contribution of the current work lies in the further development of the agent anatomy concept, particularly the refined agent classification within the scope of the PoF-based system reliability modeling, new approaches to the learning and the autonomy properties of the intelligent agents, and modeling interacting failure mechanisms within the dynamic engineering system. The autonomous property of intelligent agents is defined as agent's ability to self-activate, deactivate or completely redefine their role in the analysis. This property of agents and the ability to model interacting failure mechanisms of the system elements makes the agent autonomy fundamentally different from all existing methods of probabilistic PoF-based reliability modeling. 1. Azarkhail, M., "Agent Autonomy Approach to Physics-Based Reliability Modeling of Structures and Mechanical Systems", PhD thesis, University of Maryland, College Park, 2007.
Formal equivalence of Poisson structures around Poisson submanifolds
Marcut, I.T.
2012-01-01
Let (M,π) be a Poisson manifold. A Poisson submanifold P ⊂ M gives rise to a Lie algebroid AP → P. Formal deformations of π around P are controlled by certain cohomology groups associated to AP. Assuming that these groups vanish, we prove that π is formally rigid around P; that is, any other Poisson
International Nuclear Information System (INIS)
Schmidt, T.
1988-01-01
The numerical reliability calculation of cracked construction components under cyclical fatigue stress can be done with the help of models of probabilistic fracture mechanics. An alternative to the Monte Carlo simulation method is examined; the alternative method is based on the description of failure processes with the help of a Markov process. The Markov method is traced back directly to the stochastic parameters of a two-dimensional fracture mechanics model, the effects of inspections and repairs also being considered. The probability of failure and expected failure frequency can be determined as time functions with the transition and conditional probabilities of the original or derived Markov process. For concrete calculation, an approximative Markov chain is designed which, under certain conditions, is capable of giving a sufficient approximation of the original Markov process and the reliability characteristics determined by it. The application of the MARKOV program code developed into an algorithm reveals sufficient conformity with the Monte Carlo reference results. The starting point of the investigation was the 'Deutsche Risikostudie B (DWR)' ('German Risk Study B (DWR)'), specifically, the reliability of the main coolant line. (orig./HP) [de
Crovelli, R.A.
1988-01-01
The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.
Stable oscillations of a predator-prey probabilistic cellular automaton: a mean-field approach
International Nuclear Information System (INIS)
Tome, Tania; Carvalho, Kelly C de
2007-01-01
We analyze a probabilistic cellular automaton describing the dynamics of coexistence of a predator-prey system. The individuals of each species are localized over the sites of a lattice and the local stochastic updating rules are inspired by the processes of the Lotka-Volterra model. Two levels of mean-field approximations are set up. The simple approximation is equivalent to an extended patch model, a simple metapopulation model with patches colonized by prey, patches colonized by predators and empty patches. This approximation is capable of describing the limited available space for species occupancy. The pair approximation is moreover able to describe two types of coexistence of prey and predators: one where population densities are constant in time and another displaying self-sustained time oscillations of the population densities. The oscillations are associated with limit cycles and arise through a Hopf bifurcation. They are stable against changes in the initial conditions and, in this sense, they differ from the Lotka-Volterra cycles which depend on initial conditions. In this respect, the present model is biologically more realistic than the Lotka-Volterra model
A probabilistic approach to safety/reliability of space nuclear power systems
International Nuclear Information System (INIS)
Medford, G.; Williams, K.; Kolaczkowski, A.
1989-01-01
An ongoing effort is investigating the feasibility of using probabilistic risk assessment (PRA) modeling techniques to construct a living model of a space nuclear power system. This is being done in conjunction with a traditional reliability and survivability analysis of the SP-100 space nuclear power system. The initial phase of the project consists of three major parts with the overall goal of developing a top-level system model and defining initiating events of interest for the SP-100 system. The three major tasks were performing a traditional survivability analysis, performing a simple system reliability analysis, and constructing a top-level system fault-tree model. Each of these tasks and their interim results are discussed in this paper. Initial results from the study support the conclusion that PRA modeling techniques can provide a valuable design and decision-making tool for space reactors. The ability of the model to rank and calculate relative contributions from various failure modes allows design optimization for maximum safety and reliability. Future efforts in the SP-100 program will see data development and quantification of the model to allow parametric evaluations of the SP-100 system. Current efforts have shown the need for formal data development and test programs within such a modeling framework
Energy Technology Data Exchange (ETDEWEB)
Lee, Han Sul; Heo, Gyun Young [Kyung Hee University, Yongin (Korea, Republic of); Kim, Tae Wan [Incheon National University, Incheon (Korea, Republic of)
2017-03-15
The purpose of this research is to introduce the technical standard of accident sequence precursor (ASP) analysis, and to propose a case study using the dynamic-probabilistic safety assessment (D-PSA) approach. The D-PSA approach can aid in the determination of high-risk/low-frequency accident scenarios from all potential scenarios. It can also be used to investigate the dynamic interaction between the physical state and the actions of the operator in an accident situation for risk quantification. This approach lends significant potential for safety analysis. Furthermore, the D-PSA approach provides a more realistic risk assessment by minimizing assumptions used in the conventional PSA model so-called the static-PSA model, which are relatively static in comparison. We performed risk quantification of a steam generator tube rupture (SGTR) accident using the dynamic event tree (DET) methodology, which is the most widely used methodology in D-PSA. The risk quantification results of D-PSA and S-PSA are compared and evaluated. Suggestions and recommendations for using D-PSA are described in order to provide a technical perspective.
International Nuclear Information System (INIS)
Lee, Han Sul; Heo, Gyun Young; Kim, Tae Wan
2017-01-01
The purpose of this research is to introduce the technical standard of accident sequence precursor (ASP) analysis, and to propose a case study using the dynamic-probabilistic safety assessment (D-PSA) approach. The D-PSA approach can aid in the determination of high-risk/low-frequency accident scenarios from all potential scenarios. It can also be used to investigate the dynamic interaction between the physical state and the actions of the operator in an accident situation for risk quantification. This approach lends significant potential for safety analysis. Furthermore, the D-PSA approach provides a more realistic risk assessment by minimizing assumptions used in the conventional PSA model so-called the static-PSA model, which are relatively static in comparison. We performed risk quantification of a steam generator tube rupture (SGTR) accident using the dynamic event tree (DET) methodology, which is the most widely used methodology in D-PSA. The risk quantification results of D-PSA and S-PSA are compared and evaluated. Suggestions and recommendations for using D-PSA are described in order to provide a technical perspective
Directory of Open Access Journals (Sweden)
Salvatore Martino
2018-02-01
Full Text Available The PARSIFAL (Probabilistic Approach to pRovide Scenarios of earthquake-Induced slope FAiLures approach was applied in the basin of Alcoy (Alicante, South Spain, to provide a comprehensive scenario of earthquake-induced landslides. The basin of Alcoy is well known for several historical landslides, mainly represented by earth-slides, that involve urban settlement as well as infrastructures (i.e., roads, bridges. The PARSIFAL overcomes several limits existing in other approaches, allowing the concomitant analyses of: (i first-time landslides (due to both rock-slope failures and shallow earth-slides and reactivations of existing landslides; (ii slope stability analyses of different failure mechanisms; (iii comprehensive mapping of earthquake-induced landslide scenarios in terms of exceedance probability of critical threshold values of co-seismic displacements. Geotechnical data were used to constrain the slope stability analysis, while specific field surveys were carried out to measure jointing and strength conditions of rock masses and to inventory already existing landslides. GIS-based susceptibility analyses were performed to assess the proneness to shallow earth-slides as well as to verify kinematic compatibility to planar or wedge rock-slides and to topples. The experienced application of PARSIFAL to the Alcoy basin: (i confirms the suitability of the approach at a municipality scale, (ii outputs the main role of saturation in conditioning slope instabilities in this case study, (iii demonstrates the reliability of the obtained results respect to the historical data.
DEFF Research Database (Denmark)
Jensen, Finn Verner; Lauritzen, Steffen Lilholt
2001-01-01
This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs.......This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs....
Wakker, P.P.; Thaler, R.H.; Tversky, A.
1997-01-01
Probabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in premium to compensate for a 1% default risk. These observations cannot be
P.P. Wakker (Peter); R.H. Thaler (Richard); A. Tversky (Amos)
1997-01-01
textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these
Poisson brackets of orthogonal polynomials
Cantero, María José; Simon, Barry
2009-01-01
For the standard symplectic forms on Jacobi and CMV matrices, we compute Poisson brackets of OPRL and OPUC, and relate these to other basic Poisson brackets and to Jacobians of basic changes of variable.
Directory of Open Access Journals (Sweden)
Flavio De Martino
2013-10-01
Full Text Available Stormwater tank performance significantly depends on management practices. This paper proposes a procedure to assess tank efficiency in terms of volume and pollutant concentration using four different capture tank management protocols. The comparison of the efficiency results reveals that, as expected, a combined bypass—stormwater tank system achieves better results than a tank alone. The management practices tested for the tank-only systems provide notably different efficiency results. The practice of immediately emptying after the end of the event exhibits significant levels of efficiency and operational advantages. All other configurations exhibit either significant operational problems or very low performances. The continuous simulation and semi-probabilistic approach for the best tank management practice are compared. The semi-probabilistic approach is based on a Weibull probabilistic model of the main characteristics of the rainfall process. Following this approach, efficiency indexes were established. The comparison with continuous simulations shows the reliability of the probabilistic approach even if this last is certainly very site sensitive.
Branes in Poisson sigma models
International Nuclear Information System (INIS)
Falceto, Fernando
2010-01-01
In this review we discuss possible boundary conditions (branes) for the Poisson sigma model. We show how to carry out the perturbative quantization in the presence of a general pre-Poisson brane and how this is related to the deformation quantization of Poisson structures. We conclude with an open problem: the perturbative quantization of the system when the boundary has several connected components and we use a different pre-Poisson brane in every component.
A test of inflated zeros for Poisson regression models.
He, Hua; Zhang, Hui; Ye, Peng; Tang, Wan
2017-01-01
Excessive zeros are common in practice and may cause overdispersion and invalidate inference when fitting Poisson regression models. There is a large body of literature on zero-inflated Poisson models. However, methods for testing whether there are excessive zeros are less well developed. The Vuong test comparing a Poisson and a zero-inflated Poisson model is commonly applied in practice. However, the type I error of the test often deviates seriously from the nominal level, rendering serious doubts on the validity of the test in such applications. In this paper, we develop a new approach for testing inflated zeros under the Poisson model. Unlike the Vuong test for inflated zeros, our method does not require a zero-inflated Poisson model to perform the test. Simulation studies show that when compared with the Vuong test our approach not only better at controlling type I error rate, but also yield more power.
Normal forms in Poisson geometry
Marcut, I.T.
2013-01-01
The structure of Poisson manifolds is highly nontrivial even locally. The first important result in this direction is Conn's linearization theorem around fixed points. One of the main results of this thesis (Theorem 2) is a normal form theorem in Poisson geometry, which is the Poisson-geometric
Game meat consumption by hunters and their relatives: A probabilistic approach.
Sevillano Morales, Jesus; Moreno-Ortega, Alicia; Amaro Lopez, Manual Angel; Arenas Casas, Antonio; Cámara-Martos, Fernando; Moreno-Rojas, Rafael
2018-06-18
This study aimed to estimate the consumption of meat and products derived from hunting by the consumer population and, specifically, by hunters and their relatives. For this purpose, a survey was conducted on the frequency of consuming meat from the four most representative game species in Spain, two of big game, wild boar (Sus scrofa) and red deer (Cervus elaphus) and two of small game, rabbit (Oryctolagus cunulucus) and red partridge (Alectoris rufa), as well as of processed meat products (salami-type sausage) made from those big game species. The survey was carried out on 337 habitual consumers of these types of products (hunters and their relatives). The total mean game meat consumption, per capita in this population group, is 6.87 kg/person/year of meat and 8.57 kg/person/year if the processed meat products are also considered. Consumption of rabbit, red partridge, red deer and wild boar, individually, was 1.85, 0.82, 2.28 and 1.92 kg/person/year, respectively. It was observed that hunters generally registered a larger intake of game meat, this being statistically significant in the case of rabbit meat consumption. Using probabilistic methods, the meat consumption frequency distributions for each hunting species studied were estimated, as well as the products made from big game species and the total consumption both of meat by itself and that including the products made from it. The consumption frequency distributions were adjusted to exponential ones, verified by the test suitable for it according to Akaike Information Criterion, Bayesian Information Criterion, the Chi-Squared and Kolmogorov-Smirnov statistics. In addition, the consumption percentiles of the different distributions were obtained. The latter could be a good tool when making nutrition or contaminant studies since they permit the assessment of exposure to the compound in question.
Estimation of Poisson noise in spatial domain
Švihlík, Jan; Fliegel, Karel; Vítek, Stanislav; Kukal, Jaromír.; Krbcová, Zuzana
2017-09-01
This paper deals with modeling of astronomical images in the spatial domain. We consider astronomical light images contaminated by the dark current which is modeled by Poisson random process. Dark frame image maps the thermally generated charge of the CCD sensor. In this paper, we solve the problem of an addition of two Poisson random variables. At first, the noise analysis of images obtained from the astronomical camera is performed. It allows estimating parameters of the Poisson probability mass functions in every pixel of the acquired dark frame. Then the resulting distributions of the light image can be found. If the distributions of the light image pixels are identified, then the denoising algorithm can be applied. The performance of the Bayesian approach in the spatial domain is compared with the direct approach based on the method of moments and the dark frame subtraction.
International Nuclear Information System (INIS)
Zhao Yongxiang; Wang Jinnuo; Gao Qing
2001-01-01
A unified approach, referred to as general maximum likelihood method, is presented for estimating probabilistic design S-N curves and their confidence bounds of the three commonly used fatigue stress-life models, namely three parameter, Langer and Basquin. The curves are described by a general form of mean and standard deviation S-N curves of the logarithm of fatigue life. Different from existent methods, i.e., the conventional method and the classical maximum likelihood method,present approach considers the statistical characteristics of whole test data. The parameters of the mean curve is firstly estimated by least square method and then, the parameters of the standard deviation curve is evaluated by mathematical programming method to be agreement with the maximum likelihood principle. Fit effects of the curves are assessed by fitted relation coefficient, total fitted standard error and the confidence bounds. Application to the virtual stress amplitude-crack initiation life data of a nuclear engineering material, Chinese 1Cr18Ni9Ti stainless steel pipe-weld metal, has indicated the validity of the approach to the S-N data where both S and N show the character of random variable. Practices to the two states of S-N data of Chinese 45 carbon steel notched specimens (k t = 2.0) have indicated the validity of present approach to the test results obtained respectively from group fatigue test and from maximum likelihood fatigue test. At the practices, it was revealed that in general the fit is best for the three-parameter model,slightly inferior for the Langer relation and poor for the Basquin equation. Relative to the existent methods, present approach has better fit. In addition, the possible non-conservative predictions of the existent methods, which are resulted from the influence of local statistical characteristics of the data, are also overcome by present approach
Some probabilistic properties of fractional point processes
Garra, Roberto; Orsingher, Enzo; Scavino, Marco
2017-01-01
P{T-k(alpha) < infinity} are explicitly obtained and analyzed. The processes N-f (t) are time-changed Poisson processes N( H-f (t)) with subordinators H-f (t) and here we study N(Sigma H-n(j= 1)f j (t)) and obtain probabilistic features
Budreski, Katherine; Winchell, Michael; Padilla, Lauren; Bang, JiSu; Brain, Richard A
2016-04-01
A crop footprint refers to the estimated spatial extent of growing areas for a specific crop, and is commonly used to represent the potential "use site" footprint for a pesticide labeled for use on that crop. A methodology for developing probabilistic crop footprints to estimate the likelihood of pesticide use and the potential co-occurrence of pesticide use and listed species locations was tested at the national scale and compared to alternative methods. The probabilistic aspect of the approach accounts for annual crop rotations and the uncertainty in remotely sensed crop and land cover data sets. The crop footprints used historically are derived exclusively from the National Land Cover Database (NLCD) Cultivated Crops and/or Pasture/Hay classes. This approach broadly aggregates agriculture into 2 classes, which grossly overestimates the spatial extent of individual crops that are labeled for pesticide use. The approach also does not use all the available crop data, represents a single point in time, and does not account for the uncertainty in land cover data set classifications. The probabilistic crop footprint approach described herein incorporates best available information at the time of analysis from the National Agricultural Statistics Service (NASS) Cropland Data Layer (CDL) for 5 y (2008-2012 at the time of analysis), the 2006 NLCD, the 2007 NASS Census of Agriculture, and 5 y of NASS Quick Stats (2008-2012). The approach accounts for misclassification of crop classes in the CDL by incorporating accuracy assessment information by state, year, and crop. The NLCD provides additional information to improve the CDL crop probability through an adjustment based on the NLCD accuracy assessment data using the principles of Bayes' Theorem. Finally, crop probabilities are scaled at the state level by comparing against NASS surveys (Census of Agriculture and Quick Stats) of reported planted acres by crop. In an example application of the new method, the probabilistic
Energy Technology Data Exchange (ETDEWEB)
Singh, Kunwar P., E-mail: kpsingh_52@yahoo.com [Academy of Scientific and Innovative Research, Council of Scientific and Industrial Research, New Delhi (India); Environmental Chemistry Division, CSIR-Indian Institute of Toxicology Research, Post Box 80, Mahatma Gandhi Marg, Lucknow 226 001 (India); Gupta, Shikha; Rai, Premanjali [Academy of Scientific and Innovative Research, Council of Scientific and Industrial Research, New Delhi (India); Environmental Chemistry Division, CSIR-Indian Institute of Toxicology Research, Post Box 80, Mahatma Gandhi Marg, Lucknow 226 001 (India)
2013-10-15
Robust global models capable of discriminating positive and non-positive carcinogens; and predicting carcinogenic potency of chemicals in rodents were developed. The dataset of 834 structurally diverse chemicals extracted from Carcinogenic Potency Database (CPDB) was used which contained 466 positive and 368 non-positive carcinogens. Twelve non-quantum mechanical molecular descriptors were derived. Structural diversity of the chemicals and nonlinearity in the data were evaluated using Tanimoto similarity index and Brock–Dechert–Scheinkman statistics. Probabilistic neural network (PNN) and generalized regression neural network (GRNN) models were constructed for classification and function optimization problems using the carcinogenicity end point in rat. Validation of the models was performed using the internal and external procedures employing a wide series of statistical checks. PNN constructed using five descriptors rendered classification accuracy of 92.09% in complete rat data. The PNN model rendered classification accuracies of 91.77%, 80.70% and 92.08% in mouse, hamster and pesticide data, respectively. The GRNN constructed with nine descriptors yielded correlation coefficient of 0.896 between the measured and predicted carcinogenic potency with mean squared error (MSE) of 0.44 in complete rat data. The rat carcinogenicity model (GRNN) applied to the mouse and hamster data yielded correlation coefficient and MSE of 0.758, 0.71 and 0.760, 0.46, respectively. The results suggest for wide applicability of the inter-species models in predicting carcinogenic potency of chemicals. Both the PNN and GRNN (inter-species) models constructed here can be useful tools in predicting the carcinogenicity of new chemicals for regulatory purposes. - Graphical abstract: Figure (a) shows classification accuracies (positive and non-positive carcinogens) in rat, mouse, hamster, and pesticide data yielded by optimal PNN model. Figure (b) shows generalization and predictive
Setiawan, R.
2018-05-01
In this paper, Economic Order Quantity (EOQ) of the vendor-buyer supply-chain model under a probabilistic condition with imperfect quality items has been analysed. The analysis is delivered using two concepts in game theory approach, which is Stackelberg equilibrium and Pareto Optimal, under non-cooperative and cooperative games, respectively. Another result is getting acomparison of theoptimal result between integrated scheme and game theory approach based on analytical and numerical result using appropriate simulation data.
Yan, F.; Winijkul, E.; Bond, T. C.; Streets, D. G.
2012-12-01
There is deficiency in the determination of emission reduction potential in the future, especially with consideration of uncertainty. Mitigation measures for some economic sectors have been proposed, but few studies provide an evaluation of the amount of PM emission reduction that can be obtained in future years by different emission reduction strategies. We attribute the absence of helpful mitigation strategy analysis to limitations in the technical detail of future emission scenarios, which result in the inability to relate technological or regulatory intervention to emission changes. The purpose of this work is to provide a better understanding of the potential benefits of mitigation policies in addressing global and regional emissions. In this work, we introduce a probabilistic approach to explore the impacts of retrofit and scrappage on global PM emissions from on-road vehicles in the coming decades. This approach includes scenario analysis, sensitivity analysis and Monte Carlo simulations. A dynamic model of vehicle population linked to emission characteristics, SPEW-Trend, is used to estimate future emissions and make policy evaluations. Three basic questions will be answered in this work: (1) what contribution can these two programs make to improve global emissions in the future? (2) in which regions are such programs most and least effective in reducing emissions and what features of the vehicle fleet cause these results? (3) what is the level of confidence in the projected emission reductions, given uncertain parameters in describing the dynamic vehicle fleet?
Ifremer
1992-01-01
Vous trouverez dans ce document les 24 poissons les plus courants de Guyane (sur un nombre d'espèces approchant les 200) avec leurs principales caractéristiques, leurs noms scientifiques, français, anglais et espagnol et leurs photographies. Ils sont classés, de l'acoupa au vivaneau ti yeux, par ordre alphabétique. Si vous ne trouvez pas de chiffres sur la production de telle ou telle espèce, c'est parce qu'ils n'existent pas, mais aussi et surtout parce qu'ils ne signifieraient rien, l...
Use of risk quotient and probabilistic approaches to assess risks of pesticides to birds
When conducting ecological risk assessments for pesticides, the United States Environmental Protection Agency typically relies upon the risk quotient (RQ). This approach is intended to be conservative in nature, making assumptions related to exposure and effects that are intended...
International Nuclear Information System (INIS)
Holmberg, J.
1997-04-01
The thesis models risk management as an optimal control problem for a stochastic process. The approach classes the decisions made by management into three categories according to the control methods of a point process: (1) planned process lifetime, (2) modification of the design, and (3) operational decisions. The approach is used for optimization of plant shutdown criteria and surveillance test strategies of a hypothetical nuclear power plant
Energy Technology Data Exchange (ETDEWEB)
Holmberg, J [VTT Automation, Espoo (Finland)
1997-04-01
The thesis models risk management as an optimal control problem for a stochastic process. The approach classes the decisions made by management into three categories according to the control methods of a point process: (1) planned process lifetime, (2) modification of the design, and (3) operational decisions. The approach is used for optimization of plant shutdown criteria and surveillance test strategies of a hypothetical nuclear power plant. 62 refs. The thesis includes also five previous publications by author.
DEFF Research Database (Denmark)
Chen, Peiyuan; Chen, Zhe; Bak-Jensen, Birgitte
2007-01-01
In order to assess the performance of distribution system under normal operating conditions with large integration of renewable energy based dispersed generation (DG) units, probabilistic modeling of the distribution system is necessary in order to take into consideration the stochastic behavior...... of load demands and DG units such as wind generation and combined heat and power plant generation. This paper classifies probabilistic models of load demands and DG units into summer and winter period, weekday and weekend as well as in 24 hours a day. The voltage results from the probabilistic load flow...
Selective Contrast Adjustment by Poisson Equation
Directory of Open Access Journals (Sweden)
Ana-Belen Petro
2013-09-01
Full Text Available Poisson Image Editing is a new technique permitting to modify the gradient vector field of an image, and then to recover an image with a gradient approaching this modified gradient field. This amounts to solve a Poisson equation, an operation which can be efficiently performed by Fast Fourier Transform (FFT. This paper describes an algorithm applying this technique, with two different variants. The first variant enhances the contrast by increasing the gradient in the dark regions of the image. This method is well adapted to images with back light or strong shadows, and reveals details in the shadows. The second variant of the same Poisson technique enhances all small gradients in the image, thus also sometimes revealing details and texture.
A probabilistic multi objective CLSC model with Genetic algorithm-ε_Constraint approach
Directory of Open Access Journals (Sweden)
Alireza TaheriMoghadam
2014-05-01
Full Text Available In this paper an uncertain multi objective closed-loop supply chain is developed. The first objective function is maximizing the total profit. The second objective function is minimizing the use of row materials. In the other word, the second objective function is maximizing the amount of remanufacturing and recycling. Genetic algorithm is used for optimization and for finding the pareto optimal line, Epsilon-constraint method is used. Finally a numerical example is solved with proposed approach and performance of the model is evaluated in different sizes. The results show that this approach is effective and useful for managerial decisions.
A probabilistic approach to identify putative drug targets in biochemical networks.
Murabito, E.; Smalbone, K.; Swinton, J.; Westerhoff, H.V.; Steuer, R.
2011-01-01
Network-based drug design holds great promise in clinical research as a way to overcome the limitations of traditional approaches in the development of drugs with high efficacy and low toxicity. This novel strategy aims to study how a biochemical network as a whole, rather than its individual
1989-10-31
fo tmaa OmfuogeM ara Mmi. fal in fM?05V~ ~ ~ ~ ~ ~ A D A 2 4 0409"~ n ugt Psoo,@’ oducbof Proton (07044 136M. WagaWapN. DC 20141 T1 3. REPORT TYPE...Al (circumscription, non- monotonic reasoning, and default reasoning), our approach is based on fuzzy logic and, more specifically, on the theory of
Microscopic and probabilistic approach to thermal steady state based on a dice and coin toy model
International Nuclear Information System (INIS)
Onorato, Pasquale; Moggio, Lorenzo; Oss, Stefano; Malgieri, Massimiliano
2017-01-01
In this article we present an educational approach to thermal equilibrium which was tested on a group of 13 undergraduate students at the University of Trento. The approach is based on a stochastic toy model, in which bodies in thermal contact are represented by rows of squares on a cardboard table, which exchange coins placed on the squares based on the roll of two dice. The discussion of several physical principles, such as the exponential approach to equilibrium, the determination of the equilibrium temperature, and the interpretation of the equilibrium state as the most probable macrostate, proceeds through a continual comparison between the outcomes obtained with the toy model and the results of a real experiment on the thermal contact of two masses of water at different temperatures. At the end of the sequence, a re-analysis of the experimental results in view of both the Boltzmann and Clausius definitions of entropy reveals some limits of the toy model, but also allows for a critical discussion of the concepts of temperature and entropy. In order to provide the reader with a feeling of how the sequence was received by students, and how it helped them understand the topics introduced, we discuss some excerpts from their answers to a conceptual item given at the end of the sequence. (paper)
International Nuclear Information System (INIS)
Meslin, T.; Carnino, A.
1986-01-01
This example shows the thoroughness of EDF's approach in processing the difficult problems of the loss of electrical power supplies. Efforts are continuing in several directions: continued revision and improvement of operating procedures in the event of loss of electrical power supplies, PWR plant operator training courses devoted to the problems of power supply losses, and continued testing on simulators, and particularly testing under real conditions, including tests lasting several hours made possible by the performance of the new EDF simulators (two-phase code and taking all power losses into account)
Nonhomogeneous fractional Poisson processes
Energy Technology Data Exchange (ETDEWEB)
Wang Xiaotian [School of Management, Tianjin University, Tianjin 300072 (China)]. E-mail: swa001@126.com; Zhang Shiying [School of Management, Tianjin University, Tianjin 300072 (China); Fan Shen [Computer and Information School, Zhejiang Wanli University, Ningbo 315100 (China)
2007-01-15
In this paper, we propose a class of non-Gaussian stationary increment processes, named nonhomogeneous fractional Poisson processes W{sub H}{sup (j)}(t), which permit the study of the effects of long-range dependance in a large number of fields including quantum physics and finance. The processes W{sub H}{sup (j)}(t) are self-similar in a wide sense, exhibit more fatter tail than Gaussian processes, and converge to the Gaussian processes in distribution in some cases. In addition, we also show that the intensity function {lambda}(t) strongly influences the existence of the highest finite moment of W{sub H}{sup (j)}(t) and the behaviour of the tail probability of W{sub H}{sup (j)}(t)
Nonhomogeneous fractional Poisson processes
International Nuclear Information System (INIS)
Wang Xiaotian; Zhang Shiying; Fan Shen
2007-01-01
In this paper, we propose a class of non-Gaussian stationary increment processes, named nonhomogeneous fractional Poisson processes W H (j) (t), which permit the study of the effects of long-range dependance in a large number of fields including quantum physics and finance. The processes W H (j) (t) are self-similar in a wide sense, exhibit more fatter tail than Gaussian processes, and converge to the Gaussian processes in distribution in some cases. In addition, we also show that the intensity function λ(t) strongly influences the existence of the highest finite moment of W H (j) (t) and the behaviour of the tail probability of W H (j) (t)
Smolina, Irina Yu.
2015-10-01
Mechanical properties of a cable are of great importance in design and strength calculation of flexible cables. The problem of determination of elastic properties and rigidity characteristics of a cable modeled by anisotropic helical elastic rod is considered. These characteristics are calculated indirectly by means of the parameters received from statistical processing of experimental data. These parameters are considered as random quantities. With taking into account probable nature of these parameters the formulas for estimation of the macroscopic elastic moduli of a cable are obtained. The calculating expressions for macroscopic flexural rigidity, shear rigidity and torsion rigidity using the macroscopic elastic characteristics obtained before are presented. Statistical estimations of the rigidity characteristics of some cable grades are adduced. A comparison with those characteristics received on the basis of deterministic approach is given.
International Nuclear Information System (INIS)
Fritsch, Daniel; Yu Liyun; Johnson, Valen; McAuliffe, Matthew; Pizer, Stephen; Chaney, Edward
1996-01-01
Purpose/Objective : Current clinical methods for defining normal anatomical structures on tomographic images are time consuming and subject to intra- and inter-user variability. With the widespread implementation of 3D RTP, conformal radiotherapy, and dose escalation the implications of imprecise object definition have assumed a much higher level of importance. Object definition and volume-weighted metrics for normal anatomy, such as DVHs and NTCPs, play critical roles in aiming, shaping, and weighting beams. Improvements in object definition, including computer automation, are essential to yield reliable volume-weighted metrics and gains in human efficiency. The purpose of this study was to investigate a probabilistic approach using deformable models to automatically recognize and extract normal anatomy in tomographic images. Materials and Methods: Object models were created from normal organs that were segmented by an interactive method which involved placing a cursor near the center of the object on a slice and clicking a mouse button to initiate computation of structures called cores. Cores describe the skeletal and boundary shape of image objects in a manner that, in 2D, associates a location on the skeleton with the width of the object at that location. A significant advantage of cores is stability against image disturbances such as noise and blur. The model was composed of a relatively small set of extracted points on the skeleton and boundary. The points were carefully chosen to summarize the shape information captured by the cores. Neighborhood relationships between points were represented mathematically by energy functions that penalize, due to warping of the model, the ''goodness'' of match between the model and the image data at any stage during the segmentation process. The model was matched against the image data using a probabilistic approach based on Bayes theorem, which provides a means for computing a posteriori (posterior) probability from 1) a
Probabilistic conditional independence structures
Studeny, Milan
2005-01-01
Probabilistic Conditional Independence Structures provides the mathematical description of probabilistic conditional independence structures; the author uses non-graphical methods of their description, and takes an algebraic approach.The monograph presents the methods of structural imsets and supermodular functions, and deals with independence implication and equivalence of structural imsets.Motivation, mathematical foundations and areas of application are included, and a rough overview of graphical methods is also given.In particular, the author has been careful to use suitable terminology, and presents the work so that it will be understood by both statisticians, and by researchers in artificial intelligence.The necessary elementary mathematical notions are recalled in an appendix.
Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events
DeChant, C. M.; Moradkhani, H.
2014-12-01
Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.
Saad, Bilal Mohammed
2017-09-18
This work focuses on the simulation of CO2 storage in deep underground formations under uncertainty and seeks to understand the impact of uncertainties in reservoir properties on CO2 leakage. To simulate the process, a non-isothermal two-phase two-component flow system with equilibrium phase exchange is used. Since model evaluations are computationally intensive, instead of traditional Monte Carlo methods, we rely on polynomial chaos (PC) expansions for representation of the stochastic model response. A non-intrusive approach is used to determine the PC coefficients. We establish the accuracy of the PC representations within a reasonable error threshold through systematic convergence studies. In addition to characterizing the distributions of model observables, we compute probabilities of excess CO2 leakage. Moreover, we consider the injection rate as a design parameter and compute an optimum injection rate that ensures that the risk of excess pressure buildup at the leaky well remains below acceptable levels. We also provide a comprehensive analysis of sensitivities of CO2 leakage, where we compute the contributions of the random parameters, and their interactions, to the variance by computing first, second, and total order Sobol’ indices.
Börner, Jan; Marinho, Eduardo; Wunder, Sven
2015-01-01
Annual forest loss in the Brazilian Amazon had in 2012 declined to less than 5,000 sqkm, from over 27,000 in 2004. Mounting empirical evidence suggests that changes in Brazilian law enforcement strategy and the related governance system may account for a large share of the overall success in curbing deforestation rates. At the same time, Brazil is experimenting with alternative approaches to compensate farmers for conservation actions through economic incentives, such as payments for environmental services, at various administrative levels. We develop a spatially explicit simulation model for deforestation decisions in response to policy incentives and disincentives. The model builds on elements of optimal enforcement theory and introduces the notion of imperfect payment contract enforcement in the context of avoided deforestation. We implement the simulations using official deforestation statistics and data collected from field-based forest law enforcement operations in the Amazon region. We show that a large-scale integration of payments with the existing regulatory enforcement strategy involves a tradeoff between the cost-effectiveness of forest conservation and landholder incomes. Introducing payments as a complementary policy measure increases policy implementation cost, reduces income losses for those hit hardest by law enforcement, and can provide additional income to some land users. The magnitude of the tradeoff varies in space, depending on deforestation patterns, conservation opportunity and enforcement costs. Enforcement effectiveness becomes a key determinant of efficiency in the overall policy mix. PMID:25650966
Börner, Jan; Marinho, Eduardo; Wunder, Sven
2015-01-01
Annual forest loss in the Brazilian Amazon had in 2012 declined to less than 5,000 sqkm, from over 27,000 in 2004. Mounting empirical evidence suggests that changes in Brazilian law enforcement strategy and the related governance system may account for a large share of the overall success in curbing deforestation rates. At the same time, Brazil is experimenting with alternative approaches to compensate farmers for conservation actions through economic incentives, such as payments for environmental services, at various administrative levels. We develop a spatially explicit simulation model for deforestation decisions in response to policy incentives and disincentives. The model builds on elements of optimal enforcement theory and introduces the notion of imperfect payment contract enforcement in the context of avoided deforestation. We implement the simulations using official deforestation statistics and data collected from field-based forest law enforcement operations in the Amazon region. We show that a large-scale integration of payments with the existing regulatory enforcement strategy involves a tradeoff between the cost-effectiveness of forest conservation and landholder incomes. Introducing payments as a complementary policy measure increases policy implementation cost, reduces income losses for those hit hardest by law enforcement, and can provide additional income to some land users. The magnitude of the tradeoff varies in space, depending on deforestation patterns, conservation opportunity and enforcement costs. Enforcement effectiveness becomes a key determinant of efficiency in the overall policy mix.
Saad, Bilal Mohammed; Alexanderian, Alen; Prudhomme, Serge; Knio, Omar
2017-01-01
This work focuses on the simulation of CO2 storage in deep underground formations under uncertainty and seeks to understand the impact of uncertainties in reservoir properties on CO2 leakage. To simulate the process, a non-isothermal two-phase two-component flow system with equilibrium phase exchange is used. Since model evaluations are computationally intensive, instead of traditional Monte Carlo methods, we rely on polynomial chaos (PC) expansions for representation of the stochastic model response. A non-intrusive approach is used to determine the PC coefficients. We establish the accuracy of the PC representations within a reasonable error threshold through systematic convergence studies. In addition to characterizing the distributions of model observables, we compute probabilities of excess CO2 leakage. Moreover, we consider the injection rate as a design parameter and compute an optimum injection rate that ensures that the risk of excess pressure buildup at the leaky well remains below acceptable levels. We also provide a comprehensive analysis of sensitivities of CO2 leakage, where we compute the contributions of the random parameters, and their interactions, to the variance by computing first, second, and total order Sobol’ indices.
ChromaSig: a probabilistic approach to finding common chromatin signatures in the human genome.
Directory of Open Access Journals (Sweden)
Gary Hon
2008-10-01
Full Text Available Computational methods to identify functional genomic elements using genetic information have been very successful in determining gene structure and in identifying a handful of cis-regulatory elements. But the vast majority of regulatory elements have yet to be discovered, and it has become increasingly apparent that their discovery will not come from using genetic information alone. Recently, high-throughput technologies have enabled the creation of information-rich epigenetic maps, most notably for histone modifications. However, tools that search for functional elements using this epigenetic information have been lacking. Here, we describe an unsupervised learning method called ChromaSig to find, in an unbiased fashion, commonly occurring chromatin signatures in both tiling microarray and sequencing data. Applying this algorithm to nine chromatin marks across a 1% sampling of the human genome in HeLa cells, we recover eight clusters of distinct chromatin signatures, five of which correspond to known patterns associated with transcriptional promoters and enhancers. Interestingly, we observe that the distinct chromatin signatures found at enhancers mark distinct functional classes of enhancers in terms of transcription factor and coactivator binding. In addition, we identify three clusters of novel chromatin signatures that contain evolutionarily conserved sequences and potential cis-regulatory elements. Applying ChromaSig to a panel of 21 chromatin marks mapped genomewide by ChIP-Seq reveals 16 classes of genomic elements marked by distinct chromatin signatures. Interestingly, four classes containing enrichment for repressive histone modifications appear to be locally heterochromatic sites and are enriched in quickly evolving regions of the genome. The utility of this approach in uncovering novel, functionally significant genomic elements will aid future efforts of genome annotation via chromatin modifications.
Poisson Spot with Magnetic Levitation
Hoover, Matthew; Everhart, Michael; D'Arruda, Jose
2010-01-01
In this paper we describe a unique method for obtaining the famous Poisson spot without adding obstacles to the light path, which could interfere with the effect. A Poisson spot is the interference effect from parallel rays of light diffracting around a solid spherical object, creating a bright spot in the center of the shadow.
Probabilistic approach to rock fall hazard assessment: potential of historical data analysis
Directory of Open Access Journals (Sweden)
C. Dussauge-Peisser
2002-01-01
Full Text Available We study the rock fall volume distribution for three rock fall inventories and we fit the observed data by a power-law distribution, which has recently been proposed to describe landslide and rock fall volume distributions, and is also observed for many other natural phenomena, such as volcanic eruptions or earthquakes. We use these statistical distributions of past events to estimate rock fall occurrence rates on the studied areas. It is an alternative to deterministic approaches, which have not proved successful in predicting individual rock falls. The first one concerns calcareous cliffs around Grenoble, French Alps, from 1935 to 1995. The second data set is gathered during the 1912–1992 time window in Yosemite Valley, USA, in granite cliffs. The third one covers the 1954–1976 period in the Arly gorges, French Alps, with metamorphic and sedimentary rocks. For the three data sets, we find a good agreement between the observed volume distributions and a fit by a power-law distribution for volumes larger than 50 m3 , or 20 m3 for the Arly gorges. We obtain similar values of the b exponent close to 0.45 for the 3 data sets. In agreement with previous studies, this suggests, that the b value is not dependant on the geological settings. Regarding the rate of rock fall activity, determined as the number of rock fall events with volume larger than 1 m3 per year, we find a large variability from one site to the other. The rock fall activity, as part of a local erosion rate, is thus spatially dependent. We discuss the implications of these observations for the rock fall hazard evaluation. First, assuming that the volume distributions are temporally stable, a complete rock fall inventory allows for the prediction of recurrence rates for future events of a given volume in the range of the observed historical data. Second, assuming that the observed volume distribution follows a power-law distribution without cutoff at small or large scales, we can
Probabilistic approach to rock fall hazard assessment: potential of historical data analysis
Dussauge-Peisser, C.; Helmstetter, A.; Grasso, J.-R.; Hantz, D.; Desvarreux, P.; Jeannin, M.; Giraud, A.
We study the rock fall volume distribution for three rock fall inventories and we fit the observed data by a power-law distribution, which has recently been proposed to describe landslide and rock fall volume distributions, and is also observed for many other natural phenomena, such as volcanic eruptions or earthquakes. We use these statistical distributions of past events to estimate rock fall occurrence rates on the studied areas. It is an alternative to deterministic approaches, which have not proved successful in predicting individual rock falls. The first one concerns calcareous cliffs around Grenoble, French Alps, from 1935 to 1995. The second data set is gathered during the 1912-1992 time window in Yosemite Valley, USA, in granite cliffs. The third one covers the 1954-1976 period in the Arly gorges, French Alps, with metamorphic and sedimentary rocks. For the three data sets, we find a good agreement between the observed volume distributions and a fit by a power-law distribution for volumes larger than 50 m3 , or 20 m3 for the Arly gorges. We obtain similar values of the b exponent close to 0.45 for the 3 data sets. In agreement with previous studies, this suggests, that the b value is not dependant on the geological settings. Regarding the rate of rock fall activity, determined as the number of rock fall events with volume larger than 1 m3 per year, we find a large variability from one site to the other. The rock fall activity, as part of a local erosion rate, is thus spatially dependent. We discuss the implications of these observations for the rock fall hazard evaluation. First, assuming that the volume distributions are temporally stable, a complete rock fall inventory allows for the prediction of recurrence rates for future events of a given volume in the range of the observed historical data. Second, assuming that the observed volume distribution follows a power-law distribution without cutoff at small or large scales, we can extrapolate these
Probabilistic Structural Analysis of SSME Turbopump Blades: Probabilistic Geometry Effects
Nagpal, V. K.
1985-01-01
A probabilistic study was initiated to evaluate the precisions of the geometric and material properties tolerances on the structural response of turbopump blades. To complete this study, a number of important probabilistic variables were identified which are conceived to affect the structural response of the blade. In addition, a methodology was developed to statistically quantify the influence of these probabilistic variables in an optimized way. The identified variables include random geometric and material properties perturbations, different loadings and a probabilistic combination of these loadings. Influences of these probabilistic variables are planned to be quantified by evaluating the blade structural response. Studies of the geometric perturbations were conducted for a flat plate geometry as well as for a space shuttle main engine blade geometry using a special purpose code which uses the finite element approach. Analyses indicate that the variances of the perturbations about given mean values have significant influence on the response.
Mastrolorenzo, G.; Pappalardo, L.; de Natale, G.; Troise, C.; Rossano, S.; Panizza, A.
2009-04-01
Probabilistic approaches based on available volcanological data from real eruptions of Campi Flegrei and Somma-Vesuvius, are assembled in a comprehensive assessment of volcanic hazards at the Neapolitan area. This allows to compare the volcanic hazards related to the different types of events, which can be used for evaluating the conditional probability of flows and falls hazard in case of a volcanic crisis. Hazard maps are presented, based on a rather complete set of numerical simulations, produced using field and laboratory data as input parameters relative to a large range (VEI 1 to 5) of fallout and pyroclastic-flow events and their relative occurrence. The results allow us to quantitatively evaluate and compare the hazard related to pyroclastic fallout and density currents (PDCs) at the Neapolitan volcanoes and their surroundings, including the city of Naples. Due to its position between the two volcanic areas, the city of Naples is particularly exposed to volcanic risk from VEI>2 eruptions, as recorded in the local volcanic succession. Because dominant wind directions, the area of Naples is particularly prone to fallout hazard from Campi Flegrei caldera eruptions in the VEI range 2-5. The hazard from PDCs decreases roughly radially with distance from the eruptive vents and is strongly controlled by the topographic heights. Campi Flegrei eruptions are particularly hazardous for Naples, although the Camaldoli and Posillipo hills produce an effective barrier to propagation to the very central part of Naples. PDCs from Vesuvius eruptions with VEI>4 can cover the city of Naples, whereas even VEI>3 eruptions have a moderate fallout hazard there.
Poisson hierarchy of discrete strings
International Nuclear Information System (INIS)
Ioannidou, Theodora; Niemi, Antti J.
2016-01-01
The Poisson geometry of a discrete string in three dimensional Euclidean space is investigated. For this the Frenet frames are converted into a spinorial representation, the discrete spinor Frenet equation is interpreted in terms of a transfer matrix formalism, and Poisson brackets are introduced in terms of the spinor components. The construction is then generalised, in a self-similar manner, into an infinite hierarchy of Poisson algebras. As an example, the classical Virasoro (Witt) algebra that determines reparametrisation diffeomorphism along a continuous string, is identified as a particular sub-algebra, in the hierarchy of the discrete string Poisson algebra. - Highlights: • Witt (classical Virasoro) algebra is derived in the case of discrete string. • Infinite dimensional hierarchy of Poisson bracket algebras is constructed for discrete strings. • Spinor representation of discrete Frenet equations is developed.
Poisson hierarchy of discrete strings
Energy Technology Data Exchange (ETDEWEB)
Ioannidou, Theodora, E-mail: ti3@auth.gr [Faculty of Civil Engineering, School of Engineering, Aristotle University of Thessaloniki, 54249, Thessaloniki (Greece); Niemi, Antti J., E-mail: Antti.Niemi@physics.uu.se [Department of Physics and Astronomy, Uppsala University, P.O. Box 803, S-75108, Uppsala (Sweden); Laboratoire de Mathematiques et Physique Theorique CNRS UMR 6083, Fédération Denis Poisson, Université de Tours, Parc de Grandmont, F37200, Tours (France); Department of Physics, Beijing Institute of Technology, Haidian District, Beijing 100081 (China)
2016-01-28
The Poisson geometry of a discrete string in three dimensional Euclidean space is investigated. For this the Frenet frames are converted into a spinorial representation, the discrete spinor Frenet equation is interpreted in terms of a transfer matrix formalism, and Poisson brackets are introduced in terms of the spinor components. The construction is then generalised, in a self-similar manner, into an infinite hierarchy of Poisson algebras. As an example, the classical Virasoro (Witt) algebra that determines reparametrisation diffeomorphism along a continuous string, is identified as a particular sub-algebra, in the hierarchy of the discrete string Poisson algebra. - Highlights: • Witt (classical Virasoro) algebra is derived in the case of discrete string. • Infinite dimensional hierarchy of Poisson bracket algebras is constructed for discrete strings. • Spinor representation of discrete Frenet equations is developed.
International Nuclear Information System (INIS)
Purba, Julwan Hendry
2014-01-01
Highlights: • We propose a fuzzy-based reliability approach to evaluate basic event reliabilities. • It implements the concepts of failure possibilities and fuzzy sets. • Experts evaluate basic event failure possibilities using qualitative words. • Triangular fuzzy numbers mathematically represent qualitative failure possibilities. • It is a very good alternative for conventional reliability approach. - Abstract: Fault tree analysis has been widely utilized as a tool for nuclear power plant probabilistic safety assessment. This analysis can be completed only if all basic events of the system fault tree have their quantitative failure rates or failure probabilities. However, it is difficult to obtain those failure data due to insufficient data, environment changing or new components. This study proposes a fuzzy-based reliability approach to evaluate basic events of system fault trees whose failure precise probability distributions of their lifetime to failures are not available. It applies the concept of failure possibilities to qualitatively evaluate basic events and the concept of fuzzy sets to quantitatively represent the corresponding failure possibilities. To demonstrate the feasibility and the effectiveness of the proposed approach, the actual basic event failure probabilities collected from the operational experiences of the David–Besse design of the Babcock and Wilcox reactor protection system fault tree are used to benchmark the failure probabilities generated by the proposed approach. The results confirm that the proposed fuzzy-based reliability approach arises as a suitable alternative for the conventional probabilistic reliability approach when basic events do not have the corresponding quantitative historical failure data for determining their reliability characteristics. Hence, it overcomes the limitation of the conventional fault tree analysis for nuclear power plant probabilistic safety assessment
Bod, R.; Heine, B.; Narrog, H.
2010-01-01
Probabilistic linguistics takes all linguistic evidence as positive evidence and lets statistics decide. It allows for accurate modelling of gradient phenomena in production and perception, and suggests that rule-like behaviour is no more than a side effect of maximizing probability. This chapter
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Burcharth, H. F.
This chapter describes how partial safety factors can be used in design of vertical wall breakwaters and an example of a code format is presented. The partial safety factors are calibrated on a probabilistic basis. The code calibration process used to calibrate some of the partial safety factors...
Probabilistic Structural Analysis Theory Development
Burnside, O. H.
1985-01-01
The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.
Network Traffic Monitoring Using Poisson Dynamic Linear Models
Energy Technology Data Exchange (ETDEWEB)
Merl, D. M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
2011-05-09
In this article, we discuss an approach for network forensics using a class of nonstationary Poisson processes with embedded dynamic linear models. As a modeling strategy, the Poisson DLM (PoDLM) provides a very flexible framework for specifying structured effects that may influence the evolution of the underlying Poisson rate parameter, including diurnal and weekly usage patterns. We develop a novel particle learning algorithm for online smoothing and prediction for the PoDLM, and demonstrate the suitability of the approach to real-time deployment settings via a new application to computer network traffic monitoring.
Setiawan, R.
2018-03-01
In this paper, Economic Order Quantity (EOQ) of probabilistic two-level supply – chain system for items with imperfect quality has been analyzed under service level constraint. A firm applies an active service level constraint to avoid unpredictable shortage terms in the objective function. Mathematical analysis of optimal result is delivered using two equilibrium scheme concept in game theory approach. Stackelberg’s equilibrium for cooperative strategy and Stackelberg’s Equilibrium for noncooperative strategy. This is a new approach to game theory result in inventory system whether service level constraint is applied by a firm in his moves.
International Nuclear Information System (INIS)
Policastro, A.J.; Lazaro, M.A.; Cowen, M.A.; Hartmann, H.M.; Dunn, W.E.; Brown, D.F.
1995-01-01
This paper presents a combined deterministic and probabilistic methodology for modeling hazardous waste transportation risk and expressing the uncertainty in that risk. Both the deterministic and probabilistic methodologies are aimed at providing tools useful in the evaluation of alternative management scenarios for US Department of Energy (DOE) hazardous waste treatment, storage, and disposal (TSD). The probabilistic methodology can be used to provide perspective on and quantify uncertainties in deterministic predictions. The methodology developed has been applied to 63 DOE shipments made in fiscal year 1992, which contained poison by inhalation chemicals that represent an inhalation risk to the public. Models have been applied to simulate shipment routes, truck accident rates, chemical spill probabilities, spill/release rates, dispersion, population exposure, and health consequences. The simulation presented in this paper is specific to trucks traveling from DOE sites to their commercial TSD facilities, but the methodology is more general. Health consequences are presented as the number of people with potentially life-threatening health effects. Probabilistic distributions were developed (based on actual item data) for accident release amounts, time of day and season of the accident, and meteorological conditions
Polynomial Poisson algebras: Gel'fand-Kirillov problem and Poisson spectra
Lecoutre, César
2014-01-01
We study the fields of fractions and the Poisson spectra of polynomial Poisson algebras.\\ud \\ud First we investigate a Poisson birational equivalence problem for polynomial Poisson algebras over a field of arbitrary characteristic. Namely, the quadratic Poisson Gel'fand-Kirillov problem asks whether the field of fractions of a Poisson algebra is isomorphic to the field of fractions of a Poisson affine space, i.e. a polynomial algebra such that the Poisson bracket of two generators is equal to...
Analysis of overdispersed count data by mixtures of Poisson variables and Poisson processes.
Hougaard, P; Lee, M L; Whitmore, G A
1997-12-01
Count data often show overdispersion compared to the Poisson distribution. Overdispersion is typically modeled by a random effect for the mean, based on the gamma distribution, leading to the negative binomial distribution for the count. This paper considers a larger family of mixture distributions, including the inverse Gaussian mixture distribution. It is demonstrated that it gives a significantly better fit for a data set on the frequency of epileptic seizures. The same approach can be used to generate counting processes from Poisson processes, where the rate or the time is random. A random rate corresponds to variation between patients, whereas a random time corresponds to variation within patients.
Non-equal-time Poisson brackets
Nikolic, H.
1998-01-01
The standard definition of the Poisson brackets is generalized to the non-equal-time Poisson brackets. Their relationship to the equal-time Poisson brackets, as well as to the equal- and non-equal-time commutators, is discussed.
Background stratified Poisson regression analysis of cohort data.
Richardson, David B; Langholz, Bryan
2012-03-01
Background stratified Poisson regression is an approach that has been used in the analysis of data derived from a variety of epidemiologically important studies of radiation-exposed populations, including uranium miners, nuclear industry workers, and atomic bomb survivors. We describe a novel approach to fit Poisson regression models that adjust for a set of covariates through background stratification while directly estimating the radiation-disease association of primary interest. The approach makes use of an expression for the Poisson likelihood that treats the coefficients for stratum-specific indicator variables as 'nuisance' variables and avoids the need to explicitly estimate the coefficients for these stratum-specific parameters. Log-linear models, as well as other general relative rate models, are accommodated. This approach is illustrated using data from the Life Span Study of Japanese atomic bomb survivors and data from a study of underground uranium miners. The point estimate and confidence interval obtained from this 'conditional' regression approach are identical to the values obtained using unconditional Poisson regression with model terms for each background stratum. Moreover, it is shown that the proposed approach allows estimation of background stratified Poisson regression models of non-standard form, such as models that parameterize latency effects, as well as regression models in which the number of strata is large, thereby overcoming the limitations of previously available statistical software for fitting background stratified Poisson regression models.
Background stratified Poisson regression analysis of cohort data
International Nuclear Information System (INIS)
Richardson, David B.; Langholz, Bryan
2012-01-01
Background stratified Poisson regression is an approach that has been used in the analysis of data derived from a variety of epidemiologically important studies of radiation-exposed populations, including uranium miners, nuclear industry workers, and atomic bomb survivors. We describe a novel approach to fit Poisson regression models that adjust for a set of covariates through background stratification while directly estimating the radiation-disease association of primary interest. The approach makes use of an expression for the Poisson likelihood that treats the coefficients for stratum-specific indicator variables as 'nuisance' variables and avoids the need to explicitly estimate the coefficients for these stratum-specific parameters. Log-linear models, as well as other general relative rate models, are accommodated. This approach is illustrated using data from the Life Span Study of Japanese atomic bomb survivors and data from a study of underground uranium miners. The point estimate and confidence interval obtained from this 'conditional' regression approach are identical to the values obtained using unconditional Poisson regression with model terms for each background stratum. Moreover, it is shown that the proposed approach allows estimation of background stratified Poisson regression models of non-standard form, such as models that parameterize latency effects, as well as regression models in which the number of strata is large, thereby overcoming the limitations of previously available statistical software for fitting background stratified Poisson regression models. (orig.)
Newton/Poisson-Distribution Program
Bowerman, Paul N.; Scheuer, Ernest M.
1990-01-01
NEWTPOIS, one of two computer programs making calculations involving cumulative Poisson distributions. NEWTPOIS (NPO-17715) and CUMPOIS (NPO-17714) used independently of one another. NEWTPOIS determines Poisson parameter for given cumulative probability, from which one obtains percentiles for gamma distributions with integer shape parameters and percentiles for X(sup2) distributions with even degrees of freedom. Used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. Program written in C.
POISSON SUPERFISH, Poisson Equation Solver for Radio Frequency Cavity
International Nuclear Information System (INIS)
Colman, J.
2001-01-01
1 - Description of program or function: POISSON, SUPERFISH is a group of (1) codes that solve Poisson's equation and are used to compute field quality for both magnets and fixed electric potentials and (2) RF cavity codes that calculate resonant frequencies and field distributions of the fundamental and higher modes. The group includes: POISSON, PANDIRA, SUPERFISH, AUTOMESH, LATTICE, FORCE, MIRT, PAN-T, TEKPLOT, SF01, and SHY. POISSON solves Poisson's (or Laplace's) equation for the vector (scalar) potential with nonlinear isotropic iron (dielectric) and electric current (charge) distributions for two-dimensional Cartesian or three-dimensional cylindrical symmetry. It calculates the derivatives of the potential, the stored energy, and performs harmonic (multipole) analysis of the potential. PANDIRA is similar to POISSON except it allows anisotropic and permanent magnet materials and uses a different numerical method to obtain the potential. SUPERFISH solves for the accelerating (TM) and deflecting (TE) resonant frequencies and field distributions in an RF cavity with two-dimensional Cartesian or three-dimensional cylindrical symmetry. Only the azimuthally symmetric modes are found for cylindrically symmetric cavities. AUTOMESH prepares input for LATTICE from geometrical data describing the problem, (i.e., it constructs the 'logical' mesh and generates (x,y) coordinate data for straight lines, arcs of circles, and segments of hyperbolas). LATTICE generates an irregular triangular (physical) mesh from the input data, calculates the 'point current' terms at each mesh point in regions with distributed current density, and sets up the mesh point relaxation order needed to write the binary problem file for the equation-solving POISSON, PANDIRA, or SUPERFISH. FORCE calculates forces and torques on coils and iron regions from POISSON or PANDIRA solutions for the potential. MIRT optimizes magnet profiles, coil shapes, and current densities from POISSON output based on a
Memristive Probabilistic Computing
Alahmadi, Hamzah
2017-10-01
In the era of Internet of Things and Big Data, unconventional techniques are rising to accommodate the large size of data and the resource constraints. New computing structures are advancing based on non-volatile memory technologies and different processing paradigms. Additionally, the intrinsic resiliency of current applications leads to the development of creative techniques in computations. In those applications, approximate computing provides a perfect fit to optimize the energy efficiency while compromising on the accuracy. In this work, we build probabilistic adders based on stochastic memristor. Probabilistic adders are analyzed with respect of the stochastic behavior of the underlying memristors. Multiple adder implementations are investigated and compared. The memristive probabilistic adder provides a different approach from the typical approximate CMOS adders. Furthermore, it allows for a high area saving and design exibility between the performance and power saving. To reach a similar performance level as approximate CMOS adders, the memristive adder achieves 60% of power saving. An image-compression application is investigated using the memristive probabilistic adders with the performance and the energy trade-off.
Coordination of Conditional Poisson Samples
Directory of Open Access Journals (Sweden)
Grafström Anton
2015-12-01
Full Text Available Sample coordination seeks to maximize or to minimize the overlap of two or more samples. The former is known as positive coordination, and the latter as negative coordination. Positive coordination is mainly used for estimation purposes and to reduce data collection costs. Negative coordination is mainly performed to diminish the response burden of the sampled units. Poisson sampling design with permanent random numbers provides an optimum coordination degree of two or more samples. The size of a Poisson sample is, however, random. Conditional Poisson (CP sampling is a modification of the classical Poisson sampling that produces a fixed-size πps sample. We introduce two methods to coordinate Conditional Poisson samples over time or simultaneously. The first one uses permanent random numbers and the list-sequential implementation of CP sampling. The second method uses a CP sample in the first selection and provides an approximate one in the second selection because the prescribed inclusion probabilities are not respected exactly. The methods are evaluated using the size of the expected sample overlap, and are compared with their competitors using Monte Carlo simulation. The new methods provide a good coordination degree of two samples, close to the performance of Poisson sampling with permanent random numbers.
A probabilistic approach for the computation of non-linear vibrations of tubes under cross-flow
International Nuclear Information System (INIS)
Payen, Th.; Langre, E. de.
1996-01-01
For the predictive analysis of flow-induced vibration and wear of tube bundles, a probabilistic method is proposed taking into account the uncertainties of the physical parameters. Monte-Carlo simulations are performed to estimate the density probability function of wear work rate and a sensitivity analysis is done on physical parameters influencing wear on the case of loosely supported tube under cross-flow. (authors). 8 refs., 8 figs
Topological Poisson Sigma models on Poisson-Lie groups
International Nuclear Information System (INIS)
Calvo, Ivan; Falceto, Fernando; Garcia-Alvarez, David
2003-01-01
We solve the topological Poisson Sigma model for a Poisson-Lie group G and its dual G*. We show that the gauge symmetry for each model is given by its dual group that acts by dressing transformations on the target. The resolution of both models in the open geometry reveals that there exists a map from the reduced phase of each model (P and P*) to the main symplectic leaf of the Heisenberg double (D 0 ) such that the symplectic forms on P, P* are obtained as the pull-back by those maps of the symplectic structure on D 0 . This uncovers a duality between P and P* under the exchange of bulk degrees of freedom of one model with boundary degrees of freedom of the other one. We finally solve the Poisson Sigma model for the Poisson structure on G given by a pair of r-matrices that generalizes the Poisson-Lie case. The Hamiltonian analysis of the theory requires the introduction of a deformation of the Heisenberg double. (author)
Probabilistic representation of fermionic lattice systems
International Nuclear Information System (INIS)
Beccaria, Matteo; Presilla, Carlo; De Angelis, Gian Fabrizio; Jona-Lasinio, Giovanni
2000-01-01
We describe an exact Feynman-Kac type formula to represent the dynamics of fermionic lattice systems. In this approach the real time or Euclidean time dynamics is expressed in terms of the stochastic evolution of a collection of Poisson processes. From this formula we derive a family of algorithms for Monte Carlo simulations, parametrized by the jump rates of the Poisson processes
Sáez, Carlos; Zurriaga, Oscar; Pérez-Panadés, Jordi; Melchor, Inma; Robles, Montserrat; García-Gómez, Juan M
2016-11-01
To assess the variability in data distributions among data sources and over time through a case study of a large multisite repository as a systematic approach to data quality (DQ). Novel probabilistic DQ control methods based on information theory and geometry are applied to the Public Health Mortality Registry of the Region of Valencia, Spain, with 512 143 entries from 2000 to 2012, disaggregated into 24 health departments. The methods provide DQ metrics and exploratory visualizations for (1) assessing the variability among multiple sources and (2) monitoring and exploring changes with time. The methods are suited to big data and multitype, multivariate, and multimodal data. The repository was partitioned into 2 probabilistically separated temporal subgroups following a change in the Spanish National Death Certificate in 2009. Punctual temporal anomalies were noticed due to a punctual increment in the missing data, along with outlying and clustered health departments due to differences in populations or in practices. Changes in protocols, differences in populations, biased practices, or other systematic DQ problems affected data variability. Even if semantic and integration aspects are addressed in data sharing infrastructures, probabilistic variability may still be present. Solutions include fixing or excluding data and analyzing different sites or time periods separately. A systematic approach to assessing temporal and multisite variability is proposed. Multisite and temporal variability in data distributions affects DQ, hindering data reuse, and an assessment of such variability should be a part of systematic DQ procedures. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Energy Technology Data Exchange (ETDEWEB)
Zolezzi, M. [Fisia Italimpianti SpA, Genova (Italy); Nicolella, C. [Pisa Univ., Pisa (Italy). Dipartimento di ingegneria chimica, chimica industriale e scienza dei materiali; Tarazona, J.V. [Instituto Nacional de Investigacion y Tecnologia Agraria y Alimentaria, Madrid (Spain). Departamento de Medio Ambiente, Laboratorio de toxicologia
2005-09-15
This paper presents a tiered methodology for probabilistic ecological risk assessment. The proposed approach starts from deterministic comparison (ratio) of single exposure concentration and threshold or safe level calculated from a dose-response relationship, goes through comparison of probabilistic distributions that describe exposure values and toxicological responses of organisms to the chemical of concern, and finally determines the so called distribution-based quotients (DBQs). In order to illustrate the proposed approach, soil concentrations of 1,2,4-trichlorobenzene (1,2,4- TCB) measured in an industrial contaminated site were used for site-specific probabilistic ecological risks assessment. By using probabilistic distributions, the risk, which exceeds a level of concern for soil organisms with the deterministic approach, is associated to the presence of hot spots reaching concentrations able to affect acutely more than 50% of the soil species, while the large majority of the area presents 1,2,4- TCB concentrations below those reported as toxic. [Italian] Scopo del presente studio e fornire una procedura per l'analisi di rischio ecologico di siti contaminati basata su livelli successivi di approfondimento. L'approccio proposto, partendo dal semplice rapporto deterministico tra un livello di esposizione ed un valore di effetto che consenta la salvaguardia del maggior numero di specie dell'ecosistema considerato, procede attraverso il confronto tra le distribuzioni statistiche dei valori di esposizione e di sensitivita delle specie, per determinare infine la distribuzione probabilistica del quoziente di rischio. Ai fini di illustrare la metodologia proposta, le concentrazioni di 1,2,4-triclorobenzene determinate nel suolo di un sito industriale contaminato sono state utilizzate per condurre l'analisi di rischio per le specie terrestri. L'utilizzo delle distribuzioni probabilistiche ha permesso di associare il rischio, inizialmente
Analyzing hospitalization data: potential limitations of Poisson regression.
Weaver, Colin G; Ravani, Pietro; Oliver, Matthew J; Austin, Peter C; Quinn, Robert R
2015-08-01
Poisson regression is commonly used to analyze hospitalization data when outcomes are expressed as counts (e.g. number of days in hospital). However, data often violate the assumptions on which Poisson regression is based. More appropriate extensions of this model, while available, are rarely used. We compared hospitalization data between 206 patients treated with hemodialysis (HD) and 107 treated with peritoneal dialysis (PD) using Poisson regression and compared results from standard Poisson regression with those obtained using three other approaches for modeling count data: negative binomial (NB) regression, zero-inflated Poisson (ZIP) regression and zero-inflated negative binomial (ZINB) regression. We examined the appropriateness of each model and compared the results obtained with each approach. During a mean 1.9 years of follow-up, 183 of 313 patients (58%) were never hospitalized (indicating an excess of 'zeros'). The data also displayed overdispersion (variance greater than mean), violating another assumption of the Poisson model. Using four criteria, we determined that the NB and ZINB models performed best. According to these two models, patients treated with HD experienced similar hospitalization rates as those receiving PD {NB rate ratio (RR): 1.04 [bootstrapped 95% confidence interval (CI): 0.49-2.20]; ZINB summary RR: 1.21 (bootstrapped 95% CI 0.60-2.46)}. Poisson and ZIP models fit the data poorly and had much larger point estimates than the NB and ZINB models [Poisson RR: 1.93 (bootstrapped 95% CI 0.88-4.23); ZIP summary RR: 1.84 (bootstrapped 95% CI 0.88-3.84)]. We found substantially different results when modeling hospitalization data, depending on the approach used. Our results argue strongly for a sound model selection process and improved reporting around statistical methods used for modeling count data. © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
Stavrakakis, G.; Lucia, A.C.; Solomos, G. (Commission of the European Communities, Ispra (Italy). Joint Research Centre)
1990-01-01
The two computer codes COVASTOL and RELIEF, developed for the modeling of cumulative damage processes in the framework of probabilistic structural reliability, are compared. They are based respectively on the randomisation of a differential crack growth law and on the theory of discrete Markov processes. The codes are applied for fatigue crack growth predictions using two sets of data of crack propagation curves from specimens. The results are critically analyzed and an extensive discussion follows on the merits and limitations of each code. Their transferability for the reliability assessment of real structures is investigated. (author).
A Robust Optimisation Approach using CVaR for Unit Commitment in a Market with Probabilistic Offers
DEFF Research Database (Denmark)
Bukhsh, W. A.; Papakonstantinou, Athanasios; Pinson, Pierre
2016-01-01
The large scale integration of renewable energy sources (RES) challenges power system planners and operators alike as it can potentially introduce the need for costly investments in infrastructure. Furthermore, traditional market clearing mechanisms are no longer optimal due to the stochastic...... nature of RES. This paper presents a risk-aware market clearing strategy for a network with significant shares of RES. We propose an electricity market that embeds the uncertainty brought by wind power and other stochastic renewable sources by accepting probabilistic offers and use a risk measure defined...
da Paz, I. G.; Soldati, Rodolfo; Cabral, L. A.; de Oliveira, J. G. G.; Sampaio, Marcos
2016-12-01
Recently there have been experimental results on Poisson spot matter-wave interferometry followed by theoretical models describing the relative importance of the wave and particle behaviors for the phenomenon. We propose an analytical theoretical model for Poisson's spot with matter waves based on the Babinet principle, in which we use the results for free propagation and single-slit diffraction. We take into account effects of loss of coherence and finite detection area using the propagator for a quantum particle interacting with an environment. We observe that the matter-wave Gouy phase plays a role in the existence of the central peak and thus corroborates the predominantly wavelike character of the Poisson's spot. Our model shows remarkable agreement with the experimental data for deuterium (D2) molecules.
Comparison of Poisson structures and Poisson-Lie dynamical r-matrices
Enriquez, B.; Etingof, P.; Marshall, I.
2004-01-01
We construct a Poisson isomorphism between the formal Poisson manifolds g^* and G^*, where g is a finite dimensional quasitriangular Lie bialgebra. Here g^* is equipped with its Lie-Poisson (or Kostant-Kirillov-Souriau) structure, and G^* with its Poisson-Lie structure. We also quantize Poisson-Lie dynamical r-matrices of Balog-Feher-Palla.
NEWTPOIS- NEWTON POISSON DISTRIBUTION PROGRAM
Bowerman, P. N.
1994-01-01
The cumulative poisson distribution program, NEWTPOIS, is one of two programs which make calculations involving cumulative poisson distributions. Both programs, NEWTPOIS (NPO-17715) and CUMPOIS (NPO-17714), can be used independently of one another. NEWTPOIS determines percentiles for gamma distributions with integer shape parameters and calculates percentiles for chi-square distributions with even degrees of freedom. It can be used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. NEWTPOIS determines the Poisson parameter (lambda), that is; the mean (or expected) number of events occurring in a given unit of time, area, or space. Given that the user already knows the cumulative probability for a specific number of occurrences (n) it is usually a simple matter of substitution into the Poisson distribution summation to arrive at lambda. However, direct calculation of the Poisson parameter becomes difficult for small positive values of n and unmanageable for large values. NEWTPOIS uses Newton's iteration method to extract lambda from the initial value condition of the Poisson distribution where n=0, taking successive estimations until some user specified error term (epsilon) is reached. The NEWTPOIS program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly on most C compilers. The program format is interactive, accepting epsilon, n, and the cumulative probability of the occurrence of n as inputs. It has been implemented under DOS 3.2 and has a memory requirement of 30K. NEWTPOIS was developed in 1988.
Transforming spatial point processes into Poisson processes using random superposition
DEFF Research Database (Denmark)
Møller, Jesper; Berthelsen, Kasper Klitgaaard
with a complementary spatial point process Y to obtain a Poisson process X∪Y with intensity function β. Underlying this is a bivariate spatial birth-death process (Xt,Yt) which converges towards the distribution of (X,Y). We study the joint distribution of X and Y, and their marginal and conditional distributions....... In particular, we introduce a fast and easy simulation procedure for Y conditional on X. This may be used for model checking: given a model for the Papangelou intensity of the original spatial point process, this model is used to generate the complementary process, and the resulting superposition is a Poisson...... process with intensity function β if and only if the true Papangelou intensity is used. Whether the superposition is actually such a Poisson process can easily be examined using well known results and fast simulation procedures for Poisson processes. We illustrate this approach to model checking...
Directory of Open Access Journals (Sweden)
Mikaël Cozic
2016-11-01
Full Text Available The modeling of awareness and unawareness is a significant topic in the doxastic logic literature, where it is usually tackled in terms of full belief operators. The present paper aims at a treatment in terms of partial belief operators. It draws upon the modal probabilistic logic that was introduced by Aumann (1999 at the semantic level, and then axiomatized by Heifetz and Mongin (2001. The paper embodies in this framework those properties of unawareness that have been highlighted in the seminal paper by Modica and Rustichini (1999. Their paper deals with full belief, but we argue that the properties in question also apply to partial belief. Our main result is a (soundness and completeness theorem that reunites the two strands—modal and probabilistic—of doxastic logic.
Fast and Accurate Poisson Denoising With Trainable Nonlinear Diffusion.
Feng, Wensen; Qiao, Peng; Chen, Yunjin; Wensen Feng; Peng Qiao; Yunjin Chen; Feng, Wensen; Chen, Yunjin; Qiao, Peng
2018-06-01
The degradation of the acquired signal by Poisson noise is a common problem for various imaging applications, such as medical imaging, night vision, and microscopy. Up to now, many state-of-the-art Poisson denoising techniques mainly concentrate on achieving utmost performance, with little consideration for the computation efficiency. Therefore, in this paper we aim to propose an efficient Poisson denoising model with both high computational efficiency and recovery quality. To this end, we exploit the newly developed trainable nonlinear reaction diffusion (TNRD) model which has proven an extremely fast image restoration approach with performance surpassing recent state-of-the-arts. However, the straightforward direct gradient descent employed in the original TNRD-based denoising task is not applicable in this paper. To solve this problem, we resort to the proximal gradient descent method. We retrain the model parameters, including the linear filters and influence functions by taking into account the Poisson noise statistics, and end up with a well-trained nonlinear diffusion model specialized for Poisson denoising. The trained model provides strongly competitive results against state-of-the-art approaches, meanwhile bearing the properties of simple structure and high efficiency. Furthermore, our proposed model comes along with an additional advantage, that the diffusion process is well-suited for parallel computation on graphics processing units (GPUs). For images of size , our GPU implementation takes less than 0.1 s to produce state-of-the-art Poisson denoising performance.
Probabilistic methods for physics
International Nuclear Information System (INIS)
Cirier, G
2013-01-01
We present an asymptotic method giving a probability of presence of the iterated spots of R d by a polynomial function f. We use the well-known Perron Frobenius operator (PF) that lets certain sets and measure invariant by f. Probabilistic solutions can exist for the deterministic iteration. If the theoretical result is already known, here we quantify these probabilities. This approach seems interesting to use for computing situations when the deterministic methods don't run. Among the examined applications, are asymptotic solutions of Lorenz, Navier-Stokes or Hamilton's equations. In this approach, linearity induces many difficult problems, all of whom we have not yet resolved.
Albertson, J. D.
2015-12-01
Methane emissions from underground pipeline leaks remain an ongoing issue in the development of accurate methane emission inventories for the natural gas supply chain. Application of mobile methods during routine street surveys would help address this issue, but there are large uncertainties in current approaches. In this paper, we describe results from a series of near-source (< 30 m) controlled methane releases where an instrumented van was used to measure methane concentrations during both fixed location sampling and during mobile traverses immediately downwind of the source. The measurements were used to evaluate the application of EPA Method 33A for estimating methane emissions downwind of a source and also to test the application of a new probabilistic approach for estimating emission rates from mobile traverse data.
International Nuclear Information System (INIS)
Kang, D. I.; Jung, W. D.
2003-01-01
We review the Cause-Based Decision Tree (CBDT) approach to decide whether we incorporate it or not for the development of domestic standard Human Reliability Analysis (HRA) procedure in low power/shutdown operation Probabilistic Safety Assessment (PSA). In this paper, we introduce the cause based decision tree approach, quantify human errors using it, and identify merits and demerits of it in comparision with previously used THERP. The review results show that it is difficult to incorporate the CBDT method for the development of domestic standard HRA procedure in low power/shutdown PSA because the CBDT method need for the subjective judgment of HRA analyst like as THERP. However, it is expected that the incorporation of the CBDT method into the development of domestic standard HRA procedure only for the comparision of quantitative HRA results will relieve the burden of development of detailed HRA procedure and will help maintain consistent quantitative HRA results
Some probabilistic properties of fractional point processes
Garra, Roberto
2017-05-16
In this article, the first hitting times of generalized Poisson processes N-f (t), related to Bernstein functions f are studied. For the spacefractional Poisson processes, N alpha (t), t > 0 ( corresponding to f = x alpha), the hitting probabilities P{T-k(alpha) < infinity} are explicitly obtained and analyzed. The processes N-f (t) are time-changed Poisson processes N( H-f (t)) with subordinators H-f (t) and here we study N(Sigma H-n(j= 1)f j (t)) and obtain probabilistic features of these extended counting processes. A section of the paper is devoted to processes of the form N( G(H,v) (t)) where G(H,v) (t) are generalized grey Brownian motions. This involves the theory of time-dependent fractional operators of the McBride form. While the time-fractional Poisson process is a renewal process, we prove that the space-time Poisson process is no longer a renewal process.
International Nuclear Information System (INIS)
Kawasaki, Nobuchika; Asayama, Tai
2001-09-01
Both reliability and safety have to be further improved for the successful commercialization of FBRs. At the same time, construction and operation costs need to be reduced to a same level of future LWRs. To realize compatibility among reliability, safety and, cost, the Structural Mechanics Research Group in JNC started the development of System Based Code for Integrity of FBR. This code extends the present structural design standard to include the areas of fabrication, installation, plant system design, safety design, operation and maintenance, and so on. A quantitative index is necessary to connect different partial standards in this code. Failure probability is considered as a candidate index. Therefore we decided to make a model calculation using failure probability and judge its applicability. We first investigated other probabilistic standards like ASME Code Case N-578. A probabilistic approach in the structural integrity evaluation was created based on these results, and also an evaluation flow was proposed. According to this flow, a model calculation of creep-fatigue damage was performed. This trial calculation was for a vessel in a sodium-cooled FBR. As the result of this model calculation, a crack initiation probability and a crack penetration probability were found to be effective indices. Last we discussed merits of this System Based Code, which are presented in this report. Furthermore, this report presents future development tasks. (author)
Dall'Osso, F; Dominey-Howes, D; Moore, C; Summerhayes, S; Withycombe, G
2014-12-10
Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney.
Montero-Martinez, M. J.; Colorado, G.; Diaz-Gutierrez, D. E.; Salinas-Prieto, J. A.
2017-12-01
It is well known the North American Monsoon (NAM) region is already a very dry region which is under a lot of stress due to the lack of water resources on multiple locations of the area. However, it is very interesting that even under those conditions, the Mexican part of the NAM region is certainly the most productive in Mexico from the agricultural point of view. Thus, it is very important to have realistic climate scenarios for climate variables such as temperature, precipitation, relative humidity, radiation, etc. This study tries to tackle that problem by generating probabilistic climate scenarios using a weighted CMIP5-GCM ensemble approach based on the Xu et al. (2010) technique which is on itself an improved method from the better known Reliability Ensemble Averaging algorithm of Giorgi and Mearns (2002). In addition, it is compared the 20-plus GCMs individual performances and the weighted ensemble versus observed data (CRU TS2.1) by using different metrics and Taylor diagrams. This study focuses on probabilistic results reaching a certain threshold given the fact that those types of products could be of potential use for agricultural applications.
Graded geometry and Poisson reduction
Cattaneo, A S; Zambon, M
2009-01-01
The main result of [2] extends the Marsden-Ratiu reduction theorem [4] in Poisson geometry, and is proven by means of graded geometry. In this note we provide the background material about graded geometry necessary for the proof in [2]. Further, we provide an alternative algebraic proof for the main result. ©2009 American Institute of Physics
Efficient maximal Poisson-disk sampling and remeshing on surfaces
Guo, Jianwei; Yan, Dongming; Jia, Xiaohong; Zhang, Xiaopeng
2015-01-01
Poisson-disk sampling is one of the fundamental research problems in computer graphics that has many applications. In this paper, we study the problem of maximal Poisson-disk sampling on mesh surfaces. We present a simple approach that generalizes the 2D maximal sampling framework to surfaces. The key observation is to use a subdivided mesh as the sampling domain for conflict checking and void detection. Our approach improves the state-of-the-art approach in efficiency, quality and the memory consumption.
Efficient maximal Poisson-disk sampling and remeshing on surfaces
Guo, Jianwei
2015-02-01
Poisson-disk sampling is one of the fundamental research problems in computer graphics that has many applications. In this paper, we study the problem of maximal Poisson-disk sampling on mesh surfaces. We present a simple approach that generalizes the 2D maximal sampling framework to surfaces. The key observation is to use a subdivided mesh as the sampling domain for conflict checking and void detection. Our approach improves the state-of-the-art approach in efficiency, quality and the memory consumption.
Probabilistic liver atlas construction.
Dura, Esther; Domingo, Juan; Ayala, Guillermo; Marti-Bonmati, Luis; Goceri, E
2017-01-13
Anatomical atlases are 3D volumes or shapes representing an organ or structure of the human body. They contain either the prototypical shape of the object of interest together with other shapes representing its statistical variations (statistical atlas) or a probability map of belonging to the object (probabilistic atlas). Probabilistic atlases are mostly built with simple estimations only involving the data at each spatial location. A new method for probabilistic atlas construction that uses a generalized linear model is proposed. This method aims to improve the estimation of the probability to be covered by the liver. Furthermore, all methods to build an atlas involve previous coregistration of the sample of shapes available. The influence of the geometrical transformation adopted for registration in the quality of the final atlas has not been sufficiently investigated. The ability of an atlas to adapt to a new case is one of the most important quality criteria that should be taken into account. The presented experiments show that some methods for atlas construction are severely affected by the previous coregistration step. We show the good performance of the new approach. Furthermore, results suggest that extremely flexible registration methods are not always beneficial, since they can reduce the variability of the atlas and hence its ability to give sensible values of probability when used as an aid in segmentation of new cases.
DEFF Research Database (Denmark)
Selck, Henriette; Drouillard, Ken; Eisenreich, Karen
2012-01-01
was improved by accounting for bioavailability and absorption efficiency limitations, due to the presence of black carbon in sediment, and was used for probabilistic modeling of variability and propagation of error. Results showed that at lower trophic levels (mayfly and polychaete), variability...... in bioaccumulation was mainly driven by sediment exposure, sediment composition and chemical partitioning to sediment components, which was in turn dominated by the influence of black carbon. At higher trophic levels (yellow perch and the little owl), food web structure (i.e., diet composition and abundance...... components. Improvements in the accuracy of aqueous exposure appear to be less relevant when applied to moderate to highly hydrophobic compounds, because this route contributes only marginally to total uptake. The determination of chemical bioavailability and the increase in understanding and qualifying...
Basu, Saikat; Ganguly, Sangram; Michaelis, Andrew; Votava, Petr; Roy, Anshuman; Mukhopadhyay, Supratik; Nemani, Ramakrishna
2015-01-01
Tree cover delineation is a useful instrument in deriving Above Ground Biomass (AGB) density estimates from Very High Resolution (VHR) airborne imagery data. Numerous algorithms have been designed to address this problem, but most of them do not scale to these datasets, which are of the order of terabytes. In this paper, we present a semi-automated probabilistic framework for the segmentation and classification of 1-m National Agriculture Imagery Program (NAIP) for tree-cover delineation for the whole of Continental United States, using a High Performance Computing Architecture. Classification is performed using a multi-layer Feedforward Backpropagation Neural Network and segmentation is performed using a Statistical Region Merging algorithm. The results from the classification and segmentation algorithms are then consolidated into a structured prediction framework using a discriminative undirected probabilistic graphical model based on Conditional Random Field, which helps in capturing the higher order contextual dependencies between neighboring pixels. Once the final probability maps are generated, the framework is updated and re-trained by relabeling misclassified image patches. This leads to a significant improvement in the true positive rates and reduction in false positive rates. The tree cover maps were generated for the whole state of California, spanning a total of 11,095 NAIP tiles covering a total geographical area of 163,696 sq. miles. The framework produced true positive rates of around 88% for fragmented forests and 74% for urban tree cover areas, with false positive rates lower than 2% for both landscapes. Comparative studies with the National Land Cover Data (NLCD) algorithm and the LiDAR canopy height model (CHM) showed the effectiveness of our framework for generating accurate high-resolution tree-cover maps.
Basu, S.; Ganguly, S.; Michaelis, A.; Votava, P.; Roy, A.; Mukhopadhyay, S.; Nemani, R. R.
2015-12-01
Tree cover delineation is a useful instrument in deriving Above Ground Biomass (AGB) density estimates from Very High Resolution (VHR) airborne imagery data. Numerous algorithms have been designed to address this problem, but most of them do not scale to these datasets which are of the order of terabytes. In this paper, we present a semi-automated probabilistic framework for the segmentation and classification of 1-m National Agriculture Imagery Program (NAIP) for tree-cover delineation for the whole of Continental United States, using a High Performance Computing Architecture. Classification is performed using a multi-layer Feedforward Backpropagation Neural Network and segmentation is performed using a Statistical Region Merging algorithm. The results from the classification and segmentation algorithms are then consolidated into a structured prediction framework using a discriminative undirected probabilistic graphical model based on Conditional Random Field, which helps in capturing the higher order contextual dependencies between neighboring pixels. Once the final probability maps are generated, the framework is updated and re-trained by relabeling misclassified image patches. This leads to a significant improvement in the true positive rates and reduction in false positive rates. The tree cover maps were generated for the whole state of California, spanning a total of 11,095 NAIP tiles covering a total geographical area of 163,696 sq. miles. The framework produced true positive rates of around 88% for fragmented forests and 74% for urban tree cover areas, with false positive rates lower than 2% for both landscapes. Comparative studies with the National Land Cover Data (NLCD) algorithm and the LiDAR canopy height model (CHM) showed the effectiveness of our framework for generating accurate high-resolution tree-cover maps.
Periodic Poisson Solver for Particle Tracking
International Nuclear Information System (INIS)
Dohlus, M.; Henning, C.
2015-05-01
A method is described to solve the Poisson problem for a three dimensional source distribution that is periodic into one direction. Perpendicular to the direction of periodicity a free space (or open) boundary is realized. In beam physics, this approach allows to calculate the space charge field of a continualized charged particle distribution with periodic pattern. The method is based on a particle mesh approach with equidistant grid and fast convolution with a Green's function. The periodic approach uses only one period of the source distribution, but a periodic extension of the Green's function. The approach is numerically efficient and allows the investigation of periodic- and pseudo-periodic structures with period lengths that are small compared to the source dimensions, for instance of laser modulated beams or of the evolution of micro bunch structures. Applications for laser modulated beams are given.
Normal forms for Poisson maps and symplectic groupoids around Poisson transversals.
Frejlich, Pedro; Mărcuț, Ioan
2018-01-01
Poisson transversals are submanifolds in a Poisson manifold which intersect all symplectic leaves transversally and symplectically. In this communication, we prove a normal form theorem for Poisson maps around Poisson transversals. A Poisson map pulls a Poisson transversal back to a Poisson transversal, and our first main result states that simultaneous normal forms exist around such transversals, for which the Poisson map becomes transversally linear, and intertwines the normal form data of the transversals. Our second result concerns symplectic integrations. We prove that a neighborhood of a Poisson transversal is integrable exactly when the Poisson transversal itself is integrable, and in that case we prove a normal form theorem for the symplectic groupoid around its restriction to the Poisson transversal, which puts all structure maps in normal form. We conclude by illustrating our results with examples arising from Lie algebras.
Velasco, David; Sempere-Torres, Daniel; Corral, Carles; Llort, Xavier; Velasco, Enrique
2010-05-01
probabilistic component to the FF-EWS. As a first step, we have incorporated the uncertainty in rainfall estimates and forecasts based on an ensemble of equiprobable rainfall scenarios. The presented study has focused on a number of rainfall events and the performance of the FF-EWS evaluated in terms of its ability to produce probabilistic hazard warnings for decision-making support.
Efficient triangulation of Poisson-disk sampled point sets
Guo, Jianwei
2014-05-06
In this paper, we present a simple yet efficient algorithm for triangulating a 2D input domain containing a Poisson-disk sampled point set. The proposed algorithm combines a regular grid and a discrete clustering approach to speedup the triangulation. Moreover, our triangulation algorithm is flexible and performs well on more general point sets such as adaptive, non-maximal Poisson-disk sets. The experimental results demonstrate that our algorithm is robust for a wide range of input domains and achieves significant performance improvement compared to the current state-of-the-art approaches. © 2014 Springer-Verlag Berlin Heidelberg.
Independent production and Poisson distribution
International Nuclear Information System (INIS)
Golokhvastov, A.I.
1994-01-01
The well-known statement of factorization of inclusive cross-sections in case of independent production of particles (or clusters, jets etc.) and the conclusion of Poisson distribution over their multiplicity arising from it do not follow from the probability theory in any way. Using accurately the theorem of the product of independent probabilities, quite different equations are obtained and no consequences relative to multiplicity distributions are obtained. 11 refs
A generalized gyrokinetic Poisson solver
International Nuclear Information System (INIS)
Lin, Z.; Lee, W.W.
1995-03-01
A generalized gyrokinetic Poisson solver has been developed, which employs local operations in the configuration space to compute the polarization density response. The new technique is based on the actual physical process of gyrophase-averaging. It is useful for nonlocal simulations using general geometry equilibrium. Since it utilizes local operations rather than the global ones such as FFT, the new method is most amenable to massively parallel algorithms
Probabilistic molecular dynamics evaluation of the stress-strain behavior of polyethylene
International Nuclear Information System (INIS)
Stowe, J.Q.; Predecki, P.K.; Laz, P.J.; Burks, B.M.; Kumosa, M.
2009-01-01
The primary goal of this study was to utilize molecular dynamics to predict the mechanical behavior of polyethylene. In particular, stress-strain relationships, the Young's modulus and Poisson ratio were predicted for low-density polyethylene at several molecular weights and polymer configurations with the number of united CH 2 atoms ranging between 500 and 5000. Probabilistic Monte Carlo methods were also used to identify the extent of uncertainty in mechanical property predictions. In general, asymptotic behavior was observed for stress and the Young's modulus as the molecular weight of the models increased. At the same time, significant variability, of the order of 1000% of the mean, in the stress-strain relationships and the Young's modulus predictions was observed, especially for low molecular weight models. The variability in the Young's modulus predictions ranged from 17.9 to 3.2 GPa for the models ranging from 100 to 5000 CH 2 atom models. However, it was also found that the mean value of the Young's modulus approached a physically possible value of 194 MPa for the 5000 atom model. Poisson ratio predictions also resulted in significant variability, from 200% to 425% of the mean, and ranged from 0.75 to 1.30. The mean value of the Poisson ratios calculated in this study ranged from 0.32 to 0.44 for the 100 to 5000 atom models, respectively.
Relaxed Poisson cure rate models.
Rodrigues, Josemar; Cordeiro, Gauss M; Cancho, Vicente G; Balakrishnan, N
2016-03-01
The purpose of this article is to make the standard promotion cure rate model (Yakovlev and Tsodikov, ) more flexible by assuming that the number of lesions or altered cells after a treatment follows a fractional Poisson distribution (Laskin, ). It is proved that the well-known Mittag-Leffler relaxation function (Berberan-Santos, ) is a simple way to obtain a new cure rate model that is a compromise between the promotion and geometric cure rate models allowing for superdispersion. So, the relaxed cure rate model developed here can be considered as a natural and less restrictive extension of the popular Poisson cure rate model at the cost of an additional parameter, but a competitor to negative-binomial cure rate models (Rodrigues et al., ). Some mathematical properties of a proper relaxed Poisson density are explored. A simulation study and an illustration of the proposed cure rate model from the Bayesian point of view are finally presented. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Poisson denoising on the sphere
Schmitt, J.; Starck, J. L.; Fadili, J.; Grenier, I.; Casandjian, J. M.
2009-08-01
In the scope of the Fermi mission, Poisson noise removal should improve data quality and make source detection easier. This paper presents a method for Poisson data denoising on sphere, called Multi-Scale Variance Stabilizing Transform on Sphere (MS-VSTS). This method is based on a Variance Stabilizing Transform (VST), a transform which aims to stabilize a Poisson data set such that each stabilized sample has an (asymptotically) constant variance. In addition, for the VST used in the method, the transformed data are asymptotically Gaussian. Thus, MS-VSTS consists in decomposing the data into a sparse multi-scale dictionary (wavelets, curvelets, ridgelets...), and then applying a VST on the coefficients in order to get quasi-Gaussian stabilized coefficients. In this present article, the used multi-scale transform is the Isotropic Undecimated Wavelet Transform. Then, hypothesis tests are made to detect significant coefficients, and the denoised image is reconstructed with an iterative method based on Hybrid Steepest Descent (HST). The method is tested on simulated Fermi data.
Singularities of Poisson structures and Hamiltonian bifurcations
Meer, van der J.C.
2010-01-01
Consider a Poisson structure on C8(R3,R) with bracket {, } and suppose that C is a Casimir function. Then {f, g} =<¿C, (¿g x ¿f) > is a possible Poisson structure. This confirms earlier observations concerning the Poisson structure for Hamiltonian systems that are reduced to a one degree of freedom
A Martingale Characterization of Mixed Poisson Processes.
1985-10-01
03LA A 11. TITLE (Inciuae Security Clanafication, ",A martingale characterization of mixed Poisson processes " ________________ 12. PERSONAL AUTHOR... POISSON PROCESSES Jostification .......... . ... . . Di.;t ib,,jtion by Availability Codes Dietmar Pfeifer* Technical University Aachen Dist Special and...Mixed Poisson processes play an important role in many branches of applied probability, for instance in insurance mathematics and physics (see Albrecht
Pang, Chengfang; Hristozov, Danail; Zabeo, Alex; Pizzol, Lisa; Tsang, Michael P; Sayre, Phil; Marcomini, Antonio
2017-02-01
Silver nanoparticles (n-Ag) are widely used in consumer products and many medical applications because of their unique antibacterial properties. Their use is raising concern about potential human exposures and health effects. Therefore, it is informative to assess the potential human health risks of n-Ag in order to ensure that nanotechnology-based consumer products are deployed in a safe and sustainable way. Even though toxicity studies clearly show the potential hazard of n-Ag, there have been few attempts to integrate hazard and exposure assessments to evaluate risks. The underlying reason for this is the difficulty in characterizing exposure and the lack of toxicity studies essential for human health risk assessment (HHRA). Such data gaps introduce significant uncertainty into the risk assessment process. This study uses probabilistic methods to assess the relative uncertainty and potential risks of n-Ag exposure to infants. In this paper, we estimate the risks for infants potentially exposed to n-Ag through drinking juice or milk from sippy cups or licking baby blankets containing n-Ag. We explicitly evaluate uncertainty and variability contained in available dose-response and exposure data in order to make the risk characterization process transparent. Our results showed that individual margin of exposures for oral exposure to sippy cups and baby blankets containing n-Ag exhibited minimal risk. Copyright © 2016. Published by Elsevier Ltd.
Navarro Rodríguez, José Manuel; Gallego Plazas, Javier; Borrás Rocher, Fernando; Calpena Rico, Rafael; Ruiz Macia, José Antonio; Morcillo Ródenas, Miguel Ángel
2017-10-01
The assessment of the state of immunosurveillance (the ability of the organism to prevent the development of neoplasias) in the blood has prognostic implications of interest in colorectal cancer. We evaluated and quantified a possible predictive character of the disease in a blood test using a mathematical interaction index of several blood parameters. The predictive capacity of the index to detect colorectal cancer was also assessed. We performed a retrospective case-control study of a comparative analysis of the distribution of blood parameters in 266 patients with colorectal cancer and 266 healthy patients during the period from 2009 to 2013. Statistically significant differences (p indexes (neutrophil to lymphocyte ratio and platelet to lymphocyte ratio), hemoglobin, hematocrit and eosinophil levels. These differences allowed the design of a blood analytical profile that calculates the risk of colorectal cancer. This risk profile can be quantified via a mathematical formula with a probabilistic capacity to identify patients with the highest risk of the presence of colorectal cancer (area under the ROC curve = 0.85). We showed that a colorectal cancer predictive character exists in blood which can be quantified by an interaction index of several blood parameters. The design and development of interaction indexes of blood parameters constitutes an interesting research line for the development and improvement of programs for the screening of colorectal cancer.
Directory of Open Access Journals (Sweden)
José Manuel Navarro-Rodríguez
Full Text Available Introduction: The assessment of the state of immunosurveillance (the ability of the organism to prevent the development of neoplasias in the blood has prognostic implications of interest in colorectal cancer. We evaluated and quantified a possible predictive character of the disease in a blood test using a mathematical interaction index of several blood parameters. The predictive capacity of the index to detect colorectal cancer was also assessed. Methods: We performed a retrospective case-control study of a comparative analysis of the distribution of blood parameters in 266 patients with colorectal cancer and 266 healthy patients during the period from 2009 to 2013. Results: Statistically significant differences (p < 0.05 were observed between patients with colorectal cancer and the control group in terms of platelet counts, fibrinogen, total leukocytes, neutrophils, systemic immunovigilance indexes (neutrophil to lymphocyte ratio and platelet to lymphocyte ratio, hemoglobin, hematocrit and eosinophil levels. These differences allowed the design of a blood analytical profile that calculates the risk of colorectal cancer. This risk profile can be quantified via a mathematical formula with a probabilistic capacity to identify patients with the highest risk of the presence of colorectal cancer (area under the ROC curve = 0.85. Conclusions: We showed that a colorectal cancer predictive character exists in blood which can be quantified by an interaction index of several blood parameters. The design and development of interaction indexes of blood parameters constitutes an interesting research line for the development and improvement of programs for the screening of colorectal cancer.
Durand, A I; Ipina, S L; Bermúdez de Castro, J M
2000-06-01
Parameters of a Middle Pleistocene human population such as the expected length of the female reproductive period (E(Y)), the expected interbirth interval (E(X)), the survival rate (tau) for females after the expected reproductive period, the rate (phi(2)) of women who, given that they reach first birth, do not survive to the end of the expected reproductive period, and the female infant plus juvenile mortality rate (phi(1)) have been assessed from a probabilistic standpoint provided that such a population were stationary. The hominid sample studied, the Sima de los Huesos (SH) cave site, Sierra de Atapuerca (Spain), is the most exhaustive human fossil sample currently available. Results suggest that the Atapuerca (SH) sample can derive from a stationary population. Further, in the case that the expected reproductive period ends between 37 and 40 yr of age, then 24 less, similarE(Y) less, similar27 yr, E(X)=3 yr, 0.224
Absorption systems at z ˜ 2 as a probe of the circum galactic medium: a probabilistic approach
Mongardi, C.; Viel, M.; D'Odorico, V.; Kim, T.-S.; Barai, P.; Murante, G.; Monaco, P.
2018-05-01
We characterize the properties of the intergalactic medium (IGM) around a sample of galaxies extracted from state-of-the-art hydrodynamical simulations of structure formation in a cosmological volume of 25 Mpc comoving at z ˜ 2. The simulations are based on two different sub-resolution schemes for star formation and supernova feedback: the MUlti-Phase Particle Integrator (MUPPI) scheme and the Effective Model. We develop a quantitative and probabilistic analysis based on the apparent optical depth method of the properties of the absorbers as a function of impact parameter from their nearby galaxies: in such a way we probe different environments from circumgalactic medium (CGM) to low density filaments. Absorbers' properties are then compared with a spectroscopic observational data set obtained from high resolution quasar spectra. Our main focus is on the NCIV - NHI relation around simulated galaxies: the results obtained with MUPPI and the Effective model are remarkably similar, with small differences only confined to regions at impact parameters b = [1 - 3] × rvir. Using {C IV} as a tracer of the metallicity, we obtain evidence that the observed metal absorption systems have the highest probability to be confined in a region of 150-400 kpc around galaxies. Near-filament environments have instead metallicities too low to be probed by present-day telescopes, but could be probed by future spectroscopical studies. Finally we compute {C IV} covering fractions which are in agreement with observational data.
International Nuclear Information System (INIS)
Neaimeh, Myriam; Wardle, Robin; Jenkins, Andrew M.; Yi, Jialiang; Hill, Graeme; Lyons, Padraig F.; Hübner, Yvonne; Blythe, Phil T.; Taylor, Phil C.
2015-01-01
Highlights: • Working with unique datasets of EV charging and smart meter load demand. • Distribution networks are not a homogenous group with more capabilities to accommodate EVs than previously suggested. • Spatial and temporal diversity of EV charging demand alleviate the impacts on networks. • An extensive recharging infrastructure could enable connection of additional EVs on constrained distribution networks. • Electric utilities could increase the network capability to accommodate EVs by investing in recharging infrastructure. - Abstract: This work uses a probabilistic method to combine two unique datasets of real world electric vehicle charging profiles and residential smart meter load demand. The data was used to study the impact of the uptake of Electric Vehicles (EVs) on electricity distribution networks. Two real networks representing an urban and rural area, and a generic network representative of a heavily loaded UK distribution network were used. The findings show that distribution networks are not a homogeneous group with a variation of capabilities to accommodate EVs and there is a greater capability than previous studies have suggested. Consideration of the spatial and temporal diversity of EV charging demand has been demonstrated to reduce the estimated impacts on the distribution networks. It is suggested that distribution network operators could collaborate with new market players, such as charging infrastructure operators, to support the roll out of an extensive charging infrastructure in a way that makes the network more robust; create more opportunities for demand side management; and reduce planning uncertainties associated with the stochastic nature of EV charging demand.
Badde, Stephanie; Heed, Tobias; Röder, Brigitte
2016-04-01
To act upon a tactile stimulus its original skin-based, anatomical spatial code has to be transformed into an external, posture-dependent reference frame, a process known as tactile remapping. When the limbs are crossed, anatomical and external location codes are in conflict, leading to a decline in tactile localization accuracy. It is unknown whether this impairment originates from the integration of the resulting external localization response with the original, anatomical one or from a failure of tactile remapping in crossed postures. We fitted probabilistic models based on these diverging accounts to the data from three tactile localization experiments. Hand crossing disturbed tactile left-right location choices in all experiments. Furthermore, the size of these crossing effects was modulated by stimulus configuration and task instructions. The best model accounted for these results by integration of the external response mapping with the original, anatomical one, while applying identical integration weights for uncrossed and crossed postures. Thus, the model explained the data without assuming failures of remapping. Moreover, performance differences across tasks were accounted for by non-individual parameter adjustments, indicating that individual participants' task adaptation results from one common functional mechanism. These results suggest that remapping is an automatic and accurate process, and that the observed localization impairments in touch result from a cognitively controlled integration process that combines anatomically and externally coded responses.
Arbitrage and Hedging in a non probabilistic framework
Alvarez, Alexander; Ferrando, Sebastian; Olivares, Pablo
2011-01-01
The paper studies the concepts of hedging and arbitrage in a non probabilistic framework. It provides conditions for non probabilistic arbitrage based on the topological structure of the trajectory space and makes connections with the usual notion of arbitrage. Several examples illustrate the non probabilistic arbitrage as well perfect replication of options under continuous and discontinuous trajectories, the results can then be applied in probabilistic models path by path. The approach is r...
Liao, Chung-Min; Ju, Yun-Ru; Chio, Chia-Pin; Chen, Wei-Yu
2010-02-01
The purpose of this article is to provide a risk-based predictive model to assess the impact of false mussel Mytilopsis sallei invasions on hard clam Meretrix lusoria farms in the southwestern region of Taiwan. The actual spread of invasive false mussel was predicted by using analytical models based on advection-diffusion and gravity models. The proportion of hard clam colonized and infestation by false mussel were used to characterize risk estimates. A mortality model was parameterized to assess hard clam mortality risk characterized by false mussel density and infestation intensity. The published data were reanalyzed to parameterize a predictive threshold model described by a cumulative Weibull distribution function that can be used to estimate the exceeding thresholds of proportion of hard clam colonized and infestation. Results indicated that the infestation thresholds were 2-17 ind clam(-1) for adult hard clams, whereas 4 ind clam(-1) for nursery hard clams. The average colonization thresholds were estimated to be 81-89% for cultivated and nursery hard clam farms, respectively. Our results indicated that false mussel density and infestation, which caused 50% hard clam mortality, were estimated to be 2,812 ind m(-2) and 31 ind clam(-1), respectively. This study further indicated that hard clam farms that are close to the coastal area have at least 50% probability for 43% mortality caused by infestation. This study highlighted that a probabilistic risk-based framework characterized by probability distributions and risk curves is an effective representation of scientific assessments for farmed hard clam in response to the nonnative false mussel invasion.
Yavaş, Gökhan; Koyutürk, Mehmet; Gould, Meetha P; McMahon, Sarah; LaFramboise, Thomas
2014-03-05
With the advent of paired-end high throughput sequencing, it is now possible to identify various types of structural variation on a genome-wide scale. Although many methods have been proposed for structural variation detection, most do not provide precise boundaries for identified variants. In this paper, we propose a new method, Distribution Based detection of Duplication Boundaries (DB2), for accurate detection of tandem duplication breakpoints, an important class of structural variation, with high precision and recall. Our computational experiments on simulated data show that DB2 outperforms state-of-the-art methods in terms of finding breakpoints of tandem duplications, with a higher positive predictive value (precision) in calling the duplications' presence. In particular, DB2's prediction of tandem duplications is correct 99% of the time even for very noisy data, while narrowing down the space of possible breakpoints within a margin of 15 to 20 bps on the average. Most of the existing methods provide boundaries in ranges that extend to hundreds of bases with lower precision values. Our method is also highly robust to varying properties of the sequencing library and to the sizes of the tandem duplications, as shown by its stable precision, recall and mean boundary mismatch performance. We demonstrate our method's efficacy using both simulated paired-end reads, and those generated from a melanoma sample and two ovarian cancer samples. Newly discovered tandem duplications are validated using PCR and Sanger sequencing. Our method, DB2, uses discordantly aligned reads, taking into account the distribution of fragment length to predict tandem duplications along with their breakpoints on a donor genome. The proposed method fine tunes the breakpoint calls by applying a novel probabilistic framework that incorporates the empirical fragment length distribution to score each feasible breakpoint. DB2 is implemented in Java programming language and is freely available
Probabilistic methods used in NUSS
International Nuclear Information System (INIS)
Fischer, J.; Giuliani, P.
1985-01-01
Probabilistic considerations are used implicitly or explicitly in all technical areas. In the NUSS codes and guides the two areas of design and siting are those where more use is made of these concepts. A brief review of the relevant documents in these two areas is made in this paper. It covers the documents where either probabilistic considerations are implied or where probabilistic approaches are recommended in the evaluation of situations and of events. In the siting guides the review mainly covers the area of seismic hydrological and external man-made events analysis, as well as some aspects of meteorological extreme events analysis. Probabilistic methods are recommended in the design guides but they are not made a requirement. There are several reasons for this, mainly lack of reliable data and the absence of quantitative safety limits or goals against which to judge the design analysis. As far as practical, engineering judgement should be backed up by quantitative probabilistic analysis. Examples are given and the concept of design basis as used in NUSS design guides is explained. (author)
Schweizer, B
2005-01-01
Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.
Nonhomogeneous Poisson process with nonparametric frailty
International Nuclear Information System (INIS)
Slimacek, Vaclav; Lindqvist, Bo Henry
2016-01-01
The failure processes of heterogeneous repairable systems are often modeled by non-homogeneous Poisson processes. The common way to describe an unobserved heterogeneity between systems is to multiply the basic rate of occurrence of failures by a random variable (a so-called frailty) having a specified parametric distribution. Since the frailty is unobservable, the choice of its distribution is a problematic part of using these models, as are often the numerical computations needed in the estimation of these models. The main purpose of this paper is to develop a method for estimation of the parameters of a nonhomogeneous Poisson process with unobserved heterogeneity which does not require parametric assumptions about the heterogeneity and which avoids the frequently encountered numerical problems associated with the standard models for unobserved heterogeneity. The introduced method is illustrated on an example involving the power law process, and is compared to the standard gamma frailty model and to the classical model without unobserved heterogeneity. The derived results are confirmed in a simulation study which also reveals several not commonly known properties of the gamma frailty model and the classical model, and on a real life example. - Highlights: • A new method for estimation of a NHPP with frailty is introduced. • Introduced method does not require parametric assumptions about frailty. • The approach is illustrated on an example with the power law process. • The method is compared to the gamma frailty model and to the model without frailty.
NHPoisson: An R Package for Fitting and Validating Nonhomogeneous Poisson Processes
Directory of Open Access Journals (Sweden)
Ana C. Cebrián
2015-03-01
Full Text Available NHPoisson is an R package for the modeling of nonhomogeneous Poisson processes in one dimension. It includes functions for data preparation, maximum likelihood estimation, covariate selection and inference based on asymptotic distributions and simulation methods. It also provides specific methods for the estimation of Poisson processes resulting from a peak over threshold approach. In addition, the package supports a wide range of model validation tools and functions for generating nonhomogenous Poisson process trajectories. This paper is a description of the package and aims to help those interested in modeling data using nonhomogeneous Poisson processes.
Fernandez-Garcia, D.; Sanchez-Vila, X.; Bolster, D.; Tartakovsky, D. M.
2010-12-01
The release of non-aqueous phase liquids (NAPLs) such as petroleum hydrocarbons and chlorinated solvents in the subsurface is a severe source of groundwater and vapor contamination. Because these liquids are essentially immiscible due to low solubility, these contaminants get slowly dissolved in groundwater and/or volatilized in the vadoze zone threatening the environment and public health over a long period. Many remediation technologies and strategies have been developed in the last decades for restoring the water quality properties of these contaminated sites. The failure of an on-site treatment technology application is often due to the unnoticed presence of dissolved NAPL entrapped in low permeability areas (heterogeneity) and/or the remaining of substantial amounts of pure phase after remediation efforts. Full understanding of the impact of remediation efforts is complicated due to the role of many interlink physical and biochemical processes taking place through several potential pathways of exposure to multiple receptors in a highly unknown heterogeneous environment. Due to these difficulties, the design of remediation strategies and definition of remediation endpoints have been traditionally determined without quantifying the risk associated with the failure of such efforts. We conduct a probabilistic risk analysis (PRA) of the likelihood of success of an on-site NAPL treatment technology that easily integrates all aspects of the problem (causes, pathways, and receptors) without doing extensive modeling. Importantly, the method is further capable to incorporate the inherent uncertainty that often exist in the exact location where the dissolved NAPL plume leaves the source zone. This is achieved by describing the failure of the system as a function of this source zone exit location, parameterized in terms of a vector of parameters. Using a Bayesian interpretation of the system and by means of the posterior multivariate distribution, the failure of the
Probabilistic model for sterilization of food
International Nuclear Information System (INIS)
Chepurko, V.V.; Malinovskij, O.V.
1986-01-01
The probabilistic model for radiation sterilization is proposed based on the followng suppositions: (1) initial contamination of a volume unit of the sterilized product m is described by the distribution of the probabilities q(m), (2) inactivation of the population from m of microorganisms is approximated by Bernoulli test scheme, and (3) contamination of unit of the sterilized product is independent. The possibility of approximation q(m) by Poisson distribution is demonstrated. The diagrams are presented permitting to evaluate the dose which provides the defined reliability of sterilization of food for chicken-gnotobionts
Poisson sigma model with branes and hyperelliptic Riemann surfaces
International Nuclear Information System (INIS)
Ferrario, Andrea
2008-01-01
We derive the explicit form of the superpropagators in the presence of general boundary conditions (coisotropic branes) for the Poisson sigma model. This generalizes the results presented by Cattaneo and Felder [''A path integral approach to the Kontsevich quantization formula,'' Commun. Math. Phys. 212, 591 (2000)] and Cattaneo and Felder ['Coisotropic submanifolds in Poisson geometry and branes in the Poisson sigma model', Lett. Math. Phys. 69, 157 (2004)] for Kontsevich's angle function [Kontsevich, M., 'Deformation quantization of Poisson manifolds I', e-print arXiv:hep.th/0101170] used in the deformation quantization program of Poisson manifolds. The relevant superpropagators for n branes are defined as gauge fixed homotopy operators of a complex of differential forms on n sided polygons P n with particular ''alternating'' boundary conditions. In the presence of more than three branes we use first order Riemann theta functions with odd singular characteristics on the Jacobian variety of a hyperelliptic Riemann surface (canonical setting). In genus g the superpropagators present g zero mode contributions
Poisson image reconstruction with Hessian Schatten-norm regularization.
Lefkimmiatis, Stamatios; Unser, Michael
2013-11-01
Poisson inverse problems arise in many modern imaging applications, including biomedical and astronomical ones. The main challenge is to obtain an estimate of the underlying image from a set of measurements degraded by a linear operator and further corrupted by Poisson noise. In this paper, we propose an efficient framework for Poisson image reconstruction, under a regularization approach, which depends on matrix-valued regularization operators. In particular, the employed regularizers involve the Hessian as the regularization operator and Schatten matrix norms as the potential functions. For the solution of the problem, we propose two optimization algorithms that are specifically tailored to the Poisson nature of the noise. These algorithms are based on an augmented-Lagrangian formulation of the problem and correspond to two variants of the alternating direction method of multipliers. Further, we derive a link that relates the proximal map of an l(p) norm with the proximal map of a Schatten matrix norm of order p. This link plays a key role in the development of one of the proposed algorithms. Finally, we provide experimental results on natural and biological images for the task of Poisson image deblurring and demonstrate the practical relevance and effectiveness of the proposed framework.
Optimality of Poisson Processes Intensity Learning with Gaussian Processes
Kirichenko, A.; van Zanten, H.
2015-01-01
In this paper we provide theoretical support for the so-called "Sigmoidal Gaussian Cox Process" approach to learning the intensity of an inhomogeneous Poisson process on a d-dimensional domain. This method was proposed by Adams, Murray and MacKay (ICML, 2009), who developed a tractable computational
Nonparametric Bayesian inference for multidimensional compound Poisson processes
Gugushvili, S.; van der Meulen, F.; Spreij, P.
2015-01-01
Given a sample from a discretely observed multidimensional compound Poisson process, we study the problem of nonparametric estimation of its jump size density r0 and intensity λ0. We take a nonparametric Bayesian approach to the problem and determine posterior contraction rates in this context,
Four-dimensional gravity as an almost-Poisson system
Ita, Eyo Eyo
2015-04-01
In this paper, we examine the phase space structure of a noncanonical formulation of four-dimensional gravity referred to as the Instanton representation of Plebanski gravity (IRPG). The typical Hamiltonian (symplectic) approach leads to an obstruction to the definition of a symplectic structure on the full phase space of the IRPG. We circumvent this obstruction, using the Lagrange equations of motion, to find the appropriate generalization of the Poisson bracket. It is shown that the IRPG does not support a Poisson bracket except on the vector constraint surface. Yet there exists a fundamental bilinear operation on its phase space which produces the correct equations of motion and induces the correct transformation properties of the basic fields. This bilinear operation is known as the almost-Poisson bracket, which fails to satisfy the Jacobi identity and in this case also the condition of antisymmetry. We place these results into the overall context of nonsymplectic systems.
Probabilistic risk assessment methodology
International Nuclear Information System (INIS)
Shinaishin, M.A.
1988-06-01
The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)
Probabilistic risk assessment methodology
Energy Technology Data Exchange (ETDEWEB)
Shinaishin, M A
1988-06-15
The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)
International Nuclear Information System (INIS)
Kocer, C.; McKenzie, D.R.; Bilek, M.M.
2009-01-01
The theory of elasticity predicts a variety of phenomena associated with solids that possess a negative Poisson's ratio. The fabrication of metamaterials with a 'designed' microstructure that exhibit a Poisson's ratio approaching the thermodynamic limits of 1/2 and -1 increases the likelihood of realising these phenomena for applications. In this work, we investigate the properties of a layered composite, with alternating layers of materials with negative and positive Poisson's ratio approaching the thermodynamic limits. Using the finite element method to simulate uniaxial loading and indentation of a free standing composite, we observed an increase in the resistance to mechanical deformation above the average value of the two materials. Even though the greatest increase in stiffness is gained as the thermodynamic limits are approached, a significant amount of added stiffness can be attained, provided that the Young's modulus of the negative Poisson's ratio material is not less than that of the positive Poisson's ratio material
Nonlinear poisson brackets geometry and quantization
Karasev, M V
2012-01-01
This book deals with two old mathematical problems. The first is the problem of constructing an analog of a Lie group for general nonlinear Poisson brackets. The second is the quantization problem for such brackets in the semiclassical approximation (which is the problem of exact quantization for the simplest classes of brackets). These problems are progressively coming to the fore in the modern theory of differential equations and quantum theory, since the approach based on constructions of algebras and Lie groups seems, in a certain sense, to be exhausted. The authors' main goal is to describe in detail the new objects that appear in the solution of these problems. Many ideas of algebra, modern differential geometry, algebraic topology, and operator theory are synthesized here. The authors prove all statements in detail, thus making the book accessible to graduate students.
Poisson's ratio of fiber-reinforced composites
Christiansson, Henrik; Helsing, Johan
1996-05-01
Poisson's ratio flow diagrams, that is, the Poisson's ratio versus the fiber fraction, are obtained numerically for hexagonal arrays of elastic circular fibers in an elastic matrix. High numerical accuracy is achieved through the use of an interface integral equation method. Questions concerning fixed point theorems and the validity of existing asymptotic relations are investigated and partially resolved. Our findings for the transverse effective Poisson's ratio, together with earlier results for random systems by other authors, make it possible to formulate a general statement for Poisson's ratio flow diagrams: For composites with circular fibers and where the phase Poisson's ratios are equal to 1/3, the system with the lowest stiffness ratio has the highest Poisson's ratio. For other choices of the elastic moduli for the phases, no simple statement can be made.
Singular reduction of Nambu-Poisson manifolds
Das, Apurba
The version of Marsden-Ratiu Poisson reduction theorem for Nambu-Poisson manifolds by a regular foliation have been studied by Ibáñez et al. In this paper, we show that this reduction procedure can be extended to the singular case. Under a suitable notion of Hamiltonian flow on the reduced space, we show that a set of Hamiltonians on a Nambu-Poisson manifold can also be reduced.
Probabilistic logic networks a comprehensive framework for uncertain inference
Goertzel, Ben; Goertzel, Izabela Freire; Heljakka, Ari
2008-01-01
This comprehensive book describes Probabilistic Logic Networks (PLN), a novel conceptual, mathematical and computational approach to uncertain inference. A broad scope of reasoning types are considered.
Nonlinear Poisson equation for heterogeneous media.
Hu, Langhua; Wei, Guo-Wei
2012-08-22
The Poisson equation is a widely accepted model for electrostatic analysis. However, the Poisson equation is derived based on electric polarizations in a linear, isotropic, and homogeneous dielectric medium. This article introduces a nonlinear Poisson equation to take into consideration of hyperpolarization effects due to intensive charges and possible nonlinear, anisotropic, and heterogeneous media. Variational principle is utilized to derive the nonlinear Poisson model from an electrostatic energy functional. To apply the proposed nonlinear Poisson equation for the solvation analysis, we also construct a nonpolar solvation energy functional based on the nonlinear Poisson equation by using the geometric measure theory. At a fixed temperature, the proposed nonlinear Poisson theory is extensively validated by the electrostatic analysis of the Kirkwood model and a set of 20 proteins, and the solvation analysis of a set of 17 small molecules whose experimental measurements are also available for a comparison. Moreover, the nonlinear Poisson equation is further applied to the solvation analysis of 21 compounds at different temperatures. Numerical results are compared to theoretical prediction, experimental measurements, and those obtained from other theoretical methods in the literature. A good agreement between our results and experimental data as well as theoretical results suggests that the proposed nonlinear Poisson model is a potentially useful model for electrostatic analysis involving hyperpolarization effects. Copyright © 2012 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Non-holonomic dynamics and Poisson geometry
International Nuclear Information System (INIS)
Borisov, A V; Mamaev, I S; Tsiganov, A V
2014-01-01
This is a survey of basic facts presently known about non-linear Poisson structures in the analysis of integrable systems in non-holonomic mechanics. It is shown that by using the theory of Poisson deformations it is possible to reduce various non-holonomic systems to dynamical systems on well-understood phase spaces equipped with linear Lie-Poisson brackets. As a result, not only can different non-holonomic systems be compared, but also fairly advanced methods of Poisson geometry and topology can be used for investigating them. Bibliography: 95 titles
Solano-Flores, Guillermo
2014-01-01
This article addresses validity and fairness in the testing of English language learners (ELLs)--students in the United States who are developing English as a second language. It discusses limitations of current approaches to examining the linguistic features of items and their effect on the performance of ELL students. The article submits that…
Verhagen, H.J.
2003-01-01
In a classical design approach to breakwaters a design wave height is determined, and filled in into a design formula. Some undefined safety is added. In the method using partial safety coefficients (as developed by PIANC [1992] and recently also adopted by the Coastal Engineering Manual of the US
DEFF Research Database (Denmark)
Löwe, Roland; Del Giudice, Dario; Mikkelsen, Peter Steen
of input uncertainties observed in the models. The explicit inclusion of such variations in the modelling process will lead to a better fulfilment of the assumptions made in formal statistical frameworks, thus reducing the need to resolve to informal methods. The two approaches presented here...
Wang, Fengwen
2018-05-01
This paper presents a systematic approach for designing 3D auxetic lattice materials, which exhibit constant negative Poisson's ratios over large strain intervals. A unit cell model mimicking tensile tests is established and based on the proposed model, the secant Poisson's ratio is defined as the negative ratio between the lateral and the longitudinal engineering strains. The optimization problem for designing a material unit cell with a target Poisson's ratio is formulated to minimize the average lateral engineering stresses under the prescribed deformations. Numerical results demonstrate that 3D auxetic lattice materials with constant Poisson's ratios can be achieved by the proposed optimization formulation and that two sets of material architectures are obtained by imposing different symmetry on the unit cell. Moreover, inspired by the topology-optimized material architecture, a subsequent shape optimization is proposed by parametrizing material architectures using super-ellipsoids. By designing two geometrical parameters, simple optimized material microstructures with different target Poisson's ratios are obtained. By interpolating these two parameters as polynomial functions of Poisson's ratios, material architectures for any Poisson's ratio in the interval of ν ∈ [ - 0.78 , 0.00 ] are explicitly presented. Numerical evaluations show that interpolated auxetic lattice materials exhibit constant Poisson's ratios in the target strain interval of [0.00, 0.20] and that 3D auxetic lattice material architectures with programmable Poisson's ratio are achievable.
Probabilistic costing of transmission services
International Nuclear Information System (INIS)
Wijayatunga, P.D.C.
1992-01-01
Costing of transmission services of electrical utilities is required for transactions involving the transport of energy over a power network. The calculation of these costs based on Short Run Marginal Costing (SRMC) is preferred over other methods proposed in the literature due to its economic efficiency. In the research work discussed here, the concept of probabilistic costing of use-of-system based on SRMC which emerges as a consequence of the uncertainties in a power system is introduced using two different approaches. The first approach, based on the Monte Carlo method, generates a large number of possible system states by simulating random variables in the system using pseudo random number generators. A second approach to probabilistic use-of-system costing is proposed based on numerical convolution and multi-area representation of the transmission network. (UK)
Information content of poisson images
International Nuclear Information System (INIS)
Cederlund, J.
1979-04-01
One major problem when producing images with the aid of Poisson distributed quanta is how best to compromise between spatial and contrast resolution. Increasing the number of image elements improves spatial resolution, but at the cost of fewer quanta per image element, which reduces contrast resolution. Information theory arguments are used to analyse this problem. It is argued that information capacity is a useful concept to describe an important property of the imaging device, but that in order to compute the information content of an image produced by this device some statistical properties (such as the a priori probability of the densities) of the object to be depicted must be taken into account. If these statistical properties are not known one cannot make a correct choice between spatial and contrast resolution. (author)
Perrot, Nathalie; Baudrit, Cédric; Brousset, Jean Marie; Abbal, Philippe; Guillemin, Hervé; Perret, Bruno; Goulet, Etienne; Guerin, Laurence; Barbeau, Gérard; Picque, Daniel
2015-01-01
Agri-food is one of the most important sectors of the industry and a major contributor to the global warming potential in Europe. Sustainability issues pose a huge challenge for this sector. In this context, a big issue is to be able to predict the multiscale dynamics of those systems using computing science. A robust predictive mathematical tool is implemented for this sector and applied to the wine industry being easily able to be generalized to other applications. Grape berry maturation relies on complex and coupled physicochemical and biochemical reactions which are climate dependent. Moreover one experiment represents one year and the climate variability could not be covered exclusively by the experiments. Consequently, harvest mostly relies on expert predictions. A big challenge for the wine industry is nevertheless to be able to anticipate the reactions for sustainability purposes. We propose to implement a decision support system so called FGRAPEDBN able to (1) capitalize the heterogeneous fragmented knowledge available including data and expertise and (2) predict the sugar (resp. the acidity) concentrations with a relevant RMSE of 7 g/l (resp. 0.44 g/l and 0.11 g/kg). FGRAPEDBN is based on a coupling between a probabilistic graphical approach and a fuzzy expert system.
Dornic, N; Ficheux, A S; Bernard, A; Roudot, A C
2017-08-01
The notes of guidance for the testing of cosmetic ingredients and their safety evaluation by the Scientific Committee on Consumer Safety (SCCS) is a document dedicated to ensuring the safety of European consumers. This contains useful data for risk assessment such as default values for Skin Surface Area (SSA). A more in-depth study of anthropometric data across Europe reveals considerable variations. The default SSA value was derived from a study on the Dutch population, which is known to be one of the tallest nations in the World. This value could be inadequate for shorter populations of Europe. Data were collected in a survey on cosmetic consumption in France. Probabilistic treatment of these data and analysis of the case of methylisothiazolinone, a sensitizer recently evaluated by a deterministic approach submitted to SCCS, suggest that the default value for SSA used in the quantitative risk assessment might not be relevant for a significant share of the French female population. Others female populations of Southern Europe may also be excluded. This is of importance given that some studies show an increasing risk of developping skin sensitization among women. The disparities in anthropometric data across Europe should be taken into consideration. Copyright © 2017 Elsevier Ltd. All rights reserved.
Sobradelo, Rosa; Martí, Joan; Kilburn, Christopher; López, Carmen
2014-05-01
Understanding the potential evolution of a volcanic crisis is crucial to improving the design of effective mitigation strategies. This is especially the case for volcanoes close to densely-populated regions, where inappropriate decisions may trigger widespread loss of life, economic disruption and public distress. An outstanding goal for improving the management of volcanic crises, therefore, is to develop objective, real-time methodologies for evaluating how an emergency will develop and how scientists communicate with decision makers. Here we present a new model BADEMO (Bayesian Decision Model) that applies a general and flexible, probabilistic approach to managing volcanic crises. The model combines the hazard and risk factors that decision makers need for a holistic analysis of a volcanic crisis. These factors include eruption scenarios and their probabilities of occurrence, the vulnerability of populations and their activities, and the costs of false alarms and failed forecasts. The model can be implemented before an emergency, to identify actions for reducing the vulnerability of a district; during an emergency, to identify the optimum mitigating actions and how these may change as new information is obtained; and after an emergency, to assess the effectiveness of a mitigating response and, from the results, to improve strategies before another crisis occurs. As illustrated by a retrospective analysis of the 2011 eruption of El Hierro, in the Canary Islands, BADEMO provides the basis for quantifying the uncertainty associated with each recommended action as an emergency evolves, and serves as a mechanism for improving communications between scientists and decision makers.
Douglas, E. M.; Kirshen, P. H.; Bosma, K.; Watson, C.; Miller, S.; McArthur, K.
2015-12-01
There now exists a plethora of information attesting to the reality of our changing climate and its impacts on both human and natural systems. There also exists a growing literature linking climate change impacts and transportation infrastructure (highways, bridges, tunnels, railway, shipping ports, etc.) which largely agrees that the nation's transportation systems are vulnerable. To assess this vulnerability along the coast, flooding due to sea level rise and storm surge has most commonly been evaluated by simply increasing the water surface elevation and then estimating flood depth by comparing the new water surface elevation with the topographic elevations of the land surface. While this rudimentary "bathtub" approach may provide a first order identification of potential areas of vulnerability, accurate assessment requires a high resolution, physically-based hydrodynamic model that can simulate inundation due to the combined effects of sea level rise, storm surge, tides and wave action for site-specific locations. Furthermore, neither the "bathtub" approach nor other scenario-based approaches can quantify the probability of flooding due to these impacts. We developed a high resolution coupled ocean circulation-wave model (ADCIRC/SWAN) that utilizes a Monte Carlo approach for predicting the depths and associated exceedance probabilities of flooding due to both tropical (hurricanes) and extra-tropical storms under current and future climate conditions. This required the development of an entirely new database of meteorological forcing (e.g. pressure, wind speed, etc.) for historical Nor'easters in the North Atlantic basin. Flooding due to hurricanes and Nor'easters was simulated separately and then composite flood probability distributions were developed. Model results were used to assess the vulnerability of the Central Artery/Tunnel system in Boston, Massachusetts to coastal flooding now and in the future. Local and regional adaptation strategies were
Some thoughts on the future of probabilistic structural design of nuclear components
International Nuclear Information System (INIS)
Stancampiano, P.A.
1978-01-01
This paper presents some views on the future role of probabilistic methods in the structural design of nuclear components. The existing deterministic design approach is discussed and compared to the probabilistic approach. Some of the objections to both deterministic and probabilistic design are listed. Extensive research and development activities are required to mature the probabilistic approach suficiently to make it cost-effective and competitive with current deterministic design practices. The required research activities deal with probabilistic methods development, more realistic casual failure mode models development, and statistical data models development. A quasi-probabilistic structural design approach is recommended which accounts for the random error in the design models. (Auth.)
Probabilistic escalation modelling
Energy Technology Data Exchange (ETDEWEB)
Korneliussen, G.; Eknes, M.L.; Haugen, K.; Selmer-Olsen, S. [Det Norske Veritas, Oslo (Norway)
1997-12-31
This paper describes how structural reliability methods may successfully be applied within quantitative risk assessment (QRA) as an alternative to traditional event tree analysis. The emphasis is on fire escalation in hydrocarbon production and processing facilities. This choice was made due to potential improvements over current QRA practice associated with both the probabilistic approach and more detailed modelling of the dynamics of escalating events. The physical phenomena important for the events of interest are explicitly modelled as functions of time. Uncertainties are represented through probability distributions. The uncertainty modelling enables the analysis to be simple when possible and detailed when necessary. The methodology features several advantages compared with traditional risk calculations based on event trees. (Author)
Probabilistic fracture finite elements
Liu, W. K.; Belytschko, T.; Lua, Y. J.
1991-05-01
The Probabilistic Fracture Mechanics (PFM) is a promising method for estimating the fatigue life and inspection cycles for mechanical and structural components. The Probability Finite Element Method (PFEM), which is based on second moment analysis, has proved to be a promising, practical approach to handle problems with uncertainties. As the PFEM provides a powerful computational tool to determine first and second moment of random parameters, the second moment reliability method can be easily combined with PFEM to obtain measures of the reliability of the structural system. The method is also being applied to fatigue crack growth. Uncertainties in the material properties of advanced materials such as polycrystalline alloys, ceramics, and composites are commonly observed from experimental tests. This is mainly attributed to intrinsic microcracks, which are randomly distributed as a result of the applied load and the residual stress.
Probabilistic retinal vessel segmentation
Wu, Chang-Hua; Agam, Gady
2007-03-01
Optic fundus assessment is widely used for diagnosing vascular and non-vascular pathology. Inspection of the retinal vasculature may reveal hypertension, diabetes, arteriosclerosis, cardiovascular disease and stroke. Due to various imaging conditions retinal images may be degraded. Consequently, the enhancement of such images and vessels in them is an important task with direct clinical applications. We propose a novel technique for vessel enhancement in retinal images that is capable of enhancing vessel junctions in addition to linear vessel segments. This is an extension of vessel filters we have previously developed for vessel enhancement in thoracic CT scans. The proposed approach is based on probabilistic models which can discern vessels and junctions. Evaluation shows the proposed filter is better than several known techniques and is comparable to the state of the art when evaluated on a standard dataset. A ridge-based vessel tracking process is applied on the enhanced image to demonstrate the effectiveness of the enhancement filter.
Energy Technology Data Exchange (ETDEWEB)
Koo, B. B.; Lee, J. M.; Kim, J. S.; Kim, I. Y.; Kim, S. I. [Hanyang University, Seoul (Korea, Republic of); Lee, J. S.; Lee, D. S.; Kwon, J. S. [Seoul National University College of Medicine, Seoul (Korea, Republic of); Kim, J. J. [Yonsei University College of Medicine, Seoul (Korea, Republic of)
2003-06-01
The probabilistic anatomical maps are used to localize the functional neuro-images and morphological variability. The quantitative indicator is very important to inquire the anatomical position of an activated region because functional image data has the low-resolution nature and no inherent anatomical information. Although previously developed MNI probabilistic anatomical map was enough to localize the data, it was not suitable for the Korean brains because of the morphological difference between Occidental and Oriental. In this study, we develop a probabilistic anatomical map for Korean normal brain. Normal 75 brains of T1-weighted spoiled gradient echo magnetic resonance images were acquired on a 1.5-T GESIGNA scanner. Then, a standard brain is selected in the group through a clinician searches a brain of the average property in the Talairach coordinate system. With the standard brain, an anatomist delineates 89 regions of interest (ROI) parcellating cortical and subcortical areas. The parcellated ROIs of the standard are warped and overlapped into each brain by maximizing intensity similarity. And every brain is automatically labeled with the registered ROIs. Each of the same-labeled region is linearly normalize to the standard brain, and the occurrence of each region is counted. Finally, 89 probabilistic ROI volumes are generated. This paper presents a probabilistic anatomical map for localizing the functional and structural analysis of Korean normal brain. In the future, we'll develop the group specific probabilistic anatomical maps of OCD and schizophrenia disease.
International Nuclear Information System (INIS)
Koo, B. B.; Lee, J. M.; Kim, J. S.; Kim, I. Y.; Kim, S. I.; Lee, J. S.; Lee, D. S.; Kwon, J. S.; Kim, J. J.
2003-01-01
The probabilistic anatomical maps are used to localize the functional neuro-images and morphological variability. The quantitative indicator is very important to inquire the anatomical position of an activated region because functional image data has the low-resolution nature and no inherent anatomical information. Although previously developed MNI probabilistic anatomical map was enough to localize the data, it was not suitable for the Korean brains because of the morphological difference between Occidental and Oriental. In this study, we develop a probabilistic anatomical map for Korean normal brain. Normal 75 brains of T1-weighted spoiled gradient echo magnetic resonance images were acquired on a 1.5-T GESIGNA scanner. Then, a standard brain is selected in the group through a clinician searches a brain of the average property in the Talairach coordinate system. With the standard brain, an anatomist delineates 89 regions of interest (ROI) parcellating cortical and subcortical areas. The parcellated ROIs of the standard are warped and overlapped into each brain by maximizing intensity similarity. And every brain is automatically labeled with the registered ROIs. Each of the same-labeled region is linearly normalize to the standard brain, and the occurrence of each region is counted. Finally, 89 probabilistic ROI volumes are generated. This paper presents a probabilistic anatomical map for localizing the functional and structural analysis of Korean normal brain. In the future, we'll develop the group specific probabilistic anatomical maps of OCD and schizophrenia disease
Probabilistic machine learning and artificial intelligence.
Ghahramani, Zoubin
2015-05-28
How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.
Probabilistic machine learning and artificial intelligence
Ghahramani, Zoubin
2015-05-01
How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.
Probabilistic assessment of nuclear safety and safeguards
International Nuclear Information System (INIS)
Higson, D.J.
1987-01-01
Nuclear reactor accidents and diversions of materials from the nuclear fuel cycle are perceived by many people as particularly serious threats to society. Probabilistic assessment is a rational approach to the evaluation of both threats, and may provide a basis for decisions on appropriate actions to control them. Probabilistic method have become standard tools used in the analysis of safety, but there are disagreements on the criteria to be applied when assessing the results of analysis. Probabilistic analysis and assessment of the effectiveness of nuclear material safeguards are still at an early stage of development. (author)
Integrated Deterministic-Probabilistic Safety Assessment Methodologies
Energy Technology Data Exchange (ETDEWEB)
Kudinov, P.; Vorobyev, Y.; Sanchez-Perea, M.; Queral, C.; Jimenez Varas, G.; Rebollo, M. J.; Mena, L.; Gomez-Magin, J.
2014-02-01
IDPSA (Integrated Deterministic-Probabilistic Safety Assessment) is a family of methods which use tightly coupled probabilistic and deterministic approaches to address respective sources of uncertainties, enabling Risk informed decision making in a consistent manner. The starting point of the IDPSA framework is that safety justification must be based on the coupling of deterministic (consequences) and probabilistic (frequency) considerations to address the mutual interactions between stochastic disturbances (e.g. failures of the equipment, human actions, stochastic physical phenomena) and deterministic response of the plant (i.e. transients). This paper gives a general overview of some IDPSA methods as well as some possible applications to PWR safety analyses. (Author)
On (co)homology of Frobenius Poisson algebras
Zhu, Can; Van Oystaeyen, Fred; ZHANG, Yinhuo
2014-01-01
In this paper, we study Poisson (co)homology of a Frobenius Poisson algebra. More precisely, we show that there exists a duality between Poisson homology and Poisson cohomology of Frobenius Poisson algebras, similar to that between Hochschild homology and Hochschild cohomology of Frobenius algebras. Then we use the non-degenerate bilinear form on a unimodular Frobenius Poisson algebra to construct a Batalin-Vilkovisky structure on the Poisson cohomology ring making it into a Batalin-Vilkovisk...
Chen, Yi-Shin
2018-01-01
Conventional decision theory suggests that under risk, people choose option(s) by maximizing the expected utility. However, theories deal ambiguously with different options that have the same expected utility. A network approach is proposed by introducing ‘goal’ and ‘time’ factors to reduce the ambiguity in strategies for calculating the time-dependent probability of reaching a goal. As such, a mathematical foundation that explains the irrational behavior of choosing an option with a lower expected utility is revealed, which could imply that humans possess rationality in foresight. PMID:29702665
Pan, Wei; Chen, Yi-Shin
2018-01-01
Conventional decision theory suggests that under risk, people choose option(s) by maximizing the expected utility. However, theories deal ambiguously with different options that have the same expected utility. A network approach is proposed by introducing 'goal' and 'time' factors to reduce the ambiguity in strategies for calculating the time-dependent probability of reaching a goal. As such, a mathematical foundation that explains the irrational behavior of choosing an option with a lower expected utility is revealed, which could imply that humans possess rationality in foresight.
Directory of Open Access Journals (Sweden)
Wei Pan
Full Text Available Conventional decision theory suggests that under risk, people choose option(s by maximizing the expected utility. However, theories deal ambiguously with different options that have the same expected utility. A network approach is proposed by introducing 'goal' and 'time' factors to reduce the ambiguity in strategies for calculating the time-dependent probability of reaching a goal. As such, a mathematical foundation that explains the irrational behavior of choosing an option with a lower expected utility is revealed, which could imply that humans possess rationality in foresight.
Womack, James C; Anton, Lucian; Dziedzic, Jacek; Hasnip, Phil J; Probert, Matt I J; Skylaris, Chris-Kriton
2018-03-13
The solution of the Poisson equation is a crucial step in electronic structure calculations, yielding the electrostatic potential-a key component of the quantum mechanical Hamiltonian. In recent decades, theoretical advances and increases in computer performance have made it possible to simulate the electronic structure of extended systems in complex environments. This requires the solution of more complicated variants of the Poisson equation, featuring nonhomogeneous dielectric permittivities, ionic concentrations with nonlinear dependencies, and diverse boundary conditions. The analytic solutions generally used to solve the Poisson equation in vacuum (or with homogeneous permittivity) are not applicable in these circumstances, and numerical methods must be used. In this work, we present DL_MG, a flexible, scalable, and accurate solver library, developed specifically to tackle the challenges of solving the Poisson equation in modern large-scale electronic structure calculations on parallel computers. Our solver is based on the multigrid approach and uses an iterative high-order defect correction method to improve the accuracy of solutions. Using two chemically relevant model systems, we tested the accuracy and computational performance of DL_MG when solving the generalized Poisson and Poisson-Boltzmann equations, demonstrating excellent agreement with analytic solutions and efficient scaling to ∼10 9 unknowns and 100s of CPU cores. We also applied DL_MG in actual large-scale electronic structure calculations, using the ONETEP linear-scaling electronic structure package to study a 2615 atom protein-ligand complex with routinely available computational resources. In these calculations, the overall execution time with DL_MG was not significantly greater than the time required for calculations using a conventional FFT-based solver.
Probabilistic Logical Characterization
DEFF Research Database (Denmark)
Hermanns, Holger; Parma, Augusto; Segala, Roberto
2011-01-01
Probabilistic automata exhibit both probabilistic and non-deterministic choice. They are therefore a powerful semantic foundation for modeling concurrent systems with random phenomena arising in many applications ranging from artificial intelligence, security, systems biology to performance...... modeling. Several variations of bisimulation and simulation relations have proved to be useful as means to abstract and compare different automata. This paper develops a taxonomy of logical characterizations of these relations on image-finite and image-infinite probabilistic automata....
Conditional Probabilistic Population Forecasting
Sanderson, W.C.; Scherbov, S.; O'Neill, B.C.; Lutz, W.
2003-01-01
Since policy makers often prefer to think in terms of scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy makers it allows them to answer "what if"...
Conditional probabilistic population forecasting
Sanderson, Warren; Scherbov, Sergei; O'Neill, Brian; Lutz, Wolfgang
2003-01-01
Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because it allows them...
Conditional Probabilistic Population Forecasting
Sanderson, Warren C.; Scherbov, Sergei; O'Neill, Brian C.; Lutz, Wolfgang
2004-01-01
Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because...
Square root approximation to the poisson channel
Tsiatmas, A.; Willems, F.M.J.; Baggen, C.P.M.J.
2013-01-01
Starting from the Poisson model we present a channel model for optical communications, called the Square Root (SR) Channel, in which the noise is additive Gaussian with constant variance. Initially, we prove that for large peak or average power, the transmission rate of a Poisson Channel when coding
A Seemingly Unrelated Poisson Regression Model
King, Gary
1989-01-01
This article introduces a new estimator for the analysis of two contemporaneously correlated endogenous event count variables. This seemingly unrelated Poisson regression model (SUPREME) estimator combines the efficiencies created by single equation Poisson regression model estimators and insights from "seemingly unrelated" linear regression models.
Poisson geometry from a Dirac perspective
Meinrenken, Eckhard
2018-03-01
We present proofs of classical results in Poisson geometry using techniques from Dirac geometry. This article is based on mini-courses at the Poisson summer school in Geneva, June 2016, and at the workshop Quantum Groups and Gravity at the University of Waterloo, April 2016.
Associative and Lie deformations of Poisson algebras
Remm, Elisabeth
2011-01-01
Considering a Poisson algebra as a non associative algebra satisfying the Markl-Remm identity, we study deformations of Poisson algebras as deformations of this non associative algebra. This gives a natural interpretation of deformations which preserves the underlying associative structure and we study deformations which preserve the underlying Lie algebra.
Oldring, P K T; Castle, L; O'Mahony, C; Dixon, J
2014-01-01
The FACET tool is a probabilistic model to estimate exposure to chemicals in foodstuffs, originating from flavours, additives and food contact materials. This paper demonstrates the use of the FACET tool to estimate exposure to BPA (bisphenol A) from light metal packaging. For exposure to migrants from food packaging, FACET uses industry-supplied data on the occurrence of substances in the packaging, their concentrations and construction of the packaging, which were combined with data from a market research organisation and food consumption data supplied by national database managers. To illustrate the principles, UK packaging data were used together with consumption data from the UK National Diet and Nutrition Survey (NDNS) dietary survey for 19-64 year olds for a refined deterministic verification. The UK data were chosen mainly because the consumption surveys are detailed, data for UK packaging at a detailed level were available and, arguably, the UK population is composed of high consumers of packaged foodstuffs. Exposures were run for each food category that could give rise to BPA from light metal packaging. Consumer loyalty to a particular type of packaging, commonly referred to as packaging loyalty, was set. The BPA extraction levels used for the 15 types of coating chemistries that could release BPA were in the range of 0.00005-0.012 mg dm(-2). The estimates of exposure to BPA using FACET for the total diet were 0.0098 (mean) and 0.0466 (97.5th percentile) mg/person/day, corresponding to 0.00013 (mean) and 0.00059 (97.5th percentile) mg kg(-1) body weight day(-1) for consumers of foods packed in light metal packaging. This is well below the current EFSA (and other recognised bodies) TDI of 0.05 mg kg(-1) body weight day(-1). These probabilistic estimates were compared with estimates using a refined deterministic approach drawing on the same input data. The results from FACET for the mean, 95th and 97.5th percentile exposures to BPA lay between the
Probabilistic Analysis Methods for Hybrid Ventilation
DEFF Research Database (Denmark)
Brohus, Henrik; Frier, Christian; Heiselberg, Per
This paper discusses a general approach for the application of probabilistic analysis methods in the design of ventilation systems. The aims and scope of probabilistic versus deterministic methods are addressed with special emphasis on hybrid ventilation systems. A preliminary application...... of stochastic differential equations is presented comprising a general heat balance for an arbitrary number of loads and zones in a building to determine the thermal behaviour under random conditions....
Xie, Jianping; Marano, Kristin M; Wilson, Cody L; Liu, Huimin; Gan, Huamin; Xie, Fuwei; Naufal, Ziad S
2012-03-01
The chemical and physical complexity of cigarette mainstream smoke (MSS) presents a challenge in the understanding of risk for smoking-related diseases. Quantitative risk assessment is a useful tool for assessing the toxicological risks that may be presented by smoking currently available commercial cigarettes. In this study, yields of a selected group of chemical constituents were quantified in machine-generated MSS from 30 brands of cigarettes sold in China. Using constituent yields, exposure estimates specific to and representative of the Chinese population, and available dose-response data, a Monte Carlo method was applied to simulate probability distributions for incremental lifetime cancer risk (ILCR), hazard quotient (HQ), and margin of exposure (MOE) values for each constituent as appropriate. Measures of central tendency were extracted from the outcome distributions and constituents were ranked according to these three risk assessment indices. The constituents for which ILCR >10(-4), HQ >1, and MOE risk contributed by each MSS constituent, this approach provides a plausible and objective framework for the prioritization of toxicants in cigarette smoke and is valuable in guiding tobacco risk management. Copyright Â© 2011 Elsevier Inc. All rights reserved.
Directory of Open Access Journals (Sweden)
Lusi Eka Afri
2017-03-01
Full Text Available Regresi Binomial Negatif dan regresi Conway-Maxwell-Poisson merupakan solusi untuk mengatasi overdispersi pada regresi Poisson. Kedua model tersebut merupakan perluasan dari model regresi Poisson. Menurut Hinde dan Demetrio (2007, terdapat beberapa kemungkinan terjadi overdispersi pada regresi Poisson yaitu keragaman hasil pengamatan keragaman individu sebagai komponen yang tidak dijelaskan oleh model, korelasi antar respon individu, terjadinya pengelompokan dalam populasi dan peubah teramati yang dihilangkan. Akibatnya dapat menyebabkan pendugaan galat baku yang terlalu rendah dan akan menghasilkan pendugaan parameter yang bias ke bawah (underestimate. Penelitian ini bertujuan untuk membandingan model Regresi Binomial Negatif dan model regresi Conway-Maxwell-Poisson (COM-Poisson dalam mengatasi overdispersi pada data distribusi Poisson berdasarkan statistik uji devians. Data yang digunakan dalam penelitian ini terdiri dari dua sumber data yaitu data simulasi dan data kasus terapan. Data simulasi yang digunakan diperoleh dengan membangkitkan data berdistribusi Poisson yang mengandung overdispersi dengan menggunakan bahasa pemrograman R berdasarkan karakteristik data berupa , peluang munculnya nilai nol (p serta ukuran sampel (n. Data dibangkitkan berguna untuk mendapatkan estimasi koefisien parameter pada regresi binomial negatif dan COM-Poisson. Kata Kunci: overdispersi, regresi binomial negatif, regresi Conway-Maxwell-Poisson Negative binomial regression and Conway-Maxwell-Poisson regression could be used to overcome over dispersion on Poisson regression. Both models are the extension of Poisson regression model. According to Hinde and Demetrio (2007, there will be some over dispersion on Poisson regression: observed variance in individual variance cannot be described by a model, correlation among individual response, and the population group and the observed variables are eliminated. Consequently, this can lead to low standard error
A twisted generalization of Novikov-Poisson algebras
Yau, Donald
2010-01-01
Hom-Novikov-Poisson algebras, which are twisted generalizations of Novikov-Poisson algebras, are studied. Hom-Novikov-Poisson algebras are shown to be closed under tensor products and several kinds of twistings. Necessary and sufficient conditions are given under which Hom-Novikov-Poisson algebras give rise to Hom-Poisson algebras.
Speech parts as Poisson processes.
Badalamenti, A F
2001-09-01
This paper presents evidence that six of the seven parts of speech occur in written text as Poisson processes, simple or recurring. The six major parts are nouns, verbs, adjectives, adverbs, prepositions, and conjunctions, with the interjection occurring too infrequently to support a model. The data consist of more than the first 5000 words of works by four major authors coded to label the parts of speech, as well as periods (sentence terminators). Sentence length is measured via the period and found to be normally distributed with no stochastic model identified for its occurrence. The models for all six speech parts but the noun significantly distinguish some pairs of authors and likewise for the joint use of all words types. Any one author is significantly distinguished from any other by at least one word type and sentence length very significantly distinguishes each from all others. The variety of word type use, measured by Shannon entropy, builds to about 90% of its maximum possible value. The rate constants for nouns are close to the fractions of maximum entropy achieved. This finding together with the stochastic models and the relations among them suggest that the noun may be a primitive organizer of written text.
On covariant Poisson brackets in classical field theory
International Nuclear Information System (INIS)
Forger, Michael; Salles, Mário O.
2015-01-01
How to give a natural geometric definition of a covariant Poisson bracket in classical field theory has for a long time been an open problem—as testified by the extensive literature on “multisymplectic Poisson brackets,” together with the fact that all these proposals suffer from serious defects. On the other hand, the functional approach does provide a good candidate which has come to be known as the Peierls–De Witt bracket and whose construction in a geometrical setting is now well understood. Here, we show how the basic “multisymplectic Poisson bracket” already proposed in the 1970s can be derived from the Peierls–De Witt bracket, applied to a special class of functionals. This relation allows to trace back most (if not all) of the problems encountered in the past to ambiguities (the relation between differential forms on multiphase space and the functionals they define is not one-to-one) and also to the fact that this class of functionals does not form a Poisson subalgebra
On covariant Poisson brackets in classical field theory
Energy Technology Data Exchange (ETDEWEB)
Forger, Michael [Instituto de Matemática e Estatística, Universidade de São Paulo, Caixa Postal 66281, BR–05315-970 São Paulo, SP (Brazil); Salles, Mário O. [Instituto de Matemática e Estatística, Universidade de São Paulo, Caixa Postal 66281, BR–05315-970 São Paulo, SP (Brazil); Centro de Ciências Exatas e da Terra, Universidade Federal do Rio Grande do Norte, Campus Universitário – Lagoa Nova, BR–59078-970 Natal, RN (Brazil)
2015-10-15
How to give a natural geometric definition of a covariant Poisson bracket in classical field theory has for a long time been an open problem—as testified by the extensive literature on “multisymplectic Poisson brackets,” together with the fact that all these proposals suffer from serious defects. On the other hand, the functional approach does provide a good candidate which has come to be known as the Peierls–De Witt bracket and whose construction in a geometrical setting is now well understood. Here, we show how the basic “multisymplectic Poisson bracket” already proposed in the 1970s can be derived from the Peierls–De Witt bracket, applied to a special class of functionals. This relation allows to trace back most (if not all) of the problems encountered in the past to ambiguities (the relation between differential forms on multiphase space and the functionals they define is not one-to-one) and also to the fact that this class of functionals does not form a Poisson subalgebra.
Poisson's ratio and Young's modulus of lipid bilayers in different phases
Directory of Open Access Journals (Sweden)
Tayebeh eJadidi
2014-04-01
Full Text Available A general computational method is introduced to estimate the Poisson's ratio for membranes with small thickness.In this method, the Poisson's ratio is calculated by utilizing a rescaling of inter-particle distancesin one lateral direction under periodic boundary conditions. As an example for the coarse grained lipid model introduced by Lenz and Schmid, we calculate the Poisson's ratio in the gel, fluid, and interdigitated phases. Having the Poisson's ratio, enable us to obtain the Young's modulus for the membranes in different phases. The approach may be applied to other membranes such as graphene and tethered membranes in orderto predict the temperature dependence of its Poisson's ratio and Young's modulus.
International Nuclear Information System (INIS)
Gachon, P.; Radojevic, M.; Harding, A.; Saad, C.; Nguyen, V.T.V.
2008-01-01
downscaled results in this latter case, in future runs. Finally, we conclude about the need to generate ensemble runs to produce probabilistic climate information, and with the quantification of the cascade of uncertainties from the various sources of GCM and different downscaling approaches. (author)
Characterizing the performance of the Conway-Maxwell Poisson generalized linear model.
Francis, Royce A; Geedipally, Srinivas Reddy; Guikema, Seth D; Dhavala, Soma Sekhar; Lord, Dominique; LaRocca, Sarah
2012-01-01
Count data are pervasive in many areas of risk analysis; deaths, adverse health outcomes, infrastructure system failures, and traffic accidents are all recorded as count events, for example. Risk analysts often wish to estimate the probability distribution for the number of discrete events as part of doing a risk assessment. Traditional count data regression models of the type often used in risk assessment for this problem suffer from limitations due to the assumed variance structure. A more flexible model based on the Conway-Maxwell Poisson (COM-Poisson) distribution was recently proposed, a model that has the potential to overcome the limitations of the traditional model. However, the statistical performance of this new model has not yet been fully characterized. This article assesses the performance of a maximum likelihood estimation method for fitting the COM-Poisson generalized linear model (GLM). The objectives of this article are to (1) characterize the parameter estimation accuracy of the MLE implementation of the COM-Poisson GLM, and (2) estimate the prediction accuracy of the COM-Poisson GLM using simulated data sets. The results of the study indicate that the COM-Poisson GLM is flexible enough to model under-, equi-, and overdispersed data sets with different sample mean values. The results also show that the COM-Poisson GLM yields accurate parameter estimates. The COM-Poisson GLM provides a promising and flexible approach for performing count data regression. © 2011 Society for Risk Analysis.
Constructions and classifications of projective Poisson varieties
Pym, Brent
2018-03-01
This paper is intended both as an introduction to the algebraic geometry of holomorphic Poisson brackets, and as a survey of results on the classification of projective Poisson manifolds that have been obtained in the past 20 years. It is based on the lecture series delivered by the author at the Poisson 2016 Summer School in Geneva. The paper begins with a detailed treatment of Poisson surfaces, including adjunction, ruled surfaces and blowups, and leading to a statement of the full birational classification. We then describe several constructions of Poisson threefolds, outlining the classification in the regular case, and the case of rank-one Fano threefolds (such as projective space). Following a brief introduction to the notion of Poisson subspaces, we discuss Bondal's conjecture on the dimensions of degeneracy loci on Poisson Fano manifolds. We close with a discussion of log symplectic manifolds with simple normal crossings degeneracy divisor, including a new proof of the classification in the case of rank-one Fano manifolds.
Constructions and classifications of projective Poisson varieties.
Pym, Brent
2018-01-01
This paper is intended both as an introduction to the algebraic geometry of holomorphic Poisson brackets, and as a survey of results on the classification of projective Poisson manifolds that have been obtained in the past 20 years. It is based on the lecture series delivered by the author at the Poisson 2016 Summer School in Geneva. The paper begins with a detailed treatment of Poisson surfaces, including adjunction, ruled surfaces and blowups, and leading to a statement of the full birational classification. We then describe several constructions of Poisson threefolds, outlining the classification in the regular case, and the case of rank-one Fano threefolds (such as projective space). Following a brief introduction to the notion of Poisson subspaces, we discuss Bondal's conjecture on the dimensions of degeneracy loci on Poisson Fano manifolds. We close with a discussion of log symplectic manifolds with simple normal crossings degeneracy divisor, including a new proof of the classification in the case of rank-one Fano manifolds.
Solving stochastic multiobjective vehicle routing problem using probabilistic metaheuristic
Directory of Open Access Journals (Sweden)
Gannouni Asmae
2017-01-01
closed form expression. This novel approach is based on combinatorial probability and can be incorporated in a multiobjective evolutionary algorithm. (iiProvide probabilistic approaches to elitism and diversification in multiobjective evolutionary algorithms. Finally, The behavior of the resulting Probabilistic Multi-objective Evolutionary Algorithms (PrMOEAs is empirically investigated on the multi-objective stochastic VRP problem.
Probabilistic composition of preferences, theory and applications
Parracho Sant'Anna, Annibal
2015-01-01
Putting forward a unified presentation of the features and possible applications of probabilistic preferences composition, and serving as a methodology for decisions employing multiple criteria, this book maximizes reader insights into the evaluation in probabilistic terms and the development of composition approaches that do not depend on assigning weights to the criteria. With key applications in important areas of management such as failure modes, effects analysis and productivity analysis – together with explanations about the application of the concepts involved –this book makes available numerical examples of probabilistic transformation development and probabilistic composition. Useful not only as a reference source for researchers, but also in teaching classes of graduate courses in Production Engineering and Management Science, the key themes of the book will be of especial interest to researchers in the field of Operational Research.
Probabilistic seismic hazard assessment of southern part of Ghana
Ahulu, Sylvanus T.; Danuor, Sylvester Kojo; Asiedu, Daniel K.
2018-05-01
This paper presents a seismic hazard map for the southern part of Ghana prepared using the probabilistic approach, and seismic hazard assessment results for six cities. The seismic hazard map was prepared for 10% probability of exceedance for peak ground acceleration in 50 years. The input parameters used for the computations of hazard were obtained using data from a catalogue that was compiled and homogenised to moment magnitude (Mw). The catalogue covered a period of over a century (1615-2009). The hazard assessment is based on the Poisson model for earthquake occurrence, and hence, dependent events were identified and removed from the catalogue. The following attenuation relations were adopted and used in this study—Allen (for south and eastern Australia), Silva et al. (for Central and eastern North America), Campbell and Bozorgnia (for worldwide active-shallow-crust regions) and Chiou and Youngs (for worldwide active-shallow-crust regions). Logic-tree formalism was used to account for possible uncertainties associated with the attenuation relationships. OpenQuake software package was used for the hazard calculation. The highest level of seismic hazard is found in the Accra and Tema seismic zones, with estimated peak ground acceleration close to 0.2 g. The level of the seismic hazard in the southern part of Ghana diminishes with distance away from the Accra/Tema region to a value of 0.05 g at a distance of about 140 km.
Probabilistic seismic hazard assessment of southern part of Ghana
Ahulu, Sylvanus T.; Danuor, Sylvester Kojo; Asiedu, Daniel K.
2017-12-01
This paper presents a seismic hazard map for the southern part of Ghana prepared using the probabilistic approach, and seismic hazard assessment results for six cities. The seismic hazard map was prepared for 10% probability of exceedance for peak ground acceleration in 50 years. The input parameters used for the computations of hazard were obtained using data from a catalogue that was compiled and homogenised to moment magnitude (Mw). The catalogue covered a period of over a century (1615-2009). The hazard assessment is based on the Poisson model for earthquake occurrence, and hence, dependent events were identified and removed from the catalogue. The following attenuation relations were adopted and used in this study—Allen (for south and eastern Australia), Silva et al. (for Central and eastern North America), Campbell and Bozorgnia (for worldwide active-shallow-crust regions) and Chiou and Youngs (for worldwide active-shallow-crust regions). Logic-tree formalism was used to account for possible uncertainties associated with the attenuation relationships. OpenQuake software package was used for the hazard calculation. The highest level of seismic hazard is found in the Accra and Tema seismic zones, with estimated peak ground acceleration close to 0.2 g. The level of the seismic hazard in the southern part of Ghana diminishes with distance away from the Accra/Tema region to a value of 0.05 g at a distance of about 140 km.
Probabilistic Design of Wind Turbines
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Toft, H.S.
2010-01-01
Probabilistic design of wind turbines requires definition of the structural elements to be included in the probabilistic basis: e.g., blades, tower, foundation; identification of important failure modes; careful stochastic modeling of the uncertain parameters; recommendations for target reliability....... It is described how uncertainties in wind turbine design related to computational models, statistical data from test specimens, results from a few full-scale tests and from prototype wind turbines can be accounted for using the Maximum Likelihood Method and a Bayesian approach. Assessment of the optimal...... reliability level by cost-benefit optimization is illustrated by an offshore wind turbine example. Uncertainty modeling is illustrated by an example where physical, statistical and model uncertainties are estimated....
On terminating Poisson processes in some shock models
Energy Technology Data Exchange (ETDEWEB)
Finkelstein, Maxim, E-mail: FinkelMI@ufs.ac.z [Department of Mathematical Statistics, University of the Free State, Bloemfontein (South Africa); Max Planck Institute for Demographic Research, Rostock (Germany); Marais, Francois, E-mail: fmarais@csc.co [CSC, Cape Town (South Africa)
2010-08-15
A system subject to a point process of shocks is considered. Shocks occur in accordance with the homogeneous Poisson process. Different criteria of system failure (termination) are discussed and the corresponding probabilities of failure (accident)-free performance are derived. The described analytical approach is based on deriving integral equations for each setting and solving these equations through the Laplace transform. Some approximations are analyzed and further generalizations and applications are discussed.
On terminating Poisson processes in some shock models
International Nuclear Information System (INIS)
Finkelstein, Maxim; Marais, Francois
2010-01-01
A system subject to a point process of shocks is considered. Shocks occur in accordance with the homogeneous Poisson process. Different criteria of system failure (termination) are discussed and the corresponding probabilities of failure (accident)-free performance are derived. The described analytical approach is based on deriving integral equations for each setting and solving these equations through the Laplace transform. Some approximations are analyzed and further generalizations and applications are discussed.
Probabilistic approaches to robotic perception
Ferreira, João Filipe
2014-01-01
This book tries to address the following questions: How should the uncertainty and incompleteness inherent to sensing the environment be represented and modelled in a way that will increase the autonomy of a robot? How should a robotic system perceive, infer, decide and act efficiently? These are two of the challenging questions robotics community and robotic researchers have been facing. The development of robotic domain by the 1980s spurred the convergence of automation to autonomy, and the field of robotics has consequently converged towards the field of artificial intelligence (AI). Since the end of that decade, the general public’s imagination has been stimulated by high expectations on autonomy, where AI and robotics try to solve difficult cognitive problems through algorithms developed from either philosophical and anthropological conjectures or incomplete notions of cognitive reasoning. Many of these developments do not unveil even a few of the processes through which biological organisms solve thes...
Probabilistic Approaches to Energy Systems
DEFF Research Database (Denmark)
Iversen, Jan Emil Banning
of renewable energy generation. Particularly we focus on producing forecasting models that can predict renewable energy generation, single user demand, and provide advanced forecast products that are needed for an efficient integration of renewable energy into the power generation mix. Such forecasts can...... integration of renewable energy.Thus forecast products should be developed in unison with the decision making tool as they are two sides of the same overall challenge.......Energy generation from wind and sun is increasing rapidly in many parts of the world. This presents new challenges on how to integrate this uncertain, intermittent and non-dispatchable energy source. This thesis deals with forecasting and decision making in energy systems with a large proportion...
The Poisson equation on Klein surfaces
Directory of Open Access Journals (Sweden)
Monica Rosiu
2016-04-01
Full Text Available We obtain a formula for the solution of the Poisson equation with Dirichlet boundary condition on a region of a Klein surface. This formula reveals the symmetric character of the solution.
Poisson point processes imaging, tracking, and sensing
Streit, Roy L
2010-01-01
This overview of non-homogeneous and multidimensional Poisson point processes and their applications features mathematical tools and applications from emission- and transmission-computed tomography to multiple target tracking and distributed sensor detection.
Probabilistic Structural Analysis Program
Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.
2010-01-01
NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.
Noncommutative gauge theory for Poisson manifolds
Energy Technology Data Exchange (ETDEWEB)
Jurco, Branislav E-mail: jurco@mpim-bonn.mpg.de; Schupp, Peter E-mail: schupp@theorie.physik.uni-muenchen.de; Wess, Julius E-mail: wess@theorie.physik.uni-muenchen.de
2000-09-25
A noncommutative gauge theory is associated to every Abelian gauge theory on a Poisson manifold. The semi-classical and full quantum version of the map from the ordinary gauge theory to the noncommutative gauge theory (Seiberg-Witten map) is given explicitly to all orders for any Poisson manifold in the Abelian case. In the quantum case the construction is based on Kontsevich's formality theorem.
Noncommutative gauge theory for Poisson manifolds
International Nuclear Information System (INIS)
Jurco, Branislav; Schupp, Peter; Wess, Julius
2000-01-01
A noncommutative gauge theory is associated to every Abelian gauge theory on a Poisson manifold. The semi-classical and full quantum version of the map from the ordinary gauge theory to the noncommutative gauge theory (Seiberg-Witten map) is given explicitly to all orders for any Poisson manifold in the Abelian case. In the quantum case the construction is based on Kontsevich's formality theorem
Principles of applying Poisson units in radiology
International Nuclear Information System (INIS)
Benyumovich, M.S.
2000-01-01
The probability that radioactive particles hit particular space patterns (e.g. cells in the squares of a count chamber net) and time intervals (e.g. radioactive particles hit a given area per time unit) follows the Poisson distribution. The mean is the only parameter from which all this distribution depends. A metrological base of counting the cells and radioactive particles is a property of the Poisson distribution assuming equality of a standard deviation to a root square of mean (property 1). The application of Poisson units in counting of blood formed elements and cultured cells was proposed by us (Russian Federation Patent No. 2126230). Poisson units relate to the means which make the property 1 valid. In a case of cells counting, the square of these units is equal to 1/10 of one of count chamber net where they count the cells. Thus one finds the means from the single cell count rate divided by 10. Finding the Poisson units when counting the radioactive particles should assume determination of a number of these particles sufficient to make equality 1 valid. To this end one should subdivide a time interval used in counting a single particle count rate into different number of equal portions (count numbers). Next one should pick out the count number ensuring the satisfaction of equality 1. Such a portion is taken as a Poisson unit in the radioactive particles count. If the flux of particles is controllable one should set up a count rate sufficient to make equality 1 valid. Operations with means obtained by with the use of Poisson units are performed on the base of approximation of the Poisson distribution by a normal one. (author)
Multivariate fractional Poisson processes and compound sums
Beghin, Luisa; Macci, Claudio
2015-01-01
In this paper we present multivariate space-time fractional Poisson processes by considering common random time-changes of a (finite-dimensional) vector of independent classical (non-fractional) Poisson processes. In some cases we also consider compound processes. We obtain some equations in terms of some suitable fractional derivatives and fractional difference operators, which provides the extension of known equations for the univariate processes.
Probabilistic Open Set Recognition
Jain, Lalit Prithviraj
Real-world tasks in computer vision, pattern recognition and machine learning often touch upon the open set recognition problem: multi-class recognition with incomplete knowledge of the world and many unknown inputs. An obvious way to approach such problems is to develop a recognition system that thresholds probabilities to reject unknown classes. Traditional rejection techniques are not about the unknown; they are about the uncertain boundary and rejection around that boundary. Thus traditional techniques only represent the "known unknowns". However, a proper open set recognition algorithm is needed to reduce the risk from the "unknown unknowns". This dissertation examines this concept and finds existing probabilistic multi-class recognition approaches are ineffective for true open set recognition. We hypothesize the cause is due to weak adhoc assumptions combined with closed-world assumptions made by existing calibration techniques. Intuitively, if we could accurately model just the positive data for any known class without overfitting, we could reject the large set of unknown classes even under this assumption of incomplete class knowledge. For this, we formulate the problem as one of modeling positive training data by invoking statistical extreme value theory (EVT) near the decision boundary of positive data with respect to negative data. We provide a new algorithm called the PI-SVM for estimating the unnormalized posterior probability of class inclusion. This dissertation also introduces a new open set recognition model called Compact Abating Probability (CAP), where the probability of class membership decreases in value (abates) as points move from known data toward open space. We show that CAP models improve open set recognition for multiple algorithms. Leveraging the CAP formulation, we go on to describe the novel Weibull-calibrated SVM (W-SVM) algorithm, which combines the useful properties of statistical EVT for score calibration with one-class and binary
Probabilistic programmable quantum processors
International Nuclear Information System (INIS)
Buzek, V.; Ziman, M.; Hillery, M.
2004-01-01
We analyze how to improve performance of probabilistic programmable quantum processors. We show how the probability of success of the probabilistic processor can be enhanced by using the processor in loops. In addition, we show that an arbitrary SU(2) transformations of qubits can be encoded in program state of a universal programmable probabilistic quantum processor. The probability of success of this processor can be enhanced by a systematic correction of errors via conditional loops. Finally, we show that all our results can be generalized also for qudits. (Abstract Copyright [2004], Wiley Periodicals, Inc.)
Quantization of the Poisson SU(2) and its Poisson homogeneous space - the 2-sphere
International Nuclear Information System (INIS)
Sheu, A.J.L.
1991-01-01
We show that deformation quantizations of the Poisson structures on the Poisson Lie group SU(2) and its homogeneous space, the 2-sphere, are compatible with Woronowicz's deformation quantization of SU(2)'s group structure and Podles' deformation quantization of 2-sphere's homogeneous structure, respectively. So in a certain sense the multiplicativity of the Lie Poisson structure on SU(2) at the classical level is preserved under quantization. (orig.)
International Nuclear Information System (INIS)
Christian, Robby; Kang, Hyun Gook
2017-01-01
This paper proposes a methodology to assess and reduce risks of maritime spent nuclear fuel transportation with a probabilistic approach. Event trees detailing the progression of collisions leading to transport casks’ damage were constructed. Parallel and crossing collision probabilities were formulated based on the Poisson distribution. Automatic Identification System (AIS) data were processed with the Hough Transform algorithm to estimate possible intersections between the shipment route and the marine traffic. Monte Carlo simulations were done to compute collision probabilities and impact energies at each intersection. Possible safety improvement measures through a proper selection of operational transport parameters were investigated. These parameters include shipment routes, ship's cruise velocity, number of transport casks carried in a shipment, the casks’ stowage configuration and loading order on board the ship. A shipment case study is presented. Waters with high collision probabilities were identified. Effective range of cruising velocity to reduce collision risks were discovered. The number of casks in a shipment and their stowage method which gave low cask damage frequencies were obtained. The proposed methodology was successful in quantifying ship collision and cask damage frequency. It was effective in assisting decision making processes to minimize risks in maritime spent nuclear fuel transportation. - Highlights: • Proposes a probabilistic framework on the safety of spent nuclear fuel transportation by sea. • Developed a marine traffic simulation model using Generalized Hough Transform (GHT) algorithm. • A transportation case study on South Korean waters is presented. • Single-vessel risk reduction method is outlined by optimizing transport parameters.
Energy Technology Data Exchange (ETDEWEB)
Weisz, Daniel R.; Fouesneau, Morgan; Dalcanton, Julianne J.; Clifton Johnson, L.; Beerman, Lori C.; Williams, Benjamin F. [Department of Astronomy, University of Washington, Box 351580, Seattle, WA 98195 (United States); Hogg, David W.; Foreman-Mackey, Daniel T. [Center for Cosmology and Particle Physics, New York University, 4 Washington Place, New York, NY 10003 (United States); Rix, Hans-Walter; Gouliermis, Dimitrios [Max Planck Institute for Astronomy, Koenigstuhl 17, D-69117 Heidelberg (Germany); Dolphin, Andrew E. [Raytheon Company, 1151 East Hermans Road, Tucson, AZ 85756 (United States); Lang, Dustin [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States); Bell, Eric F. [Department of Astronomy, University of Michigan, 500 Church Street, Ann Arbor, MI 48109 (United States); Gordon, Karl D.; Kalirai, Jason S. [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Skillman, Evan D., E-mail: dweisz@astro.washington.edu [Minnesota Institute for Astrophysics, University of Minnesota, 116 Church Street SE, Minneapolis, MN 55455 (United States)
2013-01-10
We present a probabilistic approach for inferring the parameters of the present-day power-law stellar mass function (MF) of a resolved young star cluster. This technique (1) fully exploits the information content of a given data set; (2) can account for observational uncertainties in a straightforward way; (3) assigns meaningful uncertainties to the inferred parameters; (4) avoids the pitfalls associated with binning data; and (5) can be applied to virtually any resolved young cluster, laying the groundwork for a systematic study of the high-mass stellar MF (M {approx}> 1 M {sub Sun }). Using simulated clusters and Markov Chain Monte Carlo sampling of the probability distribution functions, we show that estimates of the MF slope, {alpha}, are unbiased and that the uncertainty, {Delta}{alpha}, depends primarily on the number of observed stars and on the range of stellar masses they span, assuming that the uncertainties on individual masses and the completeness are both well characterized. Using idealized mock data, we compute the theoretical precision, i.e., lower limits, on {alpha}, and provide an analytic approximation for {Delta}{alpha} as a function of the observed number of stars and mass range. Comparison with literature studies shows that {approx}3/4 of quoted uncertainties are smaller than the theoretical lower limit. By correcting these uncertainties to the theoretical lower limits, we find that the literature studies yield ({alpha}) = 2.46, with a 1{sigma} dispersion of 0.35 dex. We verify that it is impossible for a power-law MF to obtain meaningful constraints on the upper mass limit of the initial mass function, beyond the lower bound of the most massive star actually observed. We show that avoiding substantial biases in the MF slope requires (1) including the MF as a prior when deriving individual stellar mass estimates, (2) modeling the uncertainties in the individual stellar masses, and (3) fully characterizing and then explicitly modeling the
Weisz, Daniel R.; Fouesneau, Morgan; Hogg, David W.; Rix, Hans-Walter; Dolphin, Andrew E.; Dalcanton, Julianne J.; Foreman-Mackey, Daniel T.; Lang, Dustin; Johnson, L. Clifton; Beerman, Lori C.; Bell, Eric F.; Gordon, Karl D.; Gouliermis, Dimitrios; Kalirai, Jason S.; Skillman, Evan D.; Williams, Benjamin F.
2013-01-01
We present a probabilistic approach for inferring the parameters of the present-day power-law stellar mass function (MF) of a resolved young star cluster. This technique (1) fully exploits the information content of a given data set; (2) can account for observational uncertainties in a straightforward way; (3) assigns meaningful uncertainties to the inferred parameters; (4) avoids the pitfalls associated with binning data; and (5) can be applied to virtually any resolved young cluster, laying the groundwork for a systematic study of the high-mass stellar MF (M >~ 1 M ⊙). Using simulated clusters and Markov Chain Monte Carlo sampling of the probability distribution functions, we show that estimates of the MF slope, α, are unbiased and that the uncertainty, Δα, depends primarily on the number of observed stars and on the range of stellar masses they span, assuming that the uncertainties on individual masses and the completeness are both well characterized. Using idealized mock data, we compute the theoretical precision, i.e., lower limits, on α, and provide an analytic approximation for Δα as a function of the observed number of stars and mass range. Comparison with literature studies shows that ~3/4 of quoted uncertainties are smaller than the theoretical lower limit. By correcting these uncertainties to the theoretical lower limits, we find that the literature studies yield langαrang = 2.46, with a 1σ dispersion of 0.35 dex. We verify that it is impossible for a power-law MF to obtain meaningful constraints on the upper mass limit of the initial mass function, beyond the lower bound of the most massive star actually observed. We show that avoiding substantial biases in the MF slope requires (1) including the MF as a prior when deriving individual stellar mass estimates, (2) modeling the uncertainties in the individual stellar masses, and (3) fully characterizing and then explicitly modeling the completeness for stars of a given mass. The precision on MF
International Nuclear Information System (INIS)
Weisz, Daniel R.; Fouesneau, Morgan; Dalcanton, Julianne J.; Clifton Johnson, L.; Beerman, Lori C.; Williams, Benjamin F.; Hogg, David W.; Foreman-Mackey, Daniel T.; Rix, Hans-Walter; Gouliermis, Dimitrios; Dolphin, Andrew E.; Lang, Dustin; Bell, Eric F.; Gordon, Karl D.; Kalirai, Jason S.; Skillman, Evan D.
2013-01-01
We present a probabilistic approach for inferring the parameters of the present-day power-law stellar mass function (MF) of a resolved young star cluster. This technique (1) fully exploits the information content of a given data set; (2) can account for observational uncertainties in a straightforward way; (3) assigns meaningful uncertainties to the inferred parameters; (4) avoids the pitfalls associated with binning data; and (5) can be applied to virtually any resolved young cluster, laying the groundwork for a systematic study of the high-mass stellar MF (M ∼> 1 M ☉ ). Using simulated clusters and Markov Chain Monte Carlo sampling of the probability distribution functions, we show that estimates of the MF slope, α, are unbiased and that the uncertainty, Δα, depends primarily on the number of observed stars and on the range of stellar masses they span, assuming that the uncertainties on individual masses and the completeness are both well characterized. Using idealized mock data, we compute the theoretical precision, i.e., lower limits, on α, and provide an analytic approximation for Δα as a function of the observed number of stars and mass range. Comparison with literature studies shows that ∼3/4 of quoted uncertainties are smaller than the theoretical lower limit. By correcting these uncertainties to the theoretical lower limits, we find that the literature studies yield (α) = 2.46, with a 1σ dispersion of 0.35 dex. We verify that it is impossible for a power-law MF to obtain meaningful constraints on the upper mass limit of the initial mass function, beyond the lower bound of the most massive star actually observed. We show that avoiding substantial biases in the MF slope requires (1) including the MF as a prior when deriving individual stellar mass estimates, (2) modeling the uncertainties in the individual stellar masses, and (3) fully characterizing and then explicitly modeling the completeness for stars of a given mass. The precision on MF
Differential expression analysis for RNAseq using Poisson mixed models.
Sun, Shiquan; Hood, Michelle; Scott, Laura; Peng, Qinke; Mukherjee, Sayan; Tung, Jenny; Zhou, Xiang
2017-06-20
Identifying differentially expressed (DE) genes from RNA sequencing (RNAseq) studies is among the most common analyses in genomics. However, RNAseq DE analysis presents several statistical and computational challenges, including over-dispersed read counts and, in some settings, sample non-independence. Previous count-based methods rely on simple hierarchical Poisson models (e.g. negative binomial) to model independent over-dispersion, but do not account for sample non-independence due to relatedness, population structure and/or hidden confounders. Here, we present a Poisson mixed model with two random effects terms that account for both independent over-dispersion and sample non-independence. We also develop a scalable sampling-based inference algorithm using a latent variable representation of the Poisson distribution. With simulations, we show that our method properly controls for type I error and is generally more powerful than other widely used approaches, except in small samples (n <15) with other unfavorable properties (e.g. small effect sizes). We also apply our method to three real datasets that contain related individuals, population stratification or hidden confounders. Our results show that our method increases power in all three data compared to other approaches, though the power gain is smallest in the smallest sample (n = 6). Our method is implemented in MACAU, freely available at www.xzlab.org/software.html. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Bouleau, Nicolas
2015-01-01
A simplified approach to Malliavin calculus adapted to Poisson random measures is developed and applied in this book. Called the “lent particle method” it is based on perturbation of the position of particles. Poisson random measures describe phenomena involving random jumps (for instance in mathematical finance) or the random distribution of particles (as in statistical physics). Thanks to the theory of Dirichlet forms, the authors develop a mathematical tool for a quite general class of random Poisson measures and significantly simplify computations of Malliavin matrices of Poisson functionals. The method gives rise to a new explicit calculus that they illustrate on various examples: it consists in adding a particle and then removing it after computing the gradient. Using this method, one can establish absolute continuity of Poisson functionals such as Lévy areas, solutions of SDEs driven by Poisson measure and, by iteration, obtain regularity of laws. The authors also give applications to error calcul...
Probabilistic Infinite Secret Sharing
Csirmaz, László
2013-01-01
The study of probabilistic secret sharing schemes using arbitrary probability spaces and possibly infinite number of participants lets us investigate abstract properties of such schemes. It highlights important properties, explains why certain definitions work better than others, connects this topic to other branches of mathematics, and might yield new design paradigms. A probabilistic secret sharing scheme is a joint probability distribution of the shares and the secret together with a colle...
Probabilistic Programming (Invited Talk)
Yang, Hongseok
2017-01-01
Probabilistic programming refers to the idea of using standard programming constructs for specifying probabilistic models from machine learning and statistics, and employing generic inference algorithms for answering various queries on these models, such as posterior inference and estimation of model evidence. Although this idea itself is not new and was, in fact, explored by several programming-language and statistics researchers in the early 2000, it is only in the last few years that proba...
Application of Poisson random effect models for highway network screening.
Jiang, Ximiao; Abdel-Aty, Mohamed; Alamili, Samer
2014-02-01
In recent years, Bayesian random effect models that account for the temporal and spatial correlations of crash data became popular in traffic safety research. This study employs random effect Poisson Log-Normal models for crash risk hotspot identification. Both the temporal and spatial correlations of crash data were considered. Potential for Safety Improvement (PSI) were adopted as a measure of the crash risk. Using the fatal and injury crashes that occurred on urban 4-lane divided arterials from 2006 to 2009 in the Central Florida area, the random effect approaches were compared to the traditional Empirical Bayesian (EB) method and the conventional Bayesian Poisson Log-Normal model. A series of method examination tests were conducted to evaluate the performance of different approaches. These tests include the previously developed site consistence test, method consistence test, total rank difference test, and the modified total score test, as well as the newly proposed total safety performance measure difference test. Results show that the Bayesian Poisson model accounting for both temporal and spatial random effects (PTSRE) outperforms the model that with only temporal random effect, and both are superior to the conventional Poisson Log-Normal model (PLN) and the EB model in the fitting of crash data. Additionally, the method evaluation tests indicate that the PTSRE model is significantly superior to the PLN model and the EB model in consistently identifying hotspots during successive time periods. The results suggest that the PTSRE model is a superior alternative for road site crash risk hotspot identification. Copyright © 2013 Elsevier Ltd. All rights reserved.
Cellular solutions for the Poisson equation in extended systems
International Nuclear Information System (INIS)
Zhang, X.; Butler, W.H.; MacLaren, J.M.; van Ek, J.
1994-01-01
The Poisson equation for the electrostatic potential in a solid is solved using three different cellular techniques. The relative merits of these different approaches are discussed for two test charge densities for which an analytic solution to the Poisson equation is known. The first approach uses full-cell multiple-scattering theory and results in the famililar structure constant and multipole moment expansion. This solution is shown to be valid everywhere inside the cell, although for points outside the muffin-tin sphere but inside the cell the sums must be performed in the correct order to yield meaningful results. A modification of the multiple-scattering-theory approach yields a second method, a Green-function cellular method, which only requires the solution of a nearest-neighbor linear system of equations. A third approach, a related variational cellular method, is also derived. The variational cellular approach is shown to be the most accurate and reliable, and to have the best convergence in angular momentum of the three methods. Coulomb energies accurate to within 10 -6 hartree are easily achieved with the variational cellular approach, demonstrating the practicality of the approach in electronic structure calculations
Hyperbolically Patterned 3D Graphene Metamaterial with Negative Poisson's Ratio and Superelasticity.
Zhang, Qiangqiang; Xu, Xiang; Lin, Dong; Chen, Wenli; Xiong, Guoping; Yu, Yikang; Fisher, Timothy S; Li, Hui
2016-03-16
A hyperbolically patterned 3D graphene metamaterial (GM) with negative Poisson's ratio and superelasticity is highlighted. It is synthesized by a modified hydrothermal approach and subsequent oriented freeze-casting strategy. GM presents a tunable Poisson's ratio by adjusting the structural porosity, macroscopic aspect ratio (L/D), and freeze-casting conditions. Such a GM suggests promising applications as soft actuators, sensors, robust shock absorbers, and environmental remediation. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Caglar, Mehmet Umut; Pal, Ranadip
2011-03-01
Central dogma of molecular biology states that ``information cannot be transferred back from protein to either protein or nucleic acid''. However, this assumption is not exactly correct in most of the cases. There are a lot of feedback loops and interactions between different levels of systems. These types of interactions are hard to analyze due to the lack of cell level data and probabilistic - nonlinear nature of interactions. Several models widely used to analyze and simulate these types of nonlinear interactions. Stochastic Master Equation (SME) models give probabilistic nature of the interactions in a detailed manner, with a high calculation cost. On the other hand Probabilistic Boolean Network (PBN) models give a coarse scale picture of the stochastic processes, with a less calculation cost. Differential Equation (DE) models give the time evolution of mean values of processes in a highly cost effective way. The understanding of the relations between the predictions of these models is important to understand the reliability of the simulations of genetic regulatory networks. In this work the success of the mapping between SME, PBN and DE models is analyzed and the accuracy and affectivity of the control policies generated by using PBN and DE models is compared.
Probabilistic Modeling and Visualization for Bankruptcy Prediction
DEFF Research Database (Denmark)
Antunes, Francisco; Ribeiro, Bernardete; Pereira, Francisco Camara
2017-01-01
In accounting and finance domains, bankruptcy prediction is of great utility for all of the economic stakeholders. The challenge of accurate assessment of business failure prediction, specially under scenarios of financial crisis, is known to be complicated. Although there have been many successful...... studies on bankruptcy detection, seldom probabilistic approaches were carried out. In this paper we assume a probabilistic point-of-view by applying Gaussian Processes (GP) in the context of bankruptcy prediction, comparing it against the Support Vector Machines (SVM) and the Logistic Regression (LR......). Using real-world bankruptcy data, an in-depth analysis is conducted showing that, in addition to a probabilistic interpretation, the GP can effectively improve the bankruptcy prediction performance with high accuracy when compared to the other approaches. We additionally generate a complete graphical...
Conditional Poisson models: a flexible alternative to conditional logistic case cross-over analysis.
Armstrong, Ben G; Gasparrini, Antonio; Tobias, Aurelio
2014-11-24
The time stratified case cross-over approach is a popular alternative to conventional time series regression for analysing associations between time series of environmental exposures (air pollution, weather) and counts of health outcomes. These are almost always analyzed using conditional logistic regression on data expanded to case-control (case crossover) format, but this has some limitations. In particular adjusting for overdispersion and auto-correlation in the counts is not possible. It has been established that a Poisson model for counts with stratum indicators gives identical estimates to those from conditional logistic regression and does not have these limitations, but it is little used, probably because of the overheads in estimating many stratum parameters. The conditional Poisson model avoids estimating stratum parameters by conditioning on the total event count in each stratum, thus simplifying the computing and increasing the number of strata for which fitting is feasible compared with the standard unconditional Poisson model. Unlike the conditional logistic model, the conditional Poisson model does not require expanding the data, and can adjust for overdispersion and auto-correlation. It is available in Stata, R, and other packages. By applying to some real data and using simulations, we demonstrate that conditional Poisson models were simpler to code and shorter to run than are conditional logistic analyses and can be fitted to larger data sets than possible with standard Poisson models. Allowing for overdispersion or autocorrelation was possible with the conditional Poisson model but when not required this model gave identical estimates to those from conditional logistic regression. Conditional Poisson regression models provide an alternative to case crossover analysis of stratified time series data with some advantages. The conditional Poisson model can also be used in other contexts in which primary control for confounding is by fine
High order Poisson Solver for unbounded flows
DEFF Research Database (Denmark)
Hejlesen, Mads Mølholm; Rasmussen, Johannes Tophøj; Chatelain, Philippe
2015-01-01
This paper presents a high order method for solving the unbounded Poisson equation on a regular mesh using a Green’s function solution. The high order convergence was achieved by formulating mollified integration kernels, that were derived from a filter regularisation of the solution field....... The method was implemented on a rectangular domain using fast Fourier transforms (FFT) to increase computational efficiency. The Poisson solver was extended to directly solve the derivatives of the solution. This is achieved either by including the differential operator in the integration kernel...... the equations of fluid mechanics as an example, but can be used in many physical problems to solve the Poisson equation on a rectangular unbounded domain. For the two-dimensional case we propose an infinitely smooth test function which allows for arbitrary high order convergence. Using Gaussian smoothing...
Poisson-Jacobi reduction of homogeneous tensors
International Nuclear Information System (INIS)
Grabowski, J; Iglesias, D; Marrero, J C; Padron, E; Urbanski, P
2004-01-01
The notion of homogeneous tensors is discussed. We show that there is a one-to-one correspondence between multivector fields on a manifold M, homogeneous with respect to a vector field Δ on M, and first-order polydifferential operators on a closed submanifold N of codimension 1 such that Δ is transversal to N. This correspondence relates the Schouten-Nijenhuis bracket of multivector fields on M to the Schouten-Jacobi bracket of first-order polydifferential operators on N and generalizes the Poissonization of Jacobi manifolds. Actually, it can be viewed as a super-Poissonization. This procedure of passing from a homogeneous multivector field to a first-order polydifferential operator can also be understood as a sort of reduction; in the standard case-a half of a Poisson reduction. A dual version of the above correspondence yields in particular the correspondence between Δ-homogeneous symplectic structures on M and contact structures on N
The BRST complex of homological Poisson reduction
Müller-Lennert, Martin
2017-02-01
BRST complexes are differential graded Poisson algebras. They are associated with a coisotropic ideal J of a Poisson algebra P and provide a description of the Poisson algebra (P/J)^J as their cohomology in degree zero. Using the notion of stable equivalence introduced in Felder and Kazhdan (Contemporary Mathematics 610, Perspectives in representation theory, 2014), we prove that any two BRST complexes associated with the same coisotropic ideal are quasi-isomorphic in the case P = R[V] where V is a finite-dimensional symplectic vector space and the bracket on P is induced by the symplectic structure on V. As a corollary, the cohomology of the BRST complexes is canonically associated with the coisotropic ideal J in the symplectic case. We do not require any regularity assumptions on the constraints generating the ideal J. We finally quantize the BRST complex rigorously in the presence of infinitely many ghost variables and discuss the uniqueness of the quantization procedure.
Evaluating the double Poisson generalized linear model.
Zou, Yaotian; Geedipally, Srinivas Reddy; Lord, Dominique
2013-10-01
The objectives of this study are to: (1) examine the applicability of the double Poisson (DP) generalized linear model (GLM) for analyzing motor vehicle crash data characterized by over- and under-dispersion and (2) compare the performance of the DP GLM with the Conway-Maxwell-Poisson (COM-Poisson) GLM in terms of goodness-of-fit and theoretical soundness. The DP distribution has seldom been investigated and applied since its first introduction two decades ago. The hurdle for applying the DP is related to its normalizing constant (or multiplicative constant) which is not available in closed form. This study proposed a new method to approximate the normalizing constant of the DP with high accuracy and reliability. The DP GLM and COM-Poisson GLM were developed using two observed over-dispersed datasets and one observed under-dispersed dataset. The modeling results indicate that the DP GLM with its normalizing constant approximated by the new method can handle crash data characterized by over- and under-dispersion. Its performance is comparable to the COM-Poisson GLM in terms of goodness-of-fit (GOF), although COM-Poisson GLM provides a slightly better fit. For the over-dispersed data, the DP GLM performs similar to the NB GLM. Considering the fact that the DP GLM can be easily estimated with inexpensive computation and that it is simpler to interpret coefficients, it offers a flexible and efficient alternative for researchers to model count data. Copyright © 2013 Elsevier Ltd. All rights reserved.
Bayesian regression of piecewise homogeneous Poisson processes
Directory of Open Access Journals (Sweden)
Diego Sevilla
2015-12-01
Full Text Available In this paper, a Bayesian method for piecewise regression is adapted to handle counting processes data distributed as Poisson. A numerical code in Mathematica is developed and tested analyzing simulated data. The resulting method is valuable for detecting breaking points in the count rate of time series for Poisson processes. Received: 2 November 2015, Accepted: 27 November 2015; Edited by: R. Dickman; Reviewed by: M. Hutter, Australian National University, Canberra, Australia.; DOI: http://dx.doi.org/10.4279/PIP.070018 Cite as: D J R Sevilla, Papers in Physics 7, 070018 (2015
Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona
2016-06-01
Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a 'black box' research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. © The Author 2015; Published by Oxford University Press on behalf of the International Epidemiological Association.
A generalized right truncated bivariate Poisson regression model with applications to health data.
Islam, M Ataharul; Chowdhury, Rafiqul I
2017-01-01
A generalized right truncated bivariate Poisson regression model is proposed in this paper. Estimation and tests for goodness of fit and over or under dispersion are illustrated for both untruncated and right truncated bivariate Poisson regression models using marginal-conditional approach. Estimation and test procedures are illustrated for bivariate Poisson regression models with applications to Health and Retirement Study data on number of health conditions and the number of health care services utilized. The proposed test statistics are easy to compute and it is evident from the results that the models fit the data very well. A comparison between the right truncated and untruncated bivariate Poisson regression models using the test for nonnested models clearly shows that the truncated model performs significantly better than the untruncated model.
An intrinsic algorithm for parallel Poisson disk sampling on arbitrary surfaces.
Ying, Xiang; Xin, Shi-Qing; Sun, Qian; He, Ying
2013-09-01
Poisson disk sampling has excellent spatial and spectral properties, and plays an important role in a variety of visual computing. Although many promising algorithms have been proposed for multidimensional sampling in euclidean space, very few studies have been reported with regard to the problem of generating Poisson disks on surfaces due to the complicated nature of the surface. This paper presents an intrinsic algorithm for parallel Poisson disk sampling on arbitrary surfaces. In sharp contrast to the conventional parallel approaches, our method neither partitions the given surface into small patches nor uses any spatial data structure to maintain the voids in the sampling domain. Instead, our approach assigns each sample candidate a random and unique priority that is unbiased with regard to the distribution. Hence, multiple threads can process the candidates simultaneously and resolve conflicts by checking the given priority values. Our algorithm guarantees that the generated Poisson disks are uniformly and randomly distributed without bias. It is worth noting that our method is intrinsic and independent of the embedding space. This intrinsic feature allows us to generate Poisson disk patterns on arbitrary surfaces in IR(n). To our knowledge, this is the first intrinsic, parallel, and accurate algorithm for surface Poisson disk sampling. Furthermore, by manipulating the spatially varying density function, we can obtain adaptive sampling easily.
International Nuclear Information System (INIS)
Maddahi, J.; Prigent, F.; Staniloff, H.; Garcia, E.; Becerra, A.; Van Train, K.; Swan, H.J.C.; Waxman, A.; Berman, D.
1985-01-01
Probabilistic criteria for abnormality would enhance application of stress-redistribution Tl-201 rotational tomography (tomo) for evaluation of coronary artery disease (CAD). Thus, 91 pts were studied, of whom 45 had angiographic CAD (≥ 50% coronary narrowing) and 46 were normal (nl). The validity of this model was prospectively tested in the remaining 51 pts (26 nls and 25 with CAD) by comparing the predicted and observed likelihood of CAD in four subgroups (I-IV). In this paper a logistic model is developed and validated that assigns a CAD likelihood to the quantified size of tomograhic myocardial perfusion defects
Formalizing Probabilistic Safety Claims
Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.
2011-01-01
A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.
Application of probabilistic precipitation forecasts from a ...
African Journals Online (AJOL)
Application of probabilistic precipitation forecasts from a deterministic model towards increasing the lead-time of flash flood forecasts in South Africa. ... The procedure is applied to a real flash flood event and the ensemble-based rainfall forecasts are verified against rainfall estimated by the SAFFG system. The approach ...
A Probabilistic Framework for Curve Evolution
DEFF Research Database (Denmark)
Dahl, Vedrana Andersen
2017-01-01
approach include ability to handle textured images, simple generalization to multiple regions, and efficiency in computation. We test our probabilistic framework in combination with parametric (snakes) and geometric (level-sets) curves. The experimental results on composed and natural images demonstrate...
Probabilistic Durability Analysis in Advanced Engineering Design
Directory of Open Access Journals (Sweden)
A. Kudzys
2000-01-01
Full Text Available Expedience of probabilistic durability concepts and approaches in advanced engineering design of building materials, structural members and systems is considered. Target margin values of structural safety and serviceability indices are analyzed and their draft values are presented. Analytical methods of the cumulative coefficient of correlation and the limit transient action effect for calculation of reliability indices are given. Analysis can be used for probabilistic durability assessment of carrying and enclosure metal, reinforced concrete, wood, plastic, masonry both homogeneous and sandwich or composite structures and some kinds of equipments. Analysis models can be applied in other engineering fields.
DEFF Research Database (Denmark)
Larsen, Kim Guldstrand; Mardare, Radu Iulian; Xue, Bingtian
2016-01-01
We introduce a version of the probabilistic µ-calculus (PMC) built on top of a probabilistic modal logic that allows encoding n-ary inequational conditions on transition probabilities. PMC extends previously studied calculi and we prove that, despite its expressiveness, it enjoys a series of good...... metaproperties. Firstly, we prove the decidability of satisﬁability checking by establishing the small model property. An algorithm for deciding the satisﬁability problem is developed. As a second major result, we provide a complete axiomatization for the alternation-free fragment of PMC. The completeness proof...
Poisson processes and a Bessel function integral
Steutel, F.W.
1985-01-01
The probability of winning a simple game of competing Poisson processes turns out to be equal to the well-known Bessel function integral J(x, y) (cf. Y. L. Luke, Integrals of Bessel Functions, McGraw-Hill, New York, 1962). Several properties of J, some of which seem to be new, follow quite easily
Almost Poisson integration of rigid body systems
International Nuclear Information System (INIS)
Austin, M.A.; Krishnaprasad, P.S.; Li-Sheng Wang
1993-01-01
In this paper we discuss the numerical integration of Lie-Poisson systems using the mid-point rule. Since such systems result from the reduction of hamiltonian systems with symmetry by lie group actions, we also present examples of reconstruction rules for the full dynamics. A primary motivation is to preserve in the integration process, various conserved quantities of the original dynamics. A main result of this paper is an O(h 3 ) error estimate for the Lie-Poisson structure, where h is the integration step-size. We note that Lie-Poisson systems appear naturally in many areas of physical science and engineering, including theoretical mechanics of fluids and plasmas, satellite dynamics, and polarization dynamics. In the present paper we consider a series of progressively complicated examples related to rigid body systems. We also consider a dissipative example associated to a Lie-Poisson system. The behavior of the mid-point rule and an associated reconstruction rule is numerically explored. 24 refs., 9 figs
Measuring Poisson Ratios at Low Temperatures
Boozon, R. S.; Shepic, J. A.
1987-01-01
Simple extensometer ring measures bulges of specimens in compression. New method of measuring Poisson's ratio used on brittle ceramic materials at cryogenic temperatures. Extensometer ring encircles cylindrical specimen. Four strain gauges connected in fully active Wheatstone bridge self-temperature-compensating. Used at temperatures as low as liquid helium.
Affine Poisson Groups and WZW Model
Directory of Open Access Journals (Sweden)
Ctirad Klimcík
2008-01-01
Full Text Available We give a detailed description of a dynamical system which enjoys a Poisson-Lie symmetry with two non-isomorphic dual groups. The system is obtained by taking the q → ∞ limit of the q-deformed WZW model and the understanding of its symmetry structure results in uncovering an interesting duality of its exchange relations.
Easy Demonstration of the Poisson Spot
Gluck, Paul
2010-01-01
Many physics teachers have a set of slides of single, double and multiple slits to show their students the phenomena of interference and diffraction. Thomas Young's historic experiments with double slits were indeed a milestone in proving the wave nature of light. But another experiment, namely the Poisson spot, was also important historically and…
Quantum fields and Poisson processes. Pt. 2
International Nuclear Information System (INIS)
Bertrand, J.; Gaveau, B.; Rideau, G.
1985-01-01
Quantum field evolutions are written as expectation values with respect to Poisson processes in two simple models; interaction of two boson fields (with conservation of the number of particles in one field) and interaction of a boson with a fermion field. The introduction of a cutt-off ensures that the expectation values are well-defined. (orig.)
Evolutionary inference via the Poisson Indel Process.
Bouchard-Côté, Alexandre; Jordan, Michael I
2013-01-22
We address the problem of the joint statistical inference of phylogenetic trees and multiple sequence alignments from unaligned molecular sequences. This problem is generally formulated in terms of string-valued evolutionary processes along the branches of a phylogenetic tree. The classic evolutionary process, the TKF91 model [Thorne JL, Kishino H, Felsenstein J (1991) J Mol Evol 33(2):114-124] is a continuous-time Markov chain model composed of insertion, deletion, and substitution events. Unfortunately, this model gives rise to an intractable computational problem: The computation of the marginal likelihood under the TKF91 model is exponential in the number of taxa. In this work, we present a stochastic process, the Poisson Indel Process (PIP), in which the complexity of this computation is reduced to linear. The Poisson Indel Process is closely related to the TKF91 model, differing only in its treatment of insertions, but it has a global characterization as a Poisson process on the phylogeny. Standard results for Poisson processes allow key computations to be decoupled, which yields the favorable computational profile of inference under the PIP model. We present illustrative experiments in which Bayesian inference under the PIP model is compared with separate inference of phylogenies and alignments.
Natural Poisson structures of nonlinear plasma dynamics
International Nuclear Information System (INIS)
Kaufman, A.N.
1982-01-01
Hamiltonian field theories, for models of nonlinear plasma dynamics, require a Poisson bracket structure for functionals of the field variables. These are presented, applied, and derived for several sets of field variables: coherent waves, incoherent waves, particle distributions, and multifluid electrodynamics. Parametric coupling of waves and plasma yields concise expressions for ponderomotive effects (in kinetic and fluid models) and for induced scattering. (Auth.)
Poisson brackets for fluids and plasmas
International Nuclear Information System (INIS)
Morrison, P.J.
1982-01-01
Noncanonical yet Hamiltonian descriptions are presented of many of the non-dissipative field equations that govern fluids and plasmas. The dynamical variables are the usually encountered physical variables. These descriptions have the advantage that gauge conditions are absent, but at the expense of introducing peculiar Poisson brackets. Clebsch-like potential descriptions that reverse this situations are also introduced
Natural Poisson structures of nonlinear plasma dynamics
International Nuclear Information System (INIS)
Kaufman, A.N.
1982-06-01
Hamiltonian field theories, for models of nonlinear plasma dynamics, require a Poisson bracket structure for functionals of the field variables. These are presented, applied, and derived for several sets of field variables: coherent waves, incoherent waves, particle distributions, and multifluid electrodynamics. Parametric coupling of waves and plasma yields concise expressions for ponderomotive effects (in kinetic and fluid models) and for induced scattering
Coherent transform, quantization, and Poisson geometry
Novikova, E; Itskov, V; Karasev, M V
1998-01-01
This volume contains three extensive articles written by Karasev and his pupils. Topics covered include the following: coherent states and irreducible representations for algebras with non-Lie permutation relations, Hamilton dynamics and quantization over stable isotropic submanifolds, and infinitesimal tensor complexes over degenerate symplectic leaves in Poisson manifolds. The articles contain many examples (including from physics) and complete proofs.
Efficient information transfer by Poisson neurons
Czech Academy of Sciences Publication Activity Database
Košťál, Lubomír; Shinomoto, S.
2016-01-01
Roč. 13, č. 3 (2016), s. 509-520 ISSN 1547-1063 R&D Projects: GA ČR(CZ) GA15-08066S Institutional support: RVO:67985823 Keywords : information capacity * Poisson neuron * metabolic cost * decoding error Subject RIV: BD - Theory of Information Impact factor: 1.035, year: 2016
Collision prediction models using multivariate Poisson-lognormal regression.
El-Basyouny, Karim; Sayed, Tarek
2009-07-01
This paper advocates the use of multivariate Poisson-lognormal (MVPLN) regression to develop models for collision count data. The MVPLN approach presents an opportunity to incorporate the correlations across collision severity levels and their influence on safety analyses. The paper introduces a new multivariate hazardous location identification technique, which generalizes the univariate posterior probability of excess that has been commonly proposed and applied in the literature. In addition, the paper presents an alternative approach for quantifying the effect of the multivariate structure on the precision of expected collision frequency. The MVPLN approach is compared with the independent (separate) univariate Poisson-lognormal (PLN) models with respect to model inference, goodness-of-fit, identification of hot spots and precision of expected collision frequency. The MVPLN is modeled using the WinBUGS platform which facilitates computation of posterior distributions as well as providing a goodness-of-fit measure for model comparisons. The results indicate that the estimates of the extra Poisson variation parameters were considerably smaller under MVPLN leading to higher precision. The improvement in precision is due mainly to the fact that MVPLN accounts for the correlation between the latent variables representing property damage only (PDO) and injuries plus fatalities (I+F). This correlation was estimated at 0.758, which is highly significant, suggesting that higher PDO rates are associated with higher I+F rates, as the collision likelihood for both types is likely to rise due to similar deficiencies in roadway design and/or other unobserved factors. In terms of goodness-of-fit, the MVPLN model provided a superior fit than the independent univariate models. The multivariate hazardous location identification results demonstrated that some hazardous locations could be overlooked if the analysis was restricted to the univariate models.
Scalable group level probabilistic sparse factor analysis
DEFF Research Database (Denmark)
Hinrich, Jesper Løve; Nielsen, Søren Føns Vind; Riis, Nicolai Andre Brogaard
2017-01-01
Many data-driven approaches exist to extract neural representations of functional magnetic resonance imaging (fMRI) data, but most of them lack a proper probabilistic formulation. We propose a scalable group level probabilistic sparse factor analysis (psFA) allowing spatially sparse maps, component...... pruning using automatic relevance determination (ARD) and subject specific heteroscedastic spatial noise modeling. For task-based and resting state fMRI, we show that the sparsity constraint gives rise to components similar to those obtained by group independent component analysis. The noise modeling...... shows that noise is reduced in areas typically associated with activation by the experimental design. The psFA model identifies sparse components and the probabilistic setting provides a natural way to handle parameter uncertainties. The variational Bayesian framework easily extends to more complex...
Probabilistic Design of Wave Energy Devices
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Kofoed, Jens Peter; Ferreira, C.B.
2011-01-01
Wave energy has a large potential for contributing significantly to production of renewable energy. However, the wave energy sector is still not able to deliver cost competitive and reliable solutions. But the sector has already demonstrated several proofs of concepts. The design of wave energy...... devices is a new and expanding technical area where there is no tradition for probabilistic design—in fact very little full scale devices has been build to date, so it can be said that no design tradition really exists in this area. For this reason it is considered to be of great importance to develop...... and advocate for a probabilistic design approach, as it is assumed (in other areas this has been demonstrated) that this leads to more economical designs compared to designs based on deterministic methods. In the present paper a general framework for probabilistic design and reliability analysis of wave energy...
Probabilistic Flood Defence Assessment Tools
Directory of Open Access Journals (Sweden)
Slomp Robert
2016-01-01
institutions managing flood the defences, and not by just a small number of experts in probabilistic assessment. Therefore, data management and use of software are main issues that have been covered in courses and training in 2016 and 2017. All in all, this is the largest change in the assessment of Dutch flood defences since 1996. In 1996 probabilistic techniques were first introduced to determine hydraulic boundary conditions (water levels and waves (wave height, wave period and direction for different return periods. To simplify the process, the assessment continues to consist of a three-step approach, moving from simple decision rules, to the methods for semi-probabilistic assessment, and finally to a fully probabilistic analysis to compare the strength of flood defences with the hydraulic loads. The formal assessment results are thus mainly based on the fully probabilistic analysis and the ultimate limit state of the strength of a flood defence. For complex flood defences, additional models and software were developed. The current Hydra software suite (for policy analysis, formal flood defence assessment and design will be replaced by the model Ringtoets. New stand-alone software has been developed for revetments, geotechnical analysis and slope stability of the foreshore. Design software and policy analysis software, including the Delta model, will be updated in 2018. A fully probabilistic method results in more precise assessments and more transparency in the process of assessment and reconstruction of flood defences. This is of increasing importance, as large-scale infrastructural projects in a highly urbanized environment are increasingly subject to political and societal pressure to add additional features. For this reason, it is of increasing importance to be able to determine which new feature really adds to flood protection, to quantify how much its adds to the level of flood protection and to evaluate if it is really worthwhile. Please note: The Netherlands
Probabilistic systems coalgebraically: A survey
Sokolova, Ana
2011-01-01
We survey the work on both discrete and continuous-space probabilistic systems as coalgebras, starting with how probabilistic systems are modeled as coalgebras and followed by a discussion of their bisimilarity and behavioral equivalence, mentioning results that follow from the coalgebraic treatment of probabilistic systems. It is interesting to note that, for different reasons, for both discrete and continuous probabilistic systems it may be more convenient to work with behavioral equivalence than with bisimilarity. PMID:21998490
Identifying traffic accident black spots with Poisson-Tweedie models
DEFF Research Database (Denmark)
Debrabant, Birgit; Halekoh, Ulrich; Bonat, Wagner Hugo
2018-01-01
This paper aims at the identification of black spots for traffic accidents, i.e. locations with accident counts beyond what is usual for similar locations, using spatially and temporally aggregated hospital records from Funen, Denmark. Specifically, we apply an autoregressive Poisson-Tweedie model...... considered calendar years and calculated by simulations a probability of p=0.03 for these to be chance findings. Altogether, our results recommend these sites for further investigation and suggest that our simple approach could play a role in future area based traffic accident prevention planning....
Improving EWMA Plans for Detecting Unusual Increases in Poisson Counts
Directory of Open Access Journals (Sweden)
R. S. Sparks
2009-01-01
adaptive exponentially weighted moving average (EWMA plan is developed for signalling unusually high incidence when monitoring a time series of nonhomogeneous daily disease counts. A Poisson transitional regression model is used to fit background/expected trend in counts and provides “one-day-ahead” forecasts of the next day's count. Departures of counts from their forecasts are monitored. The paper outlines an approach for improving early outbreak data signals by dynamically adjusting the exponential weights to be efficient at signalling local persistent high side changes. We emphasise outbreak signals in steady-state situations; that is, changes that occur after the EWMA statistic had run through several in-control counts.
Confluence reduction for probabilistic systems
Timmer, Mark; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette
In this presentation we introduce a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We proved that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To
Tetrahedral meshing via maximal Poisson-disk sampling
Guo, Jianwei
2016-02-15
In this paper, we propose a simple yet effective method to generate 3D-conforming tetrahedral meshes from closed 2-manifold surfaces. Our approach is inspired by recent work on maximal Poisson-disk sampling (MPS), which can generate well-distributed point sets in arbitrary domains. We first perform MPS on the boundary of the input domain, we then sample the interior of the domain, and we finally extract the tetrahedral mesh from the samples by using 3D Delaunay or regular triangulation for uniform or adaptive sampling, respectively. We also propose an efficient optimization strategy to protect the domain boundaries and to remove slivers to improve the meshing quality. We present various experimental results to illustrate the efficiency and the robustness of our proposed approach. We demonstrate that the performance and quality (e.g., minimal dihedral angle) of our approach are superior to current state-of-the-art optimization-based approaches.
Bergstra, J.A.; Middelburg, C.A.
2015-01-01
We add probabilistic features to basic thread algebra and its extensions with thread-service interaction and strategic interleaving. Here, threads represent the behaviours produced by instruction sequences under execution and services represent the behaviours exhibited by the components of execution
Probabilistic simple sticker systems
Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod
2017-04-01
A model for DNA computing using the recombination behavior of DNA molecules, known as a sticker system, was introduced by by L. Kari, G. Paun, G. Rozenberg, A. Salomaa, and S. Yu in the paper entitled DNA computing, sticker systems and universality from the journal of Acta Informatica vol. 35, pp. 401-420 in the year 1998. A sticker system uses the Watson-Crick complementary feature of DNA molecules: starting from the incomplete double stranded sequences, and iteratively using sticking operations until a complete double stranded sequence is obtained. It is known that sticker systems with finite sets of axioms and sticker rules generate only regular languages. Hence, different types of restrictions have been considered to increase the computational power of sticker systems. Recently, a variant of restricted sticker systems, called probabilistic sticker systems, has been introduced [4]. In this variant, the probabilities are initially associated with the axioms, and the probability of a generated string is computed by multiplying the probabilities of all occurrences of the initial strings in the computation of the string. Strings for the language are selected according to some probabilistic requirements. In this paper, we study fundamental properties of probabilistic simple sticker systems. We prove that the probabilistic enhancement increases the computational power of simple sticker systems.
Visualizing Probabilistic Proof
Guerra-Pujol, Enrique
2015-01-01
The author revisits the Blue Bus Problem, a famous thought-experiment in law involving probabilistic proof, and presents simple Bayesian solutions to different versions of the blue bus hypothetical. In addition, the author expresses his solutions in standard and visual formats, i.e. in terms of probabilities and natural frequencies.
DEFF Research Database (Denmark)
Chen, Peiyuan; Chen, Zhe; Bak-Jensen, Birgitte
2008-01-01
This paper reviews the development of the probabilistic load flow (PLF) techniques. Applications of the PLF techniques in different areas of power system steady-state analysis are also discussed. The purpose of the review is to identify different available PLF techniques and their corresponding...
Transitive probabilistic CLIR models.
Kraaij, W.; de Jong, Franciska M.G.
2004-01-01
Transitive translation could be a useful technique to enlarge the number of supported language pairs for a cross-language information retrieval (CLIR) system in a cost-effective manner. The paper describes several setups for transitive translation based on probabilistic translation models. The
Modeling animal-vehicle collisions using diagonal inflated bivariate Poisson regression.
Lao, Yunteng; Wu, Yao-Jan; Corey, Jonathan; Wang, Yinhai
2011-01-01
Two types of animal-vehicle collision (AVC) data are commonly adopted for AVC-related risk analysis research: reported AVC data and carcass removal data. One issue with these two data sets is that they were found to have significant discrepancies by previous studies. In order to model these two types of data together and provide a better understanding of highway AVCs, this study adopts a diagonal inflated bivariate Poisson regression method, an inflated version of bivariate Poisson regression model, to fit the reported AVC and carcass removal data sets collected in Washington State during 2002-2006. The diagonal inflated bivariate Poisson model not only can model paired data with correlation, but also handle under- or over-dispersed data sets as well. Compared with three other types of models, double Poisson, bivariate Poisson, and zero-inflated double Poisson, the diagonal inflated bivariate Poisson model demonstrates its capability of fitting two data sets with remarkable overlapping portions resulting from the same stochastic process. Therefore, the diagonal inflated bivariate Poisson model provides researchers a new approach to investigating AVCs from a different perspective involving the three distribution parameters (λ(1), λ(2) and λ(3)). The modeling results show the impacts of traffic elements, geometric design and geographic characteristics on the occurrences of both reported AVC and carcass removal data. It is found that the increase of some associated factors, such as speed limit, annual average daily traffic, and shoulder width, will increase the numbers of reported AVCs and carcass removals. Conversely, the presence of some geometric factors, such as rolling and mountainous terrain, will decrease the number of reported AVCs. Published by Elsevier Ltd.