Scoring Rules for Subjective Probability Distributions
DEFF Research Database (Denmark)
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...
Scoring Rules for Subjective Probability Distributions
DEFF Research Database (Denmark)
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
2017-01-01
Subjective beliefs are elicited routinely in economics experiments. However, such elicitation often suffers from two possible disadvantages. First, beliefs are recovered in the form of a summary statistic, usually the mean, of the underlying latent distribution. Second, recovered beliefs are bias...
Eliciting Subjective Probability Distributions with Binary Lotteries
DEFF Research Database (Denmark)
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
2015-01-01
We test in a laboratory experiment the theoretical prediction that risk attitudes have a surprisingly small role in distorting reports from true belief distributions. We find evidence consistent with theory in our experiment....
Estimating Subjective Probabilities
DEFF Research Database (Denmark)
Andersen, Steffen; Fountain, John; Harrison, Glenn W.
Subjective probabilities play a central role in many economic decisions, and act as an immediate confound of inferences about behavior, unless controlled for. Several procedures to recover subjective probabilities have been proposed, but in order to recover the correct latent probability one must...
Estimating Subjective Probabilities
DEFF Research Database (Denmark)
Andersen, Steffen; Fountain, John; Harrison, Glenn W.
2014-01-01
Subjective probabilities play a central role in many economic decisions and act as an immediate confound of inferences about behavior, unless controlled for. Several procedures to recover subjective probabilities have been proposed, but in order to recover the correct latent probability one must ...
Univariate Probability Distributions
Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.
2012-01-01
We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…
Superpositions of probability distributions.
Jizba, Petr; Kleinert, Hagen
2008-09-01
Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=sigma;{2} play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.
Eliciting Subjective Probabilities with Binary Lotteries
DEFF Research Database (Denmark)
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
2014-01-01
We evaluate a binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Prior research has shown this procedure to robustly induce risk neutrality when subjects are given a single risk task defined over objective probabilities. Drawing a sample from...... the same subject population, we find evidence that the binary lottery procedure also induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation of subjective probabilities in subjects...
Five-Parameter Bivariate Probability Distribution
Tubbs, J.; Brewer, D.; Smith, O. W.
1986-01-01
NASA technical memorandum presents four papers about five-parameter bivariate gamma class of probability distributions. With some overlap of subject matter, papers address different aspects of theories of these distributions and use in forming statistical models of such phenomena as wind gusts. Provides acceptable results for defining constraints in problems designing aircraft and spacecraft to withstand large wind-gust loads.
Eliciting Subjective Probabilities with Binary Lotteries
DEFF Research Database (Denmark)
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
We evaluate the binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Harrison, Martínez-Correa and Swarthout [2013] found that the binary lottery procedure works robustly to induce risk neutrality when subjects are given one risk task defined over...... objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...
Exact Probability Distribution versus Entropy
Directory of Open Access Journals (Sweden)
Kerstin Andersson
2014-10-01
Full Text Available The problem addressed concerns the determination of the average number of successive attempts of guessing a word of a certain length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations to a natural language are considered. The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations are necessary in order to estimate the number of guesses. Several kinds of approximations are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic sizes of alphabets and words (100, the number of guesses can be estimated within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions. For many probability distributions, the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion of guesses needed on average compared to the total number decreases almost exponentially with the word length. The leading term in an asymptotic expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.
Pre-Aggregation with Probability Distributions
DEFF Research Database (Denmark)
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
2006-01-01
Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...
A practical overview on probability distributions
Viti, Andrea; Terzi, Alberto; Bertolaccini, Luca
2015-01-01
Aim of this paper is a general definition of probability, of its main mathematical features and the features it presents under particular circumstances. The behavior of probability is linked to the features of the phenomenon we would predict. This link can be defined probability distribution. Given the characteristics of phenomena (that we can also define variables), there are defined probability distribution. For categorical (or discrete) variables, the probability can be described by a bino...
A practical overview on probability distributions.
Viti, Andrea; Terzi, Alberto; Bertolaccini, Luca
2015-03-01
Aim of this paper is a general definition of probability, of its main mathematical features and the features it presents under particular circumstances. The behavior of probability is linked to the features of the phenomenon we would predict. This link can be defined probability distribution. Given the characteristics of phenomena (that we can also define variables), there are defined probability distribution. For categorical (or discrete) variables, the probability can be described by a binomial or Poisson distribution in the majority of cases. For continuous variables, the probability can be described by the most important distribution in statistics, the normal distribution. Distributions of probability are briefly described together with some examples for their possible application.
The Multivariate Gaussian Probability Distribution
DEFF Research Database (Denmark)
Ahrendt, Peter
2005-01-01
This technical report intends to gather information about the multivariate gaussian distribution, that was previously not (at least to my knowledge) to be found in one place and written as a reference manual. Additionally, some useful tips and tricks are collected that may be useful in practical ...
Pre-aggregation for Probability Distributions
DEFF Research Database (Denmark)
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....
Multidimensional stationary probability distribution for interacting active particles
National Research Council Canada - National Science Library
Maggi, Claudio; Marconi, Umberto Marini Bettolo; Gnan, Nicoletta; Di Leonardo, Roberto
2015-01-01
We derive the stationary probability distribution for a non-equilibrium system composed by an arbitrary number of degrees of freedom that are subject to Gaussian colored noise and a conservative potential...
Single Trial Probability Applications: Can Subjectivity Evade Frequency Limitations?
Directory of Open Access Journals (Sweden)
David Howden
2009-10-01
Full Text Available Frequency probability theorists define an event’s probability distribution as the limit of a repeated set of trials belonging to a homogeneous collective. The subsets of this collective are events which we have deficient knowledge about on an individual level, although for the larger collective we have knowledge its aggregate behavior. Hence, probabilities can only be achieved through repeated trials of these subsets arriving at the established frequencies that define the probabilities. Crovelli (2009 argues that this is a mistaken approach, and that a subjective assessment of individual trials should be used instead. Bifurcating between the two concepts of risk and uncertainty, Crovelli first asserts that probability is the tool used to manage uncertain situations, and then attempts to rebuild a definition of probability theory with this in mind. We show that such an attempt has little to gain, and results in an indeterminate application of entrepreneurial forecasting to uncertain decisions—a process far-removed from any application of probability theory.
Inferring Beliefs as Subjectively Imprecise Probabilities
DEFF Research Database (Denmark)
Andersen, Steffen; Fountain, John; Harrison, Glenn W.
2012-01-01
. The experimental task consists of a series of standard lottery choices in which the subject is assumed to use conventional risk attitudes to select one lottery or the other and then a series of betting choices in which the subject is presented with a range of bookies offering odds on the outcome of some event...
On $\\varphi$-families of probability distributions
Rui F. Vigelis; Cavalcante, Charles C.
2011-01-01
We generalize the exponential family of probability distributions. In our approach, the exponential function is replaced by a $\\varphi$-function, resulting in a $\\varphi$-family of probability distributions. We show how $\\varphi$-families are constructed. In a $\\varphi$-family, the analogue of the cumulant-generating function is a normalizing function. We define the $\\varphi$-divergence as the Bregman divergence associated to the normalizing function, providing a generalization of the Kullbac...
How Can Histograms Be Useful for Introducing Continuous Probability Distributions?
Derouet, Charlotte; Parzysz, Bernard
2016-01-01
The teaching of probability has changed a great deal since the end of the last century. The development of technologies is indeed part of this evolution. In France, continuous probability distributions began to be studied in 2002 by scientific 12th graders, but this subject was marginal and appeared only as an application of integral calculus.…
Learning a Probability Distribution Efficiently and Reliably
Laird, Philip; Gamble, Evan
1988-01-01
A new algorithm, called the CDF-Inversion Algorithm, is described. Using it, one can efficiently learn a probability distribution over a finite set to a specified accuracy and confidence. The algorithm can be extended to learn joint distributions over a vector space. Some implementation results are described.
A probability distribution model for rain rate
Kedem, Benjamin; Pavlopoulos, Harry; Guan, Xiaodong; Short, David A.
1994-01-01
A systematic approach is suggested for modeling the probability distribution of rain rate. Rain rate, conditional on rain and averaged over a region, is modeled as a temporally homogeneous diffusion process with appropiate boundary conditions. The approach requires a drift coefficient-conditional average instantaneous rate of change of rain intensity-as well as a diffusion coefficient-the conditional average magnitude of the rate of growth and decay of rain rate about its drift. Under certain assumptions on the drift and diffusion coefficients compatible with rain rate, a new parametric family-containing the lognormal distribution-is obtained for the continuous part of the stationary limit probability distribution. The family is fitted to tropical rainfall from Darwin and Florida, and it is found that the lognormal distribution provides adequate fits as compared with other members of the family and also with the gamma distribution.
Proposal for Modified Damage Probability Distribution Functions
DEFF Research Database (Denmark)
Pedersen, Preben Terndrup; Hansen, Peter Friis
1996-01-01
Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...
An all-timescales rainfall probability distribution
Papalexiou, S. M.; Koutsoyiannis, D.
2009-04-01
The selection of a probability distribution for rainfall intensity at many different timescales simultaneously is of primary interest and importance as typically the hydraulic design strongly depends on the rainfall model choice. It is well known that the rainfall distribution may have a long tail, is highly skewed at fine timescales and tends to normality as the timescale increases. This behaviour, explained by the maximum entropy principle (and for large timescales also by the central limit theorem), indicates that the construction of a "universal" probability distribution, capable to adequately describe the rainfall in all timescales, is a difficult task. A search in hydrological literature confirms this argument, as many different distributions have been proposed as appropriate models for different timescales or even for the same timescale, such as Normal, Skew-Normal, two- and three-parameter Log-Normal, Log-Normal mixtures, Generalized Logistic, Pearson Type III, Log-Pearson Type III, Wakeby, Generalized Pareto, Weibull, three- and four-parameter Kappa distribution, and many more. Here we study a single flexible four-parameter distribution for rainfall intensity (the JH distribution) and derive its basic statistics. This distribution incorporates as special cases many other well known distributions, and is capable of describing rainfall in a great range of timescales. Furthermore, we demonstrate the excellent fitting performance of the distribution in various rainfall samples from different areas and for timescales varying from sub-hourly to annual.
Probability distribution functions in turbulent convection
Balachandar, S.; Sirovich, L.
1991-01-01
Results of an extensive investigation of probability distribution functions (pdf's) for Rayleigh-Benard convection, in the hard turbulence regime, are presented. It is seen that the pdf's exhibit a high degree of internal universality. In certain cases this universality is established within two Kolmogorov scales of a boundary. A discussion of the factors leading to universality is presented.
Constraints on probability distributions of grammatical forms
Directory of Open Access Journals (Sweden)
Kostić Aleksandar
2007-01-01
Full Text Available In this study we investigate the constraints on probability distribution of grammatical forms within morphological paradigms of Serbian language, where paradigm is specified as a coherent set of elements with defined criteria for inclusion. Thus, for example, in Serbian all feminine nouns that end with the suffix "a" in their nominative singular form belong to the third declension, the declension being a paradigm. The notion of a paradigm could be extended to other criteria as well, hence, we can think of noun cases, irrespective of grammatical number and gender, or noun gender, irrespective of case and grammatical number, also as paradigms. We took the relative entropy as a measure of homogeneity of probability distribution within paradigms. The analysis was performed on 116 morphological paradigms of typical Serbian and for each paradigm the relative entropy has been calculated. The obtained results indicate that for most paradigms the relative entropy values fall within a range of 0.75 - 0.9. Nonhomogeneous distribution of relative entropy values allows for estimating the relative entropy of the morphological system as a whole. This value is 0.69 and can tentatively be taken as an index of stability of the morphological system.
Parametric probability distributions for anomalous change detection
Energy Technology Data Exchange (ETDEWEB)
Theiler, James P [Los Alamos National Laboratory; Foy, Bernard R [Los Alamos National Laboratory; Wohlberg, Brendt E [Los Alamos National Laboratory; Scovel, James C [Los Alamos National Laboratory
2010-01-01
The problem of anomalous change detection arises when two (or possibly more) images are taken of the same scene, but at different times. The aim is to discount the 'pervasive differences' that occur thoughout the imagery, due to the inevitably different conditions under which the images were taken (caused, for instance, by differences in illumination, atmospheric conditions, sensor calibration, or misregistration), and to focus instead on the 'anomalous changes' that actually take place in the scene. In general, anomalous change detection algorithms attempt to model these normal or pervasive differences, based on data taken directly from the imagery, and then identify as anomalous those pixels for which the model does not hold. For many algorithms, these models are expressed in terms of probability distributions, and there is a class of such algorithms that assume the distributions are Gaussian. By considering a broader class of distributions, however, a new class of anomalous change detection algorithms can be developed. We consider several parametric families of such distributions, derive the associated change detection algorithms, and compare the performance with standard algorithms that are based on Gaussian distributions. We find that it is often possible to significantly outperform these standard algorithms, even using relatively simple non-Gaussian models.
On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!
Directory of Open Access Journals (Sweden)
Mark R. Crovelli
2009-06-01
Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.
Probability Distributions for Random Quantum Operations
Schultz, Kevin
Motivated by uncertainty quantification and inference of quantum information systems, in this work we draw connections between the notions of random quantum states and operations in quantum information with probability distributions commonly encountered in the field of orientation statistics. This approach identifies natural sample spaces and probability distributions upon these spaces that can be used in the analysis, simulation, and inference of quantum information systems. The theory of exponential families on Stiefel manifolds provides the appropriate generalization to the classical case. Furthermore, this viewpoint motivates a number of additional questions into the convex geometry of quantum operations relative to both the differential geometry of Stiefel manifolds as well as the information geometry of exponential families defined upon them. In particular, we draw on results from convex geometry to characterize which quantum operations can be represented as the average of a random quantum operation. This project was supported by the Intelligence Advanced Research Projects Activity via Department of Interior National Business Center Contract Number 2012-12050800010.
Mixture probability distribution functions to model wind speed distributions
Energy Technology Data Exchange (ETDEWEB)
Kollu, Ravindra; Rayapudi, Srinivasa Rao; Pakkurthi, Krishna Mohan [J.N.T. Univ., Kakinada (India). Dept. of Electrical and Electronics Engineering; Narasimham, S.V.L. [J.N.T. Univ., Andhra Pradesh (India). Computer Science and Engineering Dept.
2012-11-01
Accurate wind speed modeling is critical in estimating wind energy potential for harnessing wind power effectively. The quality of wind speed assessment depends on the capability of chosen probability density function (PDF) to describe the measured wind speed frequency distribution. The objective of this study is to describe (model) wind speed characteristics using three mixture probability density functions Weibull-extreme value distribution (GEV), Weibull-lognormal, and GEV-lognormal which were not tried before. Statistical parameters such as maximum error in the Kolmogorov-Smirnov test, root mean square error, Chi-square error, coefficient of determination, and power density error are considered as judgment criteria to assess the fitness of the probability density functions. Results indicate that Weibull- GEV PDF is able to describe unimodal as well as bimodal wind distributions accurately whereas GEV-lognormal PDF is able to describe familiar bell-shaped unimodal distribution well. Results show that mixture probability functions are better alternatives to conventional Weibull, two-component mixture Weibull, gamma, and lognormal PDFs to describe wind speed characteristics. (orig.)
Audio feature extraction using probability distribution function
Suhaib, A.; Wan, Khairunizam; Aziz, Azri A.; Hazry, D.; Razlan, Zuradzman M.; Shahriman A., B.
2015-05-01
Voice recognition has been one of the popular applications in robotic field. It is also known to be recently used for biometric and multimedia information retrieval system. This technology is attained from successive research on audio feature extraction analysis. Probability Distribution Function (PDF) is a statistical method which is usually used as one of the processes in complex feature extraction methods such as GMM and PCA. In this paper, a new method for audio feature extraction is proposed which is by using only PDF as a feature extraction method itself for speech analysis purpose. Certain pre-processing techniques are performed in prior to the proposed feature extraction method. Subsequently, the PDF result values for each frame of sampled voice signals obtained from certain numbers of individuals are plotted. From the experimental results obtained, it can be seen visually from the plotted data that each individuals' voice has comparable PDF values and shapes.
Kinnear, John; Jackson, Ruth
2017-07-01
Although physicians are highly trained in the application of evidence-based medicine, and are assumed to make rational decisions, there is evidence that their decision making is prone to biases. One of the biases that has been shown to affect accuracy of judgements is that of representativeness and base-rate neglect, where the saliency of a person's features leads to overestimation of their likelihood of belonging to a group. This results in the substitution of 'subjective' probability for statistical probability. This study examines clinicians' propensity to make estimations of subjective probability when presented with clinical information that is considered typical of a medical condition. The strength of the representativeness bias is tested by presenting choices in textual and graphic form. Understanding of statistical probability is also tested by omitting all clinical information. For the questions that included clinical information, 46.7% and 45.5% of clinicians made judgements of statistical probability, respectively. Where the question omitted clinical information, 79.9% of clinicians made a judgement consistent with statistical probability. There was a statistically significant difference in responses to the questions with and without representativeness information (χ2 (1, n=254)=54.45, pprobability. One of the causes for this representativeness bias may be the way clinical medicine is taught where stereotypic presentations are emphasised in diagnostic decision making. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Support Theory: A Nonextensional Representation of Subjective Probability.
Tversky, Amos; Koehler, Derek J.
1994-01-01
A new theory of subjective probability is presented. According to this theory, different descriptions of the same event can give rise to different judgments. Experimental evidence supporting this theory is summarized, demonstrating that the theory provides a unified treatment of a wide range of empirical findings. (SLD)
Continuous subjective expected utility with non-additive probabilities
P.P. Wakker (Peter)
1989-01-01
textabstractA well-known theorem of Debreu about additive representations of preferences is applied in a non-additive context, to characterize continuous subjective expected utility maximization for the case where the probability measures may be non-additive. The approach of this paper does not need
The Probability Distribution for a Biased Spinner
Foster, Colin
2012-01-01
This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)
Finite n Largest Eigenvalue Probability Distribution Function of Gaussian Ensembles
Choup, Leonard N.
2011-01-01
In this paper we focus on the finite n probability distribution function of the largest eigenvalue in the classical Gaussian Ensemble of n by n matrices (GEn). We derive the finite n largest eigenvalue probability distribution function for the Gaussian Orthogonal and Symplectic Ensembles and also prove an Edgeworth type Theorem for the largest eigenvalue probability distribution function of Gaussian Symplectic Ensemble. The correction terms to the limiting probability distribution are express...
Matrix-exponential distributions in applied probability
Bladt, Mogens
2017-01-01
This book contains an in-depth treatment of matrix-exponential (ME) distributions and their sub-class of phase-type (PH) distributions. Loosely speaking, an ME distribution is obtained through replacing the intensity parameter in an exponential distribution by a matrix. The ME distributions can also be identified as the class of non-negative distributions with rational Laplace transforms. If the matrix has the structure of a sub-intensity matrix for a Markov jump process we obtain a PH distribution which allows for nice probabilistic interpretations facilitating the derivation of exact solutions and closed form formulas. The full potential of ME and PH unfolds in their use in stochastic modelling. Several chapters on generic applications, like renewal theory, random walks and regenerative processes, are included together with some specific examples from queueing theory and insurance risk. We emphasize our intention towards applications by including an extensive treatment on statistical methods for PH distribu...
Probability distribution of flood flows in Tunisia
Abida, H.; Ellouze, M.
2008-05-01
L (Linear) moments are used in identifying regional flood frequency distributions for different zones Tunisia wide. 1134 site-years of annual maximum stream flow data from a total of 42 stations with an average record length of 27 years are considered. The country is divided into two homogeneous regions (northern and central/southern Tunisia) using a heterogeneity measure, based on the spread of the sample L-moments among the sites in a given region. Then, selection of the corresponding distribution is achieved through goodness-of-fit comparisons in L-moment diagrams and verified using an L moment based regional test that compares observed to theoretical values of L-skewness and L-kurtosis for various candidate distributions. The distributions used, which represent five of the most frequently used distributions in the analysis of hydrologic extreme variables are: (i) Generalized Extreme Value (GEV), (ii) Pearson Type III (P3), (iii) Generalized Logistic (GLO), (iv) Generalized Normal (GN), and (v) Generalized Pareto (GPA) distributions. Spatial trends, with respect to the best-fit flood frequency distribution, are distinguished: Northern Tunisia was shown to be represented by the GNO distribution while the GNO and GEV distributions give the best fit in central/southern Tunisia.
Foundations of quantization for probability distributions
Graf, Siegfried
2000-01-01
Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.
Probability distribution of drawdowns in risky investments
Maslov, Sergei; Zhang, Yi-Cheng
We study the risk criterion for investments based on the drawdown from the maximal value of the capital in the past. Depending on investor's risk attitude, thus his risk exposure, we find that the distribution of these drawdowns follows a general power law. In particular, if the risk exposure is Kelly-optimal, the exponent of this power law has the borderline value of 2, i.e. the average drawdown is just about to diverge.
Incorporating Skew into RMS Surface Roughness Probability Distribution
Stahl, Mark T.; Stahl, H. Philip.
2013-01-01
The standard treatment of RMS surface roughness data is the application of a Gaussian probability distribution. This handling of surface roughness ignores the skew present in the surface and overestimates the most probable RMS of the surface, the mode. Using experimental data we confirm the Gaussian distribution overestimates the mode and application of an asymmetric distribution provides a better fit. Implementing the proposed asymmetric distribution into the optical manufacturing process would reduce the polishing time required to meet surface roughness specifications.
Probability distribution of vertical longitudinal shear fluctuations.
Fichtl, G. H.
1972-01-01
This paper discusses some recent measurements of third and fourth moments of vertical differences (shears) of longitudinal velocity fluctuations obtained in unstable air at the NASA 150 m meteorological tower site at Cape Kennedy, Fla. Each set of measurements consisted of longitudinal velocity fluctuation time histories obtained at the 18, 30, 60, 90, 120 and 150 m levels, so that 15 wind-shear time histories were obtained from each set of measurements. It appears that the distribution function of the longitudinal wind fluctuations at two levels is not bivariate Gaussian. The implications of the results relative to the design and operation of aerospace vehicles are discussed.-
A Priori Probability Distribution of the Cosmological Constant
Weinberg, Steven
2000-01-01
In calculations of the probability distribution for the cosmological constant, it has been previously assumed that the a priori probability distribution is essentially constant in the very narrow range that is anthropically allowed. This assumption has recently been challenged. Here we identify large classes of theories in which this assumption is justified.
Probability plots based on student’s t-distribution
Hooft, R.W.W.|info:eu-repo/dai/nl/109722213; Straver, L.H.; Spek, A.L.|info:eu-repo/dai/nl/156517566
2009-01-01
The validity of the normal distribution as an error model is commonly tested with a (half) normal probability plot. Real data often contain outliers. The use of t-distributions in a probability plot to model such data more realistically is described. It is shown how a suitable value of the parameter
Probability distributions in risk management operations
Artikis, Constantinos
2015-01-01
This book is about the formulations, theoretical investigations, and practical applications of new stochastic models for fundamental concepts and operations of the discipline of risk management. It also examines how these models can be useful in the descriptions, measurements, evaluations, and treatments of risks threatening various modern organizations. Moreover, the book makes clear that such stochastic models constitute very strong analytical tools which substantially facilitate strategic thinking and strategic decision making in many significant areas of risk management. In particular the incorporation of fundamental probabilistic concepts such as the sum, minimum, and maximum of a random number of continuous, positive, independent, and identically distributed random variables in the mathematical structure of stochastic models significantly supports the suitability of these models in the developments, investigations, selections, and implementations of proactive and reactive risk management operations. The...
Methods for fitting a parametric probability distribution to most probable number data.
Williams, Michael S; Ebel, Eric D
2012-07-02
Every year hundreds of thousands, if not millions, of samples are collected and analyzed to assess microbial contamination in food and water. The concentration of pathogenic organisms at the end of the production process is low for most commodities, so a highly sensitive screening test is used to determine whether the organism of interest is present in a sample. In some applications, samples that test positive are subjected to quantitation. The most probable number (MPN) technique is a common method to quantify the level of contamination in a sample because it is able to provide estimates at low concentrations. This technique uses a series of dilution count experiments to derive estimates of the concentration of the microorganism of interest. An application for these data is food-safety risk assessment, where the MPN concentration estimates can be fitted to a parametric distribution to summarize the range of potential exposures to the contaminant. Many different methods (e.g., substitution methods, maximum likelihood and regression on order statistics) have been proposed to fit microbial contamination data to a distribution, but the development of these methods rarely considers how the MPN technique influences the choice of distribution function and fitting method. An often overlooked aspect when applying these methods is whether the data represent actual measurements of the average concentration of microorganism per milliliter or the data are real-valued estimates of the average concentration, as is the case with MPN data. In this study, we propose two methods for fitting MPN data to a probability distribution. The first method uses a maximum likelihood estimator that takes average concentration values as the data inputs. The second is a Bayesian latent variable method that uses the counts of the number of positive tubes at each dilution to estimate the parameters of the contamination distribution. The performance of the two fitting methods is compared for two
Negative Binomial and Multinomial States: probability distributions and coherent states
Fu, Hong-Chen; Sasaki, Ryu
1996-01-01
Following the relationship between probability distribution and coherent states, for example the well known Poisson distribution and the ordinary coherent states and relatively less known one of the binomial distribution and the $su(2)$ coherent states, we propose ``interpretation'' of $su(1,1)$ and $su(r,1)$ coherent states ``in terms of probability theory''. They will be called the ``negative binomial'' (``multinomial'') ``states'' which correspond to the ``negative'' binomial (multinomial)...
Probability plots based on Student's t-distribution.
Hooft, Rob W W; Straver, Leo H; Spek, Anthony L
2009-07-01
The validity of the normal distribution as an error model is commonly tested with a (half) normal probability plot. Real data often contain outliers. The use of t-distributions in a probability plot to model such data more realistically is described. It is shown how a suitable value of the parameter nu of the t-distribution can be determined from the data. The results suggest that even data that seem to be modeled well using a normal distribution can be better modeled using a t-distribution.
The estimation of tree posterior probabilities using conditional clade probability distributions.
Larget, Bret
2013-07-01
In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample.
Dinov, Ivo D.; Siegrist, Kyle; Pearl, Dennis K.; Kalinin, Alexandr; Christou, Nicolas
2015-01-01
Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome, which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the
Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas
2016-06-01
Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome, which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the
Bounding the Failure Probability Range of Polynomial Systems Subject to P-box Uncertainties
Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.
2012-01-01
This paper proposes a reliability analysis framework for systems subject to multiple design requirements that depend polynomially on the uncertainty. Uncertainty is prescribed by probability boxes, also known as p-boxes, whose distribution functions have free or fixed functional forms. An approach based on the Bernstein expansion of polynomials and optimization is proposed. In particular, we search for the elements of a multi-dimensional p-box that minimize (i.e., the best-case) and maximize (i.e., the worst-case) the probability of inner and outer bounding sets of the failure domain. This technique yields intervals that bound the range of failure probabilities. The offset between this bounding interval and the actual failure probability range can be made arbitrarily tight with additional computational effort.
Hybrid computer technique yields random signal probability distributions
Cameron, W. D.
1965-01-01
Hybrid computer determines the probability distributions of instantaneous and peak amplitudes of random signals. This combined digital and analog computer system reduces the errors and delays of manual data analysis.
Information-theoretic methods for estimating of complicated probability distributions
Zong, Zhi
2006-01-01
Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur
Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution
Energy Technology Data Exchange (ETDEWEB)
Hamadameen, Abdulqader Othman [Optimization, Department of Mathematical Sciences, Faculty of Science, UTM (Malaysia); Zainuddin, Zaitul Marlizawati [Department of Mathematical Sciences, Faculty of Science, UTM (Malaysia)
2014-06-19
This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α{sup –}. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen’s method is employed to find a compromise solution, supported by illustrative numerical example.
Fitness Probability Distribution of Bit-Flip Mutation.
Chicano, Francisco; Sutton, Andrew M; Whitley, L Darrell; Alba, Enrique
2015-01-01
Bit-flip mutation is a common mutation operator for evolutionary algorithms applied to optimize functions over binary strings. In this paper, we develop results from the theory of landscapes and Krawtchouk polynomials to exactly compute the probability distribution of fitness values of a binary string undergoing uniform bit-flip mutation. We prove that this probability distribution can be expressed as a polynomial in p, the probability of flipping each bit. We analyze these polynomials and provide closed-form expressions for an easy linear problem (Onemax), and an NP-hard problem, MAX-SAT. We also discuss a connection of the results with runtime analysis.
Some explicit expressions for the probability distribution of force ...
Indian Academy of Sciences (India)
Abstract. Recently, empirical investigations have suggested that the components of contact forces follow the exponential distribution. However, explicit expressions for the probability distribution of the corresponding force magnitude have not been known and only approximations have been used in the literature. In this note ...
Some explicit expressions for the probability distribution of force ...
Indian Academy of Sciences (India)
Recently, empirical investigations have suggested that the components of contact forces follow the exponential distribution. However, explicit expressions for the probability distribution of the corresponding force magnitude have not been known and only approximations have been used in the literature. In this note, for the ...
Probability distribution functions of echo signals from meteorological targets
Vasilyev, G. V.
1975-01-01
Simple expressions are obtained for the laws and moments of the probability distributions of averaged echo signals from meteorological targets at the output of a logarithmic radar receiver. Here, the distribution function is assumed to be represented in the form of an Edgeworth series.
Measuring Robustness of Timetables at Stations using a Probability Distribution
DEFF Research Database (Denmark)
Jensen, Lars Wittrup; Landex, Alex
infrastructure layouts given a timetable. These two methods provide different precision at the expense of a more complex calculation process. The advanced and more precise method is based on a probability distribution that can describe the expected delay between two trains as a function of the buffer time....... This paper proposes to use the exponential distribution, only taking non-negative delays into account, but any probability distribution can be used. Furthermore, the paper proposes that the calculation parameters are estimated from existing delay data, at a station, to achieve a higher precision. As delay...
Gendist: An R Package for Generated Probability Distribution Models.
Directory of Open Access Journals (Sweden)
Shaiful Anuar Abu Bakar
Full Text Available In this paper, we introduce the R package gendist that computes the probability density function, the cumulative distribution function, the quantile function and generates random values for several generated probability distribution models including the mixture model, the composite model, the folded model, the skewed symmetric model and the arc tan model. These models are extensively used in the literature and the R functions provided here are flexible enough to accommodate various univariate distributions found in other R packages. We also show its applications in graphing, estimation, simulation and risk measurements.
Gendist: An R Package for Generated Probability Distribution Models.
Abu Bakar, Shaiful Anuar; Nadarajah, Saralees; Absl Kamarul Adzhar, Zahrul Azmir; Mohamed, Ibrahim
2016-01-01
In this paper, we introduce the R package gendist that computes the probability density function, the cumulative distribution function, the quantile function and generates random values for several generated probability distribution models including the mixture model, the composite model, the folded model, the skewed symmetric model and the arc tan model. These models are extensively used in the literature and the R functions provided here are flexible enough to accommodate various univariate distributions found in other R packages. We also show its applications in graphing, estimation, simulation and risk measurements.
Evidence for Truncated Exponential Probability Distribution of Earthquake Slip
Thingbaijam, Kiran K. S.
2016-07-13
Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.
Applications of the Dirichlet distribution to forensic match probabilities.
Lange, K
1995-01-01
The Dirichlet distribution provides a convenient conjugate prior for Bayesian analyses involving multinomial proportions. In particular, allele frequency estimation can be carried out with a Dirichlet prior. If data from several distinct populations are available, then the parameters characterizing the Dirichlet prior can be estimated by maximum likelihood and then used for allele frequency estimation in each of the separate populations. This empirical Bayes procedure tends to moderate extreme multinomial estimates based on sample proportions. The Dirichlet distribution can also be employed to model the contributions from different ancestral populations in computing forensic match probabilities. If the ancestral populations are in genetic equilibrium, then the product rule for computing match probabilities is valid conditional on the ancestral contributions to a typical person of the reference population. This fact facilitates computation of match probabilities and tight upper bounds to match probabilities.
Probability distributions for the magnification of quasars due to microlensing
Wambsganss, Joachim
1992-01-01
Gravitational microlensing can magnify the flux of a lensed quasar considerably and therefore possibly influence quasar source counts or the observed quasar luminosity function. A large number of distributions of magnification probabilities due to gravitational microlensing for finite sources are presented, with a reasonable coverage of microlensing parameter space (i.e., surface mass density, external shear, mass spectrum of lensing objects). These probability distributions were obtained from smoothing two-dimensional magnification patterns with Gaussian source profiles. Different source sizes ranging from 10 exp 14 cm to 5 x 10 exp 16 cm were explored. The probability distributions show a large variety of shapes. Coefficients of fitted slopes for large magnifications are presented.
On the probability distribution for the cosmological constant
Elizalde, E.; Gaztañaga, E.
1990-01-01
The behaviour in Coleman's approach of the probability distribution for the cosmological constant Λ is shown to depend rather strongly on the corrections to the effective action. In particular, when one includes terms proportional to Λ2, the infinite peak in the probability density at Λ=0 smoothly disappears (provided that the coefficient of Λ2 is positive). A random distribution for Λ can then be obtained (as a limiting case) in a domain around Λ=0. This is in accordance with the results of an approach recently proposed by Fischler, Klebanov, Polchinski and Susskind.
Full counting statistics for noninteracting fermions: joint probability distributions.
Inhester, L; Schönhammer, K
2009-11-25
The joint probability distribution in the full counting statistics (FCS) for noninteracting electrons is discussed for an arbitrary number of initially separate subsystems which are connected at t = 0 and separated again at a later time. A simple method to obtain the leading-order long-time contribution to the logarithm of the characteristic function is presented which simplifies earlier approaches. New explicit results for the determinant involving the scattering matrices are found. The joint probability distribution for the charges in two leads is discussed for Y junctions and dots connected to four leads.
Full counting statistics for noninteracting fermions: joint probability distributions
Energy Technology Data Exchange (ETDEWEB)
Inhester, L; Schoenhammer, K [Institut fuer Theoretische Physik, Universitaet Goettingen, Friedrich-Hund-Platz 1, D-37077 Goettingen (Germany)
2009-11-25
The joint probability distribution in the full counting statistics (FCS) for noninteracting electrons is discussed for an arbitrary number of initially separate subsystems which are connected at t = 0 and separated again at a later time. A simple method to obtain the leading-order long-time contribution to the logarithm of the characteristic function is presented which simplifies earlier approaches. New explicit results for the determinant involving the scattering matrices are found. The joint probability distribution for the charges in two leads is discussed for Y junctions and dots connected to four leads.
Parametric Probability Distribution Functions for Axon Diameters of Corpus Callosum.
Sepehrband, Farshid; Alexander, Daniel C; Clark, Kristi A; Kurniawan, Nyoman D; Yang, Zhengyi; Reutens, David C
2016-01-01
Axon diameter is an important neuroanatomical characteristic of the nervous system that alters in the course of neurological disorders such as multiple sclerosis. Axon diameters vary, even within a fiber bundle, and are not normally distributed. An accurate distribution function is therefore beneficial, either to describe axon diameters that are obtained from a direct measurement technique (e.g., microscopy), or to infer them indirectly (e.g., using diffusion-weighted MRI). The gamma distribution is a common choice for this purpose (particularly for the inferential approach) because it resembles the distribution profile of measured axon diameters which has been consistently shown to be non-negative and right-skewed. In this study we compared a wide range of parametric probability distribution functions against empirical data obtained from electron microscopy images. We observed that the gamma distribution fails to accurately describe the main characteristics of the axon diameter distribution, such as location and scale of the mode and the profile of distribution tails. We also found that the generalized extreme value distribution consistently fitted the measured distribution better than other distribution functions. This suggests that there may be distinct subpopulations of axons in the corpus callosum, each with their own distribution profiles. In addition, we observed that several other distributions outperformed the gamma distribution, yet had the same number of unknown parameters; these were the inverse Gaussian, log normal, log logistic and Birnbaum-Saunders distributions.
Parametric Probability Distribution Functions for Axon Diameters of Corpus Callosum
Sepehrband, Farshid; Alexander, Daniel C.; Clark, Kristi A.; Kurniawan, Nyoman D.; Yang, Zhengyi; Reutens, David C.
2016-01-01
Axon diameter is an important neuroanatomical characteristic of the nervous system that alters in the course of neurological disorders such as multiple sclerosis. Axon diameters vary, even within a fiber bundle, and are not normally distributed. An accurate distribution function is therefore beneficial, either to describe axon diameters that are obtained from a direct measurement technique (e.g., microscopy), or to infer them indirectly (e.g., using diffusion-weighted MRI). The gamma distribution is a common choice for this purpose (particularly for the inferential approach) because it resembles the distribution profile of measured axon diameters which has been consistently shown to be non-negative and right-skewed. In this study we compared a wide range of parametric probability distribution functions against empirical data obtained from electron microscopy images. We observed that the gamma distribution fails to accurately describe the main characteristics of the axon diameter distribution, such as location and scale of the mode and the profile of distribution tails. We also found that the generalized extreme value distribution consistently fitted the measured distribution better than other distribution functions. This suggests that there may be distinct subpopulations of axons in the corpus callosum, each with their own distribution profiles. In addition, we observed that several other distributions outperformed the gamma distribution, yet had the same number of unknown parameters; these were the inverse Gaussian, log normal, log logistic and Birnbaum-Saunders distributions. PMID:27303273
A Probability Distribution over Latent Causes, in the Orbitofrontal Cortex.
Chan, Stephanie C Y; Niv, Yael; Norman, Kenneth A
2016-07-27
The orbitofrontal cortex (OFC) has been implicated in both the representation of "state," in studies of reinforcement learning and decision making, and also in the representation of "schemas," in studies of episodic memory. Both of these cognitive constructs require a similar inference about the underlying situation or "latent cause" that generates our observations at any given time. The statistically optimal solution to this inference problem is to use Bayes' rule to compute a posterior probability distribution over latent causes. To test whether such a posterior probability distribution is represented in the OFC, we tasked human participants with inferring a probability distribution over four possible latent causes, based on their observations. Using fMRI pattern similarity analyses, we found that BOLD activity in the OFC is best explained as representing the (log-transformed) posterior distribution over latent causes. Furthermore, this pattern explained OFC activity better than other task-relevant alternatives, such as the most probable latent cause, the most recent observation, or the uncertainty over latent causes. Our world is governed by hidden (latent) causes that we cannot observe, but which generate the observations we see. A range of high-level cognitive processes require inference of a probability distribution (or "belief distribution") over the possible latent causes that might be generating our current observations. This is true for reinforcement learning and decision making (where the latent cause comprises the true "state" of the task), and for episodic memory (where memories are believed to be organized by the inferred situation or "schema"). Using fMRI, we show that this belief distribution over latent causes is encoded in patterns of brain activity in the orbitofrontal cortex, an area that has been separately implicated in the representations of both states and schemas. Copyright © 2016 the authors 0270-6474/16/367817-12$15.00/0.
Probability distributions of the electroencephalogram envelope of preterm infants.
Saji, Ryoya; Hirasawa, Kyoko; Ito, Masako; Kusuda, Satoshi; Konishi, Yukuo; Taga, Gentaro
2015-06-01
To determine the stationary characteristics of electroencephalogram (EEG) envelopes for prematurely born (preterm) infants and investigate the intrinsic characteristics of early brain development in preterm infants. Twenty neurologically normal sets of EEGs recorded in infants with a post-conceptional age (PCA) range of 26-44 weeks (mean 37.5 ± 5.0 weeks) were analyzed. Hilbert transform was applied to extract the envelope. We determined the suitable probability distribution of the envelope and performed a statistical analysis. It was found that (i) the probability distributions for preterm EEG envelopes were best fitted by lognormal distributions at 38 weeks PCA or less, and by gamma distributions at 44 weeks PCA; (ii) the scale parameter of the lognormal distribution had positive correlations with PCA as well as a strong negative correlation with the percentage of low-voltage activity; (iii) the shape parameter of the lognormal distribution had significant positive correlations with PCA; (iv) the statistics of mode showed significant linear relationships with PCA, and, therefore, it was considered a useful index in PCA prediction. These statistics, including the scale parameter of the lognormal distribution and the skewness and mode derived from a suitable probability distribution, may be good indexes for estimating stationary nature in developing brain activity in preterm infants. The stationary characteristics, such as discontinuity, asymmetry, and unimodality, of preterm EEGs are well indicated by the statistics estimated from the probability distribution of the preterm EEG envelopes. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Stochastic invertible mappings between power law and Gaussian probability distributions
Vignat, C.; Plastino, A.
2005-01-01
We construct "stochastic mappings" between power law probability distributions (PD's) and Gaussian ones. To a given vector $N$, Gaussian distributed (respectively $Z$, exponentially distributed), one can associate a vector $X$, "power law distributed", by multiplying $X$ by a random scalar variable $a$, $N= a X$. This mapping is "invertible": one can go via multiplication by another random variable $b$ from $X$ to $N$ (resp. from $X$ to $Z$), i.e., $X=b N$ (resp. $X=b Z$). Note that all the a...
Computer routines for probability distributions, random numbers, and related functions
Kirby, W.
1983-01-01
Use of previously coded and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main progress. The probability distributions provided include the beta, chi-square, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F. Other mathematical functions include the Bessel function, I sub o, gamma and log-gamma functions, error functions, and exponential integral. Auxiliary services include sorting and printer-plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)
Equivalent absorption coefficients generated from frequency probability distributions
Ackerman, T. P.
1983-01-01
A flexible and computationally accurate method of treating aerosol scattering in spectral regions in which gaseous absorption is important is described. In the method, line-by-line absorption coefficients are computed as a function of pressure, temperature, and absorber gas for the spectral region of interest. The coefficients are sorted into a probability distribution which is converted into a cumulative probability distribution, which in turn can be inverted due to its monotonic nature. The inverted distribution is a smooth curve giving the absorption coefficient as a function of an independent variable on the domain. The frequency integration of the radiative transfer equation can then be performed by a quadrature technique with values of the absorption coefficient determined from the inverted distribution curve. The method is illustrated by applying it to the 9.6 micron band of ozone.
Probability distribution of extreme share returns in Malaysia
Zin, Wan Zawiah Wan; Safari, Muhammad Aslam Mohd; Jaaman, Saiful Hafizah; Yie, Wendy Ling Shin
2014-09-01
The objective of this study is to investigate the suitable probability distribution to model the extreme share returns in Malaysia. To achieve this, weekly and monthly maximum daily share returns are derived from share prices data obtained from Bursa Malaysia over the period of 2000 to 2012. The study starts with summary statistics of the data which will provide a clue on the likely candidates for the best fitting distribution. Next, the suitability of six extreme value distributions, namely the Gumbel, Generalized Extreme Value (GEV), Generalized Logistic (GLO) and Generalized Pareto (GPA), the Lognormal (GNO) and the Pearson (PE3) distributions are evaluated. The method of L-moments is used in parameter estimation. Based on several goodness of fit tests and L-moment diagram test, the Generalized Pareto distribution and the Pearson distribution are found to be the best fitted distribution to represent the weekly and monthly maximum share returns in Malaysia stock market during the studied period, respectively.
Modeling highway travel time distribution with conditional probability models
Energy Technology Data Exchange (ETDEWEB)
Oliveira Neto, Francisco Moraes [ORNL; Chin, Shih-Miao [ORNL; Hwang, Ho-Ling [ORNL; Han, Lee [University of Tennessee, Knoxville (UTK)
2014-01-01
ABSTRACT Under the sponsorship of the Federal Highway Administration's Office of Freight Management and Operations, the American Transportation Research Institute (ATRI) has developed performance measures through the Freight Performance Measures (FPM) initiative. Under this program, travel speed information is derived from data collected using wireless based global positioning systems. These telemetric data systems are subscribed and used by trucking industry as an operations management tool. More than one telemetric operator submits their data dumps to ATRI on a regular basis. Each data transmission contains truck location, its travel time, and a clock time/date stamp. Data from the FPM program provides a unique opportunity for studying the upstream-downstream speed distributions at different locations, as well as different time of the day and day of the week. This research is focused on the stochastic nature of successive link travel speed data on the continental United States Interstates network. Specifically, a method to estimate route probability distributions of travel time is proposed. This method uses the concepts of convolution of probability distributions and bivariate, link-to-link, conditional probability to estimate the expected distributions for the route travel time. Major contribution of this study is the consideration of speed correlation between upstream and downstream contiguous Interstate segments through conditional probability. The established conditional probability distributions, between successive segments, can be used to provide travel time reliability measures. This study also suggests an adaptive method for calculating and updating route travel time distribution as new data or information is added. This methodology can be useful to estimate performance measures as required by the recent Moving Ahead for Progress in the 21st Century Act (MAP 21).
Caustic-induced features in microlensing magnification probability distributions
Rauch, Kevin P.; Mao, Shude; Wambsganss, Joachim; Paczynski, Bohdan
1992-01-01
Numerical simulations have uncovered a previously unrecognized 'bump' in the macroimage magnification probabilities produced by a planar distribution of point masses. The result could be relevant to cases of microlensing by star fields in single galaxies, for which this lensing geometry is an excellent approximation. The bump is produced by bright pairs of microimages formed by sources lying near the caustics of the lens. The numerically calculated probabilities for the magnifications in the range between 3 and 30 are significantly higher than those given by the asymptotic relation derived by Schneider. The bump present in the two-dimensional lenses appears not to exist in the magnification probability distribution produced by a fully three-dimensional lens.
Log-concave Probability Distributions: Theory and Statistical Testing
DEFF Research Database (Denmark)
An, Mark Yuing
1996-01-01
This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...
Monsoonal differences and probability distribution of PM(10) concentration.
Md Yusof, Noor Faizah Fitri; Ramli, Nor Azam; Yahaya, Ahmad Shukri; Sansuddin, Nurulilyana; Ghazali, Nurul Adyani; Al Madhoun, Wesam
2010-04-01
There are many factors that influence PM(10) concentration in the atmosphere. This paper will look at the PM(10) concentration in relation with the wet season (north east monsoon) and dry season (south west monsoon) in Seberang Perai, Malaysia from the year 2000 to 2004. It is expected that PM(10) will reach the peak during south west monsoon as the weather during this season becomes dry and this study has proved that the highest PM(10) concentrations in 2000 to 2004 were recorded in this monsoon. Two probability distributions using Weibull and lognormal were used to model the PM(10) concentration. The best model used for prediction was selected based on performance indicators. Lognormal distribution represents the data better than Weibull distribution model for 2000, 2001, and 2002. However, for 2003 and 2004, Weibull distribution represents better than the lognormal distribution. The proposed distributions were successfully used for estimation of exceedences and predicting the return periods of the sequence year.
The probability distributions of S/+/ gyrospeeds in the Io torus
Brown, R. A.
1982-01-01
The first detailed thermal speed probability distribution for a Jovian plasma ion is presented. The distribution of heavy ion thermal speeds appears to be highly significant for an evaluation of the processes by which the particle and energy budgets of the Io plasma torus are maintained. Attention is given to the measurement, reduction, and analysis procedures which yield the reported probability distribution. A kinetic energy inventory of the Jovian plasma heavy ion component can be obtained from high-resolution, high-precision spectrophotometry of emission lines. The current S(+) study finds that detection of the line wings results in a mean energy (approximately 60 eV) which is higher by a factor of approximately 10 than is implied by the line core. This illustrates that a dispersive measurement may provide only a lower limit to the kinetic energy content.
Subjective Probability of Receiving Harm as a Function of Attraction and Harm Delivered.
Schlenker, Barry R.; And Others
It was hypothesized that subjects who liked a source of potential harm would estimate the probability of receiving harm mediated by him as lower than would subjects who disliked the source. To test the hypothesis, subjects were asked to estimate the probability that a liked or disliked confederate would deliver an electric shock on each of 10…
Yoganandan, Narayan; Arun, Mike W J; Pintar, Frank A; Szabo, Aniko
2014-01-01
Derive optimum injury probability curves to describe human tolerance of the lower leg using parametric survival analysis. The study reexamined lower leg postmortem human subjects (PMHS) data from a large group of specimens. Briefly, axial loading experiments were conducted by impacting the plantar surface of the foot. Both injury and noninjury tests were included in the testing process. They were identified by pre- and posttest radiographic images and detailed dissection following the impact test. Fractures included injuries to the calcaneus and distal tibia-fibula complex (including pylon), representing severities at the Abbreviated Injury Score (AIS) level 2+. For the statistical analysis, peak force was chosen as the main explanatory variable and the age was chosen as the covariable. Censoring statuses depended on experimental outcomes. Parameters from the parametric survival analysis were estimated using the maximum likelihood approach and the dfbetas statistic was used to identify overly influential samples. The best fit from the Weibull, log-normal, and log-logistic distributions was based on the Akaike information criterion. Plus and minus 95% confidence intervals were obtained for the optimum injury probability distribution. The relative sizes of the interval were determined at predetermined risk levels. Quality indices were described at each of the selected probability levels. The mean age, stature, and weight were 58.2±15.1 years, 1.74±0.08 m, and 74.9±13.8 kg, respectively. Excluding all overly influential tests resulted in the tightest confidence intervals. The Weibull distribution was the most optimum function compared to the other 2 distributions. A majority of quality indices were in the good category for this optimum distribution when results were extracted for 25-, 45- and 65-year-olds at 5, 25, and 50% risk levels age groups for lower leg fracture. For 25, 45, and 65 years, peak forces were 8.1, 6.5, and 5.1 kN at 5% risk; 9.6, 7.7, and 6.1 k
Polynomial probability distribution estimation using the method of moments
Mattsson, Lars; Rydén, Jesper
2017-01-01
We suggest a procedure for estimating Nth degree polynomial approximations to unknown (or known) probability density functions (PDFs) based on N statistical moments from each distribution. The procedure is based on the method of moments and is setup algorithmically to aid applicability and to ensure rigor in use. In order to show applicability, polynomial PDF approximations are obtained for the distribution families Normal, Log-Normal, Weibull as well as for a bimodal Weibull distribution and a data set of anonymized household electricity use. The results are compared with results for traditional PDF series expansion methods of Gram–Charlier type. It is concluded that this procedure is a comparatively simple procedure that could be used when traditional distribution families are not applicable or when polynomial expansions of probability distributions might be considered useful approximations. In particular this approach is practical for calculating convolutions of distributions, since such operations become integrals of polynomial expressions. Finally, in order to show an advanced applicability of the method, it is shown to be useful for approximating solutions to the Smoluchowski equation. PMID:28394949
Polynomial probability distribution estimation using the method of moments.
Munkhammar, Joakim; Mattsson, Lars; Rydén, Jesper
2017-01-01
We suggest a procedure for estimating Nth degree polynomial approximations to unknown (or known) probability density functions (PDFs) based on N statistical moments from each distribution. The procedure is based on the method of moments and is setup algorithmically to aid applicability and to ensure rigor in use. In order to show applicability, polynomial PDF approximations are obtained for the distribution families Normal, Log-Normal, Weibull as well as for a bimodal Weibull distribution and a data set of anonymized household electricity use. The results are compared with results for traditional PDF series expansion methods of Gram-Charlier type. It is concluded that this procedure is a comparatively simple procedure that could be used when traditional distribution families are not applicable or when polynomial expansions of probability distributions might be considered useful approximations. In particular this approach is practical for calculating convolutions of distributions, since such operations become integrals of polynomial expressions. Finally, in order to show an advanced applicability of the method, it is shown to be useful for approximating solutions to the Smoluchowski equation.
Probability of lensing magnification by cosmologically distributed galaxies
Pei, Yichuan C.
1993-01-01
We present the analytical formulae for computing the magnification probability caused by cosmologically distributed galaxies. The galaxies are assumed to be singular, truncated-isothermal spheres without both evolution and clustering in redshift. We find that, for a fixed total mass, extended galaxies produce a broader shape in the magnification probability distribution and hence are less efficient as gravitational lenses than compact galaxies. The high-magnification tail caused by large galaxies is well approximated by an A exp -3 form, while the tail by small galaxies is slightly shallower. The mean magnification as a function of redshift is, however, found to be independent of the size of the lensing galaxies. In terms of the flux conservation, our formulae for the isothermal galaxy model predict a mean magnification to within a few percent with the Dyer-Roeder model of a clumpy universe.
Outage probability of distributed beamforming with co-channel interference
Yang, Liang
2012-03-01
In this letter, we consider a distributed beamforming scheme (DBF) in the presence of equal-power co-channel interferers for both amplify-and-forward and decode-and-forward relaying protocols over Rayleigh fading channels. We first derive outage probability expressions for the DBF systems. We then present a performance analysis for a scheme relying on source selection. Numerical results are finally presented to verify our analysis. © 2011 IEEE.
Partial Generalized Probability Weighted Moments for Exponentiated Exponential Distribution
Directory of Open Access Journals (Sweden)
Neema Mohamed El Haroun
2015-09-01
Full Text Available The generalized probability weighted moments are widely used in hydrology for estimating parameters of flood distributions from complete sample. The method of partial generalized probability weighted moments was used to estimate the parameters of distributions from censored sample. This article offers new method called partial generalized probability weighted moments (PGPWMs for the analysis of censored data. The method of PGPWMs is an extended class from partial generalized probability weighted moments. To illustrate the new method, estimation of the unknown parameters from exponentiated exponential distribution based on doubly censored sample is considered. PGPWMs estimators for right and left censored samples are obtained as special cases. Simulation study is conducted to investigate performance of estimates for exponentiated exponential distribution. Comparison between estimators is made through simulation via their biases and mean square errors. An illustration with real data is provided. Normal 0 false false false EN-US X-NONE AR-SA /* Style Definitions */ table.MsoNormalTable {mso-style-name:"جدول عادي"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:Arial; mso-bidi-theme-font:minor-bidi;}
Estimating probable flaw distributions in PWR steam generator tubes
Energy Technology Data Exchange (ETDEWEB)
Gorman, J.A.; Turner, A.P.L. [Dominion Engineering, Inc., McLean, VA (United States)
1997-02-01
This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regarding uncertainties and assumptions in the data and analyses.
Unitary equilibrations: Probability distribution of the Loschmidt echo
Campos Venuti, Lorenzo; Zanardi, Paolo
2010-02-01
Closed quantum systems evolve unitarily and therefore cannot converge in a strong sense to an equilibrium state starting out from a generic pure state. Nevertheless for large system size one observes temporal typicality. Namely, for the overwhelming majority of the time instants, the statistics of observables is practically indistinguishable from an effective equilibrium one. In this paper we consider the Loschmidt echo (LE) to study this sort of unitary equilibration after a quench. We draw several conclusions on general grounds and on the basis of an exactly solvable example of a quasifree system. In particular we focus on the whole probability distribution of observing a given value of the LE after waiting a long time. Depending on the interplay between the initial state and the quench Hamiltonian, we find different regimes reflecting different equilibration dynamics. When the perturbation is small and the system is away from criticality the probability distribution is Gaussian. However close to criticality the distribution function approaches a double-peaked, universal form.
Assessing magnitude probability distribution through physics-based rupture scenarios
Hok, Sébastien; Durand, Virginie; Bernard, Pascal; Scotti, Oona
2016-04-01
When faced with complex network of faults in a seismic hazard assessment study, the first question raised is to what extent the fault network is connected and what is the probability that an earthquake ruptures simultaneously a series of neighboring segments. Physics-based dynamic rupture models can provide useful insight as to which rupture scenario is most probable, provided that an exhaustive exploration of the variability of the input parameters necessary for the dynamic rupture modeling is accounted for. Given the random nature of some parameters (e.g. hypocenter location) and the limitation of our knowledge, we used a logic-tree approach in order to build the different scenarios and to be able to associate them with a probability. The methodology is applied to the three main faults located along the southern coast of the West Corinth rift. Our logic tree takes into account different hypothesis for: fault geometry, location of hypocenter, seismic cycle position, and fracture energy on the fault plane. The variability of these parameters is discussed, and the different values tested are weighted accordingly. 64 scenarios resulting from 64 parameter combinations were included. Sensitivity studies were done to illustrate which parameters control the variability of the results. Given the weight of the input parameters, we evaluated the probability to obtain a full network break to be 15 %, while single segment rupture represents 50 % of the scenarios. These rupture scenario probability distribution along the three faults of the West Corinth rift fault network can then be used as input to a seismic hazard calculation.
Maximum-entropy probability distributions under Lp-norm constraints
Dolinar, S.
1991-01-01
Continuous probability density functions and discrete probability mass functions are tabulated which maximize the differential entropy or absolute entropy, respectively, among all probability distributions with a given L sub p norm (i.e., a given pth absolute moment when p is a finite integer) and unconstrained or constrained value set. Expressions for the maximum entropy are evaluated as functions of the L sub p norm. The most interesting results are obtained and plotted for unconstrained (real valued) continuous random variables and for integer valued discrete random variables. The maximum entropy expressions are obtained in closed form for unconstrained continuous random variables, and in this case there is a simple straight line relationship between the maximum differential entropy and the logarithm of the L sub p norm. Corresponding expressions for arbitrary discrete and constrained continuous random variables are given parametrically; closed form expressions are available only for special cases. However, simpler alternative bounds on the maximum entropy of integer valued discrete random variables are obtained by applying the differential entropy results to continuous random variables which approximate the integer valued random variables in a natural manner. All the results are presented in an integrated framework that includes continuous and discrete random variables, constraints on the permissible value set, and all possible values of p. Understanding such as this is useful in evaluating the performance of data compression schemes.
Diachronic changes in word probability distributions in daily press
Directory of Open Access Journals (Sweden)
Stanković Jelena
2006-01-01
Full Text Available Changes in probability distributions of individual words and word types were investigated within two samples of daily press in the span of fifty years. Two samples of daily press were used in this study. The one derived from the Corpus of Serbian Language (CSL /Kostić, Đ., 2001/ that covers period between 1945. and 1957. and the other derived from the Ebart Media Documentation (EBR that was complied from seven daily news and five weekly magazines from 2002. and 2003. Each sample consisted of about 1 million words. The obtained results indicate that nouns and adjectives were more frequent in the CSL, while verbs and prepositions are more frequent in the EBR sample, suggesting a decrease of sentence length in the last five decades. Conspicuous changes in probability distribution of individual words were observed for nouns and adjectives, while minimal or no changes were observed for verbs and prepositions. Such an outcome suggests that nouns and adjectives are most susceptible to diachronic changes, while verbs and prepositions appear to be resistant to such changes.
PROBABILITY DISTRIBUTION OVER THE SET OF CLASSES IN ARABIC DIALECT CLASSIFICATION TASK
Directory of Open Access Journals (Sweden)
O. V. Durandin
2017-01-01
Full Text Available Subject of Research.We propose an approach for solving machine learning classification problem that uses the information about the probability distribution on the training data class label set. The algorithm is illustrated on a complex natural language processing task - classification of Arabic dialects. Method. Each object in the training set is associated with a probability distribution over the class label set instead of a particular class label. The proposed approach solves the classification problem taking into account the probability distribution over the class label set to improve the quality of the built classifier. Main Results. The suggested approach is illustrated on the automatic Arabic dialects classification example. Mined from the Twitter social network, the analyzed data contain word-marks and belong to the following six Arabic dialects: Saudi, Levantine, Algerian, Egyptian, Iraq, Jordan, and to the modern standard Arabic (MSA. The paper results demonstrate an increase of the quality of the built classifier achieved by taking into account probability distributions over the set of classes. Experiments carried out show that even relatively naive accounting of the probability distributions improves the precision of the classifier from 44% to 67%. Practical Relevance. Our approach and corresponding algorithm could be effectively used in situations when a manual annotation process performed by experts is connected with significant financial and time resources, but it is possible to create a system of heuristic rules. The implementation of the proposed algorithm enables to decrease significantly the data preparation expenses without substantial losses in the precision of the classification.
Non-Gaussian probability distributions of solar wind fluctuations
Directory of Open Access Journals (Sweden)
E. Marsch
Full Text Available The probability distributions of field differences ∆x(τ=x(t+τ-x(t, where the variable x(t may denote any solar wind scalar field or vector field component at time t, have been calculated from time series of Helios data obtained in 1976 at heliocentric distances near 0.3 AU. It is found that for comparatively long time lag τ, ranging from a few hours to 1 day, the differences are normally distributed according to a Gaussian. For shorter time lags, of less than ten minutes, significant changes in shape are observed. The distributions are often spikier and narrower than the equivalent Gaussian distribution with the same standard deviation, and they are enhanced for large, reduced for intermediate and enhanced for very small values of ∆x. This result is in accordance with fluid observations and numerical simulations. Hence statistical properties are dominated at small scale τ by large fluctuation amplitudes that are sparsely distributed, which is direct evidence for spatial intermittency of the fluctuations. This is in agreement with results from earlier analyses of the structure functions of ∆x. The non-Gaussian features are differently developed for the various types of fluctuations. The relevance of these observations to the interpretation and understanding of the nature of solar wind magnetohydrodynamic (MHD turbulence is pointed out, and contact is made with existing theoretical concepts of intermittency in fluid turbulence.
Subjective Probability and Information Retrieval: A Review of the Psychological Literature.
Thompson, Paul
1988-01-01
Reviews the subjective probability estimation literature of six schools of human judgement and decision making: decision theory, behavioral decision theory, psychological decision theory, social judgement theory, information integration theory, and attribution theory. Implications for probabilistic information retrieval are discussed, including…
Molecular clouds have power-law probability distribution functions
Lombardi, Marco; Alves, João; Lada, Charles J.
2015-04-01
In this Letter we investigate the shape of the probability distribution of column densities (PDF) in molecular clouds. Through the use of low-noise, extinction-calibrated Herschel/Planck emission data for eight molecular clouds, we demonstrate that, contrary to common belief, the PDFs of molecular clouds are not described well by log-normal functions, but are instead power laws with exponents close to two and with breaks between AK ≃ 0.1 and 0.2 mag, so close to the CO self-shielding limit and not far from the transition between molecular and atomic gas. Additionally, we argue that the intrinsic functional form of the PDF cannot be securely determined below AK ≃ 0.1 mag, limiting our ability to investigate more complex models for the shape of the cloud PDF.
Epileptic seizure detection using probability distribution based on equal frequency discretization.
Orhan, Umut; Hekim, Mahmut; Ozer, Mahmut
2012-08-01
In this study, we offered a new feature extraction approach called probability distribution based on equal frequency discretization (EFD) to be used in the detection of epileptic seizure from electroencephalogram (EEG) signals. Here, after EEG signals were discretized by using EFD method, the probability densities of the signals were computed according to the number of data points in each interval. Two different probability density functions were defined by means of the polynomial curve fitting for the subjects without epileptic seizure and the subjects with epileptic seizure, and then when using the mean square error criterion for these two functions, the success of epileptic seizure detection was 96.72%. In addition, when the probability densities of EEG segments were used as the inputs of a multilayer perceptron neural network (MLPNN) model, the success of epileptic seizure detection was 99.23%. This results show that non-linear classifiers can easily detect the epileptic seizures from EEG signals by means of probability distribution based on EFD.
Wallsten, T S; Forsyth, B H
1983-06-01
As part of a method for assessing health risks associated with primary National Ambient Air Quality Standards. T. B. Feagans and W. F. Biller (Research Triangle Park, North Carolina. EPA Office of Air Quality Planning and Standards, May 1981) developed a technique for encoding experts' subjective probabilities regarding dose--response functions. The encoding technique is based on B. O. Koopman's (Bulletin of the American Mathematical Society, 1940, 46, 763-764; Annals of Mathematics, 1940, 41, 269-292) probability theory, which does not require probabilities to be sharp, but rather allows lower and upper probabilities to be associated with an event. Uncertainty about a dose--response function can be expressed either in terms of the response rate expected at a given concentration or, conversely, in terms of the concentration expected to support a given response rate. Feagans and Biller (1981, cited above) derive the relation between the two conditional probabilities, which is easily extended to upper and lower conditional probabilities. These relations were treated as coherence requirements in an experiment utilizing four ozone and four lead experts as subjects, each providing judgments on two separate occasions. Four subjects strongly satisfied the coherence requirements in both conditions. and three more did no in the second session only. The eighth subject also improved in Session 2. Encoded probabilities were highly correlated between the two sessions, but changed from the first to the second in a manner that improved coherence and reflected greater attention to certain parameters of the dose--response function.
Energy Technology Data Exchange (ETDEWEB)
Aoki, Shigeru; Suzuki, Kohei (Tokyo Metropolitan Univ. (Japan))
1984-06-01
An estimation technique whereby the first excursion probability of the mechanical appendage system subjected to the nonstationary seismic excitation can be conventionally calculated is proposed. The first excursion probability of the appendage system is estimated by using this method and the following results are obtained. (1) The probability from this technique is more convervative than that from a simulation technique taking artificial time histories compatible to the design spectrum as input excitation. (2) The first excursion probability is practically independent of the natural period of the appendage system when the tolerable barrier level is normalized by the response amplification factor given by the design spectrum. (3) The first excursion probability decreases as the damping ratio of the appendage system increases. It also decreases as the mass ratio of the appendage system to the supporting system increases. (4) For the inelastic appendage system, the first excursion probability is reduced, if an appropriate elongation is permitted.
Probability density distribution of velocity differences at high Reynolds numbers
Praskovsky, Alexander A.
1993-01-01
Recent understanding of fine-scale turbulence structure in high Reynolds number flows is mostly based on Kolmogorov's original and revised models. The main finding of these models is that intrinsic characteristics of fine-scale fluctuations are universal ones at high Reynolds numbers, i.e., the functional behavior of any small-scale parameter is the same in all flows if the Reynolds number is high enough. The only large-scale quantity that directly affects small-scale fluctuations is the energy flux through a cascade. In dynamical equilibrium between large- and small-scale motions, this flux is equal to the mean rate of energy dissipation epsilon. The pdd of velocity difference is a very important characteristic for both the basic understanding of fully developed turbulence and engineering problems. Hence, it is important to test the findings: (1) the functional behavior of the tails of the probability density distribution (pdd) represented by P(delta(u)) is proportional to exp(-b(r) absolute value of delta(u)/sigma(sub delta(u))) and (2) the logarithmic decrement b(r) scales as b(r) is proportional to r(sup 0.15) when separation r lies in the inertial subrange in high Reynolds number laboratory shear flows.
Study of the SEMG probability distribution of the paretic tibialis anterior muscle
Cherniz, Analía S.; Bonell, Claudia E.; Tabernig, Carolina B.
2007-11-01
The surface electromyographic signal is a stochastic signal that has been modeled as a Gaussian process, with a zero mean. It has been experimentally proved that this probability distribution can be adjusted with less error to a Laplacian type distribution. The selection of estimators for the detection of changes in the amplitude of the muscular signal depends, among other things, on the type of distribution. In the case of subjects with lesions to the superior motor neuron, the lack of central control affects the muscular tone, the force and the patterns of muscular movement involved in activities such as the gait cycle. In this work, the distribution types of the SEMG signal amplitudes of the tibialis anterior muscle are evaluated during gait, both in two healthy subjects and in two hemiparetic ones in order to select the estimators that best characterize them. It was observed that the Laplacian distribution function would be the one that best adjusts to the experimental data in the studied subjects, although this largely depends on the subject and on the data segment analyzed.
Energy probability distribution zeros: A route to study phase transitions
Costa, B. V.; Mól, L. A. S.; Rocha, J. C. S.
2017-07-01
In the study of phase transitions a very few models are accessible to exact solution. In most cases analytical simplifications have to be done or some numerical techniques have to be used to get insight about their critical properties. Numerically, the most common approaches are those based on Monte Carlo simulations together with finite size scaling analysis. The use of Monte Carlo techniques requires the estimation of quantities like the specific heat or susceptibilities in a wide range of temperaturesor the construction of the density of states in large intervals of energy. Although many of these techniques are well developed they may be very time consuming when the system size becomes large enough. It should be suitable to have a method that could surpass those difficulties. In this work we present an iterative method to study the critical behavior of a system based on the partial knowledge of the complex Fisher zeros set of the partition function. The method is general with advantages over most conventional techniques since it does not need to identify any order parameter a priori. The critical temperature and exponents can be obtained with great precision even in the most unamenable cases like the two dimensional XY model. To test the method and to show how it works we applied it to some selected models where the transitions are well known: The 2D Ising, Potts and XY models and to a homopolymer system. Our choices cover systems with first order, continuous and Berezinskii-Kosterlitz-Thouless transitions as well as the homopolymer that has two pseudo-transitions. The strategy can easily be adapted to any model, classical or quantum, once we are able to build the corresponding energy probability distribution.
Soury, Hamza
2012-06-01
This letter considers the average bit error probability of binary coherent signaling over flat fading channels subject to additive generalized Gaussian noise. More specifically, a generic closed form expression in terms of the Fox\\'s H function is offered for the extended generalized-K fading case. Simplifications for some special fading distributions such as generalized-K fading and Nakagami-m fading and special additive noise distributions such as Gaussian and Laplacian noise are then presented. Finally, the mathematical formalism is illustrated by some numerical examples verified by computer based simulations for a variety of fading and additive noise parameters. © 2012 IEEE.
Soury, Hamza
2013-07-01
This paper considers the average symbol error probability of square Quadrature Amplitude Modulation (QAM) coherent signaling over flat fading channels subject to additive generalized Gaussian noise. More specifically, a generic closedform expression in terms of the Fox H function and the bivariate Fox H function is offered for the extended generalized-K fading case. Simplifications for some special fading distributions such as generalized-K fading, Nakagami-m fading, and Rayleigh fading and special additive noise distributions such as Gaussian and Laplacian noise are then presented. Finally, the mathematical formalism is illustrated by some numerical examples verified by computer based simulations for a variety of fading and additive noise parameters.
Tools for Bramwell-Holdsworth-Pinton Probability Distribution
Directory of Open Access Journals (Sweden)
Mirela Danubianu
2009-01-01
Full Text Available This paper is a synthesis of a range of papers presentedat various conferences related to distribution Bramwell-Holdsworth-Pinton. S. T. Bramwell, P. C. W. Holdsworth, J. F.Pinton introduced a new non-parametric distribution (calledBHP after studying some magnetization problems in 2D. Probabilitydensity function of distribution can be aproximated as amodified GFT (Gumbel-Fisher-Tippitt distribution.
Concepts of microdosimetry II. Probability distributions of the microdosimetric variables.
Kellerer, A M; Chmelevsky, D
1975-10-02
This is the second part of an investigation of microdosimetric concepts relevant to numerical calculations. Two different types of distributions of the microdosimetric quantities are discussed. The sampling procedures are considered, which lead from the initial pattern of enregy transfers, the so-called inchoate distribution, to the distribution of specific energy and their mean values. The dependence of the distributions of specific energy on absorbed dose is related to the sampling procedures.
Fitting probability distributions to component water quality data from ...
African Journals Online (AJOL)
Continuous statistical distributions are usually applied to engineering situations. A goodness-of-fit test is usually necessary to determine the fitness of a distribution to specific data. The Kolmogorov-Smirnov test which is a widely used goodness-of-fit measure was used in the work. A total of four continuous distributions ...
How effective is incidental learning of the shape of probability distributions?
Tran, Randy; Vul, Edward; Pashler, Harold
2017-08-01
The idea that people learn detailed probabilistic generative models of the environments they interact with is intuitively appealing, and has received support from recent studies of implicit knowledge acquired in daily life. The goal of this study was to see whether people efficiently induce a probability distribution based upon incidental exposure to an unknown generative process. Subjects played a 'whack-a-mole' game in which they attempted to click on objects appearing briefly, one at a time on the screen. Horizontal positions of the objects were generated from a bimodal distribution. After 180 plays of the game, subjects were unexpectedly asked to generate another 180 target positions of their own from the same distribution. Their responses did not even show a bimodal distribution, much less an accurate one (Experiment 1). The same was true for a pre-announced test (Experiment 2). On the other hand, a more extreme bimodality with zero density in a middle region did produce some distributional learning (Experiment 3), perhaps reflecting conscious hypothesis testing. We discuss the challenge this poses to the idea of efficient accurate distributional learning.
Investigation of Probability Distributions Using Dice Rolling Simulation
Lukac, Stanislav; Engel, Radovan
2010-01-01
Dice are considered one of the oldest gambling devices and thus many mathematicians have been interested in various dice gambling games in the past. Dice have been used to teach probability, and dice rolls can be effectively simulated using technology. The National Council of Teachers of Mathematics (NCTM) recommends that teachers use simulations…
Measuring Robustness of Timetables in Stations using a Probability Distribution
DEFF Research Database (Denmark)
Jensen, Lars Wittrup; Landex, Alex
delays caused by interdependencies, and result in a more robust operation. Currently three methods to calculate the complexity of station exists: 1. Complexity of a station based on the track layout 2. Complexity of a station based on the probability of a conflict using a plan of operation 3. Complexity...
Some properties of a 5-parameter bivariate probability distribution
Tubbs, J. D.; Brewer, D. W.; Smith, O. E.
1983-01-01
A five-parameter bivariate gamma distribution having two shape parameters, two location parameters and a correlation parameter was developed. This more general bivariate gamma distribution reduces to the known four-parameter distribution. The five-parameter distribution gives a better fit to the gust data. The statistical properties of this general bivariate gamma distribution and a hypothesis test were investigated. Although these developments have come too late in the Shuttle program to be used directly as design criteria for ascent wind gust loads, the new wind gust model has helped to explain the wind profile conditions which cause large dynamic loads. Other potential applications of the newly developed five-parameter bivariate gamma distribution are in the areas of reliability theory, signal noise, and vibration mechanics.
Joint probability distributions for a class of non-Markovian processes.
Baule, A; Friedrich, R
2005-02-01
We consider joint probability distributions for the class of coupled Langevin equations introduced by Fogedby [H. C. Fogedby, Phys. Rev. E 50, 1657 (1994)]. We generalize well-known results for the single-time probability distributions to the case of N -time joint probability distributions. It is shown that these probability distribution functions can be obtained by an integral transform from distributions of a Markovian process. The integral kernel obeys a partial differential equation with fractional time derivatives reflecting the non-Markovian character of the process.
Conant, Darcy Lynn
2013-01-01
Stochastic understanding of probability distribution undergirds development of conceptual connections between probability and statistics and supports development of a principled understanding of statistical inference. This study investigated the impact of an instructional course intervention designed to support development of stochastic…
Most probable degree distribution at fixed structural entropy
Indian Academy of Sciences (India)
This result indicates that scale-free degree distributions emerge naturally when con- sidering networks ensemble with small structural entropy. The appearance of the power-law degree distribution reflects the tendency of social, technological and es- pecially biological networks toward 'ordering'. This tendency is at work ...
Optimal design of unit hydrographs using probability distribution and ...
Indian Academy of Sciences (India)
R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22
Model I (existing NLP). N − M + 1. 1. Model II (gamma distribution). 2. Nil. Model III (log-normal distribution). 2. Nil better in predicting peak flow for both the datasets, but it fails to predict the rising and recession limbs properly. Table 4 shows the objective function values for the different models. The lesser the objective ...
Probability distribution of the number of deceits in collective robotics
Murciano, Antonio; Zamora, Javier; Lopez-Sanchez, Jesus; Rodriguez-Santamaria, Emilia
2002-01-01
The benefit obtained by a selfish robot by cheating in a real multirobotic system can be represented by the random variable Xn,q: the number of cheating interactions needed before all the members in a cooperative team of robots, playing a TIT FOR TAT strategy, recognize the selfish robot. Stability of cooperation depends on the ratio between the benefit obtained by selfish and cooperative robots. In this paper, we establish the probability model for Xn,q. If the values...
Probability Distribution Function of the Upper Equatorial Pacific Current Speeds
National Research Council Canada - National Science Library
Chu, Peter C
2005-01-01
...), constructed from hourly ADCP data (1990-2007) at six stations for the Tropical Atmosphere Ocean project satisfies the two-parameter Weibull distribution reasonably well with different characteristics between El Nino and La Nina events...
Higher risk of probable mental emotional disorder in low or severe vision subjects
Directory of Open Access Journals (Sweden)
Lutfah Rif’ati
2012-07-01
health problem priority in Indonesia. This paper presents an assessment of severe visual impairments related to the risk of MED. Methods: This paper assessed a part of Basic Health Research (Riskesdas 2007 data. For this assessment, subjects 15 years old or more had their visual acuity measured using the Snellen chart and their mental health status determined using the Self Reporting Questionnaire (SRQ 20. A subject was considered to have probable MED if the subject had a total score of 6 or more on the SRQ. Based on the measure of visual acuity, visual acuity was divided into 3 categories: normal/mild (20/20 to 20/60; low vision (less than 20/60 to 3/60; and blind (less than 3/60 to 0/0. Results: Among 972,989 subjects, 554,886 were aged 15 years or older. 11.4% of the subjects had probable MED. The prevalence of low vision and blindness was 5.1% and 0.9%, respectively. Compared to subjects with normal or mild visual impairments, subjects with low vision had a 74% increased risk for probable MED [adjusted relative risk (RRa=1,75; 95% confidence interval (CI=1,71-1,79]. Blind subjects had a 2.7-fold risk to be probable MED (RRa=2.69; 95% CI=2.60-2.78] compared to subjects with normal or mild visual impairments. Conclusion: Visual impairment severity increased probable MED risk. Therefore, visual impairment subjects need more attention on probable MED. (Health Science Indones 2011;2:9-13
Parametric Probability Distribution Functions for Axon Diameters of Corpus Callosum
Sepehrband, Farshid; Alexander, Daniel C; Clark, Kristi A.; Kurniawan, Nyoman D; Yang, Zhengyi; Reutens, David C.
2016-01-01
Axon diameter is an important neuroanatomical characteristic of the nervous system that alters in the course of neurological disorders such as multiple sclerosis. Axon diameters vary, even within a fiber bundle, and are not normally distributed. An accurate distribution function is therefore beneficial, either to describe axon diameters that are obtained from a direct measurement technique (e.g., microscopy), or to infer them indirectly (e.g., using diffusion-weighted MRI). The gamma distribu...
Juanchich, Marie; Sirota, Miroslav
2017-08-17
We tested whether people focus on extreme outcomes to predict climate change and assessed the gap between the frequency of the predicted outcome and its perceived probability while controlling for climate change beliefs. We also tested 2 cost-effective interventions to reduce the preference for extreme outcomes and the frequency-probability gap by manipulating the probabilistic format: numerical or dual-verbal-numerical. In 4 experiments, participants read a scenario featuring a distribution of sea level rises, selected a sea rise to complete a prediction (e.g., "It is 'unlikely' that the sea level will rise . . . inches") and judged the likelihood of this sea rise occurring. Results showed that people have a preference for predicting extreme climate change outcomes in verbal predictions (59% in Experiments 1-4) and that this preference was not predicted by climate change beliefs. Results also showed an important gap between the predicted outcome frequency and participants' perception of the probability that it would occur. The dual-format reduced the preference for extreme outcomes for low and medium probability predictions but not for high ones, and none of the formats consistently reduced the frequency-probability gap. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
A web-based tool for eliciting probability distributions from experts
Morris, David E.; Oakley, Jeremy E.; Crowe, John A.
2014-01-01
We present a web-based probability distribution elicitation tool: The MATCH Uncertainty Elicitation Tool. The Tool is designed to help elicit probability distributions about uncertain model parameters from experts, in situations where suitable data is either unavailable or sparse. The Tool is free to use, and offers five different techniques for eliciting univariate probability distributions. A key feature of the Tool is that users can log in from different sites and view and interact with th...
Supervised learning of probability distributions by neural networks
Baum, Eric B.; Wilczek, Frank
1988-01-01
Supervised learning algorithms for feedforward neural networks are investigated analytically. The back-propagation algorithm described by Werbos (1974), Parker (1985), and Rumelhart et al. (1986) is generalized by redefining the values of the input and output neurons as probabilities. The synaptic weights are then varied to follow gradients in the logarithm of likelihood rather than in the error. This modification is shown to provide a more rigorous theoretical basis for the algorithm and to permit more accurate predictions. A typical application involving a medical-diagnosis expert system is discussed.
Interacting discrete Markov processes with power-law probability distributions
Ridley, Kevin D.; Jakeman, Eric
2017-09-01
During recent years there has been growing interest in the occurrence of long-tailed distributions, also known as heavy-tailed or fat-tailed distributions, which can exhibit power-law behaviour and often characterise physical systems that undergo very large fluctuations. In this paper we show that the interaction between two discrete Markov processes naturally generates a time-series characterised by such a distribution. This possibility is first demonstrated by numerical simulation and then confirmed by a mathematical analysis that enables the parameter range over which the power-law occurs to be quantified. The results are supported by comparison of numerical results with theoretical predictions and general conclusions are drawn regarding mechanisms that can cause this behaviour.
Probability distribution of wind retrieval error for the NASA scatterometer
Leotta, Daniel F.; Long, David G.
1989-01-01
The NASA scatterometer (NSCAT) is a spaceborne scatterometer scheduled to be deployed in the mid-1990s. An analysis of the wind retrieval error distribution for wind estimates based on backscatter measurements made by the NSCAT instrument is presented. The results are based on an end-to-end simulation of the scatterometer instrument and data processing. In general, the distribution of the wind speed error, when normalized, is independent of the true wind speed and direction. The wind speed error can be characterized by a normal distribution. The wind direction error is independent of the true wind speed, but depends on the true wind direction. Details for wind vectors with true wind speeds from 3 m/s to 33 m/s and true wind directions from 0 to 360 deg are presented.
Parton distributions: determining probabilities in a space of functions
Ball, Richard D.
2011-01-01
We discuss the statistical properties of parton distributions within the framework of the NNPDF methodology. We present various tests of statistical consistency, in particular that the distribution of results does not depend on the underlying parametrization and that it behaves according to Bayes' theorem upon the addition of new data. We then study the dependence of results on consistent or inconsistent datasets and present tools to assess the consistency of new data. Finally we estimate the relative size of the PDF uncertainty due to data uncertainties, and that due to the need to infer a functional form from a finite set of data.
Size effect on strength and lifetime probability distributions of ...
Indian Academy of Sciences (India)
ments and ships, as well as computer circuits, chips and MEMS, should be designed for failure ... The theory is shown to match the experimentally observed systematic deviations of strength and lifetime histograms of industrial ceramics from the Weibull distribution. ... A probabilistic theory was recently developed to model.
Size effect on strength and lifetime probability distributions of ...
Indian Academy of Sciences (India)
The empirical approach is sufﬁcient for perfectly brittle and perfectly ductile structures since the cumulative distribution function (cdf) of random strength is known, making it possible to extrapolate to the tail from the mean and variance. However, the empirical approach does not apply to structures consisting of quasibrittle ...
Cosmological constraints from the convergence 1-point probability distribution
Patton, Kenneth; Blazek, Jonathan; Honscheid, Klaus; Huff, Eric; Melchior, Peter; Ross, Ashley J.; Suchyta, Eric
2017-11-01
We examine the cosmological information available from the 1-point probability density function (PDF) of the weak-lensing convergence field, utilizing fast l-picola simulations and a Fisher analysis. We find competitive constraints in the Ωm-σ8 plane from the convergence PDF with 188 arcmin2 pixels compared to the cosmic shear power spectrum with an equivalent number of modes (ℓ PDF also partially breaks the degeneracy cosmic shear exhibits in that parameter space. A joint analysis of the convergence PDF and shear 2-point function also reduces the impact of shape measurement systematics, to which the PDF is less susceptible, and improves the total figure of merit by a factor of 2-3, depending on the level of systematics. Finally, we present a correction factor necessary for calculating the unbiased Fisher information from finite differences using a limited number of cosmological simulations.
Asymmetric Phosphatidylethanolamine Distribution Controls Fusion Pore Lifetime and Probability.
Kreutzberger, Alex J B; Kiessling, Volker; Liang, Binyong; Yang, Sung-Tae; Castle, J David; Tamm, Lukas K
2017-11-07
Little attention has been given to how the asymmetric lipid distribution of the plasma membrane might facilitate fusion pore formation during exocytosis. Phosphatidylethanolamine (PE), a cone-shaped phospholipid, is predominantly located in the inner leaflet of the plasma membrane and has been proposed to promote membrane deformation and stabilize fusion pores during exocytotic events. To explore this possibility, we modeled exocytosis using plasma membrane SNARE-containing planar-supported bilayers and purified neuroendocrine dense core vesicles (DCVs) as fusion partners, and we examined how different PE distributions between the two leaflets of the supported bilayers affected SNARE-mediated fusion. Using total internal reflection fluorescence microscopy, the fusion of single DCVs with the planar-supported bilayer was monitored by observing DCV-associated neuropeptide Y tagged with a fluorescent protein. The time-dependent line shape of the fluorescent signal enables detection of DCV docking, fusion-pore opening, and vesicle collapse into the planar membrane. Four different distributions of PE in the planar bilayer mimicking the plasma membrane were examined: exclusively in the leaflet facing the DCVs; exclusively in the opposite leaflet; equally distributed in both leaflets; and absent from both leaflets. With PE in the leaflet facing the DCVs, overall fusion was most efficient and the extended fusion pore lifetime (0.7 s) enabled notable detection of content release preceding vesicle collapse. All other PE distributions decreased fusion efficiency, altered pore lifetime, and reduced content release. With PE exclusively in the opposite leaflet, resolution of pore opening and content release was lost. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Extreme points of the convex set of joint probability distributions with ...
Indian Academy of Sciences (India)
Abstract. By using a quantum probabilistic approach we obtain a description of the extreme points of the convex set of all joint probability distributions on the product of two standard Borel spaces with fixed marginal distributions.
Principal modes of variation of rain-rate probability distributions
Bell, Thomas L.; Suhasini, R.
1994-01-01
Radar or satellite observations of an area generate sequences of rain-rate maps. From a gridded map a histogram of rain rates can be obtained representing the relative areas occupied by rain rates of various strengths. The histograms vary with time as precipitating systems in the area evolve and decay and amounts of convective and stratiform rain in the area change. A method of decomposing the histograms into linear combinations of a few empirical distributions with time-dependent coefficients is developed, using principal component analysis as a starting point. When applied to a tropical Atlantic dataset (GATE), two distributions emerge naturally from the analysis, resembling stratiform and convective rain-rate distributions in that they peak at low and high rain rates, respectively. The two 'modes' have different timescales and only the high-rain-rate mode has a statistically significant diurnal cycle. The ability of just two modes to describe rain variabiltiy over an area can explain why methods of estimating area-averaged rain rate from the area covered by rain rates above a certain threshold are so successful.
Yilmaz, Şeyda; Bayrak, Erdem; Bayrak, Yusuf
2016-04-01
In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed time using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.
Energy Technology Data Exchange (ETDEWEB)
Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com [Karadeniz Technical University, Trabzon (Turkey); Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr [Ağrı İbrahim Çeçen University, Ağrı (Turkey)
2016-04-18
In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed time using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.
Probability Measure of Navigation pattern predition using Poisson Distribution Analysis
Dr.V.Valli Mayil; Ms. R. Rooba; Ms. C. Parimala
2012-01-01
The World Wide Web has become one of the most important media to store, share and distribute information. The rapid expansion of the web has provided a great opportunity to study user and system behavior by exploring web access logs. Web Usage Mining is the application of data mining techniques to large web data repositories in order to extract usage patterns. Every web server keeps a log of all transactions between the server and the clients. The log data which are collected by web servers c...
Turbulent plane Couette flow using probability distribution functions
Srinivasan, R.; Giddens, D. P.; Bangert, L. H.; Wu, J. C.
1977-01-01
A numerical scheme employing a combination of the discrete ordinate method and finite differences is developed for solving the one-dimensional form of Lundgren's (1967) model equation for turbulent plane Couette flow. The approach used requires no a priori assumption about the form of the turbulent distribution function, and the numerical solution is obtained directly from the governing differential equations. Two different types of boundary conditions (zero-gradient and Chapman-Enskog) for the distribution function are evaluated by comparing the numerical results with experimental data. It is found that: (1) the present approach gives convergent and stable results over a wide range of Reynolds numbers; (2) Lundgren's equation yields results that compare well with experimental data for mean velocity and skin friction in the case of simple Couette flow; (3) the zero-gradient boundary condition leads to a logarithmic flow profile; and (4) the Chapman-Enskog boundary condition provides very good agreement with experimental data when applied within the near-wall region.
Labudda, Kirsten; Woermann, Friedrich G; Mertens, Markus; Pohlmann-Eden, Bernd; Markowitsch, Hans J; Brand, Matthias
2008-06-01
Recent functional neuroimaging and lesion studies demonstrate the involvement of the orbitofrontal/ventromedial prefrontal cortex as a key structure in decision making processes. This region seems to be particularly crucial when contingencies between options and consequences are unknown but have to be learned by the use of feedback following previous decisions (decision making under ambiguity). However, little is known about the neural correlates of decision making under risk conditions in which information about probabilities and potential outcomes is given. In the present study, we used functional magnetic resonance imaging to measure blood-oxygenation-level-dependent (BOLD) responses in 12 subjects during a decision making task. This task provided explicit information about probabilities and associated potential incentives. The responses were compared to BOLD signals in a control condition without information about incentives. In contrast to previous decision making studies, we completely removed the outcome phase following a decision to exclude the potential influence of feedback previously received on current decisions. The results indicate that the integration of information about probabilities and incentives leads to activations within the dorsolateral prefrontal cortex, the posterior parietal lobe, the anterior cingulate and the right lingual gyrus. We assume that this pattern of activation is due to the involvement of executive functions, conflict detection mechanisms and arithmetic operations during the deliberation phase of decisional processes that are based on explicit information.
Probability distribution analysis of observational extreme events and model evaluation
Yu, Q.; Lau, A. K. H.; Fung, J. C. H.; Tsang, K. T.
2016-12-01
Earth's surface temperatures were the warmest in 2015 since modern record-keeping began in 1880, according to the latest study. In contrast, a cold weather occurred in many regions of China in January 2016, and brought the first snowfall to Guangzhou, the capital city of Guangdong province in 67 years. To understand the changes of extreme weather events as well as project its future scenarios, this study use statistical models to analyze on multiple climate data. We first use Granger-causality test to identify the attribution of global mean temperature rise and extreme temperature events with CO2 concentration. The four statistical moments (mean, variance, skewness, kurtosis) of daily maximum temperature distribution is investigated on global climate observational, reanalysis (1961-2010) and model data (1961-2100). Furthermore, we introduce a new tail index based on the four moments, which is a more robust index to measure extreme temperatures. Our results show that the CO2 concentration can provide information to the time series of mean and extreme temperature, but not vice versa. Based on our new tail index, we find that other than mean and variance, skewness is an important indicator should be considered to estimate extreme temperature changes and model evaluation. Among the 12 climate model data we investigate, the fourth version of Community Climate System Model (CCSM4) from National Center for Atmospheric Research performs well on the new index we introduce, which indicate the model have a substantial capability to project the future changes of extreme temperature in the 21st century. The method also shows its ability to measure extreme precipitation/ drought events. In the future we will introduce a new diagram to systematically evaluate the performance of the four statistical moments in climate model output, moreover, the human and economic impacts of extreme weather events will also be conducted.
Calculation of ruin probabilities for a dense class of heavy tailed distributions
DEFF Research Database (Denmark)
Bladt, Mogens; Nielsen, Bo Friis; Samorodnitsky, Gennady
2015-01-01
In this paper, we propose a class of infinite-dimensional phase-type distributions with finitely many parameters as models for heavy tailed distributions. The class of finite-dimensional phase-type distributions is dense in the class of distributions on the positive reals and may hence approximat...... any such distribution. We prove that formulas from renewal theory, and with a particular attention to ruin probabilities, which are true for common phase-type distributions also hold true for the infinite-dimensional case. We provide algorithms for calculating functionals of interest...... of distributions with a slowly varying tail. An example from risk theory, comparing ruin probabilities for a classical risk process with Pareto distributed claim sizes, is presented and exact known ruin probabilities for the Pareto case are compared to the ones obtained by approximating by an infinite...
Calisto, H.; Bologna, M.
2007-05-01
We report an exact result for the calculation of the probability distribution of the Bernoulli-Malthus-Verhulst model driven by a multiplicative colored noise. We study the conditions under which the probability distribution of the Malthus-Verhulst model can exhibit a transition from a unimodal to a bimodal distribution depending on the value of a critical parameter. Also we show that the mean value of x(t) in the latter model always approaches asymptotically the value 1.
The probability distribution of the predicted CFM-induced ozone depletion. [Chlorofluoromethane
Ehhalt, D. H.; Chang, J. S.; Bulter, D. M.
1979-01-01
It is argued from the central limit theorem that the uncertainty in model predicted changes of the ozone column density is best represented by a normal probability density distribution. This conclusion is validated by comparison with a probability distribution generated by a Monte Carlo technique. In the case of the CFM-induced ozone depletion, and based on the estimated uncertainties in the reaction rate coefficients alone the relative mean standard deviation of this normal distribution is estimated to be 0.29.
Parameter-free testing of the shape of a probability distribution.
Broom, M; Nouvellet, P; Bacon, J P; Waxman, D
2007-01-01
The Kolmogorov-Smirnov test determines the consistency of empirical data with a particular probability distribution. Often, parameters in the distribution are unknown, and have to be estimated from the data. In this case, the Kolmogorov-Smirnov test depends on the form of the particular probability distribution under consideration, even when the estimated parameter-values are used within the distribution. In the present work, we address a less specific problem: to determine the consistency of data with a given functional form of a probability distribution (for example the normal distribution), without enquiring into values of unknown parameters in the distribution. For a wide class of distributions, we present a direct method for determining whether empirical data are consistent with a given functional form of the probability distribution. This utilizes a transformation of the data. If the data are from the class of distributions considered here, the transformation leads to an empirical distribution with no unknown parameters, and hence is susceptible to a standard Kolmogorov-Smirnov test. We give some general analytical results for some of the distributions from the class of distributions considered here. The significance level and power of the tests introduced in this work are estimated from simulations. Some biological applications of the method are given.
Collective motions of globally coupled oscillators and some probability distributions on circle
Energy Technology Data Exchange (ETDEWEB)
Jaćimović, Vladimir [Faculty of Natural Sciences and Mathematics, University of Montenegro, Cetinjski put, bb., 81000 Podgorica (Montenegro); Crnkić, Aladin, E-mail: aladin.crnkic@hotmail.com [Faculty of Technical Engineering, University of Bihać, Ljubijankićeva, bb., 77000 Bihać, Bosnia and Herzegovina (Bosnia and Herzegovina)
2017-06-28
In 2010 Kato and Jones described a new family of probability distributions on circle, obtained as Möbius transformation of von Mises distribution. We present the model demonstrating that these distributions appear naturally in study of populations of coupled oscillators. We use this opportunity to point out certain relations between Directional Statistics and collective motion of coupled oscillators. - Highlights: • We specify probability distributions on circle that arise in Kuramoto model. • We study how the mean-field coupling affects the shape of distribution of phases. • We discuss potential applications in some experiments on cell cycle. • We apply Directional Statistics to study collective dynamics of coupled oscillators.
The exact probability distribution of the rank product statistics for replicated experiments.
Eisinga, Rob; Breitling, Rainer; Heskes, Tom
2013-03-18
The rank product method is a widely accepted technique for detecting differentially regulated genes in replicated microarray experiments. To approximate the sampling distribution of the rank product statistic, the original publication proposed a permutation approach, whereas recently an alternative approximation based on the continuous gamma distribution was suggested. However, both approximations are imperfect for estimating small tail probabilities. In this paper we relate the rank product statistic to number theory and provide a derivation of its exact probability distribution and the true tail probabilities. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.
Some possible q-exponential type probability distribution in the non-extensive statistical physics
Chung, Won Sang
2016-08-01
In this paper, we present two exponential type probability distributions which are different from Tsallis’s case which we call Type I: one given by pi = 1 Zq[eq(Ei)]-β (Type IIA) and another given by pi = 1 Zq[eq(-β)]Ei (Type IIIA). Starting with the Boltzman-Gibbs entropy, we obtain the different probability distribution by using the Kolmogorov-Nagumo average for the microstate energies. We present the first-order differential equations related to Types I, II and III. For three types of probability distributions, we discuss the quantum harmonic oscillator, two-level problem and the spin-1 2 paramagnet.
Characterization of chaotic maps using the permutation Bandt-Pompe probability distribution
Rosso, Osvaldo A.; Olivares, Felipe; Zunino, Luciano; De Micco, Luciana; Aquino, André L. L.; Plastino, Angelo; Larrondo, Hilda A.
2013-04-01
By appealing to a long list of different nonlinear maps we review the characterization of time series arising from chaotic maps. The main tool for this characterization is the permutation Bandt-Pompe probability distribution function. We focus attention on both local and global characteristics of the components of this probability distribution function. We show that forbidden ordinal patterns (local quantifiers) exhibit an exponential growth for pattern-length range 3 ≤ D ≤ 8, in the case of finite time series data. Indeed, there is a minimum Dmin-value such that forbidden patterns cannot appear for D Pompe probability distribution function.
Liu, En-Bin; Tang, Meng-Ping; Shi, Yong-Jun; Zhou, Guo-Mo; Li, Yong-Fu
2009-11-01
Aiming at the deficiencies in the researches about the probability distribution model for mixed forests tree measurement factors, a joint maximum entropy probability density function was put forward, based on the maximum entropy principle. This function had the characteristics of 1) each element of the function was linked to the maximum entropy function, and hence, could integrate the information about the probability distribution of measurement factors of main tree species in mixed forests, 2) the function had a probability expression of double-weight, being possible to reflect the characteristics of the complex structure of mixed forests, and accurately and completely reflect the probability distribution of tree measurement factors of mixed forests based on the fully use of the information about the probability distribution of measurement factors of main tree species in mixed forests, and 3) the joint maximum entropy probability density function was succinct in structure and excellent in performance. The model was applied and tested in two sampling plots in Tianmu Mountain Nature Reserve. The fitting precision (R2 = 0.9655) and testing accuracy (R2 = 0.9772) were both high, suggesting that this model could be used as a probability distribution model for mixed forests tree measurement factors, and provided a feasible method to fully understand the comprehensive structure of mixed forests.
Ching, Wai-Ki; Zhang, Shuqin; Ng, Michael K; Akutsu, Tatsuya
2007-06-15
Probabilistic Boolean networks (PBNs) have been proposed to model genetic regulatory interactions. The steady-state probability distribution of a PBN gives important information about the captured genetic network. The computation of the steady-state probability distribution usually includes construction of the transition probability matrix and computation of the steady-state probability distribution. The size of the transition probability matrix is 2(n)-by-2(n) where n is the number of genes in the genetic network. Therefore, the computational costs of these two steps are very expensive and it is essential to develop a fast approximation method. In this article, we propose an approximation method for computing the steady-state probability distribution of a PBN based on neglecting some Boolean networks (BNs) with very small probabilities during the construction of the transition probability matrix. An error analysis of this approximation method is given and theoretical result on the distribution of BNs in a PBN with at most two Boolean functions for one gene is also presented. These give a foundation and support for the approximation method. Numerical experiments based on a genetic network are given to demonstrate the efficiency of the proposed method.
Ribereau, Pierre; Masiello, Esterina; Naveau, Philippe
2014-01-01
International audience; Following the work of Azzalini ([2] and [3]) on the skew normal distribution, we propose an extension of the Generalized Extreme Value (GEV) distribution, the SGEV. This new distribution allows for a better t of maxima and can be interpreted as both the distribution of maxima when maxima are taken on dependent data and when maxima are taken over a random block size. We propose to estimate the parameters of the SGEV distribution via the Probability Weighted Moments meth...
Traceable accounts of subjective probability judgments in the IPCC and beyond
Baer, P. G.
2012-12-01
One of the major sources of controversy surrounding the reports of the IPCC has been the characterization of uncertainty. Although arguably the IPCC has paid more attention to the process of uncertainty analysis and communication than any comparable assessment body, its efforts to achieve consistency have produced mixed results. In particular, the extensive use of subjective probability assessment has attracted widespread criticism. Statements such as "Average Northern Hemisphere temperatures during the second half of the 20th century were very likely higher than during any other 50-year period in the last 500 years" are ubiquitous (one online database lists nearly 3000 such claims), and indeed are the primary way in which its key "findings" are reported. Much attention is drawn to the precise quantitative definition of such statements (e.g., "very likely" means >90% probability, vs. "extremely likely" which means >95% certainty). But there is no process by which the decision regarding the choice of such uncertainty level for a given finding is formally made or reported, and thus they are easily by disputed by anyone, expert or otherwise, who disagrees with the assessment. In the "Uncertainty Guidance Paper" for the Third Assessment Report, Richard Moss and Steve Schneider defined the concept of a "traceable account," which gave exhaustive detail regarding how one ought to provide documentation of such an uncertainty assessment. But the guidance, while appearing straightforward and reasonable, in fact was an unworkable recipe, which would have taken near-infinite time if used for more than a few key results, and would have required a different structuring of the text than the conventional scientific assessment. And even then it would have left a gap when it came to the actual provenance of any such specific judgments, because there simply is no formal step at which individuals turn their knowledge of the evidence on some finding into a probability judgment. The
Probability distribution of the time-averaged mean-square displacement of a Gaussian process.
Grebenkov, Denis S
2011-09-01
We study the probability distribution of the time-averaged mean-square displacement of a discrete Gaussian process. An empirical approximation for the probability density is suggested and numerically validated for fractional Brownian motion. The optimality of quadratic forms for inferring dynamical and microrheological quantities from individual random trajectories is discussed, with emphasis on a reliable interpretation of single-particle tracking experiments.
A measure of mutual divergence among a number of probability distributions
Directory of Open Access Journals (Sweden)
J. N. Kapur
1987-01-01
major inequalities due to Shannon, Renyi and Holder. The inequalities are then used to obtain some useful results in information theory. In particular measures are obtained to measure the mutual divergence among two or more probability distributions.
Energy Technology Data Exchange (ETDEWEB)
Huang Zhifu [Department of Physics, Xiamen University, Xiamen 361005 (China); Lin Bihong [Department of Physics, Xiamen University, Xiamen 361005 (China); Department of Physics, Quanzhou Normal University, Quanzhou 362000 (China); ChenJincan [Department of Physics, Xiamen University, Xiamen 361005 (China)], E-mail: jcchen@xmu.edu.cn
2009-05-15
In order to overcome the limitations of the original expression of the probability distribution appearing in literature of Incomplete Statistics, a new expression of the probability distribution is derived, where the Lagrange multiplier {beta} introduced here is proved to be identical with that introduced in the second and third choices for the internal energy constraint in Tsallis' statistics and to be just equal to the physical inverse temperature. It is expounded that the probability distribution described by the new expression is invariant through uniform translation of the energy spectrum. Moreover, several fundamental thermodynamic relations are given and the relationship between the new and the original expressions of the probability distribution is discussed.
Further Evidence That the Effects of Repetition on Subjective Time Depend on Repetition Probability.
Skylark, William J; Gheorghiu, Ana I
2017-01-01
Repeated stimuli typically have shorter apparent duration than novel stimuli. Most explanations for this effect have attributed it to the repeated stimuli being more expected or predictable than the novel items, but an emerging body of work suggests that repetition and expectation exert distinct effects on time perception. The present experiment replicated a recent study in which the probability of repetition was varied between blocks of trials. As in the previous work, the repetition effect was smaller when repeats were common (and therefore more expected) than when they were rare. These results add to growing evidence that, contrary to traditional accounts, expectation increases apparent duration whereas repetition compresses subjective time, perhaps via a low-level process like adaptation. These opposing processes can be seen as instances of a more general "processing principle," according to which subjective time is a function of the perceptual strength of the stimulus representation, and therefore depends on a confluence of "bottom-up" and "top-down" variables.
Bigot, Jérémie; Cazelles, Elsa; Papadakis, Nicolas
2017-01-01
The notion of Sinkhorn divergence has recently gained popularity in machine learning and statistics, as it makes feasible the use of smoothed optimal transportation distances for data analysis. The Sinkhorn divergence allows the fast computation of an entropically regularized Wasserstein distance between two probability distributions supported on a finite metric space of (possibly) high-dimension. For data sampled from one or two unknown probability distributions, we derive central limit theo...
Majumdar, A. K.
1979-01-01
Expressions are derived for higher-order skewness and excess coefficients using central moments and cumulants up to 8th order. These coefficients are then calculated for three probability distributions: (1) Log-normal, (2) Rice-Nakagami, and (3) Gamma distributions. Curves are given to shown the variation of skewness with excess coefficients for these distributions. These curves are independent of the particular distribution parameters. This method is useful for studying fluctuating phenomena, which obey non-Gaussian statistics.
Directory of Open Access Journals (Sweden)
Changhao Fan
2017-01-01
Full Text Available In modeling, only information from the deviation between the output of the support vector regression (SVR model and the training sample is considered, whereas the other prior information of the training sample, such as probability distribution information, is ignored. Probabilistic distribution information describes the overall distribution of sample data in a training sample that contains different degrees of noise and potential outliers, as well as helping develop a high-accuracy model. To mine and use the probability distribution information of a training sample, a new support vector regression model that incorporates probability distribution information weight SVR (PDISVR is proposed. In the PDISVR model, the probability distribution of each sample is considered as the weight and is then introduced into the error coefficient and slack variables of SVR. Thus, the deviation and probability distribution information of the training sample are both used in the PDISVR model to eliminate the influence of noise and outliers in the training sample and to improve predictive performance. Furthermore, examples with different degrees of noise were employed to demonstrate the performance of PDISVR, which was then compared with those of three SVR-based methods. The results showed that PDISVR performs better than the three other methods.
Crossing probability for directed polymers in random media. II. Exact tail of the distribution.
De Luca, Andrea; Le Doussal, Pierre
2016-03-01
We study the probability p ≡ p(η)(t) that two directed polymers in a given random potential η and with fixed and nearby endpoints do not cross until time t. This probability is itself a random variable (over samples η), which, as we show, acquires a very broad probability distribution at large time. In particular, the moments of p are found to be dominated by atypical samples where p is of order unity. Building on a formula established by us in a previous work using nested Bethe ansatz and Macdonald process methods, we obtain analytically the leading large time behavior of all moments p(m) ≃ γ(m)/t. From this, we extract the exact tail ∼ρ(p)/t of the probability distribution of the noncrossing probability at large time. The exact formula is compared to numerical simulations, with excellent agreement.
Nam, Sungsik
2010-11-01
Previous work on performance analyses of generalized selection combining (GSC) RAKE receivers based on the signal to noise ratio focused on the development of methodologies to derive exact closed-form expressions for various performance measures. However, some open problems related to the performance evaluation of GSC RAKE receivers still remain to be solved such that an assessment of the impact of self-interference on the performance of GSC RAKE receivers. To have a full and exact understanding of the performance of GSC RAKE receivers, the outage probability of GSC RAKE receivers needs to be analyzed as closed-form expressions. The major difficulty in this problem is to derive some joint statistics of ordered exponential variates. With this motivation in mind, we capitalize in this paper on some new order statistics results to derive exact closed-form expressions for outage probability of GSC RAKE receivers subject to self-interference over independent and identically distributed Rayleigh fading channels. © 2010 IEEE.
Predicting the probability of slip in gait: methodology and distribution study.
Gragg, Jared; Yang, James
2016-01-01
The likelihood of a slip is related to the available and required friction for a certain activity, here gait. Classical slip and fall analysis presumed that a walking surface was safe if the difference between the mean available and required friction coefficients exceeded a certain threshold. Previous research was dedicated to reformulating the classical slip and fall theory to include the stochastic variation of the available and required friction when predicting the probability of slip in gait. However, when predicting the probability of a slip, previous researchers have either ignored the variation in the required friction or assumed the available and required friction to be normally distributed. Also, there are no published results that actually give the probability of slip for various combinations of required and available frictions. This study proposes a modification to the equation for predicting the probability of slip, reducing the previous equation from a double-integral to a more convenient single-integral form. Also, a simple numerical integration technique is provided to predict the probability of slip in gait: the trapezoidal method. The effect of the random variable distributions on the probability of slip is also studied. It is shown that both the required and available friction distributions cannot automatically be assumed as being normally distributed. The proposed methods allow for any combination of distributions for the available and required friction, and numerical results are compared to analytical solutions for an error analysis. The trapezoidal method is shown to be highly accurate and efficient. The probability of slip is also shown to be sensitive to the input distributions of the required and available friction. Lastly, a critical value for the probability of slip is proposed based on the number of steps taken by an average person in a single day.
WIENER-HOPF SOLVER WITH SMOOTH PROBABILITY DISTRIBUTIONS OF ITS COMPONENTS
Directory of Open Access Journals (Sweden)
Mr. Vladimir A. Smagin
2016-12-01
Full Text Available The Wiener – Hopf solver with smooth probability distributions of its component is presented. The method is based on hyper delta approximations of initial distributions. The use of Fourier series transformation and characteristic function allows working with the random variable method concentrated in transversal axis of absc.
Hameren, Andreas Ferdinand Willem van
2001-01-01
Discrepancies play an important role in the study of uniformity properties of point sets. Their probability distributions are a help in the analysis of the efficiency of the Quasi Monte Carlo method of numerical integration, which uses point sets that are distributed more uniformly than sets of
Probability distribution of long-run indiscriminate felling of trees in ...
African Journals Online (AJOL)
The study was undertaken to determine the probability distribution of Long-run indiscriminate felling of trees in northern senatorial district of Adamawa State. Specifically, the study focused on examining the future direction of indiscriminate felling of trees as well as its equilibrium distribution. A multi-stage and simple random ...
Marshman, Emily; Singh, Chandralekha
2017-03-01
A solid grasp of the probability distributions for measuring physical observables is central to connecting the quantum formalism to measurements. However, students often struggle with the probability distributions of measurement outcomes for an observable and have difficulty expressing this concept in different representations. Here we first describe the difficulties that upper-level undergraduate and PhD students have with the probability distributions for measuring physical observables in quantum mechanics. We then discuss how student difficulties found in written surveys and individual interviews were used as a guide in the development of a quantum interactive learning tutorial (QuILT) to help students develop a good grasp of the probability distributions of measurement outcomes for physical observables. The QuILT strives to help students become proficient in expressing the probability distributions for the measurement of physical observables in Dirac notation and in the position representation and be able to convert from Dirac notation to position representation and vice versa. We describe the development and evaluation of the QuILT and findings about the effectiveness of the QuILT from in-class evaluations.
Khanmohammadi, Neda; Rezaie, Hossein; Montaseri, Majid; Behmanesh, Javad
2017-10-01
The reference evapotranspiration (ET0) plays an important role in water management plans in arid or semi-arid countries such as Iran. For this reason, the regional analysis of this parameter is important. But, ET0 process is affected by several meteorological parameters such as wind speed, solar radiation, temperature and relative humidity. Therefore, the effect of distribution type of effective meteorological variables on ET0 distribution was analyzed. For this purpose, the regional probability distribution of the annual ET0 and its effective parameters were selected. Used data in this research was recorded data at 30 synoptic stations of Iran during 1960-2014. Using the probability plot correlation coefficient (PPCC) test and the L-moment method, five common distributions were compared and the best distribution was selected. The results of PPCC test and L-moment diagram indicated that the Pearson type III distribution was the best probability distribution for fitting annual ET0 and its four effective parameters. The results of RMSE showed that the ability of the PPCC test and L-moment method for regional analysis of reference evapotranspiration and its effective parameters was similar. The results also showed that the distribution type of the parameters which affected ET0 values can affect the distribution of reference evapotranspiration.
Score distributions of gapped multiple sequence alignments down to the low-probability tail
Fieth, Pascal; Hartmann, Alexander K.
2016-08-01
Assessing the significance of alignment scores of optimally aligned DNA or amino acid sequences can be achieved via the knowledge of the score distribution of random sequences. But this requires obtaining the distribution in the biologically relevant high-scoring region, where the probabilities are exponentially small. For gapless local alignments of infinitely long sequences this distribution is known analytically to follow a Gumbel distribution. Distributions for gapped local alignments and global alignments of finite lengths can only be obtained numerically. To obtain result for the small-probability region, specific statistical mechanics-based rare-event algorithms can be applied. In previous studies, this was achieved for pairwise alignments. They showed that, contrary to results from previous simple sampling studies, strong deviations from the Gumbel distribution occur in case of finite sequence lengths. Here we extend the studies to multiple sequence alignments with gaps, which are much more relevant for practical applications in molecular biology. We study the distributions of scores over a large range of the support, reaching probabilities as small as 10-160, for global and local (sum-of-pair scores) multiple alignments. We find that even after suitable rescaling, eliminating the sequence-length dependence, the distributions for multiple alignment differ from the pairwise alignment case. Furthermore, we also show that the previously discussed Gaussian correction to the Gumbel distribution needs to be refined, also for the case of pairwise alignments.
Evaluation of probability distributions for concentration fluctuations in a building array
Efthimiou, G. C.; Andronopoulos, S.; Bartzis, J. G.
2017-10-01
The wide range of values observed in a measured concentration time series after the release of a dispersing airborne pollutant from a point source in the atmospheric boundary layer, and the hazard level associated with the peak values, demonstrate the necessity of predicting the concentration probability distribution. For this, statistical models describing the probability of occurrence are preferably employed. In this paper a concentration database pertaining to a field experiment of dispersion in an urban-like area (MUST experiment) from a continuously emitting source is used for the selection of the best performing statistical model between the Gamma and the Beta distributions. The skewness, the kurtosis as well as the inverses of the cumulative distribution function were compared between the two statistical models and the experiment. The evaluation is performed in the form of validation metrics such as the Fractional Bias (FB), the Normalized Mean Square Error and the factor-of-2 percentage. The Beta probability distribution agreed with the experimental results better than the Gamma probability distribution except for the 25th percentile. Also according to the significant tests using the BOOT software the Beta model presented FB and NMSE values that are statistical different than the ones of the Gamma model except the 75th percentiles and the FB of the 99th percentiles. The effect of the stability conditions and source heights on the performance of the statistical models is also examined. For both cases the performance of the Beta distribution was slightly better than that of the Gamma.
Computer simulation of random variables and vectors with arbitrary probability distribution laws
Bogdan, V. M.
1981-01-01
Assume that there is given an arbitrary n-dimensional probability distribution F. A recursive construction is found for a sequence of functions x sub 1 = f sub 1 (U sub 1, ..., U sub n), ..., x sub n = f sub n (U sub 1, ..., U sub n) such that if U sub 1, ..., U sub n are independent random variables having uniform distribution over the open interval (0,1), then the joint distribution of the variables x sub 1, ..., x sub n coincides with the distribution F. Since uniform independent random variables can be well simulated by means of a computer, this result allows one to simulate arbitrary n-random variables if their joint probability distribution is known.
Pendrin gene ablation alters ENaC subcellular distribution and open probability.
Pech, Vladimir; Wall, Susan M; Nanami, Masayoshi; Bao, Hui-Fang; Kim, Young Hee; Lazo-Fernandez, Yoskaly; Yue, Qiang; Pham, Truyen D; Eaton, Douglas C; Verlander, Jill W
2015-07-15
The present study explored whether the intercalated cell Cl(-)/HCO3(-) exchanger pendrin modulates epithelial Na(+) channel (ENaC) function by changing channel open probability and/or channel density. To do so, we measured ENaC subunit subcellular distribution by immunohistochemistry, single channel recordings in split open cortical collecting ducts (CCDs), as well as transepithelial voltage and Na(+) absorption in CCDs from aldosterone-treated wild-type and pendrin-null mice. Because pendrin gene ablation reduced 70-kDa more than 85-kDa γ-ENaC band density, we asked if pendrin gene ablation interferes with ENaC cleavage. We observed that ENaC-cleaving protease application (trypsin) increased the lumen-negative transepithelial voltage in pendrin-null mice but not in wild-type mice, which raised the possibility that pendrin gene ablation blunts ENaC cleavage, thereby reducing open probability. In mice harboring wild-type ENaC, pendrin gene ablation reduced ENaC-mediated Na(+) absorption by reducing channel open probability as well as by reducing channel density through changes in subunit total protein abundance and subcellular distribution. Further experiments used mice with blunted ENaC endocytosis and degradation (Liddle's syndrome) to explore the significance of pendrin-dependent changes in ENaC open probability. In mouse models of Liddle's syndrome, pendrin gene ablation did not change ENaC subunit total protein abundance, subcellular distribution, or channel density, but markedly reduced channel open probability. We conclude that in mice harboring wild-type ENaC, pendrin modulates ENaC function through changes in subunit abundance, subcellular distribution, and channel open probability. In a mouse model of Liddle's syndrome, however, pendrin gene ablation reduces channel activity mainly through changes in open probability. Copyright © 2015 the American Physiological Society.
Subjective Belief Distributions and the Characterization of Economic Literacy
DEFF Research Database (Denmark)
Di Girolamo, Amalia; Harrison, Glenn W.; Lau, Morten
2015-01-01
We characterize the literacy of an individual in a domain by their elicited subjective belief distribution over the possible responses to a question posed in that domain. By eliciting the distribution, rather than just the answers to true/false or multiple choice questions, we can directly measur...
A note on the probability distribution function of the surface electromyogram signal.
Nazarpour, Kianoush; Al-Timemy, Ali H; Bugmann, Guido; Jackson, Andrew
2013-01-01
The probability density function (PDF) of the surface electromyogram (EMG) signals has been modelled with Gaussian and Laplacian distribution functions. However, a general consensus upon the PDF of the EMG signals is yet to be reached, because not only are there several biological factors that can influence this distribution function, but also different analysis techniques can lead to contradicting results. Here, we recorded the EMG signal at different isometric muscle contraction levels and characterised the probability distribution of the surface EMG signal with two statistical measures: bicoherence and kurtosis. Bicoherence analysis did not help to infer the PDF of measured EMG signals. In contrast, with kurtosis analysis we demonstrated that the EMG PDF at isometric, non-fatiguing, low contraction levels is super-Gaussian. Moreover, kurtosis analysis showed that as the contraction force increases the surface EMG PDF tends to a Gaussian distribution. Copyright © 2012 Elsevier Inc. All rights reserved.
Dopkins, Stephen; Varner, Kaitlin; Hoyer, Darin
2017-10-01
In word recognition semantic priming of test words increased the false-alarm rate and the mean of confidence ratings to lures. Such priming also increased the standard deviation of confidence ratings to lures and the slope of the z-ROC function, suggesting that the priming increased the standard deviation of the lure evidence distribution. The Unequal Variance Signal Detection (UVSD) model interpreted the priming as increasing the standard deviation of the lure evidence distribution. Without additional parameters the Dual Process Signal Detection (DPSD) model could only accommodate the results by fitting the data for related and unrelated primes separately, interpreting the priming, implausibly, as decreasing the probability of target recollection (DPSD). With an additional parameter, for the probability of false (lure) recollection the model could fit the data for related and unrelated primes together, interpreting the priming as increasing the probability of false recollection. These results suggest that DPSD estimates of target recollection probability will decrease with increases in the lure confidence/evidence standard deviation unless a parameter is included for false recollection. Unfortunately the size of a given lure confidence/evidence standard deviation relative to other possible lure confidence/evidence standard deviations is often unspecified by context. Hence the model often has no way of estimating false recollection probability and thereby correcting its estimates of target recollection probability.
Viana, Duarte S; Santamaría, Luis; Figuerola, Jordi
2016-02-01
Propagule retention time is a key factor in determining propagule dispersal distance and the shape of "seed shadows". Propagules dispersed by animal vectors are either ingested and retained in the gut until defecation or attached externally to the body until detachment. Retention time is a continuous variable, but it is commonly measured at discrete time points, according to pre-established sampling time-intervals. Although parametric continuous distributions have been widely fitted to these interval-censored data, the performance of different fitting methods has not been evaluated. To investigate the performance of five different fitting methods, we fitted parametric probability distributions to typical discretized retention-time data with known distribution using as data-points either the lower, mid or upper bounds of sampling intervals, as well as the cumulative distribution of observed values (using either maximum likelihood or non-linear least squares for parameter estimation); then compared the estimated and original distributions to assess the accuracy of each method. We also assessed the robustness of these methods to variations in the sampling procedure (sample size and length of sampling time-intervals). Fittings to the cumulative distribution performed better for all types of parametric distributions (lognormal, gamma and Weibull distributions) and were more robust to variations in sample size and sampling time-intervals. These estimated distributions had negligible deviations of up to 0.045 in cumulative probability of retention times (according to the Kolmogorov-Smirnov statistic) in relation to original distributions from which propagule retention time was simulated, supporting the overall accuracy of this fitting method. In contrast, fitting the sampling-interval bounds resulted in greater deviations that ranged from 0.058 to 0.273 in cumulative probability of retention times, which may introduce considerable biases in parameter estimates. We
Directory of Open Access Journals (Sweden)
Daniel Ting
2010-04-01
Full Text Available Distributions of the backbone dihedral angles of proteins have been studied for over 40 years. While many statistical analyses have been presented, only a handful of probability densities are publicly available for use in structure validation and structure prediction methods. The available distributions differ in a number of important ways, which determine their usefulness for various purposes. These include: 1 input data size and criteria for structure inclusion (resolution, R-factor, etc.; 2 filtering of suspect conformations and outliers using B-factors or other features; 3 secondary structure of input data (e.g., whether helix and sheet are included; whether beta turns are included; 4 the method used for determining probability densities ranging from simple histograms to modern nonparametric density estimation; and 5 whether they include nearest neighbor effects on the distribution of conformations in different regions of the Ramachandran map. In this work, Ramachandran probability distributions are presented for residues in protein loops from a high-resolution data set with filtering based on calculated electron densities. Distributions for all 20 amino acids (with cis and trans proline treated separately have been determined, as well as 420 left-neighbor and 420 right-neighbor dependent distributions. The neighbor-independent and neighbor-dependent probability densities have been accurately estimated using Bayesian nonparametric statistical analysis based on the Dirichlet process. In particular, we used hierarchical Dirichlet process priors, which allow sharing of information between densities for a particular residue type and different neighbor residue types. The resulting distributions are tested in a loop modeling benchmark with the program Rosetta, and are shown to improve protein loop conformation prediction significantly. The distributions are available at http://dunbrack.fccc.edu/hdp.
Application of the Unbounded Probability Distribution of the Johnson System for Floods Estimation
Directory of Open Access Journals (Sweden)
Campos-Aranda Daniel Francisco
2015-09-01
Full Text Available Floods designs constitute a key to estimate the sizing of new water works and to review the hydrological security of existing ones. The most reliable method for estimating their magnitudes associated with certain return periods is to fit a probabilistic model to available records of maximum annual flows. Since such model is at first unknown, several models need to be tested in order to select the most appropriate one according to an arbitrary statistical index, commonly the standard error of fit. Several probability distributions have shown versatility and consistency of results when processing floods records and therefore, its application has been established as a norm or precept. The Johnson System has three families of distributions, one of which is the Log–Normal model with three parameters of fit, which is also the border between the bounded distributions and those with no upper limit. These families of distributions have four adjustment parameters and converge to the standard normal distribution, so that their predictions are obtained with such a model. Having contrasted the three probability distributions established by precept in 31 historical records of hydrological events, the Johnson system is applied to such data. The results of the unbounded distribution of the Johnson system (SJU are compared to the optimal results from the three distributions. It was found that the predictions of the SJU distribution are similar to those obtained with the other models in the low return periods ( 1000 years. Because of its theoretical support, the SJU model is recommended in flood estimation.
Furbish, David; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan
2016-01-01
We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.
Animating Statistics: A New Kind of Applet for Exploring Probability Distributions
Kahle, David
2014-01-01
In this article, I introduce a novel applet ("module") for exploring probability distributions, their samples, and various related statistical concepts. The module is primarily designed to be used by the instructor in the introductory course, but it can be used far beyond it as well. It is a free, cross-platform, stand-alone interactive…
Generalized Binomial Probability Distributions Attached to Landau Levels on the Riemann Sphere
Ghanmi, A.; Hafoud, A.; Mouayn, Z.
2011-01-01
A family of generalized binomial probability distributions attached to Landau levels on the Riemann sphere is introduced by constructing a kind of generalized coherent states. Their main statistical parameters are obtained explicitly. As an application, photon number statistics related to coherent states under consideration are discussed.
Maximizing a Probability: A Student Workshop on an Application of Continuous Distributions
Griffiths, Martin
2010-01-01
For many students meeting, say, the gamma distribution for the first time, it may well turn out to be a rather fruitless encounter unless they are immediately able to see an application of this probability model to some real-life situation. With this in mind, we pose here an appealing problem that can be used as the basis for a workshop activity…
Ruin Probabilities and Aggregrate Claims Distributions for Shot Noise Cox Processes
DEFF Research Database (Denmark)
Albrecher, H.; Asmussen, Søren
claim size is investigated under these assumptions. For both light-tailed and heavy-tailed claim size distributions, asymptotic estimates for infinite-time and finite-time ruin probabilities are derived. Moreover, we discuss an extension of the model to an adaptive premium rule that is dynamically...
The distribution of FRAX(®)-based probabilities in women from Japan.
Kanis, John A; Johansson, Helena; Odén, Anders; McCloskey, Eugene V
2012-11-01
New assessment guidelines for osteoporosis in Japan include the use of the WHO risk assessment tool (FRAX) that computes the 10-year probability of fracture. The aim of this study was to determine the distribution of fracture probabilities and to assess the impact of probability-based intervention thresholds in women from Japan aged 50 years and older. Age-specific simulation cohorts were constructed from the prevalences of clinical risk factors and femoral neck bone mineral density to determine the distribution of fracture probabilities as assessed by FRAX. These data were used to estimate the number and proportion of women at or above a 10-year fracture probability of 5, 10, 15, 20, 25, and 30 %. In addition, case scenarios that applied a FRAX probability threshold of 15 % were compared with current guidance. In the absence of additional criteria for treatment, a 15 % fracture probability threshold would identify approximately 32 % of women over the age of 50 years (9.3 million women) as eligible for treatment. Because of expected changes in population demography, the 15 % fracture probability threshold would capture approximately 38 % of women over the age of 50 years (12.7 million women), mainly those aged 80 years or older. The introduction of a FRAX threshold of 15 % would permit treatment in women with clinical risk factors that would otherwise fall below previously established intervention thresholds. The incorporation of FRAX into assessment guidelines is likely to redirect treatments for osteoporosis from younger women at low risk to elderly women at high fracture risk.
Directory of Open Access Journals (Sweden)
A. B. Levina
2016-03-01
Full Text Available Error detection codes are mechanisms that enable robust delivery of data in unreliable communication channels and devices. Unreliable channels and devices are error-prone objects. Respectively, error detection codes allow detecting such errors. There are two classes of error detecting codes - classical codes and security-oriented codes. The classical codes have high percentage of detected errors; however, they have a high probability to miss an error in algebraic manipulation. In order, security-oriented codes are codes with a small Hamming distance and high protection to algebraic manipulation. The probability of error masking is a fundamental parameter of security-oriented codes. A detailed study of this parameter allows analyzing the behavior of the error-correcting code in the case of error injection in the encoding device. In order, the complexity of the encoding function plays an important role in the security-oriented codes. Encoding functions with less computational complexity and a low probability of masking are the best protection of encoding device against malicious acts. This paper investigates the influence of encoding function complexity on the error masking probability distribution. It will be shownthat the more complex encoding function reduces the maximum of error masking probability. It is also shown in the paper that increasing of the function complexity changes the error masking probability distribution. In particular, increasing of computational complexity decreases the difference between the maximum and average value of the error masking probability. Our resultshave shown that functions with greater complexity have smoothed maximums of error masking probability, which significantly complicates the analysis of error-correcting code by attacker. As a result, in case of complex encoding function the probability of the algebraic manipulation is reduced. The paper discusses an approach how to measure the error masking
Bistatic-radar estimation of surface-slope probability distributions with applications to the moon.
Parker, M. N.; Tyler, G. L.
1973-01-01
A method for extracting surface-slope frequency distributions from bistatic-radar data has been developed and applied to the lunar surface. Telemetry transmissions from orbiting Apollo spacecraft were received on the earth after reflection from the lunar surface. The echo-frequency spectrum was related analytically to the probability distribution of lunar slopes. Standard regression techniques were used to solve the inverse problem of finding slope distributions from observed echo-frequency spectra. Data taken simultaneously at two wavelengths, 13 and 116 cm, have yielded diverse slope statistics.
Audio analysis of statistically instantaneous signals with mixed Gaussian probability distributions
Naik, Ganesh R.; Wang, Wenwu
2012-10-01
In this article, a novel method is proposed to measure the separation qualities of statistically instantaneous audio signals with mixed Gaussian probability distributions. This study evaluates the impact of the Probability Distribution Function (PDF) of the mixed signals on the outcomes of both sub- and super-Gaussian distributions. Different Gaussian measures are evaluated by using various spectral-distortion measures. It aims to compare the different audio mixtures from both super-Gaussian and sub-Gaussian perspectives. Extensive computer simulation confirms that the separated sources always have super-Gaussian characteristics irrespective of the PDF of the signals or mixtures. The result based on the objective measures demonstrates the effectiveness of source separation in improving the quality of the separated audio sources.
Jones, Evan; Singal, Jack
2018-01-01
We present results of using individual galaxies' redshift probability information derived from a photometric redshift (photo-z) algorithm, SPIDERz, to identify potential catastrophic outliers in photometric redshift determinations. By using test data comprised of COSMOS multi-band photometry and known spectroscopic redshifts from the 3D-HST survey spanning a wide redshift range (0method to flag potential catastrophic outliers in an analysis which relies on accurate photometric redshifts. SPIDERz is a custom support vector machine classification algorithm for photo-z analysis that naturally outputs a distribution of redshift probability information for each galaxy in addition to a discrete most probable photo-z value. By applying an analytic technique with flagging criteria to identify the presence of probability distribution features characteristic of catastrophic outlier photo-z estimates, such as multiple redshift probability peaks separated by substantial redshift distances, we can flag potential catastrophic outliers in photo-z determinations. We find that our proposed method can correctly flag large fractions of the outliers and catastrophic outlier galaxies, while only flagging a small fraction of the total non-outlier galaxies. We examine the performance of this strategy in photo-z determinations using a range of flagging parameter values. These results could potentially be useful for utilization of photometric redshifts in future large scale surveys where catastrophic outliers are particularly detrimental to the science goals.
Gurbuz, Ramazan; Birgin, Osman
2012-01-01
The aim of this study is to determine the effects of computer-assisted teaching (CAT) on remedying misconceptions students often have regarding some probability concepts in mathematics. Toward this aim, computer-assisted teaching materials were developed and used in the process of teaching. Within the true-experimental research method, a pre- and…
Tolsma, J.; Need, A.; Jong, U. de
2010-01-01
In this article we examine whether subjective estimates of success probabilities explain the effect of social origin, sex, and ethnicity on students’ choices between different school tracks in Dutch higher education. The educational options analysed differ in level (i.e. university versus
Karim, Shahriar; Buzzard, Gregery T; Umulis, David M
2012-01-01
The Steady State (SS) probability distribution is an important quantity needed to characterize the steady state behavior of many stochastic biochemical networks. In this paper, we propose an efficient and accurate approach to calculating an approximate SS probability distribution from solution of the Chemical Master Equation (CME) under the assumption of the existence of a unique deterministic SS of the system. To find the approximate solution to the CME, a truncated state-space representation is used to reduce the state-space of the system and translate it to a finite dimension. The subsequent ill-posed eigenvalue problem of a linear system for the finite state-space can be converted to a well-posed system of linear equations and solved. The proposed strategy yields efficient and accurate estimation of noise in stochastic biochemical systems. To demonstrate the approach, we applied the method to characterize the noise behavior of a set of biochemical networks of ligand-receptor interactions for Bone Morphogenetic Protein (BMP) signaling. We found that recruitment of type II receptors during the receptor oligomerization by itself doesn't not tend to lower noise in receptor signaling, but regulation by a secreted co-factor may provide a substantial improvement in signaling relative to noise. The steady state probability approximation method shortened the time necessary to calculate the probability distributions compared to earlier approaches, such as Gillespie's Stochastic Simulation Algorithm (SSA) while maintaining high accuracy.
Wang, S Q; Zhang, H Y; Li, Z L
2016-10-01
Understanding spatio-temporal distribution of pest in orchards can provide important information that could be used to design monitoring schemes and establish better means for pest control. In this study, the spatial and temporal distribution of Bactrocera minax (Enderlein) (Diptera: Tephritidae) was assessed, and activity trends were evaluated by using probability kriging. Adults of B. minax were captured in two successive occurrences in a small-scale citrus orchard by using food bait traps, which were placed both inside and outside the orchard. The weekly spatial distribution of B. minax within the orchard and adjacent woods was examined using semivariogram parameters. The edge concentration was discovered during the most weeks in adult occurrence, and the population of the adults aggregated with high probability within a less-than-100-m-wide band on both of the sides of the orchard and the woods. The sequential probability kriged maps showed that the adults were estimated in the marginal zone with higher probability, especially in the early and peak stages. The feeding, ovipositing, and mating behaviors of B. minax are possible explanations for these spatio-temporal patterns. Therefore, spatial arrangement and distance to the forest edge of traps or spraying spot should be considered to enhance pest control on B. minax in small-scale orchards.
Experimental probability density distributions for optical and infrared cross-beam units
Huff, L. L.; Sandborn, V. A.
1974-01-01
A series of cross beam experiments for measuring meteorological parameters was conducted. Both optical and infrared sensors were tested for comparison. The test configuration and experiment identification are provided. A data report is presented for an approximate analysis of the probability density distributions of the signals from both the optical and the infared units. Two runs representing each of the two types of cross-beam units were selected for analysis. Probability density functions are also presented for the instantaneous product of the signals for each of the runs.
Saitoh, K.; Magnanimo, Vanessa; Luding, Stefan
2016-01-01
Employing two-dimensional molecular dynamics (MD) simulations of soft particles, we study their non-affine responses to quasi-static isotropic compression where the effects of microscopic friction between the particles in contact and particle size distributions are examined. To quantify complicated
Probability distribution of surface wind speed induced by convective adjustment on Venus
Yamamoto, Masaru
2017-03-01
The influence of convective adjustment on the spatial structure of Venusian surface wind and probability distribution of its wind speed is investigated using an idealized weather research and forecasting model. When the initially uniform wind is much weaker than the convective wind, patches of both prograde and retrograde winds with scales of a few kilometers are formed during active convective adjustment. After the active convective adjustment, because the small-scale convective cells and their related vertical momentum fluxes dissipate quickly, the large-scale (>4 km) prograde and retrograde wind patches remain on the surface and in the longitude-height cross-section. This suggests the coexistence of local prograde and retrograde flows, which may correspond to those observed by Pioneer Venus below 10 km altitude. The probability distributions of surface wind speed V during the convective adjustment have a similar form in different simulations, with a sharp peak around ∼0.1 m s-1 and a bulge developing on the flank of the probability distribution. This flank bulge is associated with the most active convection, which has a probability distribution with a peak at the wind speed 1.5-times greater than the Weibull fitting parameter c during the convective adjustment. The Weibull distribution P(> V) (= exp[-(V/c)k]) with best-estimate coefficients of Lorenz (2016) is reproduced during convective adjustments induced by a potential energy of ∼7 × 107 J m-2, which is calculated from the difference in total potential energy between initially unstable and neutral states. The maximum vertical convective heat flux magnitude is proportional to the potential energy of the convective adjustment in the experiments with the initial unstable-layer thickness altered. The present work suggests that convective adjustment is a promising process for producing the wind structure with occasionally generating surface winds of ∼1 m s-1 and retrograde wind patches.
Crovelli, R.A.; Balay, R.H.
1991-01-01
A general risk-analysis method was developed for petroleum-resource assessment and other applications. The triangular probability distribution is used as a model with an analytic aggregation methodology based on probability theory rather than Monte-Carlo simulation. Among the advantages of the analytic method are its computational speed and flexibility, and the saving of time and cost on a microcomputer. The input into the model consists of a set of components (e.g. geologic provinces) and, for each component, three potential resource estimates: minimum, most likely (mode), and maximum. Assuming a triangular probability distribution, the mean, standard deviation, and seven fractiles (F100, F95, F75, F50, F25, F5, and F0) are computed for each component, where for example, the probability of more than F95 is equal to 0.95. The components are aggregated by combining the means, standard deviations, and respective fractiles under three possible siutations (1) perfect positive correlation, (2) complete independence, and (3) any degree of dependence between these two polar situations. A package of computer programs named the TRIAGG system was written in the Turbo Pascal 4.0 language for performing the analytic probabilistic methodology. The system consists of a program for processing triangular probability distribution assessments and aggregations, and a separate aggregation routine for aggregating aggregations. The user's documentation and program diskette of the TRIAGG system are available from USGS Open File Services. TRIAGG requires an IBM-PC/XT/AT compatible microcomputer with 256kbyte of main memory, MS-DOS 3.1 or later, either two diskette drives or a fixed disk, and a 132 column printer. A graphics adapter and color display are optional. ?? 1991.
Probability distribution of residence times of grains in models of rice piles.
Pradhan, Punyabrata; Dhar, Deepak
2006-02-01
We study the probability distribution of residence time of a grain at a site, and its total residence time inside a pile, in different rice pile models. The tails of these distributions are dominated by the grains that get deeply buried in the pile. We show that, for a pile of size L, the probabilities that the residence time at a site or the total residence time is greater than t, both decay as 1/t(ln t)x for L(omega) or = 1, and values of x and omega in the two cases are different. In the Oslo rice pile model we find that the probability of the residence time T(i) at a site i being greater than or equal to t is a nonmonotonic function of L for a fixed t and does not obey simple scaling. For model in d dimensions, we show that the probability of minimum slope configuration in the steady state, for large L, varies as exp(-kappaL(d+2)) where kappa is a constant, and hence gamma=d+2.
Hanayama, Nobutane; Sibuya, Masaaki
2016-08-01
In modern biology, theories of aging fall mainly into two groups: damage theories and programed theories. If programed theories are true, the probability that human beings live beyond a specific age will be zero. In contrast, if damage theories are true, such an age does not exist, and a longevity record will be eventually destroyed. In this article, for examining real state, a special type of binomial model based on the generalized Pareto distribution has been applied to data of Japanese centenarians. From the results, it is concluded that the upper limit of lifetime probability distribution in the Japanese population has been estimated 123 years. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Kupchishin, A. I.; Kupchishin, A. A.; Shmygalev, E. V.; Shmygaleva, T. A.; Tlebaev, K. B.
2014-11-01
In this article we carried out the calculations of the depth distribution of implanted ions of arsenic and indium, loss of energy and cascade-probability functions in silicon. Comparison of the calculations with the experimental data is in the satisfactory agreement. The computer simulation and analysis of the characteristics of ions depending on the depth of penetration and the number of interactions were carried out.
Huang, N. E.; Long, S. R.
1980-01-01
Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.
Directory of Open Access Journals (Sweden)
Julio Michael Stern
2011-04-01
Full Text Available This article analyzes the role of entropy in Bayesian statistics, focusing on its use as a tool for detection, recognition and validation of eigen-solutions. “Objects as eigen-solutions” is a key metaphor of the cognitive constructivism epistemological framework developed by the philosopher Heinz von Foerster. Special attention is given to some objections to the concepts of probability, statistics and randomization posed by George Spencer-Brown, a figure of great influence in the field of radical constructivism.
Level crossing statistics of atmospheric wind speeds: Probability distribution of episode lengths
Edwards, Paul J.
2000-03-01
The probability distribution of the duration of episodes (``wind runs'') during which the horizontal wind speed in the planetary surface boundary layer remains above or below a given threshold value is of interest in the fields of renewable energy generation and pollutant dispersal. There still appear to be no analytic or conceptual models to explain the remarkable constancy of the power law form of the wind run distribution measured at a variety of sites on the earth's surface for run lengths ranging from a few minutes to a day or more. .
Directory of Open Access Journals (Sweden)
Shulin Lyu
2018-01-01
The σ function, namely, the derivative of the log of the smallest eigenvalue distributions of the finite-n LUE or the JUE, satisfies the Jimbo–Miwa–Okamoto σ form of PV and PVI, although in the shift Jacobi case, with the weight xα(1−xβ, the β parameter does not show up in the equation. We also obtain the asymptotic expansions for the smallest eigenvalue distributions of the Laguerre unitary and Jacobi unitary ensembles after appropriate double scalings, and obtained the constants in the asymptotic expansion of the gap probabilities, expressed in term of the Barnes G-function valuated at special point.
Sorriso-Valvo, Luca; Carbone, Vincenzo; Veltri, Pierluigi; Consolini, Giuseppe; Bruno, Roberto
Intermittency in fluid turbulence can be emphasized through the analysis of Probability Distribution Functions (PDF) for velocity fluctuations, which display a strong non-gaussian behavior at small scales. Castaing et al. (1990) have introduced the idea that this behavior can be represented, in the framework of a multiplicative cascade model, by a convolution of gaussians whose variances is distributed according to a log-normal distribution. In this letter we have tried to test this conjecture on the MHD solar wind turbulence by performing a fit of the PDF of the bulk speed and magnetic field intensity fluctuations calculated in the solar wind, with the model. This fit allows us to calculate a parameter λ² depending on the scale, which represents the width of the log-normal distribution of the variances of the gaussians. The physical implications of the obtained values of the parameter as well as of its scaling law are finally discussed.
Probability distribution functions for unit hydrographs with optimization using genetic algorithm
Ghorbani, Mohammad Ali; Singh, Vijay P.; Sivakumar, Bellie; H. Kashani, Mahsa; Atre, Atul Arvind; Asadi, Hakimeh
2017-05-01
A unit hydrograph (UH) of a watershed may be viewed as the unit pulse response function of a linear system. In recent years, the use of probability distribution functions (pdfs) for determining a UH has received much attention. In this study, a nonlinear optimization model is developed to transmute a UH into a pdf. The potential of six popular pdfs, namely two-parameter gamma, two-parameter Gumbel, two-parameter log-normal, two-parameter normal, three-parameter Pearson distribution, and two-parameter Weibull is tested on data from the Lighvan catchment in Iran. The probability distribution parameters are determined using the nonlinear least squares optimization method in two ways: (1) optimization by programming in Mathematica; and (2) optimization by applying genetic algorithm. The results are compared with those obtained by the traditional linear least squares method. The results show comparable capability and performance of two nonlinear methods. The gamma and Pearson distributions are the most successful models in preserving the rising and recession limbs of the unit hydographs. The log-normal distribution has a high ability in predicting both the peak flow and time to peak of the unit hydrograph. The nonlinear optimization method does not outperform the linear least squares method in determining the UH (especially for excess rainfall of one pulse), but is comparable.
Directory of Open Access Journals (Sweden)
Sang-Yeop Chung
2015-01-01
Full Text Available Insulating concrete is a multiphase material designed for reduced thermal conductivity, and the void distribution in concrete strongly affects its physical properties such as mechanical response and heat conduction. Therefore, it is essential to develop a method for identifying the spatial distribution of voids. To examine the voids of insulating concrete specimens, micro-CT (computed tomography images can be effectively used. The micro-CT images are binarized to visualize the void distribution and stacked to generate 3D specimen images. From the obtained images, the spatial distribution of the voids and the microscopic constituents inside the insulating concrete specimens can be identified. The void distribution in the material can be characterized using low-order probability functions such as two-point correlation, lineal-path, and two-point cluster functions. It is confirmed that micro-CT images and low-order probability functions are effective in describing the relative degree of void clustering and void connectivity in insulating concrete.
Stirk, Emily R; Lythe, Grant; van den Berg, Hugo A; Hurst, Gareth A D; Molina-París, Carmen
2010-04-01
The limiting conditional probability distribution (LCD) has been much studied in the field of mathematical biology, particularly in the context of epidemiology and the persistence of epidemics. However, it has not yet been applied to the immune system. One of the characteristic features of the T cell repertoire is its diversity. This diversity declines in old age, whence the concepts of extinction and persistence are also relevant to the immune system. In this paper we model T cell repertoire maintenance by means of a continuous-time birth and death process on the positive integers, where the origin is an absorbing state. We show that eventual extinction is guaranteed. The late-time behaviour of the process before extinction takes place is modelled by the LCD, which we prove always exists for the process studied here. In most cases, analytic expressions for the LCD cannot be computed but the probability distribution may be approximated by means of the stationary probability distributions of two related processes. We show how these approximations are related to the LCD of the original process and use them to study the LCD in two special cases. We also make use of the large N expansion to derive a further approximation to the LCD. The accuracy of the various approximations is then analysed. (c) 2009 Elsevier Inc. All rights reserved.
1980-03-01
subjective probabil- ity estimates have been incorporated routinely into tactical intelligence comunications . Research in the area of intelligence...analysis: Report on Phase I. Report FSC-71-5047. Gaithersburg, Md.: International Business Machines (IBM), Federal Systems Division, 1971. Kelly, C. W
Stefanescu, E. R.; Patra, A.; Sheridan, M. F.; Cordoba, G.
2012-04-01
event. Logistic regression - Here, we define A as a discrete r.v., while B is a continuous one. P(B) represents the probability of having a flow≥ hcritical at location B, while P(A) represents the probability of having a flow or non-flow at A. Bayes analysis - At this stage of the analysis we consider only the r.v. A, where P(A) represents the probability of having a flow≥ hcritical at location A. We are interested in observing how the probability of having a flow≥ hcritical at location A is changing when data from the model is taken into consideration. We assume a Beta prior distribution for P(A) and compute P(A/data) using Maximum Likelihood Estimation (MLE) approach. Bayesian network for causal relationships - Here, we are interested in more than two critical locations and we are able to incorporate using a directed acyclic graph the causal relationship between all the chosen locations. Marginal probabilities along with the joint probability associated with an event based on the "causal links" between variables.
Anglewicz, Philip; Kohler, Hans-Peter
2009-01-01
In the absence of HIV testing, how do rural Malawians assess their HIV status? In this paper, we use a unique dataset that includes respondents' HIV status as well as their subjective likelihood of HIV infection. These data show that many rural Malawians overestimate their likelihood of current HIV infection. The discrepancy between actual and perceived status raises an important question: Why are so many wrong? We begin by identifying determinants of self-assessed HIV status, and then compar...
Goodness of fit of probability distributions for sightings as species approach extinction.
Vogel, Richard M; Hosking, Jonathan R M; Elphick, Chris S; Roberts, David L; Reed, J Michael
2009-04-01
Estimating the probability that a species is extinct and the timing of extinctions is useful in biological fields ranging from paleoecology to conservation biology. Various statistical methods have been introduced to infer the time of extinction and extinction probability from a series of individual sightings. There is little evidence, however, as to which of these models provide adequate fit to actual sighting records. We use L-moment diagrams and probability plot correlation coefficient (PPCC) hypothesis tests to evaluate the goodness of fit of various probabilistic models to sighting data collected for a set of North American and Hawaiian bird populations that have either gone extinct, or are suspected of having gone extinct, during the past 150 years. For our data, the uniform, truncated exponential, and generalized Pareto models performed moderately well, but the Weibull model performed poorly. Of the acceptable models, the uniform distribution performed best based on PPCC goodness of fit comparisons and sequential Bonferroni-type tests. Further analyses using field significance tests suggest that although the uniform distribution is the best of those considered, additional work remains to evaluate the truncated exponential model more fully. The methods we present here provide a framework for evaluating subsequent models.
Cerniglia, M. C.; Douglass, A. R.; Rood, R. B.; Sparling, L. C.; Nielsen, J. E.
1999-01-01
We present a study of the distribution of ozone in the lowermost stratosphere with the goal of understanding the relative contribution to the observations of air of either distinctly tropospheric or stratospheric origin. The air in the lowermost stratosphere is divided into two population groups based on Ertel's potential vorticity at 300 hPa. High [low] potential vorticity at 300 hPa suggests that the tropopause is low [high], and the identification of the two groups helps to account for dynamic variability. Conditional probability distribution functions are used to define the statistics of the mix from both observations and model simulations. Two data sources are chosen. First, several years of ozonesonde observations are used to exploit the high vertical resolution. Second, observations made by the Halogen Occultation Experiment [HALOE) on the Upper Atmosphere Research Satellite [UARS] are used to understand the impact on the results of the spatial limitations of the ozonesonde network. The conditional probability distribution functions are calculated at a series of potential temperature surfaces spanning the domain from the midlatitude tropopause to surfaces higher than the mean tropical tropopause [approximately 380K]. Despite the differences in spatial and temporal sampling, the probability distribution functions are similar for the two data sources. Comparisons with the model demonstrate that the model maintains a mix of air in the lowermost stratosphere similar to the observations. The model also simulates a realistic annual cycle. By using the model, possible mechanisms for the maintenance of mix of air in the lowermost stratosphere are revealed. The relevance of the results to the assessment of the environmental impact of aircraft effluence is discussed.
Visualizing 2D Probability Distributions from Satellite Image-Derived Data
Kao, David; Dungan, Jennifer; Pang, Alex; Biegel, Bryan (Technical Monitor)
2002-01-01
Creating maps of biophysical and geophysical variables using Earth Observing System (EOS) satellite image data is an important component of Earth science. These 2D maps have a single value at every location and standard techniques are used to visualize them. Current tools fall short, however, when it is necessary to describe a distribution of values at each location. Distributions may represent a frequency of occurrence over time, frequency of occurrence from multiple runs of an ensemble forecast or possible values from an uncertainty model. 'Distribution data sets' are described, then a case study is presented to visualize such 2D distributions. Distribution data sets are different from multivariate data sets in the sense that the values are for a single variable instead of multiple variables. Our case study data consists of multiple realizations of percent forest cover, generated using a geostatistical technique that combines ground measurements and satellite imagery to model uncertainty about forest cover. We present several approaches for analyzing and visualizing such data sets. The first is a pixel-wise analysis of the probability density functions for the 2D image while the second is an analysis of features identified within the image. Such pixel-wise and feature-wise views will give Earth scientists a more complete understanding of distribution data sets.
For experimental determination of parameters of the law of time probability distribution of correct operation the demarcation of failures by causes...shows practically no effect on reliability. Parameters of the law of probabilities distribution, determined by numerical values of dispersion and
Guo, L. M.; Zhu, H. B.; Zhang, N. X.
The probability density distribution of the traffic density is analyzed based on the empirical data. It is found that the beta distribution can fit the result obtained from the measured traffic density perfectly. Then a modified traffic model is proposed to simulate the microscopic traffic flow, in which the probability density distribution of the traffic density is taken into account. The model also contains the behavior of drivers’ speed adaptation by taking into account the driving behavior difference and the dynamic headway. Accompanied by presenting the flux-density diagrams, the velocity evolution diagrams and the spatial-temporal profiles of vehicles are also given. The synchronized flow phase and the wide moving jam phase are indicated, which is the challenge for the cellular automata traffic model. Furthermore the phenomenon of the high speed car-following is exhibited, which has been observed in the measured data previously. The results set demonstrate the effectiveness of the proposed model in detecting the complicated dynamic phenomena of the traffic flow.
A probability distribution of shape for the dental maxillary arch using digital images.
Rijal, Omar M; Abdullah, Norli A; Isa, Zakiah M; Noor, Norliza M; Tawfiq, Omar F
2012-01-01
Selected landmarks from each of 47 maxillary dental casts were used to define a Cartesian-coordinate system from which the positions of selected teeth were determined on standardized digital images. The position of the i-th tooth was defined by a line of length (l(i)) joining the tooth to the origin, and the angle (θ(i)) of this line to the horizontal Cartesian axis. Four teeth, the central incisor, lateral incisor, canine and first molar were selected and their position were collectively used to represent the shape of the dental arch. A pilot study using clustering and principal component analysis strongly suggest the existence of 3 groups of arch shape. In this study, the homogeneity of the 3 groups was further investigated and confirmed by the Dunn and Davies-Bouldein validity indices. This is followed by an investigation of the probability distribution of these 3 groups. The main result of this study suggests 3 groups of multivariate (MV) normal distribution. The MV normal probability distribution of these groups may be used in further studies to investigate the issues of variation of arch shape, which is fundamental to the practice of prosthodontics and orthodontics.
Hobbs, Jennifer A; Towal, R Blythe; Hartmann, Mitra J Z
2015-08-01
Analysis of natural scene statistics has been a powerful approach for understanding neural coding in the auditory and visual systems. In the field of somatosensation, it has been more challenging to quantify the natural tactile scene, in part because somatosensory signals are so tightly linked to the animal's movements. The present work takes a step towards quantifying the natural tactile scene for the rat vibrissal system by simulating rat whisking motions to systematically investigate the probabilities of whisker-object contact in naturalistic environments. The simulations permit an exhaustive search through the complete space of possible contact patterns, thereby allowing for the characterization of the patterns that would most likely occur during long sequences of natural exploratory behavior. We specifically quantified the probabilities of 'concomitant contact', that is, given that a particular whisker makes contact with a surface during a whisk, what is the probability that each of the other whiskers will also make contact with the surface during that whisk? Probabilities of concomitant contact were quantified in simulations that assumed increasingly naturalistic conditions: first, the space of all possible head poses; second, the space of behaviorally preferred head poses as measured experimentally; and third, common head poses in environments such as cages and burrows. As environments became more naturalistic, the probability distributions shifted from exhibiting a 'row-wise' structure to a more diagonal structure. Results also reveal that the rat appears to use motor strategies (e.g. head pitches) that generate contact patterns that are particularly well suited to extract information in the presence of uncertainty. © 2015. Published by The Company of Biologists Ltd.
Hubig, Michael; Muggenthaler, Holger; Mall, Gita
2014-05-01
Bayesian estimation applied to temperature based death time estimation was recently introduced as conditional probability distribution or CPD-method by Biermann and Potente. The CPD-method is useful, if there is external information that sets the boundaries of the true death time interval (victim last seen alive and found dead). CPD allows computation of probabilities for small time intervals of interest (e.g. no-alibi intervals of suspects) within the large true death time interval. In the light of the importance of the CPD for conviction or acquittal of suspects the present study identifies a potential error source. Deviations in death time estimates will cause errors in the CPD-computed probabilities. We derive formulae to quantify the CPD error as a function of input error. Moreover we observed the paradox, that in cases, in which the small no-alibi time interval is located at the boundary of the true death time interval, adjacent to the erroneous death time estimate, CPD-computed probabilities for that small no-alibi interval will increase with increasing input deviation, else the CPD-computed probabilities will decrease. We therefore advise not to use CPD if there is an indication of an error or a contra-empirical deviation in the death time estimates, that is especially, if the death time estimates fall out of the true death time interval, even if the 95%-confidence intervals of the estimate still overlap the true death time interval. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Kurugol, Sila; Freiman, Moti; Afacan, Onur; Perez-Rossello, Jeannette M; Callahan, Michael J; Warfield, Simon K
2016-08-01
Quantitative diffusion-weighted MR imaging (DW-MRI) of the body enables characterization of the tissue microenvironment by measuring variations in the mobility of water molecules. The diffusion signal decay model parameters are increasingly used to evaluate various diseases of abdominal organs such as the liver and spleen. However, previous signal decay models (i.e., mono-exponential, bi-exponential intra-voxel incoherent motion (IVIM) and stretched exponential models) only provide insight into the average of the distribution of the signal decay rather than explicitly describe the entire range of diffusion scales. In this work, we propose a probability distribution model of incoherent motion that uses a mixture of Gamma distributions to fully characterize the multi-scale nature of diffusion within a voxel. Further, we improve the robustness of the distribution parameter estimates by integrating spatial homogeneity prior into the probability distribution model of incoherent motion (SPIM) and by using the fusion bootstrap solver (FBM) to estimate the model parameters. We evaluated the improvement in quantitative DW-MRI analysis achieved with the SPIM model in terms of accuracy, precision and reproducibility of parameter estimation in both simulated data and in 68 abdominal in-vivo DW-MRIs. Our results show that the SPIM model not only substantially reduced parameter estimation errors by up to 26%; it also significantly improved the robustness of the parameter estimates (paired Student's t-test, p < 0.0001) by reducing the coefficient of variation (CV) of estimated parameters compared to those produced by previous models. In addition, the SPIM model improves the parameter estimates reproducibility for both intra- (up to 47%) and inter-session (up to 30%) estimates compared to those generated by previous models. Thus, the SPIM model has the potential to improve accuracy, precision and robustness of quantitative abdominal DW-MRI analysis for clinical applications
Lee, Hansu; Choi, Youngeun
2014-05-01
This study determined the optimal statistical probability distributions to estimate maximum probability precipitation in the Republic of Korea and examined whether there were any distinct changes on distribution types and extreme precipitation characteristics. Generalized Pareto distribution, and three parameter Burr distribution were most selected distributions for annual maximum series in the Republic of Korea. Furthermore, in the seasonal basis, the most selected distributions was three parameter Dagum distribution for spring, three parameter Burr distribution for summer, generalized Pareto distribution for autumn, three parameter log logistic distribution, generalized Pareto distribution and log-Pearson type III distribution for winter. Maximum probability precipitation was derived from selected optimal probability distributions and compared with that from Ministry of Land, Transport and Maritime Affairs(MOLTMA). Maximum probability precipitation in this study was greater than that of MOLTMA as the duration time and return periods increased. This difference was statistically significant when apply Wilcoxon signed rank test. Because of different distributions, as the return period is longer, greater maximum probability precipitation value were estimated. Annual maximum series from 1973 to 2012 showed that the median was the highest in the south coastal region, but as a duration time was getting longer, Seoul, Gyeonggido, and Gangwondo had higher median values, which located in the central part of Korea. The months of annual maximum series occurrence were concentrated between June and September. Typhoons affected on annual maximum series occurrence in September. Seasonal maximum probability precipitation was greater in most of the south coastal region, and Seoul, Gyeonggido and Gangwondo had greater maximum probability precipitation in summer. Gangwondo had greater maximum probability precipitation in autumn while Ulleung and Daegwallyeong had a greater one in
Discrete coherent states and probability distributions in finite-dimensional spaces
Energy Technology Data Exchange (ETDEWEB)
Galetti, D.; Marchiolli, M.A.
1995-06-01
Operator bases are discussed in connection with the construction of phase space representatives of operators in finite-dimensional spaces and their properties are presented. It is also shown how these operator bases allow for the construction of a finite harmonic oscillator-like coherent state. Creation and annihilation operators for the Fock finite-dimensional space are discussed and their expressions in terms of the operator bases are explicitly written. The relevant finite-dimensional probability distributions are obtained and their limiting behavior for an infinite-dimensional space are calculated which agree with the well know results. (author). 20 refs, 2 figs.
On the Meta Distribution of Coverage Probability in Uplink Cellular Networks
Elsawy, Hesham
2017-04-07
This letter studies the meta distribution of coverage probability (CP), within a stochastic geometry framework, for cellular uplink transmission with fractional path-loss inversion power control. Using the widely accepted Poisson point process (PPP) for modeling the spatial locations of base stations (BSs), we obtain the percentiles of users that achieve a target uplink CP over an arbitrary, but fixed, realization of the PPP. To this end, the effect of the users activity factor (p) and the path-loss compensation factor () on the uplink performance are analyzed. The results show that decreasing p and/or increasing reduce the CP variation around the spatially averaged value.
Probability distribution functions for intermittent scrape-off layer plasma fluctuations
Theodorsen, A.; Garcia, O. E.
2018-03-01
A stochastic model for intermittent fluctuations in the scrape-off layer of magnetically confined plasmas has been constructed based on a super-position of uncorrelated pulses arriving according to a Poisson process. In the most common applications of the model, the pulse amplitudes are assumed exponentially distributed, supported by conditional averaging of large-amplitude fluctuations in experimental measurement data. This basic assumption has two potential limitations. First, statistical analysis of measurement data using conditional averaging only reveals the tail of the amplitude distribution to be exponentially distributed. Second, exponentially distributed amplitudes leads to a positive definite signal which cannot capture fluctuations in for example electric potential and radial velocity. Assuming pulse amplitudes which are not positive definite often make finding a closed form for the probability density function (PDF) difficult, even if the characteristic function remains relatively simple. Thus estimating model parameters requires an approach based on the characteristic function, not the PDF. In this contribution, the effect of changing the amplitude distribution on the moments, PDF and characteristic function of the process is investigated and a parameter estimation method using the empirical characteristic function is presented and tested on synthetically generated data. This proves valuable for describing intermittent fluctuations of all plasma parameters in the boundary region of magnetized plasmas.
Wind speed analysis in La Vainest, Mexico: a bimodal probability distribution case
Energy Technology Data Exchange (ETDEWEB)
Jaramillo, O.A.; Borja, M.A. [Energias No Convencionales, Morelos (Mexico). Instituto de Investigaciones Electricas
2004-08-01
The statistical characteristics of the wind speed in La Vainest, Oxoic, Mexico, have been analyzed by using wind speed data recorded by Instituto de Investigaciones Electricas (IIE). By grouping the observations by annual, seasonal and wind direction, we show that the wind speed distribution, with calms included, is not represented by the typical two-parameter Weibull function. A mathematical formulation by using a bimodal Weibull and Weibull probability distribution function (PDF) has been developed to analyse the wind speed frequency distribution in that region. The model developed here can be applied for similar regions where the wind speed distribution presents a bimodal PDF. The two-parameter Weibull wind speed distribution must not be generalised, since it is not accurate to represent some wind regimes as the case of La Ventosa, Mexico. The analysis of wind data shows that computing the capacity factor for wind power plants to be installed in La Ventosa must be carded out by means of a bimodal PDF instead of the typical Weibull PDF. Otherwise, the capacity factor will be underestimated. (author)
Fitting: Subroutine to fit four-moment probability distributions to data
Energy Technology Data Exchange (ETDEWEB)
Winterstein, S.R.; Lange, C.H.; Kumar, S. [Stanford Univ., CA (United States)
1995-01-01
FITTING is a Fortran subroutine that constructs a smooth, generalized four-parameter probability distribution model. It is fit to the first four statistical moments of the random variable X (i.e., average values of X, X{sup 2}, X{sup 3}, and X{sup 4}) which can be calculated from data using the associated subroutine CALMOM. The generalized model is produced from a cubic distortion of the parent model, calibrated to match the first four moments of the data. This four-moment matching is intended to provide models that are more faithful to the data in the upper tail of the distribution. Examples are shown for two specific cases.
Directory of Open Access Journals (Sweden)
Limin Wang
2015-06-01
Full Text Available As one of the most common types of graphical models, the Bayesian classifier has become an extremely popular approach to dealing with uncertainty and complexity. The scoring functions once proposed and widely used for a Bayesian network are not appropriate for a Bayesian classifier, in which class variable C is considered as a distinguished one. In this paper, we aim to clarify the working mechanism of Bayesian classifiers from the perspective of the chain rule of joint probability distribution. By establishing the mapping relationship between conditional probability distribution and mutual information, a new scoring function, Sum_MI, is derived and applied to evaluate the rationality of the Bayesian classifiers. To achieve global optimization and high dependence representation, the proposed learning algorithm, the flexible K-dependence Bayesian (FKDB classifier, applies greedy search to extract more information from the K-dependence network structure. Meanwhile, during the learning procedure, the optimal attribute order is determined dynamically, rather than rigidly. In the experimental study, functional dependency analysis is used to improve model interpretability when the structure complexity is restricted.
Das, Jayajit; Mukherjee, Sayak; Hodge, Susan E
2015-07-01
A common statistical situation concerns inferring an unknown distribution Q(x) from a known distribution P(y), where X (dimension n), and Y (dimension m) have a known functional relationship. Most commonly, n ≤ m, and the task is relatively straightforward for well-defined functional relationships. For example, if Y1 and Y2 are independent random variables, each uniform on [0, 1], one can determine the distribution of X = Y1 + Y2; here m = 2 and n = 1. However, biological and physical situations can arise where n > m and the functional relation Y→X is non-unique. In general, in the absence of additional information, there is no unique solution to Q in those cases. Nevertheless, one may still want to draw some inferences about Q. To this end, we propose a novel maximum entropy (MaxEnt) approach that estimates Q(x) based only on the available data, namely, P(y). The method has the additional advantage that one does not need to explicitly calculate the Lagrange multipliers. In this paper we develop the approach, for both discrete and continuous probability distributions, and demonstrate its validity. We give an intuitive justification as well, and we illustrate with examples.
Serial Spike Time Correlations Affect Probability Distribution of Joint Spike Events.
Shahi, Mina; van Vreeswijk, Carl; Pipa, Gordon
2016-01-01
Detecting the existence of temporally coordinated spiking activity, and its role in information processing in the cortex, has remained a major challenge for neuroscience research. Different methods and approaches have been suggested to test whether the observed synchronized events are significantly different from those expected by chance. To analyze the simultaneous spike trains for precise spike correlation, these methods typically model the spike trains as a Poisson process implying that the generation of each spike is independent of all the other spikes. However, studies have shown that neural spike trains exhibit dependence among spike sequences, such as the absolute and relative refractory periods which govern the spike probability of the oncoming action potential based on the time of the last spike, or the bursting behavior, which is characterized by short epochs of rapid action potentials, followed by longer episodes of silence. Here we investigate non-renewal processes with the inter-spike interval distribution model that incorporates spike-history dependence of individual neurons. For that, we use the Monte Carlo method to estimate the full shape of the coincidence count distribution and to generate false positives for coincidence detection. The results show that compared to the distributions based on homogeneous Poisson processes, and also non-Poisson processes, the width of the distribution of joint spike events changes. Non-renewal processes can lead to both heavy tailed or narrow coincidence distribution. We conclude that small differences in the exact autostructure of the point process can cause large differences in the width of a coincidence distribution. Therefore, manipulations of the autostructure for the estimation of significance of joint spike events seem to be inadequate.
Sasaki, Tomohiko; Kondo, Osamu
2016-03-01
In paleodemography, the Bayesian approach has been suggested to provide an effective means by which mortality profiles of past populations can be adequately estimated, and thus avoid problems of "age-mimicry" inherent in conventional approaches. In this study, we propose an application of the Gompertz model using an "informative" prior probability distribution by revising a recent example of the Bayesian approach based on an "uninformative" distribution. Life-table data of 134 human populations including those of contemporary hunter-gatherers were used to determine the Gompertz parameters of each population. In each population, we used both raw life-table data and the Gompertz parameters to calculate some demographic values such as the mean life-span, to confirm representativeness of the model. Then, the correlation between the two Gompertz parameters (the Strehler-Mildvan correlation) was re-established. We incorporated the correlation into the Bayesian approach as an "informative" prior probability distribution, and tested its effectiveness using simulated data. Our analyses showed that the mean life-span (≥ age 15) and the proportion of living persons aging over 45 were well-reproduced by the Gompertz model. The simulation showed that using the correlation as an informative prior provides a narrower estimation range in the Bayesian approach than does the uninformative prior. The Gompertz model can be assumed to accurately estimate the mean life-span and/or the proportion of old people in a population. We suggest that the Strehler-Mildvan correlation can be used as a useful constraint in demographic reconstructions of past human populations. © 2015 Wiley Periodicals, Inc.
DEFF Research Database (Denmark)
Yura, Harold; Hanson, Steen Grüner
2012-01-01
Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the......Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set...... with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative...
Directory of Open Access Journals (Sweden)
X. Shen
2018-01-01
Full Text Available In this work, the spatial extent of new particle formation (NPF events and the relative probability of observing particles originating from different spatial origins around three rural sites in eastern China were investigated using the NanoMap method, using particle number size distribution (PNSD data and air mass back trajectories. The length of the datasets used were 7, 1.5, and 3 years at rural sites Shangdianzi (SDZ in the North China Plain (NCP, Mt. Tai (TS in central eastern China, and Lin'an (LAN in the Yangtze River Delta region in eastern China, respectively. Regional NPF events were observed to occur with the horizontal extent larger than 500 km at SDZ and TS, favoured by the fast transport of northwesterly air masses. At LAN, however, the spatial footprint of NPF events was mostly observed around the site within 100–200 km. Difference in the horizontal spatial distribution of new particle source areas at different sites was connected to typical meteorological conditions at the sites. Consecutive large-scale regional NPF events were observed at SDZ and TS simultaneously and were associated with a high surface pressure system dominating over this area. Simultaneous NPF events at SDZ and LAN were seldom observed. At SDZ the polluted air masses arriving over the NCP were associated with higher particle growth rate (GR and new particle formation rate (J than air masses from Inner Mongolia (IM. At TS the same phenomenon was observed for J, but GR was somewhat lower in air masses arriving over the NCP compared to those arriving from IM. The capability of NanoMap to capture the NPF occurrence probability depends on the length of the dataset of PNSD measurement but also on topography around the measurement site and typical air mass advection speed during NPF events. Thus the long-term measurements of PNSD in the planetary boundary layer are necessary in the further study of spatial extent and the probability of NPF events. The spatial
Lee, Sunghee; Liu, Mingnan; Hu, Mengyao
2017-01-01
Time orientation is an unconscious yet fundamental cognitive process that provides a framework for organizing personal experiences in temporal categories of past, present and future, reflecting the relative emphasis given to these categories. Culture lies central to individuals’ time orientation, leading to cultural variations in time orientation. For example, people from future-oriented cultures tend to emphasize the future and store information relevant for the future more than those from present- or past-oriented cultures. For survey questions that ask respondents to report expected probabilities of future events, this may translate into culture-specific question difficulties, manifested through systematically varying “I don’t know” item nonresponse rates. This study drew on the time orientation theory and examined culture-specific nonresponse patterns on subjective probability questions using methodologically comparable population-based surveys from multiple countries. The results supported our hypothesis. Item nonresponse rates on these questions varied significantly in the way that future-orientation at the group as well as individual level was associated with lower nonresponse rates. This pattern did not apply to non-probability questions. Our study also suggested potential nonresponse bias. Examining culture-specific constructs, such as time orientation, as a framework for measurement mechanisms may contribute to improving cross-cultural research. PMID:28781381
Panaretos, John
1989-01-01
In this paper, a probability model leading to a Yule distribution is developed in the study of surname frequency data. This distribution, suitably truncated, is fitted to actual data as an alternative to the discrete Pareto distribution, with quite satisfactory results
On the probability distribution of daily streamflow in the United States
Blum, Annalise G.; Archfield, Stacey A.; Vogel, Richard M.
2017-01-01
Daily streamflows are often represented by flow duration curves (FDCs), which illustrate the frequency with which flows are equaled or exceeded. FDCs have had broad applications across both operational and research hydrology for decades; however, modeling FDCs has proven elusive. Daily streamflow is a complex time series with flow values ranging over many orders of magnitude. The identification of a probability distribution that can approximate daily streamflow would improve understanding of the behavior of daily flows and the ability to estimate FDCs at ungaged river locations. Comparisons of modeled and empirical FDCs at nearly 400 unregulated, perennial streams illustrate that the four-parameter kappa distribution provides a very good representation of daily streamflow across the majority of physiographic regions in the conterminous United States (US). Further, for some regions of the US, the three-parameter generalized Pareto and lognormal distributions also provide a good approximation to FDCs. Similar results are found for the period of record FDCs, representing the long-term hydrologic regime at a site, and median annual FDCs, representing the behavior of flows in a typical year.
Probability distribution of turbulence in curvilinear cross section mobile bed channel.
Sharma, Anurag; Kumar, Bimlesh
2016-01-01
The present study investigates the probability density functions (PDFs) of two-dimensional turbulent velocity fluctuations, Reynolds shear stress (RSS) and conditional RSSs in threshold channel obtained by using Gram-Charlier (GC) series. The GC series expansion has been used up to the moments of order four to include the skewness and kurtosis. Experiments were carried out in the curvilinear cross section sand bed channel at threshold condition with uniform sand size of d50 = 0.418 mm. The result concludes that the PDF distributions of turbulent velocity fluctuations and RSS calculated theoretically based on GC series expansion satisfied the PDFs obtained from the experimental data. The PDF distribution of conditional RSSs related to the ejections and sweeps are well represented by the GC series exponential distribution, except that a slight departure of inward and outward interactions is observed, which may be due to weaker events. This paper offers some new insights into the probabilistic mechanism of sediment transport, which can be helpful in sediment management and design of curvilinear cross section mobile bed channel.
On the probability distribution of daily streamflow in the United States
Directory of Open Access Journals (Sweden)
A. G. Blum
2017-06-01
Full Text Available Daily streamflows are often represented by flow duration curves (FDCs, which illustrate the frequency with which flows are equaled or exceeded. FDCs have had broad applications across both operational and research hydrology for decades; however, modeling FDCs has proven elusive. Daily streamflow is a complex time series with flow values ranging over many orders of magnitude. The identification of a probability distribution that can approximate daily streamflow would improve understanding of the behavior of daily flows and the ability to estimate FDCs at ungaged river locations. Comparisons of modeled and empirical FDCs at nearly 400 unregulated, perennial streams illustrate that the four-parameter kappa distribution provides a very good representation of daily streamflow across the majority of physiographic regions in the conterminous United States (US. Further, for some regions of the US, the three-parameter generalized Pareto and lognormal distributions also provide a good approximation to FDCs. Similar results are found for the period of record FDCs, representing the long-term hydrologic regime at a site, and median annual FDCs, representing the behavior of flows in a typical year.
Li, Ning; Liu, Xueqin; Xie, Wei; Wu, Jidong; Zhang, Peng
2013-01-01
New features of natural disasters have been observed over the last several years. The factors that influence the disasters' formation mechanisms, regularity of occurrence and main characteristics have been revealed to be more complicated and diverse in nature than previously thought. As the uncertainty involved increases, the variables need to be examined further. This article discusses the importance and the shortage of multivariate analysis of natural disasters and presents a method to estimate the joint probability of the return periods and perform a risk analysis. Severe dust storms from 1990 to 2008 in Inner Mongolia were used as a case study to test this new methodology, as they are normal and recurring climatic phenomena on Earth. Based on the 79 investigated events and according to the dust storm definition with bivariate, the joint probability distribution of severe dust storms was established using the observed data of maximum wind speed and duration. The joint return periods of severe dust storms were calculated, and the relevant risk was analyzed according to the joint probability. The copula function is able to simulate severe dust storm disasters accurately. The joint return periods generated are closer to those observed in reality than the univariate return periods and thus have more value in severe dust storm disaster mitigation, strategy making, program design, and improvement of risk management. This research may prove useful in risk-based decision making. The exploration of multivariate analysis methods can also lay the foundation for further applications in natural disaster risk analysis. © 2012 Society for Risk Analysis.
Directory of Open Access Journals (Sweden)
Han Liwei
2014-07-01
Full Text Available Monitoring data on an earth-rockfill dam constitutes a form of spatial data. Such data include much uncertainty owing to the limitation of measurement information, material parameters, load, geometry size, initial conditions, boundary conditions and the calculation model. So the cloud probability density of the monitoring data must be addressed. In this paper, the cloud theory model was used to address the uncertainty transition between the qualitative concept and the quantitative description. Then an improved algorithm of cloud probability distribution density based on a backward cloud generator was proposed. This was used to effectively convert certain parcels of accurate data into concepts which can be described by proper qualitative linguistic values. Such qualitative description was addressed as cloud numerical characteristics-- {Ex, En, He}, which could represent the characteristics of all cloud drops. The algorithm was then applied to analyze the observation data of a piezometric tube in an earth-rockfill dam. And experiment results proved that the proposed algorithm was feasible, through which, we could reveal the changing regularity of piezometric tube’s water level. And the damage of the seepage in the body was able to be found out.
Bazant, Zdenek P; Le, Jia-Liang; Bazant, Martin Z
2009-07-14
The failure probability of engineering structures such as aircraft, bridges, dams, nuclear structures, and ships, as well as microelectronic components and medical implants, must be kept extremely low, typically theory for the strength cdf of quasibrittle structure is refined by deriving it from fracture mechanics of nanocracks propagating by small, activation-energy-controlled, random jumps through the atomic lattice. This refinement also provides a plausible physical justification of the power law for subcritical creep crack growth, hitherto considered empirical. The theory is further extended to predict the cdf of structural lifetime at constant load, which is shown to be size- and geometry-dependent. The size effects on structure strength and lifetime are shown to be related and the latter to be much stronger. The theory fits previously unexplained deviations of experimental strength and lifetime histograms from the Weibull distribution. Finally, a boundary layer method for numerical calculation of the cdf of structural strength and lifetime is outlined.
Binomial moments of the distance distribution and the probability of undetected error
Energy Technology Data Exchange (ETDEWEB)
Barg, A. [Lucent Technologies, Murray Hill, NJ (United States). Bell Labs.; Ashikhmin, A. [Los Alamos National Lab., NM (United States)
1998-09-01
In [1] K.A.S. Abdel-Ghaffar derives a lower bound on the probability of undetected error for unrestricted codes. The proof relies implicitly on the binomial moments of the distance distribution of the code. The authors use the fact that these moments count the size of subcodes of the code to give a very simple proof of the bound in [1] by showing that it is essentially equivalent to the Singleton bound. They discuss some combinatorial connections revealed by this proof. They also discuss some improvements of this bound. Finally, they analyze asymptotics. They show that an upper bound on the undetected error exponent that corresponds to the bound of [1] improves known bounds on this function.
Molecular clouds have power-law probability distribution functions (not log-normal)
Alves, Joao; Lombardi, Marco; Lada, Charles
2015-08-01
We investigate the shape of the probability distribution of column densities (PDF) in molecular clouds. Through the use of low-noise, extinction-calibrated Planck-Herschel emission data for eight molecular clouds, we demonstrate that, contrary to common belief, the PDFs of molecular clouds are not described well by log-normal functions, but are instead power laws with exponents close to two and with breaks between AK≃0.1 and 0.2mag, so close to the CO self-shielding limit and not far from the transition between molecular and atomic gas. Additionally, we argue that the intrinsic functional form of the PDF cannot be securely determined below AK≃0.1mag, limiting our ability to investigate more complex models for the shape of the cloud PDF.
Rupture of single receptor-ligand bonds: a new insight into probability distribution function.
Gupta, V K
2013-01-01
Single molecule force spectroscopy is widely used to determine kinetic parameters of dissociation by analyzing bond rupture data obtained via applying mechanical force to cells, capsules, and beads that are attached to an intermolecular bond. The current analysis assumes that the intermolecular bond force is equal to the externally applied mechanical force. We confirm that viscous drag alone or in combination with cellular deformation resulting in viscoelasticity modulates bond force so that the instantaneous intermolecular bond force is not equivalent to the applied force. The bond force modulation leads to bond rupture time and force histograms that differ from those predicted by probability distribution function (PDF) using the current approach. A new methodology that accounts for bond force modulation in obtaining PDF is presented. The predicted histograms from the new methodology are in excellent agreement with the respective histograms obtained from Monte Carlo simulation. Copyright © 2012 Elsevier B.V. All rights reserved.
Royle, J. Andrew; Chandler, Richard B.; Yackulic, Charles; Nichols, James D.
2012-01-01
1. Understanding the factors affecting species occurrence is a pre-eminent focus of applied ecological research. However, direct information about species occurrence is lacking for many species. Instead, researchers sometimes have to rely on so-called presence-only data (i.e. when no direct information about absences is available), which often results from opportunistic, unstructured sampling. MAXENT is a widely used software program designed to model and map species distribution using presence-only data. 2. We provide a critical review of MAXENT as applied to species distribution modelling and discuss how it can lead to inferential errors. A chief concern is that MAXENT produces a number of poorly defined indices that are not directly related to the actual parameter of interest – the probability of occurrence (ψ). This focus on an index was motivated by the belief that it is not possible to estimate ψ from presence-only data; however, we demonstrate that ψ is identifiable using conventional likelihood methods under the assumptions of random sampling and constant probability of species detection. 3. The model is implemented in a convenient r package which we use to apply the model to simulated data and data from the North American Breeding Bird Survey. We demonstrate that MAXENT produces extreme under-predictions when compared to estimates produced by logistic regression which uses the full (presence/absence) data set. We note that MAXENT predictions are extremely sensitive to specification of the background prevalence, which is not objectively estimated using the MAXENT method. 4. As with MAXENT, formal model-based inference requires a random sample of presence locations. Many presence-only data sets, such as those based on museum records and herbarium collections, may not satisfy this assumption. However, when sampling is random, we believe that inference should be based on formal methods that facilitate inference about interpretable ecological quantities
Sáez, Carlos; Robles, Montserrat; García-Gómez, Juan M
2017-02-01
Biomedical data may be composed of individuals generated from distinct, meaningful sources. Due to possible contextual biases in the processes that generate data, there may exist an undesirable and unexpected variability among the probability distribution functions (PDFs) of the source subsamples, which, when uncontrolled, may lead to inaccurate or unreproducible research results. Classical statistical methods may have difficulties to undercover such variabilities when dealing with multi-modal, multi-type, multi-variate data. This work proposes two metrics for the analysis of stability among multiple data sources, robust to the aforementioned conditions, and defined in the context of data quality assessment. Specifically, a global probabilistic deviation and a source probabilistic outlyingness metrics are proposed. The first provides a bounded degree of the global multi-source variability, designed as an estimator equivalent to the notion of normalized standard deviation of PDFs. The second provides a bounded degree of the dissimilarity of each source to a latent central distribution. The metrics are based on the projection of a simplex geometrical structure constructed from the Jensen-Shannon distances among the sources PDFs. The metrics have been evaluated and demonstrated their correct behaviour on a simulated benchmark and with real multi-source biomedical data using the UCI Heart Disease data set. The biomedical data quality assessment based on the proposed stability metrics may improve the efficiency and effectiveness of biomedical data exploitation and research.
Li, Y.; Gong, H.; Zhu, L.; Guo, L.; Gao, M.; Zhou, C.
2016-12-01
Continuous over-exploitation of groundwater causes dramatic drawdown, and leads to regional land subsidence in the Huairou Emergency Water Resources region, which is located in the up-middle part of the Chaobai river basin of Beijing. Owing to the spatial heterogeneity of strata's lithofacies of the alluvial fan, ground deformation has no significant positive correlation with groundwater drawdown, and one of the challenges ahead is to quantify the spatial distribution of strata's lithofacies. The transition probability geostatistics approach provides potential for characterizing the distribution of heterogeneous lithofacies in the subsurface. Combined the thickness of clay layer extracted from the simulation, with deformation field acquired from PS-InSAR technology, the influence of strata's lithofacies on land subsidence can be analyzed quantitatively. The strata's lithofacies derived from borehole data were generalized into four categories and their probability distribution in the observe space was mined by using the transition probability geostatistics, of which clay was the predominant compressible material. Geologically plausible realizations of lithofacies distribution were produced, accounting for complex heterogeneity in alluvial plain. At a particular probability level of more than 40 percent, the volume of clay defined was 55 percent of the total volume of strata's lithofacies. This level, equaling nearly the volume of compressible clay derived from the geostatistics, was thus chosen to represent the boundary between compressible and uncompressible material. The method incorporates statistical geological information, such as distribution proportions, average lengths and juxtaposition tendencies of geological types, mainly derived from borehole data and expert knowledge, into the Markov chain model of transition probability. Some similarities of patterns were indicated between the spatial distribution of deformation field and clay layer. In the area with
Dinov, Ivo D.; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas
2014-01-01
Summary Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students’ understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference. PMID:25419016
Liu, Zhaoyan; Vaughan, Mark A.; Winker, Davd M.; Hostetler, Chris A.; Poole, Lamont R.; Hlavka, Dennis; Hart, William; McGill, Mathew
2004-01-01
In this paper we describe the algorithm hat will be used during the upcoming Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) mission for discriminating between clouds and aerosols detected in two wavelength backscatter lidar profiles. We first analyze single-test and multiple-test classification approaches based on one-dimensional and multiple-dimensional probability density functions (PDFs) in the context of a two-class feature identification scheme. From these studies we derive an operational algorithm based on a set of 3-dimensional probability distribution functions characteristic of clouds and aerosols. A dataset acquired by the Cloud Physics Lidar (CPL) is used to test the algorithm. Comparisons are conducted between the CALIPSO algorithm results and the CPL data product. The results obtained show generally good agreement between the two methods. However, of a total of 228,264 layers analyzed, approximately 5.7% are classified as different types by the CALIPSO and CPL algorithm. This disparity is shown to be due largely to the misclassification of clouds as aerosols by the CPL algorithm. The use of 3-dimensional PDFs in the CALIPSO algorithm is found to significantly reduce this type of error. Dust presents a special case. Because the intrinsic scattering properties of dust layers can be very similar to those of clouds, additional algorithm testing was performed using an optically dense layer of Saharan dust measured during the Lidar In-space Technology Experiment (LITE). In general, the method is shown to distinguish reliably between dust layers and clouds. The relatively few erroneous classifications occurred most often in the LITE data, in those regions of the Saharan dust layer where the optical thickness was the highest.
Singh, Nagendra Pratap; Srivastava, Rajeev
2016-06-01
Retinal blood vessel segmentation is a prominent task for the diagnosis of various retinal pathology such as hypertension, diabetes, glaucoma, etc. In this paper, a novel matched filter approach with the Gumbel probability distribution function as its kernel is introduced to improve the performance of retinal blood vessel segmentation. Before applying the proposed matched filter, the input retinal images are pre-processed. During pre-processing stage principal component analysis (PCA) based gray scale conversion followed by contrast limited adaptive histogram equalization (CLAHE) are applied for better enhancement of retinal image. After that an exhaustive experiments have been conducted for selecting the appropriate value of parameters to design a new matched filter. The post-processing steps after applying the proposed matched filter include the entropy based optimal thresholding and length filtering to obtain the segmented image. For evaluating the performance of proposed approach, the quantitative performance measures, an average accuracy, average true positive rate (ATPR), and average false positive rate (AFPR) are calculated. The respective values of the quantitative performance measures are 0.9522, 0.7594, 0.0292 for DRIVE data set and 0.9270, 0.7939, 0.0624 for STARE data set. To justify the effectiveness of proposed approach, receiver operating characteristic (ROC) curve is plotted and the average area under the curve (AUC) is calculated. The average AUC for DRIVE and STARE data sets are 0.9287 and 0.9140 respectively. The obtained experimental results confirm that the proposed approach performance better with respect to other prominent Gaussian distribution function and Cauchy PDF based matched filter approaches. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Teaching time-series analysis. II. Wave height and water surface elevation probability distributions
Whitford, Dennis J.; Waters, Jennifer K.; Vieira, Mario E. C.
2001-04-01
This paper describes the second of a two-part series of pedagogical exercises to introduce students to methods of time-series analysis. While these exercises are focused on the analysis of wind generated surface gravity waves, they are cross-disciplinary in nature and can be applied to other fields dealing with random signal analysis. Two computer laboratory exercises are presented which enable students to understand many of the facets of random signal analysis with less difficulty and more understanding than standard classroom instruction alone. The first pedagogical exercise, described in the previous article, uses mathematical software on which the students execute the manual arithmetic operations of a finite Fourier analysis on a complex wave record. The results are then compared to those obtained by a fast Fourier transform. This article, the second of this two-part pedagogical series, addresses analysis of a complex sea using observed and theoretical wave height and water surface elevation probability distributions and wave spectra. These results are compared to a fast Fourier transform analysis, thus providing a link back to the first exercise.
Characterizing the Lyα forest flux probability distribution function using Legendre polynomials
Cieplak, Agnieszka M.; Slosar, Anže
2017-10-01
The Lyman-α forest is a highly non-linear field with considerable information available in the data beyond the power spectrum. The flux probability distribution function (PDF) has been used as a successful probe of small-scale physics. In this paper we argue that measuring coefficients of the Legendre polynomial expansion of the PDF offers several advantages over measuring the binned values as is commonly done. In particular, the n-th Legendre coefficient can be expressed as a linear combination of the first n moments, allowing these coefficients to be measured in the presence of noise and allowing a clear route for marginalisation over mean flux. Moreover, in the presence of noise, our numerical work shows that a finite number of coefficients are well measured with a very sharp transition into noise dominance. This compresses the available information into a small number of well-measured quantities. We find that the amount of recoverable information is a very non-linear function of spectral noise that strongly favors fewer quasars measured at better signal to noise.
Cieplak, Agnieszka; Slosar, Anze
2018-01-01
The Lyman-alpha forest has become a powerful cosmological probe at intermediate redshift. It is a highly non-linear field with much information present beyond the power spectrum. The flux probability flux distribution (PDF) in particular has been a successful probe of small scale physics. However, it is also sensitive to pixel noise, spectrum resolution, and continuum fitting, all of which lead to possible biased estimators. Here we argue that measuring the coefficients of the Legendre polynomial expansion of the PDF offers several advantages over measuring the binned values as is commonly done. Since the n-th Legendre coefficient can be expressed as a linear combination of the first n moments of the field, this allows for the coefficients to be measured in the presence of noise and allows for a clear route towards marginalization over the mean flux. Additionally, in the presence of noise, a finite number of these coefficients are well measured with a very sharp transition into noise dominance. This compresses the information into a small amount of well-measured quantities. Finally, we find that measuring fewer quasars with high signal-to-noise produces a higher amount of recoverable information.
Directory of Open Access Journals (Sweden)
Jinhua Xu
Full Text Available Visual saliency is the perceptual quality that makes some items in visual scenes stand out from their immediate contexts. Visual saliency plays important roles in natural vision in that saliency can direct eye movements, deploy attention, and facilitate tasks like object detection and scene understanding. A central unsolved issue is: What features should be encoded in the early visual cortex for detecting salient features in natural scenes? To explore this important issue, we propose a hypothesis that visual saliency is based on efficient encoding of the probability distributions (PDs of visual variables in specific contexts in natural scenes, referred to as context-mediated PDs in natural scenes. In this concept, computational units in the model of the early visual system do not act as feature detectors but rather as estimators of the context-mediated PDs of a full range of visual variables in natural scenes, which directly give rise to a measure of visual saliency of any input stimulus. To test this hypothesis, we developed a model of the context-mediated PDs in natural scenes using a modified algorithm for independent component analysis (ICA and derived a measure of visual saliency based on these PDs estimated from a set of natural scenes. We demonstrated that visual saliency based on the context-mediated PDs in natural scenes effectively predicts human gaze in free-viewing of both static and dynamic natural scenes. This study suggests that the computation based on the context-mediated PDs of visual variables in natural scenes may underlie the neural mechanism in the early visual cortex for detecting salient features in natural scenes.
Eilers, Anna-Christina; Hennawi, Joseph F.; Lee, Khee-Gan
2017-08-01
We present a new Bayesian algorithm making use of Markov Chain Monte Carlo sampling that allows us to simultaneously estimate the unknown continuum level of each quasar in an ensemble of high-resolution spectra, as well as their common probability distribution function (PDF) for the transmitted Lyα forest flux. This fully automated PDF regulated continuum fitting method models the unknown quasar continuum with a linear principal component analysis (PCA) basis, with the PCA coefficients treated as nuisance parameters. The method allows one to estimate parameters governing the thermal state of the intergalactic medium (IGM), such as the slope of the temperature-density relation γ -1, while marginalizing out continuum uncertainties in a fully Bayesian way. Using realistic mock quasar spectra created from a simplified semi-numerical model of the IGM, we show that this method recovers the underlying quasar continua to a precision of ≃ 7 % and ≃ 10 % at z = 3 and z = 5, respectively. Given the number of principal component spectra, this is comparable to the underlying accuracy of the PCA model itself. Most importantly, we show that we can achieve a nearly unbiased estimate of the slope γ -1 of the IGM temperature-density relation with a precision of +/- 8.6 % at z = 3 and +/- 6.1 % at z = 5, for an ensemble of ten mock high-resolution quasar spectra. Applying this method to real quasar spectra and comparing to a more realistic IGM model from hydrodynamical simulations would enable precise measurements of the thermal and cosmological parameters governing the IGM, albeit with somewhat larger uncertainties, given the increased flexibility of the model.
Boots, Nam Kyoo; Shahabuddin, Perwez
2001-01-01
This paper deals with estimating small tail probabilities of thesteady-state waiting time in a GI/GI/1 queue with heavy-tailed (subexponential) service times. The problem of estimating infinite horizon ruin probabilities in insurance risk processes with heavy-tailed claims can be transformed into
Directory of Open Access Journals (Sweden)
O. E. Gudkova
2014-01-01
Full Text Available Reliability of distributive systems for electric supply of consumers is considered as a multi-criteria function. For this reason while developing an algorithm for determination of optimum reliability level of distributive networks it is necessary to take into account probability character of changes in corresponding indices. A mathematical model and algorithm have been developed for determination of optimum reliability level of electric supply systems with due account of probability changes in reliability indices of component elements.
Lanzi, R. James; Vincent, Brett T.
1993-01-01
The relationship between actual and predicted re-entry maximum dynamic pressure is characterized using a probability density function and a cumulative distribution function derived from sounding rocket flight data. This paper explores the properties of this distribution and demonstrates applications of this data with observed sounding rocket re-entry body damage characteristics to assess probabilities of sustaining various levels of heating damage. The results from this paper effectively bridge the gap existing in sounding rocket reentry analysis between the known damage level/flight environment relationships and the predicted flight environment.
Directory of Open Access Journals (Sweden)
Mark William Perlin
2015-01-01
Full Text Available Background: DNA mixtures of two or more people are a common type of forensic crime scene evidence. A match statistic that connects the evidence to a criminal defendant is usually needed for court. Jurors rely on this strength of match to help decide guilt or innocence. However, the reliability of unsophisticated match statistics for DNA mixtures has been questioned. Materials and Methods: The most prevalent match statistic for DNA mixtures is the combined probability of inclusion (CPI, used by crime labs for over 15 years. When testing 13 short tandem repeat (STR genetic loci, the CPI -1 value is typically around a million, regardless of DNA mixture composition. However, actual identification information, as measured by a likelihood ratio (LR, spans a much broader range. This study examined probability of inclusion (PI mixture statistics for 517 locus experiments drawn from 16 reported cases and compared them with LR locus information calculated independently on the same data. The log(PI -1 values were examined and compared with corresponding log(LR values. Results: The LR and CPI methods were compared in case examples of false inclusion, false exclusion, a homicide, and criminal justice outcomes. Statistical analysis of crime laboratory STR data shows that inclusion match statistics exhibit a truncated normal distribution having zero center, with little correlation to actual identification information. By the law of large numbers (LLN, CPI -1 increases with the number of tested genetic loci, regardless of DNA mixture composition or match information. These statistical findings explain why CPI is relatively constant, with implications for DNA policy, criminal justice, cost of crime, and crime prevention. Conclusions: Forensic crime laboratories have generated CPI statistics on hundreds of thousands of DNA mixture evidence items. However, this commonly used match statistic behaves like a random generator of inclusionary values, following the LLN
Bellemare, C.; Kroger, S.; van Soest, A.H.O.
2005-01-01
We combine the choice data of proposers and responders in the ultimatum game, their expectations elicited in the form of subjective probability questions, and the choice data of proposers ("dictator") in a dictator game to estimate a structural model of decision making under uncertainty.We use a
Li, Yanjun; Tang, Xiaoying; Wang, Ancong; Tang, Hui
2017-09-01
Atrial fibrillation (AF) monitoring and diagnosis require automatic AF detection methods. In this paper, a novel image-based AF detection method was proposed. The map was constructed by plotting changes of RR intervals (△RR) into grid panes. First, the map was divided into grid panes with 20 ms fixed resolution in y-axes and 15-60 s step length in x-axes. Next, the blank pane ratio (BPR), the entropy and the probability density distribution were processed using linear support-vector machine (LSVM) to classify AF and non-AF episodes. The performance was evaluated based on four public physiological databases. The Cohen's Kappa coefficients were 0.87, 0.91 and 0.64 at 50 s step length for the long-term AF database, the MIT-BIH AF database and the MIT-BIH arrhythmia database, respectively. Best results were achieved as follows: (1) an accuracy of 93.7%, a sensitivity of 95.1%, a specificity of 92.0% and a positive predictive value (PPV) of 93.5% were obtained for the long-term AF database at 60 s step length. (2) An accuracy of 95.9%, a sensitivity of 95.3%, a specificity of 96.3% and a PPV of 94.1% were obtained for the MIT-BIH AF database at 40 s step length. (3) An accuracy of 90.6%, a sensitivity of 94.5%, a specificity of 90.0% and a PPV of 55.0% were achieved for the MIT-BIH arrhythmia database at 60 s step length. (4) Both accuracy and specificity were 96.0% for the MIT-BIH normal sinus rhythm database at 40 s step length. In conclusion, the intuitive grid map of delta RR intervals offers a new approach to achieving comparable performance with previously published AF detection methods.
Radtke, T.; Fritzsche, S.
2008-11-01
, quantum information science has contributed to our understanding of quantum mechanics and has provided also new and efficient protocols, based on the use of entangled quantum states. To determine the behavior and entanglement of n-qubit quantum registers, symbolic and numerical simulations need to be applied in order to analyze how these quantum information protocols work and which role the entanglement plays hereby. Solution method: Using the computer algebra system Maple, we have developed a set of procedures that support the definition, manipulation and analysis of n-qubit quantum registers. These procedures also help to deal with (unitary) logic gates and (nonunitary) quantum operations that act upon the quantum registers. With the parameterization of various frequently-applied objects, that are implemented in the present version, the program now facilitates a wider range of symbolic and numerical studies. All commands can be used interactively in order to simulate and analyze the evolution of n-qubit quantum systems, both in ideal and noisy quantum circuits. Reasons for new version: In the first version of the FEYNMAN program [1], we implemented the data structures and tools that are necessary to create, manipulate and to analyze the state of quantum registers. Later [2,3], support was added to deal with quantum operations (noisy channels) as an ingredient which is essential for studying the effects of decoherence. With the present extension, we add a number of parametrizations of objects frequently utilized in decoherence and entanglement studies, such that as hermitian and unitary matrices, probability distributions, or various kinds of quantum states. This extension therefore provides the basis, for example, for the optimization of a given function over the set of pure states or the simple generation of random objects. Running time: Most commands that act upon quantum registers with five or less qubits take ⩽10 seconds of processor time on a Pentium 4 processor
Lee, Moon Ho; Dudin, Alexander; Shaban, Alexy; Pokhrel, Subash Shree; Ma, Wen Ping
Formulae required for accurate approximate calculation of transition probabilities of embedded Markov chain for single-server queues of the GI/M/1, GI/M/1/K, M/G/1, M/G/1/K type with heavy-tail lognormal distribution of inter-arrival or service time are given.
Energy Technology Data Exchange (ETDEWEB)
O' Rourke, Patrick Francis [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-10-27
The purpose of this report is to provide the reader with an understanding of how a Monte Carlo neutron transport code was written, developed, and evolved to calculate the probability distribution functions (PDFs) and their moments for the neutron number at a final time as well as the cumulative fission number, along with introducing several basic Monte Carlo concepts.
Boers, A M M; Berkhemer, O A; Slump, C H; van Zwam, W H; Roos, Y B W E M; van der Lugt, A; van Oostenbrugge, R J; Yoo, A J; Dippel, D W J; Marquering, H A; Majoie, C B L M
2017-05-01
Since proof emerged that IA treatment (IAT) is beneficial for patients with acute ischemic stroke, it has become the standard method of care. Despite these positive results, recovery to functional independence is established in only about one-third of treated patients. The effect of IAT is commonly assessed by functional outcome, whereas its effect on brain tissue salvage is considered a secondary outcome measure (at most). Because patient and treatment selection needs to be improved, understanding the treatment effect on brain tissue salvage is of utmost importance. To introduce infarct probability maps to estimate the location and extent of tissue damage based on patient baseline characteristics and treatment type. Cerebral infarct probability maps were created by combining automatically segmented infarct distributions using follow-up CT images of 281 patients from the MR CLEAN trial. Comparison of infarct probability maps allows visualization and quantification of probable treatment effects. Treatment impact was calculated for 10 Alberta Stroke Program Early CT Score (ASPECTS) and 27 anatomical regions. The insular cortex had the highest infarct probability in both control and IAT populations (47.2% and 42.6%, respectively). Comparison showed significant lower infarct probability in 4 ASPECTS and 17 anatomical regions in favor of IAT. Most salvaged tissue was found within the ASPECTS M2 region, which was 8.5% less likely to infarct. Probability maps intuitively visualize the topographic distribution of infarct probability due to treatment, which makes it a promising tool for estimating the effect of treatment. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Fry, J. N.
1985-01-01
Knowledge of N-point correlation functions for all N allows one to invert and obtain the probability distribution of mass fluctuations in a fixed volume. The hierarchical sequence of higher order is applied to correlations with dimensionless amplitudes suggested by the BBGKY equations. The resulting distribution is significantly non-Gaussian, even for quite small mean square fluctuations. The qualitative and to some degree quantitative results are to a large degree independent of the exact sequence of amplitudes. An ensemble of such models compared with N-body simulations fails in detail to account for the low-density frequency distribution.
Brosius, J
2015-01-01
This paper presents a completely new method for the calculation of expectations (and thus joint probability distributions) of structure factors or phase invariants. As an example, a first approximation of the expectation of the triplet invariant (up to a constant) is given and a complex number is obtained. Instead of considering the atomic vector positions or reciprocal vectors as the fundamental random variables, the method samples over all functions (distributions) with a given number of atoms and given Patterson function. The aim of this paper was to explore the feasibility of the method, so the easiest problem was chosen: the calculation of the expectation value of the triplet invariant in P1. Calculation of the joint probability distribution of the triplet is not performed here but will be done in the future.
Directory of Open Access Journals (Sweden)
Panpan Zhao
2017-05-01
Full Text Available This study investigates the sensitivity and uncertainty of hydrological droughts frequencies and severity in the Weihe Basin, China during 1960–2012, by using six commonly used univariate probability distributions and three Archimedean copulas to fit the marginal and joint distributions of drought characteristics. The Anderson-Darling method is used for testing the goodness-of-fit of the univariate model, and the Akaike information criterion (AIC is applied to select the best distribution and copula functions. The results demonstrate that there is a very strong correlation between drought duration and drought severity in three stations. The drought return period varies depending on the selected marginal distributions and copula functions and, with an increase of the return period, the differences become larger. In addition, the estimated return periods (both co-occurrence and joint from the best-fitted copulas are the closet to those from empirical distribution. Therefore, it is critical to select the appropriate marginal distribution and copula function to model the hydrological drought frequency and severity. The results of this study can not only help drought investigation to select a suitable probability distribution and copulas function, but are also useful for regional water resource management. However, a few limitations remain in this study, such as the assumption of stationary of runoff series.
DEFF Research Database (Denmark)
Helles, Glennie; Fonseca, Rasmus
2009-01-01
residue in the input-window. The trained neural network shows a significant improvement (4-68%) in predicting the most probable bin (covering a 30°×30° area of the dihedral angle space) for all amino acids in the data set compared to first order statistics. An accuracy comparable to that of secondary...
Mestres-Missé, Anna; Trampel, Robert; Turner, Robert; Kotz, Sonja A
2016-04-01
A key aspect of optimal behavior is the ability to predict what will come next. To achieve this, we must have a fairly good idea of the probability of occurrence of possible outcomes. This is based both on prior knowledge about a particular or similar situation and on immediately relevant new information. One question that arises is: when considering converging prior probability and external evidence, is the most probable outcome selected or does the brain represent degrees of uncertainty, even highly improbable ones? Using functional magnetic resonance imaging, the current study explored these possibilities by contrasting words that differ in their probability of occurrence, namely, unbalanced ambiguous words and unambiguous words. Unbalanced ambiguous words have a strong frequency-based bias towards one meaning, while unambiguous words have only one meaning. The current results reveal larger activation in lateral prefrontal and insular cortices in response to dominant ambiguous compared to unambiguous words even when prior and contextual information biases one interpretation only. These results suggest a probability distribution, whereby all outcomes and their associated probabilities of occurrence--even if very low--are represented and maintained.
Dai, Huanping; Micheyl, Christophe
2015-05-01
Proportion correct (Pc) is a fundamental measure of task performance in psychophysics. The maximum Pc score that can be achieved by an optimal (maximum-likelihood) observer in a given task is of both theoretical and practical importance, because it sets an upper limit on human performance. Within the framework of signal detection theory, analytical solutions for computing the maximum Pc score have been established for several common experimental paradigms under the assumption of Gaussian additive internal noise. However, as the scope of applications of psychophysical signal detection theory expands, the need is growing for psychophysicists to compute maximum Pc scores for situations involving non-Gaussian (internal or stimulus-induced) noise. In this article, we provide a general formula for computing the maximum Pc in various psychophysical experimental paradigms for arbitrary probability distributions of sensory activity. Moreover, easy-to-use MATLAB code implementing the formula is provided. Practical applications of the formula are illustrated, and its accuracy is evaluated, for two paradigms and two types of probability distributions (uniform and Gaussian). The results demonstrate that Pc scores computed using the formula remain accurate even for continuous probability distributions, as long as the conversion from continuous probability density functions to discrete probability mass functions is supported by a sufficiently high sampling resolution. We hope that the exposition in this article, and the freely available MATLAB code, facilitates calculations of maximum performance for a wider range of experimental situations, as well as explorations of the impact of different assumptions concerning internal-noise distributions on maximum performance in psychophysical experiments.
DEFF Research Database (Denmark)
Asmussen, Søren; Albrecher, Hansjörg
, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially......The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...
DEFF Research Database (Denmark)
Asmussen, Søren; Albrecher, Hansjörg
The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence......., extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...
Dubreuil, S.; Salaün, M.; Rodriguez, E.; Petitjean, F.
2018-01-01
This study investigates the construction and identification of the probability distribution of random modal parameters (natural frequencies and effective parameters) in structural dynamics. As these parameters present various types of dependence structures, the retained approach is based on pair copula construction (PCC). A literature review leads us to choose a D-Vine model for the construction of modal parameters probability distributions. Identification of this model is based on likelihood maximization which makes it sensitive to the dimension of the distribution, namely the number of considered modes in our context. To this respect, a mode selection preprocessing step is proposed. It allows the selection of the relevant random modes for a given transfer function. The second point, addressed in this study, concerns the choice of the D-Vine model. Indeed, D-Vine model is not uniquely defined. Two strategies are proposed and compared. The first one is based on the context of the study whereas the second one is purely based on statistical considerations. Finally, the proposed approaches are numerically studied and compared with respect to their capabilities, first in the identification of the probability distribution of random modal parameters and second in the estimation of the 99 % quantiles of some transfer functions.
Williams, Michael S; Cao, Yong; Ebel, Eric D
2013-07-15
Levels of pathogenic organisms in food and water have steadily declined in many parts of the world. A consequence of this reduction is that the proportion of samples that test positive for the most contaminated product-pathogen pairings has fallen to less than 0.1. While this is unequivocally beneficial to public health, datasets with very few enumerated samples present an analytical challenge because a large proportion of the observations are censored values. One application of particular interest to risk assessors is the fitting of a statistical distribution function to datasets collected at some point in the farm-to-table continuum. The fitted distribution forms an important component of an exposure assessment. A number of studies have compared different fitting methods and proposed lower limits on the proportion of samples where the organisms of interest are identified and enumerated, with the recommended lower limit of enumerated samples being 0.2. This recommendation may not be applicable to food safety risk assessments for a number of reasons, which include the development of new Bayesian fitting methods, the use of highly sensitive screening tests, and the generally larger sample sizes found in surveys of food commodities. This study evaluates the performance of a Markov chain Monte Carlo fitting method when used in conjunction with a screening test and enumeration of positive samples by the Most Probable Number technique. The results suggest that levels of contamination for common product-pathogen pairs, such as Salmonella on poultry carcasses, can be reliably estimated with the proposed fitting method and samples sizes in excess of 500 observations. The results do, however, demonstrate that simple guidelines for this application, such as the proportion of positive samples, cannot be provided. Published by Elsevier B.V.
DEFF Research Database (Denmark)
Hansen, Thomas Mejer; Mosegaard, Klaus; Cordua, Knud Skou
2010-01-01
Markov chain Monte Carlo methods such as the Gibbs sampler and the Metropolis algorithm can be used to sample the solutions to non-linear inverse problems. In principle these methods allow incorporation of arbitrarily complex a priori information, but current methods allow only relatively simple...... this algorithm with the Metropolis algorithm to obtain an efficient method for sampling posterior probability densities for nonlinear inverse problems....
Zhu, Jun; Eickhoff, Jens C; Kaiser, Mark S
2003-12-01
Beta-binomial models are widely used for overdispersed binomial data, with the binomial success probability modeled as following a beta distribution. The number of binary trials in each binomial is assumed to be nonrandom and unrelated to the success probability. In many behavioral studies, however, binomial observations demonstrate more complex structures. In this article, a general beta-binomial-Poisson mixture model is developed, to allow for a relation between the number of trials and the success probability for overdispersed binomial data. An EM algorithm is implemented to compute both the maximum likelihood estimates of the model parameters and the corresponding standard errors. For illustration, the methodology is applied to study the feeding behavior of green-backed herons in two southeastern Missouri streams.
Goldberg, Samuel
1960-01-01
Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.
Zubdah-e-Noor; Athar, Haseeb
2014-01-01
In this paper, two general classes of distributions have been characterized through conditional expectation of power of difference of two record statistics. Further, some particular cases and examples are also discussed.
Directory of Open Access Journals (Sweden)
Zubdah-e-Noor
2014-07-01
Full Text Available In this paper, two general classes of distributions have been characterized through conditional expectation of power of difference of two record statistics. Further, some particular cases and examples are also discussed.
TOKUNAGA, EIJI
1990-01-01
The "geomorphic temperature" and the "partition function" of a landform unit are defined by assuming the canonical distribution of the potential energies of material forming the surface of the landform unit. These two quantities provide the variables analogous to the internal energy, the entropy, and the free energy in the thermodynamic system. The area-altitude distributions of the earth surface and the four Japanese main islands imply that landform units at the equilibrium state satisfy the...
On the probability distribution of stock returns in the Mike-Farmer model
Gu, G.-F.; Zhou, W.-X.
2009-02-01
Recently, Mike and Farmer have constructed a very powerful and realistic behavioral model to mimick the dynamic process of stock price formation based on the empirical regularities of order placement and cancelation in a purely order-driven market, which can successfully reproduce the whole distribution of returns, not only the well-known power-law tails, together with several other important stylized facts. There are three key ingredients in the Mike-Farmer (MF) model: the long memory of order signs characterized by the Hurst index Hs, the distribution of relative order prices x in reference to the same best price described by a Student distribution (or Tsallis’ q-Gaussian), and the dynamics of order cancelation. They showed that different values of the Hurst index Hs and the freedom degree αx of the Student distribution can always produce power-law tails in the return distribution fr(r) with different tail exponent αr. In this paper, we study the origin of the power-law tails of the return distribution fr(r) in the MF model, based on extensive simulations with different combinations of the left part L(x) for x 0 of fx(x). We find that power-law tails appear only when L(x) has a power-law tail, no matter R(x) has a power-law tail or not. In addition, we find that the distributions of returns in the MF model at different timescales can be well modeled by the Student distributions, whose tail exponents are close to the well-known cubic law and increase with the timescale.
Ishikawa, Takeshi
1996-01-01
This article provides a method using the probability papers for point and interval predictions of future order statistics, on the basis of Type II censored samples from the Weibull and Extreme-Value distribution. First we propose two-point predictor for the point prediction problem and the problem of choosing plotting position are studied. Second we give the method to construct the prediction interval of the future order statistics using the two-point predictor.
Ground impact probability distribution for small unmanned aircraft in ballistic descent
DEFF Research Database (Denmark)
La Cour-Harbo, Anders
2017-01-01
equally rapid. At the same time the trend in national and international regulations for unmanned aircraft is to take a risk-based approach, effectively requiring risk assessment for every flight operation. This work addresses the growing need for methods for quantitatively evaluating individual flights...... by modelling the consequences of a ballistic descent of an unmanned aircraft as a result of a major inflight incident. The presented model is a probability density function for the ground impact area based on a second order drag model with probabilistic assumptions on the least well-known parameters...
Holland, Frederic A., Jr.
2004-01-01
Modern engineering design practices are tending more toward the treatment of design parameters as random variables as opposed to fixed, or deterministic, values. The probabilistic design approach attempts to account for the uncertainty in design parameters by representing them as a distribution of values rather than as a single value. The motivations for this effort include preventing excessive overdesign as well as assessing and assuring reliability, both of which are important for aerospace applications. However, the determination of the probability distribution is a fundamental problem in reliability analysis. A random variable is often defined by the parameters of the theoretical distribution function that gives the best fit to experimental data. In many cases the distribution must be assumed from very limited information or data. Often the types of information that are available or reasonably estimated are the minimum, maximum, and most likely values of the design parameter. For these situations the beta distribution model is very convenient because the parameters that define the distribution can be easily determined from these three pieces of information. Widely used in the field of operations research, the beta model is very flexible and is also useful for estimating the mean and standard deviation of a random variable given only the aforementioned three values. However, an assumption is required to determine the four parameters of the beta distribution from only these three pieces of information (some of the more common distributions, like the normal, lognormal, gamma, and Weibull distributions, have two or three parameters). The conventional method assumes that the standard deviation is a certain fraction of the range. The beta parameters are then determined by solving a set of equations simultaneously. A new method developed in-house at the NASA Glenn Research Center assumes a value for one of the beta shape parameters based on an analogy with the normal
Elizalde, E.; Gaztanaga, E.
1992-01-01
The dependence of counts in cells on the shape of the cell for the large scale galaxy distribution is studied. A very concrete prediction can be done concerning the void distribution for scale invariant models. The prediction is tested on a sample of the CfA catalog, and good agreement is found. It is observed that the probability of a cell to be occupied is bigger for some elongated cells. A phenomenological scale invariant model for the observed distribution of the counts in cells, an extension of the negative binomial distribution, is presented in order to illustrate how this dependence can be quantitatively determined. An original, intuitive derivation of this model is presented.
Directory of Open Access Journals (Sweden)
Pradeep K. Goyal
2011-09-01
Full Text Available This paper presents a study conducted on the probabilistic distribution of key cyclone parameters and the cyclonic wind speed by analyzing the cyclone track records obtained from India meteorological department for east coast region of India. The dataset of historical landfalling storm tracks in India from 1975–2007 with latitude /longitude and landfall locations are used to map the cyclone tracks in a region of study. The statistical tests were performed to find a best fit distribution to the track data for each cyclone parameter. These parameters include central pressure difference, the radius of maximum wind speed, the translation velocity, track angle with site and are used to generate digital simulated cyclones using wind field simulation techniques. For this, different sets of values for all the cyclone key parameters are generated randomly from their probability distributions. Using these simulated values of the cyclone key parameters, the distribution of wind velocity at a particular site is obtained. The same distribution of wind velocity at the site is also obtained from actual track records and using the distributions of the cyclone key parameters as published in the literature. The simulated distribution is compared with the wind speed distributions obtained from actual track records. The findings are useful in cyclone disaster mitigation.
Multiparameter probability distributions for heavy rainfall modeling in extreme southern Brazil
Directory of Open Access Journals (Sweden)
Samuel Beskow
2015-09-01
New hydrological insights for the region: The Anderson–Darling and Filliben tests were the most restrictive in this study. Based on the Anderson–Darling test, it was found that the Kappa distribution presented the best performance, followed by the GEV. This finding provides evidence that these multiparameter distributions result, for the region of study, in greater accuracy for the generation of intensity–duration–frequency curves and the prediction of peak streamflows and design hydrographs. As a result, this finding can support the design of hydraulic structures and flood management in river basins.
DEFF Research Database (Denmark)
Rojas-Nandayapa, Leonardo
Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think....... By doing so, we will obtain a deeper insight into how events involving large values of sums of heavy-tailed random variables are likely to occur....
Smith, O. E.; Adelfang, S. I.
1981-01-01
A model of the largest gust amplitude and gust length is presented which uses the properties of the bivariate gamma distribution. The gust amplitude and length are strongly dependent on the filter function; the amplitude increases with altitude and is larger in winter than in summer.
The distribution of probability values in medical abstracts: an observational study.
Ginsel, Bastiaan; Aggarwal, Abhinav; Xuan, Wei; Harris, Ian
2015-11-26
A relatively high incidence of p values immediately below 0.05 (such as 0.047 or 0.04) compared to p values immediately above 0.05 (such as 0.051 or 0.06) has been noticed anecdotally in published medical abstracts. If p values immediately below 0.05 are over-represented, such a distribution may reflect the true underlying distribution of p values or may be due to error (a false distribution). If due to error, a consistent over-representation of p values immediately below 0.05 would be a systematic error due either to publication bias or (overt or inadvertent) bias within studies. We searched the Medline 2012 database to identify abstracts containing a p value. Two thousand abstracts out of 80,649 abstracts were randomly selected. Two independent researchers extracted all p values. The p values were plotted and compared to a predicted curve. Chi square test was used to test assumptions and significance was set at 0.05. 2798 p value ranges and 3236 exact p values were reported. 4973 of these (82%) were significant (distribution of p values in reported medical abstracts provides evidence for systematic error in the reporting of p values. This may be due to publication bias, methodological errors (underpowering, selective reporting and selective analyses) or fraud.
Is extrapair mating random? On the probability distribution of extrapair young in avian broods
Brommer, Jon E.; Korsten, Peter; Bouwman, Karen A.; Berg, Mathew L.; Komdeur, Jan
2007-01-01
A dichotomy in female extrapair copulation (EPC) behavior, with some females seeking EPC and others not, is inferred if the observed distribution of extrapair young (EPY) over broods differs from a random process on the level of individual offspring (binomial, hypergeometrical, or Poisson). A review
Wilheit, Thomas T.; Chang, Alfred T. C.; Chiu, Long S.
1991-01-01
An algorithm for the estimation of monthly rain totals for 5 deg cells over the ocean from histograms of SSM/I brightness temperatures has been developed. There are three novel features to this algorithm. First, it uses knowledge of the form of the rainfall intensity probability density function to augment the measurements. Second, a linear combination of the 19.35 and 22.235 GHz channels has been employed to reduce the impact of variability of water vapor. Third, an objective technique has been developed to estimate the rain layer thickness from the 19.35- and 22.235-GHz brightness temperature histograms. Comparison with climatologies and the GATE radar observations suggest that the estimates are reasonable in spite of not having a beam-filling correction. By-products of the retrievals indicate that the SSM/I instrument noise level and calibration stability are quite good.
Linde, N.; Vrugt, J. A.
2009-04-01
Geophysical models are increasingly used in hydrological simulations and inversions, where they are typically treated as an artificial data source with known uncorrelated "data errors". The model appraisal problem in classical deterministic linear and non-linear inversion approaches based on linearization is often addressed by calculating model resolution and model covariance matrices. These measures offer only a limited potential to assign a more appropriate "data covariance matrix" for future hydrological applications, simply because the regularization operators used to construct a stable inverse solution bear a strong imprint on such estimates and because the non-linearity of the geophysical inverse problem is not explored. We present a parallelized Markov Chain Monte Carlo (MCMC) scheme to efficiently derive the posterior spatially distributed radar slowness and water content between boreholes given first-arrival traveltimes. This method is called DiffeRential Evolution Adaptive Metropolis (DREAM_ZS) with snooker updater and sampling from past states. Our inverse scheme does not impose any smoothness on the final solution, and uses uniform prior ranges of the parameters. The posterior distribution of radar slowness is converted into spatially distributed soil moisture values using a petrophysical relationship. To benchmark the performance of DREAM_ZS, we first apply our inverse method to a synthetic two-dimensional infiltration experiment using 9421 traveltimes contaminated with Gaussian errors and 80 different model parameters, corresponding to a model discretization of 0.3 m × 0.3 m. After this, the method is applied to field data acquired in the vadose zone during snowmelt. This work demonstrates that fully non-linear stochastic inversion can be applied with few limiting assumptions to a range of common two-dimensional tomographic geophysical problems. The main advantage of DREAM_ZS is that it provides a full view of the posterior distribution of spatially
Czech Academy of Sciences Publication Activity Database
Grim, Jiří
2017-01-01
Roč. 31, č. 9 (2017), č. článku 1750028. ISSN 0218-0014 R&D Projects: GA ČR GA17-18407S Institutional support: RVO:67985556 Keywords : multivariate statistics * product mixtures * naive Bayes models * EM algorithm * pattern recognition * neural networks * expert systems * image analysis Subject RIV: IN - Informatics, Computer Science Impact factor: 0.994, year: 2016 http:// library .utia.cas.cz/separaty/2017/RO/grim-0475182.pdf
Nemeth, Noel
2013-01-01
Models that predict the failure probability of monolithic glass and ceramic components under multiaxial loading have been developed by authors such as Batdorf, Evans, and Matsuo. These "unit-sphere" failure models assume that the strength-controlling flaws are randomly oriented, noninteracting planar microcracks of specified geometry but of variable size. This report develops a formulation to describe the probability density distribution of the orientation of critical strength-controlling flaws that results from an applied load. This distribution is a function of the multiaxial stress state, the shear sensitivity of the flaws, the Weibull modulus, and the strength anisotropy. Examples are provided showing the predicted response on the unit sphere for various stress states for isotropic and transversely isotropic (anisotropic) materials--including the most probable orientation of critical flaws for offset uniaxial loads with strength anisotropy. The author anticipates that this information could be used to determine anisotropic stiffness degradation or anisotropic damage evolution for individual brittle (or quasi-brittle) composite material constituents within finite element or micromechanics-based software
Querci, F.; Kunde, V. G.; Querci, M.
1971-01-01
The basis and techniques are presented for generating opacity probability distribution functions for the CN molecule (red and violet systems) and the C2 molecule (Swan, Phillips, Ballik-Ramsay systems), two of the more important diatomic molecules in the spectra of carbon stars, with a view to including these distribution functions in equilibrium model atmosphere calculations. Comparisons to the CO molecule are also shown. T he computation of the monochromatic absorption coefficient uses the most recent molecular data with revision of the oscillator strengths for some of the band systems. The total molecular stellar mass absorption coefficient is established through fifteen equations of molecular dissociation equilibrium to relate the distribution functions to each other on a per gram of stellar material basis.
Jian, Jhih-Wei; Elumalai, Pavadai; Pitti, Thejkiran; Wu, Chih Yuan; Tsai, Keng-Chang; Chang, Jeng-Yih; Peng, Hung-Pin; Yang, An-Suei
2016-01-01
Predicting ligand binding sites (LBSs) on protein structures, which are obtained either from experimental or computational methods, is a useful first step in functional annotation or structure-based drug design for the protein structures. In this work, the structure-based machine learning algorithm ISMBLab-LIG was developed to predict LBSs on protein surfaces with input attributes derived from the three-dimensional probability density maps of interacting atoms, which were reconstructed on the query protein surfaces and were relatively insensitive to local conformational variations of the tentative ligand binding sites. The prediction accuracy of the ISMBLab-LIG predictors is comparable to that of the best LBS predictors benchmarked on several well-established testing datasets. More importantly, the ISMBLab-LIG algorithm has substantial tolerance to the prediction uncertainties of computationally derived protein structure models. As such, the method is particularly useful for predicting LBSs not only on experimental protein structures without known LBS templates in the database but also on computationally predicted model protein structures with structural uncertainties in the tentative ligand binding sites. PMID:27513851
Energy Technology Data Exchange (ETDEWEB)
Hannachi, A. [University of Reading, Department of Meteorology, Earley Gate, PO Box 243, Reading (United Kingdom)
2006-08-15
Robust tools are presented in this manuscript to assess changes in probability density function (pdf) of climate variables. The approach is based on order statistics and aims at computing, along with their standard errors, changes in various quantiles and related statistics. The technique, which is nonparametric and simple to compute, is developed for both independent and dependent data. For autocorrelated data, serial correlation is addressed via Monte Carlo simulations using various autoregressive models. The ratio between the average standard errors, over several quantiles, of quantile estimates for correlated and independent data, is then computed. A simple scaling-law type relationship is found to hold between this ratio and the lag-1 autocorrelation. The approach has been applied to winter monthly Central England Temperature (CET) and North Atlantic Oscillation (NAO) time series from 1659 to 1999 to assess/quantify changes in various parameters of their pdf. For the CET, significant changes in median (or scale) and also in low and high quantiles are observed between various time slices, in particular between the pre- and post-industrial revolution. Observed changes in spread and also quartile skewness of the pdf, however, are not statistically significant (at 95% confidence level). For the NAO index we find mainly large significant changes in variance (or scale), yielding significant changes in low/high quantiles. Finally, the performance of the method compared to few conventional approaches is discussed. (orig.)
Directory of Open Access Journals (Sweden)
D Johan Kotze
Full Text Available Temporal variation in the detectability of a species can bias estimates of relative abundance if not handled correctly. For example, when effort varies in space and/or time it becomes necessary to take variation in detectability into account when data are analyzed. We demonstrate the importance of incorporating seasonality into the analysis of data with unequal sample sizes due to lost traps at a particular density of a species. A case study of count data was simulated using a spring-active carabid beetle. Traps were 'lost' randomly during high beetle activity in high abundance sites and during low beetle activity in low abundance sites. Five different models were fitted to datasets with different levels of loss. If sample sizes were unequal and a seasonality variable was not included in models that assumed the number of individuals was log-normally distributed, the models severely under- or overestimated the true effect size. Results did not improve when seasonality and number of trapping days were included in these models as offset terms, but only performed well when the response variable was specified as following a negative binomial distribution. Finally, if seasonal variation of a species is unknown, which is often the case, seasonality can be added as a free factor, resulting in well-performing negative binomial models. Based on these results we recommend (a add sampling effort (number of trapping days in our example to the models as an offset term, (b if precise information is available on seasonal variation in detectability of a study object, add seasonality to the models as an offset term; (c if information on seasonal variation in detectability is inadequate, add seasonality as a free factor; and (d specify the response variable of count data as following a negative binomial or over-dispersed Poisson distribution.
Perrotta, A
2002-01-01
A MC method is proposed to compute upper limits, in a pure Bayesian approach, when the errors associated with the experimental sensitivity and expected background content are not Gaussian distributed or not small enough to apply usual approximations. It is relatively easy to extend the procedure to the multichannel case (for instance when different decay branching, luminosities or experiments have to be combined). Some of the searches for supersymmetric particles performed in the DELPHI experiment at the LEP electron- positron collider use such a procedure to propagate systematics into the calculation of cross-section upper limits. One of these searches is described as an example. (6 refs).
Using Perturbed Underdamped Langevin Dynamics to Efficiently Sample from Probability Distributions
Duncan, A. B.; Nüsken, N.; Pavliotis, G. A.
2017-12-01
In this paper we introduce and analyse Langevin samplers that consist of perturbations of the standard underdamped Langevin dynamics. The perturbed dynamics is such that its invariant measure is the same as that of the unperturbed dynamics. We show that appropriate choices of the perturbations can lead to samplers that have improved properties, at least in terms of reducing the asymptotic variance. We present a detailed analysis of the new Langevin sampler for Gaussian target distributions. Our theoretical results are supported by numerical experiments with non-Gaussian target measures.
THE PROBABILITY DENSITY LOCATION OF LIGHT AIRCRAFT WITHIN AIRPORT AND THEIR DISTRIBUTION FROM RUNWAY
Directory of Open Access Journals (Sweden)
І. Государська
2011-04-01
Full Text Available The accident locations of light aircraft with take-off weight less 4 tones were collected. Statistical datawere taken from ADREP (ІСАО, NTSB, ALPA, Aіrclaіms, CAA-UK, ASRS, AІDS databases for 30 year period. Dynamics of accident rate was represented for light aircraft in republics of participantsAgreement of civil aviation (including Ukraine for period from 1982 to 2009. Distributions of location ofthe light aircraft accident from runway end were determined. Bivariate and three-dimensional distributionsof location of light aircraft accident within airports were found
Directory of Open Access Journals (Sweden)
2007-01-01
Full Text Available To produce probability distributions for regional climate change in surface temperature and precipitation, a probability distribution for global mean temperature increase has been combined with the probability distributions for the appropriate scaling variables, i.e. the changes in regional temperature/precipitation per degree global mean warming. Each scaling variable is assumed to be normally distributed. The uncertainty of the scaling relationship arises from systematic differences between the regional changes from global and regional climate model simulations and from natural variability. The contributions of these sources of uncertainty to the total variance of the scaling variable are estimated from simulated temperature and precipitation data in a suite of regional climate model experiments conducted within the framework of the EU-funded project PRUDENCE, using an Analysis Of Variance (ANOVA. For the area covered in the 2001–2004 EU-funded project SWURVE, five case study regions (CSRs are considered: NW England, the Rhine basin, Iberia, Jura lakes (Switzerland and Mauvoisin dam (Switzerland. The resulting regional climate changes for 2070–2099 vary quite significantly between CSRs, between seasons and between meteorological variables. For all CSRs, the expected warming in summer is higher than that expected for the other seasons. This summer warming is accompanied by a large decrease in precipitation. The uncertainty of the scaling ratios for temperature and precipitation is relatively large in summer because of the differences between regional climate models. Differences between the spatial climate-change patterns of global climate model simulations make significant contributions to the uncertainty of the scaling ratio for temperature. However, no meaningful contribution could be found for the scaling ratio for precipitation due to the small number of global climate models in the PRUDENCE project and natural variability, which is
Interaction of surface cracks subjected to non-uniform distributions of stress
Coules, H.E.
2017-01-01
Closely-spaced surface cracks in structures interact with each other when subjected to load. The degree of interaction depends strongly on the distribution of stress that is applied. In pressure boundary components, thermal shock, residual stress and global bending can all cause load distributions that are non-uniform through the wall thickness. A wide range of crack pairs subject to various non-uniform stress distributions have been modelled using finite element analysis. Cracks sometimes in...
Del Giudice, G; Padulano, R; Siciliano, D
2016-01-01
The lack of geometrical and hydraulic information about sewer networks often excludes the adoption of in-deep modeling tools to obtain prioritization strategies for funds management. The present paper describes a novel statistical procedure for defining the prioritization scheme for preventive maintenance strategies based on a small sample of failure data collected by the Sewer Office of the Municipality of Naples (IT). Novelty issues involve, among others, considering sewer parameters as continuous statistical variables and accounting for their interdependences. After a statistical analysis of maintenance interventions, the most important available factors affecting the process are selected and their mutual correlations identified. Then, after a Box-Cox transformation of the original variables, a methodology is provided for the evaluation of a vulnerability map of the sewer network by adopting a joint multivariate normal distribution with different parameter sets. The goodness-of-fit is eventually tested for each distribution by means of a multivariate plotting position. The developed methodology is expected to assist municipal engineers in identifying critical sewers, prioritizing sewer inspections in order to fulfill rehabilitation requirements.
Lodder, W J; Schijven, J F; Rutjes, S A; de Roda Husman, A M; Teunis, P F M
2015-05-15
Numerous studies have reported quantitative data on viruses in surface waters generated using different methodologies. In the current study, the impact of the use of either cell culture-based or molecular-based methods in quantitative microbial risk assessment was assessed. Previously and newly generated data on the presence of infectious human enteroviruses (HEV) and enterovirus and parechovirus RNA were used to estimate distributions of virus concentrations in surface waters. Because techniques for the detection of infectious human parechoviruses (HPeV) in surface waters were not available, a 'Parallelogram Approach' was used to estimate their concentrations based on the ratio infectious HEV/HEV RNA. The obtained virus concentrations were then used to estimate the probability of exposure for children during recreation in such virus contaminated surface waters. Human enterovirus cell culture/PCR ratios ranged from 2.3 × 10(-3) to 0.28. This broad range of ratios indicates that care should be taken in assuming a fixed ratio for assessing the risk with PCR based virus concentrations. The probabilities of exposure to both enteroviruses and parechoviruses were calculated, using our Parallelogram Approach for the calculation of infectious parechoviruses. For both viruses it was observed that the detection method significantly influenced the probability of exposure. Based on the calculated culture data, PCR data, and the ingestion volume, it was estimated that the mean probabilities of exposure, of recreating children, to surface water containing viruses were 0.087 (infectious enteroviruses), 0.71 (enterovirus particles), 0.28 (parechovirus particles) and 0.025 (calculated infectious parechoviruses) per recreation event. The mean probabilities of exposure of children recreating in surface water from which drinking water is produced to infectious enteroviruses were estimated for nine locations and varied between 1.5 × 10(-4) - 0.09 per recreation event. In this study
Wei, Xiaoyan; Chen, Baoguo; Liang, Lijuan; Dunlap, Susan
2015-01-01
Three experiments were conducted to investigate the distributive effect when producing subject-verb agreement in English as a second language (L2) when the participant's first language either does or does not require subject-verb agreement. Both Chinese-English and Uygur-English bilinguals were included in Experiment 1. Chinese has no required subject-verb agreement, whereas Uygur does. Results showed that the distributive effect was observed in Uygur-English bilinguals but not in Chinese-English bilinguals, indicating that this particular first language (L1) syntactic feature is one significant factor affecting the distributive effect in the production of subject-verb agreement in L2. Experiment 2 further investigated the matter by choosing Chinese-English participants with higher L2 proficiency. Still, no distributive effect was observed, suggesting that the absence of distributive effect in Chinese-English bilinguals in Experiment 1 was not due to low proficiency in the target language. Experiment 3 changed the way the stimuli were presented, highlighting the singular or distributive nature of the subject noun phrases, and the distributive effect was observed in Chinese-English bilinguals. Altogether, the results show that the L1 syntactic feature of subject-verb agreement is one significant factor affecting the distributive effect in the production of subject-verb agreement in L2. More specifically, distributive effects rarely occur in L2 when L1 has no requirement on subject-verb agreement, whereas distributive effects are more likely to occur in L2 when the L1 also has required subject-verb agreement.
Scaling Qualitative Probability
Burgin, Mark
2017-01-01
There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...
Directory of Open Access Journals (Sweden)
Haokun Jin
2017-05-01
Full Text Available Stochastic distribution control (SDC systems are a group of systems where the outputs considered is the measured probability density function (PDF of the system output whilst subjected to a normal crisp input. The purpose of the active fault tolerant control of such systems is to use the fault estimation information and other measured information to make the output PDF still track the given distribution when the objective PDF is known. However, if the target PDF is unavailable, the PDF tracking operation will be impossible. Minimum entropy control of the system output can be considered as an alternative strategy. The mean represents the center location of the stochastic variable, and it is reasonable that the minimum entropy fault tolerant controller can be designed subjected to mean constraint. In this paper, using the rational square-root B-spline model for the shape control of the system output probability density function (PDF, a nonlinear adaptive observer based fault diagnosis algorithm is proposed to diagnose the fault. Through the controller reconfiguration, the system entropy subjected to mean restriction can still be minimized when fault occurs. An illustrative example is utilized to demonstrate the use of the minimum entropy fault tolerant control algorithms.
DEFF Research Database (Denmark)
Andersen, Birgit; Felding, Ulrik Ascanius; Krarup, Christian
2012-01-01
Triple stimulation technique (TST) has previously shown that transcranial magnetic stimulation (TMS) fails to activate a proportion of spinal motoneurons (MNs) during motor fatigue. The TST response depression without attenuation of the conventional motor evoked potential suggested increased prob...... the muscle is fatigued. Repetitive MN firing may provide an adaptive mechanism to maintain motor unit activation and task performance during sustained voluntary activity.......Triple stimulation technique (TST) has previously shown that transcranial magnetic stimulation (TMS) fails to activate a proportion of spinal motoneurons (MNs) during motor fatigue. The TST response depression without attenuation of the conventional motor evoked potential suggested increased...... probability of repetitive spinal MN activation during exercise even if some MNs failed to discharge by the brain stimulus. Here we used a modified TST (Quadruple stimulation; QuadS and Quintuple stimulation; QuintS) to examine the influence of fatiguing exercise on second and third MN discharges after...
Jacobsen, J L; Saleur, H
2008-02-29
We determine exactly the probability distribution of the number N_(c) of valence bonds connecting a subsystem of length L>1 to the rest of the system in the ground state of the XXX antiferromagnetic spin chain. This provides, in particular, the asymptotic behavior of the valence-bond entanglement entropy S_(VB)=N_(c)ln2=4ln2/pi(2)lnL disproving a recent conjecture that this should be related with the von Neumann entropy, and thus equal to 1/3lnL. Our results generalize to the Q-state Potts model.
Spiesberger, John L
2013-02-01
The hypothesis tested is that internal gravity waves limit the coherent integration time of sound at 1346 km in the Pacific ocean at 133 Hz and a pulse resolution of 0.06 s. Six months of continuous transmissions at about 18 min intervals are examined. The source and receiver are mounted on the bottom of the ocean with timing governed by atomic clocks. Measured variability is only due to fluctuations in the ocean. A model for the propagation of sound through fluctuating internal waves is run without any tuning with data. Excellent resemblance is found between the model and data's probability distributions of integration time up to five hours.
Shiryaev, Albert N
2016-01-01
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.
Ben Issaid, Chaouki
2017-07-28
When assessing the performance of the free space optical (FSO) communication systems, the outage probability encountered is generally very small, and thereby the use of nave Monte Carlo simulations becomes prohibitively expensive. To estimate these rare event probabilities, we propose in this work an importance sampling approach which is based on the exponential twisting technique to offer fast and accurate results. In fact, we consider a variety of turbulence regimes, and we investigate the outage probability of FSO communication systems, under a generalized pointing error model based on the Beckmann distribution, for both single and multihop scenarios. Selected numerical simulations are presented to show the accuracy and the efficiency of our approach compared to naive Monte Carlo.
The perception of probability.
Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E
2014-01-01
We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
Santillán, David; Mosquera, Juan-Carlos; Cueto-Felgueroso, Luis
2017-11-01
Hydraulic fracture trajectories in rocks and other materials are highly affected by spatial heterogeneity in their mechanical properties. Understanding the complexity and structure of fluid-driven fractures and their deviation from the predictions of homogenized theories is a practical problem in engineering and geoscience. We conduct a Monte Carlo simulation study to characterize the influence of heterogeneous mechanical properties on the trajectories of hydraulic fractures propagating in elastic media. We generate a large number of random fields of mechanical properties and simulate pressure-driven fracture propagation using a phase-field model. We model the mechanical response of the material as that of an elastic isotropic material with heterogeneous Young modulus and Griffith energy release rate, assuming that fractures propagate in the toughness-dominated regime. Our study shows that the variance and the spatial covariance of the mechanical properties are controlling factors in the tortuousness of the fracture paths. We characterize the deviation of fracture paths from the homogenous case statistically, and conclude that the maximum deviation grows linearly with the distance from the injection point. Additionally, fracture path deviations seem to be normally distributed, suggesting that fracture propagation in the toughness-dominated regime may be described as a random walk.
Nonuniversal power law scaling in the probability distribution of scientific citations.
Peterson, George J; Pressé, Steve; Dill, Ken A
2010-09-14
We develop a model for the distribution of scientific citations. The model involves a dual mechanism: in the direct mechanism, the author of a new paper finds an old paper A and cites it. In the indirect mechanism, the author of a new paper finds an old paper A only via the reference list of a newer intermediary paper B, which has previously cited A. By comparison to citation databases, we find that papers having few citations are cited mainly by the direct mechanism. Papers already having many citations ("classics") are cited mainly by the indirect mechanism. The indirect mechanism gives a power-law tail. The "tipping point" at which a paper becomes a classic is about 25 citations for papers published in the Institute for Scientific Information (ISI) Web of Science database in 1981, 31 for Physical Review D papers published from 1975-1994, and 37 for all publications from a list of high h-index chemists assembled in 2007. The power-law exponent is not universal. Individuals who are highly cited have a systematically smaller exponent than individuals who are less cited.
Wiegmann, Douglas A.a
2005-01-01
The NASA Aviation Safety Program (AvSP) has defined several products that will potentially modify airline and/or ATC operations, enhance aircraft systems, and improve the identification of potential hazardous situations within the National Airspace System (NAS). Consequently, there is a need to develop methods for evaluating the potential safety benefit of each of these intervention products so that resources can be effectively invested to produce the judgments to develop Bayesian Belief Networks (BBN's) that model the potential impact that specific interventions may have. Specifically, the present report summarizes methodologies for improving the elicitation of probability estimates during expert evaluations of AvSP products for use in BBN's. The work involved joint efforts between Professor James Luxhoj from Rutgers University and researchers at the University of Illinois. The Rutgers' project to develop BBN's received funding by NASA entitled "Probabilistic Decision Support for Evaluating Technology Insertion and Assessing Aviation Safety System Risk." The proposed project was funded separately but supported the existing Rutgers' program.
Huang, Xiaowei; Zhang, Yanling; Meng, Long; Abbott, Derek; Qian, Ming; Wong, Kelvin K L; Zheng, Rongqing; Zheng, Hairong; Niu, Lili
2017-01-01
Carotid plaque echogenicity is associated with the risk of cardiovascular events. Gray-scale median (GSM) of the ultrasound image of carotid plaques has been widely used as an objective method for evaluation of plaque echogenicity in patients with atherosclerosis. We proposed a computer-aided method to evaluate plaque echogenicity and compared its efficiency with GSM. One hundred and twenty-five carotid plaques (43 echo-rich, 35 intermediate, 47 echolucent) were collected from 72 patients in this study. The cumulative probability distribution curves were obtained based on statistics of the pixels in the gray-level images of plaques. The area under the cumulative probability distribution curve (AUCPDC) was calculated as its integral value to evaluate plaque echogenicity. The classification accuracy for three types of plaques is 78.4% (kappa value, κ = 0.673), when the AUCPDC is used for classifier training, whereas GSM is 64.8% (κ = 0.460). The receiver operating characteristic curves were produced to test the effectiveness of AUCPDC and GSM for the identification of echolucent plaques. The area under the curve (AUC) was 0.817 when AUCPDC was used for training the classifier, which is higher than that achieved using GSM (AUC = 0.746). Compared with GSM, the AUCPDC showed a borderline association with coronary heart disease (Spearman r = 0.234, p = 0.050). Our experimental results suggest that AUCPDC analysis is a promising method for evaluation of plaque echogenicity and predicting cardiovascular events in patients with plaques.
Directory of Open Access Journals (Sweden)
Jesús Vega Encabo
2015-11-01
Full Text Available In this paper, I claim that subjectivity is a way of being that is constituted through a set of practices in which the self is subject to the dangers of fictionalizing and plotting her life and self-image. I examine some ways of becoming subject through narratives and through theatrical performance before others. Through these practices, a real and active subjectivity is revealed, capable of self-knowledge and self-transformation.
Representing Uncertainty by Probability and Possibility
DEFF Research Database (Denmark)
Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance of uncer......Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance...
S Varadhan, S R
2001-01-01
This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando
Tran, Duong; Andrews, Mark; Ratilal, Purnima
2012-12-01
The probability distribution of ocean-acoustic broadband signal energy after saturated multipath propagation is derived using coherence theory. The frequency components obtained from Fourier decomposition of a broadband signal are each assumed to be fully saturated with energy spectral density that obey the exponential distribution with 5.6 dB standard deviation and unity scintillation index. When the signal bandwidth and measurement time are larger than the correlation bandwidth and correlation time, respectively, of its energy spectral density components, the broadband signal energy obtained by integrating the energy spectral density across the signal bandwidth then follows the Gamma distribution with a standard deviation smaller than 5.6 dB and a scintillation index less than unity. The theory is verified with broadband transmissions in the Gulf of Maine shallow water waveguide in the 300 to 1200 Hz frequency range. The standard deviations of received broadband signal energies range from 2.7 to 4.6 dB for effective bandwidths up to 42 Hz, while the standard deviations of individual energy spectral density components are roughly 5.6 dB. The energy spectral density correlation bandwidths of the received broadband signals are found to be larger for signals with higher center frequencies and are roughly 10% of each center frequency.
Probability distribution functions
Sousa, Paulo Baltarejo; Ferreira, Luís Lino
2007-01-01
This technical report describes the PDFs which have been implemented to model the behaviours of certain parameters of the Repeater-Based Hybrid Wired/Wireless PROFIBUS Network Simulator (RHW2PNetSim) and Bridge-Based Hybrid Wired/Wireless PROFIBUS Network Simulator (BHW2PNetSim).
Xiao, Chuan-Le; Chen, Xiao-Zhou; Du, Yang-Li; Sun, Xuesong; Zhang, Gong; He, Qing-Yu
2013-01-04
Mass spectrometry has become one of the most important technologies in proteomic analysis. Tandem mass spectrometry (LC-MS/MS) is a major tool for the analysis of peptide mixtures from protein samples. The key step of MS data processing is the identification of peptides from experimental spectra by searching public sequence databases. Although a number of algorithms to identify peptides from MS/MS data have been already proposed, e.g. Sequest, OMSSA, X!Tandem, Mascot, etc., they are mainly based on statistical models considering only peak-matches between experimental and theoretical spectra, but not peak intensity information. Moreover, different algorithms gave different results from the same MS data, implying their probable incompleteness and questionable reproducibility. We developed a novel peptide identification algorithm, ProVerB, based on a binomial probability distribution model of protein tandem mass spectrometry combined with a new scoring function, making full use of peak intensity information and, thus, enhancing the ability of identification. Compared with Mascot, Sequest, and SQID, ProVerB identified significantly more peptides from LC-MS/MS data sets than the current algorithms at 1% False Discovery Rate (FDR) and provided more confident peptide identifications. ProVerB is also compatible with various platforms and experimental data sets, showing its robustness and versatility. The open-source program ProVerB is available at http://bioinformatics.jnu.edu.cn/software/proverb/ .
Shabbir, A; Verdoolaege, G; Kardaun, O J W F; Noterdaeme, J M
2014-11-01
Information visualization aimed at facilitating human perception is an important tool for the interpretation of experiments on the basis of complex multidimensional data characterizing the operational space of fusion devices. This work describes a method for visualizing the operational space on a two-dimensional map and applies it to the discrimination of type I and type III edge-localized modes (ELMs) from a series of carbon-wall ELMy discharges at JET. The approach accounts for stochastic uncertainties that play an important role in fusion data sets, by modeling measurements with probability distributions in a metric space. The method is aimed at contributing to physical understanding of ELMs as well as their control. Furthermore, it is a general method that can be applied to the modeling of various other plasma phenomena as well.
Directory of Open Access Journals (Sweden)
Chung-Ho Su
2010-12-01
Full Text Available To forecast a complex and non-linear system, such as a stock market, advanced artificial intelligence algorithms, like neural networks (NNs and genetic algorithms (GAs have been proposed as new approaches. However, for the average stock investor, two major disadvantages are argued against these advanced algorithms: (1 the rules generated by NNs and GAs are difficult to apply in investment decisions; and (2 the time complexity of the algorithms to produce forecasting outcomes is very high. Therefore, to provide understandable rules for investors and to reduce the time complexity of forecasting algorithms, this paper proposes a novel model for the forecasting process, which combines two granulating methods (the minimize entropy principle approach and the cumulative probability distribution approach and a rough set algorithm. The model verification demonstrates that the proposed model surpasses the three listed conventional fuzzy time-series models and a multiple regression model (MLR in forecast accuracy.
Energy Technology Data Exchange (ETDEWEB)
Cai, H.; Wang, M.; Elgowainy, A.; Han, J. (Energy Systems)
2012-07-06
Greenhouse gas (CO{sub 2}, CH{sub 4} and N{sub 2}O, hereinafter GHG) and criteria air pollutant (CO, NO{sub x}, VOC, PM{sub 10}, PM{sub 2.5} and SO{sub x}, hereinafter CAP) emission factors for various types of power plants burning various fuels with different technologies are important upstream parameters for estimating life-cycle emissions associated with alternative vehicle/fuel systems in the transportation sector, especially electric vehicles. The emission factors are typically expressed in grams of GHG or CAP per kWh of electricity generated by a specific power generation technology. This document describes our approach for updating and expanding GHG and CAP emission factors in the GREET (Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation) model developed at Argonne National Laboratory (see Wang 1999 and the GREET website at http://greet.es.anl.gov/main) for various power generation technologies. These GHG and CAP emissions are used to estimate the impact of electricity use by stationary and transportation applications on their fuel-cycle emissions. The electricity generation mixes and the fuel shares attributable to various combustion technologies at the national, regional and state levels are also updated in this document. The energy conversion efficiencies of electric generating units (EGUs) by fuel type and combustion technology are calculated on the basis of the lower heating values of each fuel, to be consistent with the basis used in GREET for transportation fuels. On the basis of the updated GHG and CAP emission factors and energy efficiencies of EGUs, the probability distribution functions (PDFs), which are functions that describe the relative likelihood for the emission factors and energy efficiencies as random variables to take on a given value by the integral of their own probability distributions, are updated using best-fit statistical curves to characterize the uncertainties associated with GHG and CAP emissions in life
IGM Constraints from the SDSS-III/BOSS DR9 Lyα Forest Transmission Probability Distribution Function
Lee, Khee-Gan; Hennawi, Joseph F.; Spergel, David N.; Weinberg, David H.; Hogg, David W.; Viel, Matteo; Bolton, James S.; Bailey, Stephen; Pieri, Matthew M.; Carithers, William; Schlegel, David J.; Lundgren, Britt; Palanque-Delabrouille, Nathalie; Suzuki, Nao; Schneider, Donald P.; Yèche, Christophe
2015-02-01
The Lyα forest transmission probability distribution function (PDF) is an established probe of the intergalactic medium (IGM) astrophysics, especially the temperature-density relationship of the IGM. We measure the transmission PDF from 3393 Baryon Oscillations Spectroscopic Survey (BOSS) quasars from Sloan Digital Sky Survey Data Release 9, and compare with mock spectra that include careful modeling of the noise, continuum, and astrophysical uncertainties. The BOSS transmission PDFs, measured at langzrang = [2.3, 2.6, 3.0], are compared with PDFs created from mock spectra drawn from a suite of hydrodynamical simulations that sample the IGM temperature-density relationship, γ, and temperature at mean density, T 0, where T(Δ) = T 0Δγ - 1. We find that a significant population of partial Lyman-limit systems (LLSs) with a column-density distribution slope of βpLLS ~ - 2 are required to explain the data at the low-transmission end of transmission PDF, while uncertainties in the mean Lyα forest transmission affect the high-transmission end. After modeling the LLSs and marginalizing over mean transmission uncertainties, we find that γ = 1.6 best describes the data over our entire redshift range, although constraints on T 0 are affected by systematic uncertainties. Within our model framework, isothermal or inverted temperature-density relationships (γ <= 1) are disfavored at a significance of over 4σ, although this could be somewhat weakened by cosmological and astrophysical uncertainties that we did not model.
Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry
2015-11-01
In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.
Chen, M.; Kumar, A.
2013-12-01
In prediction of atmospheric seasonal mean climate variability, the signal-to-noise ratio provides a classic measure of predictability. The signal component is the atmospheric response to the slowly evolving boundary conditions such as ENSO SST or from smaller spread from initial conditions for shorter lead forecasts in an initialized prediction system. The noise component results from the internally generated variability of atmospheric states around the ensemble mean. The high signal-to-noise ratio leads to high prediction skill and predictability. Statistically, the signal can be quantified by the mean shift of the atmospheric states to its climatology, the noise by the spread of the probability distribution function (PDF) of atmospheric variability, and the predictability by the relative displacement of the PDF for the atmospheric variable from its climatological distribution. Therefore, it is essential for understanding the predictability to know if there is change and how significant the change is in the PDF spread of atmospheric variable due to changes in external forcing (e.g., ENSO SST; CO2 etc.) through the years. These issues are the focus of this study. Specifically, by using 31 years (1982-2012) seasonal hindcast data from NCEP Climate Forecast Systems version2 (CFSv2), we analyzed the variations of the PDF spread for the seasonal variability of precipitation and near surface land temperature associated with changes in external forcing. The results include (1) its year to year variations for target seasons; (2) its seasonality and geographic dependence; and (3) its lead time dependence in the initialized prediction system.
Fat distribution of overweight persons in relation to morbidity and subjective health
Seidell, J C; Bakx, J C; De Boer, E; Deurenberg, P.; Hautvast, J.G.A.J.
1985-01-01
The association between fat distribution, morbidity and subjective health was studied in 95 overweight adult men and 210 overweight adult women. Retrospective morbidity data were taken from a continuous morbidity registration made by general practitioners over a period of maximally 17 years. In
Energy Technology Data Exchange (ETDEWEB)
Tierney, M.S.
1991-11-01
The Waste Isolation Pilot Plant (WIPP), in southeastern New Mexico, is a research and development facility to demonstrate safe disposal of defense-generated transuranic waste. The US Department of Energy will designate WIPP as a disposal facility if it meets the US Environmental Protection Agency's standard for disposal of such waste; the standard includes a requirement that estimates of cumulative releases of radioactivity to the accessible environment be incorporated in an overall probability distribution. The WIPP Project has chosen an approach to calculation of an overall probability distribution that employs the concept of scenarios for release and transport of radioactivity to the accessible environment. This report reviews the use of Monte Carlo methods in the calculation of an overall probability distribution and presents a logical and mathematical foundation for use of the scenario concept in such calculations. The report also draws preliminary conclusions regarding the shape of the probability distribution for the WIPP system; preliminary conclusions are based on the possible occurrence of three events and the presence of one feature: namely, the events attempted boreholes over rooms and drifts,'' mining alters ground-water regime,'' water-withdrawal wells provide alternate pathways,'' and the feature brine pocket below room or drift.'' Calculation of the WIPP systems's overall probability distributions for only five of sixteen possible scenario classes that can be obtained by combining the four postulated events or features.
Ocampo-Duque, William; Osorio, Carolina; Piamba, Christian; Schuhmacher, Marta; Domingo, José L
2013-02-01
The integration of water quality monitoring variables is essential in environmental decision making. Nowadays, advanced techniques to manage subjectivity, imprecision, uncertainty, vagueness, and variability are required in such complex evaluation process. We here propose a probabilistic fuzzy hybrid model to assess river water quality. Fuzzy logic reasoning has been used to compute a water quality integrative index. By applying a Monte Carlo technique, based on non-parametric probability distributions, the randomness of model inputs was estimated. Annual histograms of nine water quality variables were built with monitoring data systematically collected in the Colombian Cauca River, and probability density estimations using the kernel smoothing method were applied to fit data. Several years were assessed, and river sectors upstream and downstream the city of Santiago de Cali, a big city with basic wastewater treatment and high industrial activity, were analyzed. The probabilistic fuzzy water quality index was able to explain the reduction in water quality, as the river receives a larger number of agriculture, domestic, and industrial effluents. The results of the hybrid model were compared to traditional water quality indexes. The main advantage of the proposed method is that it considers flexible boundaries between the linguistic qualifiers used to define the water status, being the belongingness of water quality to the diverse output fuzzy sets or classes provided with percentiles and histograms, which allows classify better the real water condition. The results of this study show that fuzzy inference systems integrated to stochastic non-parametric techniques may be used as complementary tools in water quality indexing methodologies. Copyright © 2012 Elsevier Ltd. All rights reserved.
Hansen, John P
2003-01-01
Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.
Tojinbara, Kageaki; Sugiura, K; Yamada, A; Kakitani, I; Kwan, N C L; Sugiura, K
2016-01-01
Data of 98 rabies cases in dogs and cats from the 1948-1954 rabies epidemic in Tokyo were used to estimate the probability distribution of the incubation period. Lognormal, gamma and Weibull distributions were used to model the incubation period. The maximum likelihood estimates of the mean incubation period ranged from 27.30 to 28.56 days according to different distributions. The mean incubation period was shortest with the lognormal distribution (27.30 days), and longest with the Weibull distribution (28.56 days). The best distribution in terms of AIC value was the lognormal distribution with mean value of 27.30 (95% CI: 23.46-31.55) days and standard deviation of 20.20 (15.27-26.31) days. There were no significant differences between the incubation periods for dogs and cats, or between those for male and female dogs. Copyright © 2015 Elsevier B.V. All rights reserved.
Chrystal, A.; Heikoop, J. M.; Davis, P.; Syme, J.; Hagerty, S.; Perkins, G.; Larson, T. E.; Longmire, P.; Fessenden, J. E.
2010-12-01
Elevated nitrate (NO3-) concentrations in drinking water pose a health risk to the public. The dual stable isotopic signatures of δ15N and δ18O in NO3- in surface- and groundwater are often used to identify and distinguish among sources of NO3- (e.g., sewage, fertilizer, atmospheric deposition). In oxic groundwaters where no denitrification is occurring, direct calculations of mixing fractions using a mass balance approach can be performed if three or fewer sources of NO3- are present, and if the stable isotope ratios of the source terms are defined. There are several limitations to this approach. First, direct calculations of mixing fractions are not possible when four or more NO3- sources may be present. Simple mixing calculations also rely upon treating source isotopic compositions as a single value; however these sources themselves exhibit ranges in stable isotope ratios. More information can be gained by using a probabilistic approach to account for the range and distribution of stable isotope ratios in each source. Fitting probability density functions (PDFs) to the isotopic compositions for each source term reveals that some values within a given isotopic range are more likely to occur than others. We compiled a data set of dual isotopes in NO3- sources by combining our measurements with data collected through extensive literature review. We fit each source term with a PDF, and show a new method to probabilistically solve multiple component mixing scenarios with source isotopic composition uncertainty. This method is based on a modified use of a tri-linear diagram. First, source term PDFs are sampled numerous times using a variation of stratified random sampling, Latin Hypercube Sampling. For each set of sampled source isotopic compositions, a reference point is generated close to the measured groundwater sample isotopic composition. This point is used as a vertex to form all possible triangles between all pairs of sampled source isotopic compositions
DEFF Research Database (Denmark)
Nielsen, Bjørn Gilbert; Jensen, Morten Østergaard; Bohr, Henrik
2003-01-01
The structure of enkephalin, a small neuropeptide with five amino acids, has been simulated on computers using molecular dynamics. Such simulations exhibit a few stable conformations, which also have been identified experimentally. The simulations provide the possibility to perform cluster analysis...... in the space defined by potentially pharmacophoric measures such as dihedral angles, side-chain orientation, etc. By analyzing the statistics of the resulting clusters, the probability distribution of the side-chain conformations may be determined. These probabilities allow us to predict the selectivity...... of [Leu]enkephalin and [Met]enkephalin to the known mu- and delta-type opiate receptors to which they bind as agonists. Other plausible consequences of these probability distributions are discussed in relation to the way in which they may influence the dynamics of the synapse....
Directory of Open Access Journals (Sweden)
Bogdan Ozga-Zielinski
2016-06-01
New hydrological insights for the region: The results indicated that the 2D normal probability distribution model gives a better probabilistic description of snowmelt floods characterized by the 2-dimensional random variable (Qmax,f, Vf compared to the elliptical Gaussian copula and Archimedean 1-parameter Gumbel–Hougaard copula models, in particular from the view point of probability of exceedance as well as complexity and time of computation. Nevertheless, the copula approach offers a new perspective in estimating the 2D probability distribution for multidimensional random variables. Results showed that the 2D model for snowmelt floods built using the Gumbel–Hougaard copula is much better than the model built using the Gaussian copula.
Lunsford, M. Leigh; Rowell, Ginger Holmes; Goodson-Espy, Tracy
2006-01-01
We applied a classroom research model to investigate student understanding of sampling distributions of sample means and the Central Limit Theorem in post-calculus introductory probability and statistics courses. Using a quantitative assessment tool developed by previous researchers and a qualitative assessment tool developed by the authors, we…
Childers, Timothy
2013-01-01
Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,
Energy Technology Data Exchange (ETDEWEB)
Hewson, Alex C [Department of Mathematics, Imperial College, London SW7 2AZ (United Kingdom); Bauer, Johannes [Max-Planck Institute for Solid State Research, Heisenbergstrasse 1, 70569 Stuttgart (Germany)
2010-03-24
We show that information on the probability density of local fluctuations can be obtained from a numerical renormalization group calculation of a reduced density matrix. We apply this approach to the Anderson-Holstein impurity model to calculate the ground state probability density rho(x) for the displacement x of the local oscillator. From this density we can deduce an effective local potential for the oscillator and compare its form with that obtained from a semiclassical approximation as a function of the coupling strength. The method is extended to the infinite dimensional Holstein-Hubbard model using dynamical mean field theory. We use this approach to compare the probability densities for the displacement of the local oscillator in the normal, antiferromagnetic and charge ordered phases.
Soni, Sanjeev; Tyagi, Himanshu; Taylor, Robert A; Kumar, Amod
2014-07-01
This study investigates the effect of the distribution of nanoparticles delivered to a skin tumour for the thermal ablation conditions attained during thermal therapy. Ultimate aim is to define a distribution of nanoparticles as well as a combination of other therapeutic parameters to attain thermal ablation temperatures (50-60 °C) within whole of the tumour region. Three different cases of nanoparticle distributions are analysed under controlled conditions for all other parameters viz. irradiation intensity and duration, and volume fraction of nanoparticles. Results show that distribution of nanoparticles into only the periphery of tumour resulted in desired thermal ablation temperature in whole of tumour. For the tumour size considered in this study, an irradiation intensity of 1.25 W/cm(2) for duration of 300 s and a nanoparticle volume fraction of 0.001% was optimal to attain a temperature of ≥53 °C within the whole tumour region. It is concluded that distribution of nanoparticles in peripheral region of tumour, along with a controlled combination of other parameters, seems favourable and provides a promising pathway for thermal ablation of a tumour subjected to nanoparticle assisted thermal therapy. Copyright © 2014 Elsevier Ltd. All rights reserved.
Lifetime distributions from tracking individual BC3H1 cells subjected to yessotoxin
Directory of Open Access Journals (Sweden)
Monica Suarez Korsnes
2015-10-01
Full Text Available This work shows examples of lifetime distributions for individual BC3H1 cells after start of exposure to the marine toxin yessotoxin (YTX in an experimental dish. The present tracking of many single cells from time-lapse microscopy datademonstrates the complexity in individual cell fate and which can be masked in aggregate properties. This contribution also demonstrates the general practicality of cell tracking. It can serve as a conceptually simple and non-intrusive method for high throughput early analysis of cytotoxic effects to assess early and late time points relevant for further analyzes or to assess for variability and sub-populations of interest. The present examples of lifetime distributions seem partly to reflect different cell death modalities. Differences between cell lifetime distributions derived from populations in different experimental dishes, can potentially provide measures of inter-cellular influence. Such outcomes may help to understand tumor-cell resistance to drug therapy and to predict the probability of metastasis.
Philosophical theories of probability
Gillies, Donald
2000-01-01
The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.
Sivrikaya, A; Akil, M; Bicer, M; Kilic, M; Baltaci, A K; Mogulkoc, R
2013-01-01
The present study aims to explore how selenium supplementation affects the element distribution in the liver tissue of rats subjected to strenuous swimming exercise. Thirty-two Spraque-Dawley male rats were equally divided into the four groups: Group 1, normal control group. Group 2, selenium-supplemented, non-swimming (0.6 mg/kg/day sodium selenite) group. Group 3, swimming, no supplementation group. Group 4, swimming, selenium-supplemented (0.6 mg/kg/day sodium selenite) group. After one month, the animals were decapitated and liver tissue samples were collected to determine the levels of lead, cobalt, boron, molybdenum, chromium, sulfur, magnesium, sodium, potassium, phosphorus, copper, iron, zinc and selenium. The chromium, molybdenum, iron, sodium and potassium values were higher in the swimming groups, relative to controls. Group 3 had significantly lower lead levels (pselenium and zinc values were obtained in the Group 2 and those of the Group 4 were higher than in the Groups 1 and 3. Group 1 had higher selenium and zinc levels than the Group 3. The results of the present study demonstrated that selenium-supplemented rats subjected to strenuous swimming exercise had distinct elements distribution in liver tissue. Also, selenium supplementation offsets the decrease in zinc levels in rats subjected to vigorous swimming (Tab. 3, Ref. 20).
Mancini, Matteo; Giulietti, Giovanni; Dowell, Nicholas; Spanò, Barbara; Harrison, Neil; Bozzali, Marco; Cercignani, Mara
2017-09-14
Microstructural imaging and connectomics are two research areas that hold great potential for investigating brain structure and function. Combining these two approaches can lead to a better and more complete characterization of the brain as a network. The aim of this work is characterizing the connectome from a novel perspective using the myelination measure given by the g-ratio. The g-ratio is the ratio of the inner to the outer diameters of a myelinated axon, whose aggregated value can now be estimated in vivo using MRI. In two different datasets of healthy subjects, we reconstructed the structural connectome and then used the g-ratio estimated from diffusion and magnetization transfer data to characterize the network structure. Significant characteristics of g-ratio weighted graphs emerged. First, the g-ratio distribution across the edges of the graph did not show the power-law distribution observed using the number of streamlines as a weight. Second, connections involving regions related to motor and sensory functions were the highest in myelin content. We also observed significant differences in terms of the hub structure and the rich-club organization suggesting that connections involving hub regions present higher myelination than peripheral connections. Taken together, these findings offer a characterization of g-ratio distribution across the connectome in healthy subjects and lay the foundations for further investigating plasticity and pathology using a similar approach. Copyright © 2017. Published by Elsevier Inc.
Mark A. Finney; Charles W. McHugh; Isaac Grenfell; Karin L. Riley
2010-01-01
Components of a quantitative risk assessment were produced by simulation of burn probabilities and fire behavior variation for 134 fire planning units (FPUs) across the continental U.S. The system uses fire growth simulation of ignitions modeled from relationships between large fire occurrence and the fire danger index Energy Release Component (ERC). Simulations of 10,...
Freund, John E
1993-01-01
Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.
Directory of Open Access Journals (Sweden)
Luis Vicente Chamorro Marcillllo
2013-06-01
Full Text Available Engineering, within its academic and application forms, as well as any formal research work requires the use of statistics and every inferential statistical analysis requires the use of values of probability distribution functions that are generally available in tables. Generally, management of those tables have physical problems (wasteful transport and wasteful consultation and operational (incomplete lists and limited accuracy. The study, “Probability distribution function values in mobile phones”, permitted determining – through a needs survey applied to students involved in statistics studies at Universidad de Nariño – that the best known and most used values correspond to Chi-Square, Binomial, Student’s t, and Standard Normal distributions. Similarly, it showed user’s interest in having the values in question within an alternative means to correct, at least in part, the problems presented by “the famous tables”. To try to contribute to the solution, we built software that allows immediately and dynamically obtaining the values of the probability distribution functions most commonly used by mobile phones.
Dodds, W J; Stef, J J; Hogan, W J; Hoke, S E; Stewart, E T; Arndorfer, R C
1975-09-01
This study was designed to determine the radial profile of peristaltic pressure waves in the esophageal body of normal subjects and patients with isophageal diverticulum. We used a manometric assembly featuring four radial side hole recording orifices oriented at equidistant 90 degree angles. Each recording catheter was infused with water at a rate (6.1 ml per min) which provided high fidelity pressure recording. In normal subjects, the radially recorded peristaltic pressure complexes were similar in peak amplitude and wave form. The range of pressure differences between the four radial recordings averaged 9.0 +/- 4 SD mm Hg A range is less than or greater to 25 mm Hg occurring in 99% of observations. These variations in pressure amplitude showed no consistant spacial orientation. In 5 of the 6 patients with esophageal diverticulum, the range of radial peristaltic pressure differences exceeded 25 mm Hg in the region of the diverticulum, the lowest pressure occurring at the recording orifice facing the diverticulum mouth. In occasional peristaltic sequences abnormal wave forms featuring abrupt onsets or offsets were observed. These bizarre wave forms were probably caused by oralaboral diverticulum movement relative to the recording sensor during peristalsis. Two patients had abnormally high peristaltic pressure amplitudes, greater than 250 mm Hg. This latter finding introduces the possibility that hypertensive peristaltic contractions may contribute to diverticulum production in some patients.
Probability and stochastic modeling
Rotar, Vladimir I
2012-01-01
Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...
Perkins, Neil J; Schisterman, Enrique F; Vexler, Albert
2013-07-01
Biomarkers are of ever-increasing importance to clinical practice and epidemiologic research. Multiple biomarkers are often measured per patient. Measurement of true biomarker levels is limited by laboratory precision, specifically measuring relatively low, or high, biomarker levels resulting in undetectable levels below, or above, a limit of detection (LOD). Ignoring these missing observations or replacing them with a constant are methods commonly used although they have been shown to lead to biased estimates of several parameters of interest, including the area under the receiver operating characteristic (ROC) curve and regression coefficients. We developed asymptotically consistent, efficient estimators, via maximum likelihood techniques, for the mean vector and covariance matrix of multivariate normally distributed biomarkers affected by LOD. We also developed an approximation for the Fisher information and covariance matrix for our maximum likelihood estimations (MLEs). We apply these results to an ROC curve setting, generating an MLE for the area under the curve for the best linear combination of multiple biomarkers and accompanying confidence interval. Point and confidence interval estimates are scrutinized by simulation study, with bias and root mean square error and coverage probability, respectively, displaying behavior consistent with MLEs. An example using three polychlorinated biphenyls to classify women with and without endometriosis illustrates how the underlying distribution of multiple biomarkers with LOD can be assessed and display increased discriminatory ability over naïve methods. Properly addressing LODs can lead to optimal biomarker combinations with increased discriminatory ability that may have been ignored because of measurement obstacles. Published by Elsevier Inc.
Chen, Ching-Tai; Peng, Hung-Pin; Jian, Jhih-Wei; Tsai, Keng-Chang; Chang, Jeng-Yih; Yang, Ei-Wen; Chen, Jun-Bo; Ho, Shinn-Ying; Hsu, Wen-Lian; Yang, An-Suei
2012-01-01
Protein-protein interactions are key to many biological processes. Computational methodologies devised to predict protein-protein interaction (PPI) sites on protein surfaces are important tools in providing insights into the biological functions of proteins and in developing therapeutics targeting the protein-protein interaction sites. One of the general features of PPI sites is that the core regions from the two interacting protein surfaces are complementary to each other, similar to the interior of proteins in packing density and in the physicochemical nature of the amino acid composition. In this work, we simulated the physicochemical complementarities by constructing three-dimensional probability density maps of non-covalent interacting atoms on the protein surfaces. The interacting probabilities were derived from the interior of known structures. Machine learning algorithms were applied to learn the characteristic patterns of the probability density maps specific to the PPI sites. The trained predictors for PPI sites were cross-validated with the training cases (consisting of 432 proteins) and were tested on an independent dataset (consisting of 142 proteins). The residue-based Matthews correlation coefficient for the independent test set was 0.423; the accuracy, precision, sensitivity, specificity were 0.753, 0.519, 0.677, and 0.779 respectively. The benchmark results indicate that the optimized machine learning models are among the best predictors in identifying PPI sites on protein surfaces. In particular, the PPI site prediction accuracy increases with increasing size of the PPI site and with increasing hydrophobicity in amino acid composition of the PPI interface; the core interface regions are more likely to be recognized with high prediction confidence. The results indicate that the physicochemical complementarity patterns on protein surfaces are important determinants in PPIs, and a substantial portion of the PPI sites can be predicted correctly with
Directory of Open Access Journals (Sweden)
Ching-Tai Chen
Full Text Available Protein-protein interactions are key to many biological processes. Computational methodologies devised to predict protein-protein interaction (PPI sites on protein surfaces are important tools in providing insights into the biological functions of proteins and in developing therapeutics targeting the protein-protein interaction sites. One of the general features of PPI sites is that the core regions from the two interacting protein surfaces are complementary to each other, similar to the interior of proteins in packing density and in the physicochemical nature of the amino acid composition. In this work, we simulated the physicochemical complementarities by constructing three-dimensional probability density maps of non-covalent interacting atoms on the protein surfaces. The interacting probabilities were derived from the interior of known structures. Machine learning algorithms were applied to learn the characteristic patterns of the probability density maps specific to the PPI sites. The trained predictors for PPI sites were cross-validated with the training cases (consisting of 432 proteins and were tested on an independent dataset (consisting of 142 proteins. The residue-based Matthews correlation coefficient for the independent test set was 0.423; the accuracy, precision, sensitivity, specificity were 0.753, 0.519, 0.677, and 0.779 respectively. The benchmark results indicate that the optimized machine learning models are among the best predictors in identifying PPI sites on protein surfaces. In particular, the PPI site prediction accuracy increases with increasing size of the PPI site and with increasing hydrophobicity in amino acid composition of the PPI interface; the core interface regions are more likely to be recognized with high prediction confidence. The results indicate that the physicochemical complementarity patterns on protein surfaces are important determinants in PPIs, and a substantial portion of the PPI sites can be predicted
Imprecise Probability Methods for Weapons UQ
Energy Technology Data Exchange (ETDEWEB)
Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-05-13
Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.
Nitsche, Ludwig C.; Nitsche, Johannes M.; Brenner, Howard
1988-01-01
The sedimentation and diffusion of a nonneutrally buoyant Brownian particle in vertical fluid-filled cylinder of finite length which is instantaneously inverted at regular intervals are investigated analytically. A one-dimensional convective-diffusive equation is derived to describe the temporal and spatial evolution of the probability density; a periodicity condition is formulated; the applicability of Fredholm theory is established; and the parameter-space regions are determined within which the existence and uniqueness of solutions are guaranteed. Numerical results for sample problems are presented graphically and briefly characterized.
Optimal Service Distribution in WSN Service System Subject to Data Security Constraints
Directory of Open Access Journals (Sweden)
Zhao Wu
2014-08-01
Full Text Available Services composition technology provides a flexible approach to building Wireless Sensor Network (WSN Service Applications (WSA in a service oriented tasking system for WSN. Maintaining the data security of WSA is one of the most important goals in sensor network research. In this paper, we consider a WSN service oriented tasking system in which the WSN Services Broker (WSB, as the resource management center, can map the service request from user into a set of atom-services (AS and send them to some independent sensor nodes (SN for parallel execution. The distribution of ASs among these SNs affects the data security as well as the reliability and performance of WSA because these SNs can be of different and independent specifications. By the optimal service partition into the ASs and their distribution among SNs, the WSB can provide the maximum possible service reliability and/or expected performance subject to data security constraints. This paper proposes an algorithm of optimal service partition and distribution based on the universal generating function (UGF and the genetic algorithm (GA approach. The experimental analysis is presented to demonstrate the feasibility of the suggested algorithm.
Optimal service distribution in WSN service system subject to data security constraints.
Wu, Zhao; Xiong, Naixue; Huang, Yannong; Gu, Qiong
2014-08-04
Services composition technology provides a flexible approach to building Wireless Sensor Network (WSN) Service Applications (WSA) in a service oriented tasking system for WSN. Maintaining the data security of WSA is one of the most important goals in sensor network research. In this paper, we consider a WSN service oriented tasking system in which the WSN Services Broker (WSB), as the resource management center, can map the service request from user into a set of atom-services (AS) and send them to some independent sensor nodes (SN) for parallel execution. The distribution of ASs among these SNs affects the data security as well as the reliability and performance of WSA because these SNs can be of different and independent specifications. By the optimal service partition into the ASs and their distribution among SNs, the WSB can provide the maximum possible service reliability and/or expected performance subject to data security constraints. This paper proposes an algorithm of optimal service partition and distribution based on the universal generating function (UGF) and the genetic algorithm (GA) approach. The experimental analysis is presented to demonstrate the feasibility of the suggested algorithm.
Directory of Open Access Journals (Sweden)
Laktineh Imad
2010-04-01
Full Text Available This ourse constitutes a brief introduction to probability applications in high energy physis. First the mathematical tools related to the diferent probability conepts are introduced. The probability distributions which are commonly used in high energy physics and their characteristics are then shown and commented. The central limit theorem and its consequences are analysed. Finally some numerical methods used to produce diferent kinds of probability distribution are presented. The full article (17 p. corresponding to this lecture is written in french and is provided in the proceedings of the book SOS 2008.
Normal probability plots with confidence.
Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang
2015-01-01
Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Probability theory a foundational course
Pakshirajan, R P
2013-01-01
This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.
Directory of Open Access Journals (Sweden)
Dolićanin-Đekić Diana
2013-01-01
Full Text Available This paper investigates the possibility of distinguishing between the effects of radiation coming from two or more radioactive isotopes, by using methods of statistical mathematics. The procedure uses a mixed distribution of an additive type. Mathematical treatment is demonstrated herein on an example of analysis of composite radiation from two radioactive sources.
Directory of Open Access Journals (Sweden)
J.W. Love
2017-04-01
Where FEC data were obtained with less sensitive counting techniques (i.e. McMaster 30 or 15 epg, zero-inflated distributions and their associated central tendency were the most appropriate and would be recommended to use, i.e. the arithmetic group mean divided by the proportion of non-zero counts present; otherwise apparent anthelmintic efficacy could be misrepresented.
DEFF Research Database (Denmark)
Dimitrov, Nikolay Krasimirov
2016-01-01
extrapolation techniques: the Weibull, Gumbel and Pareto distributions and a double-exponential asymptotic extreme value function based on the ACER method. For the successful implementation of a fully automated extrapolation process, we have developed a procedure for automatic identification of tail threshold...
Bhattacharyya, Pratip; Chakrabarti, Bikas K.
2008-01-01
We study different ways of determining the mean distance (r[subscript n]) between a reference point and its nth neighbour among random points distributed with uniform density in a D-dimensional Euclidean space. First, we present a heuristic method; though this method provides only a crude mathematical result, it shows a simple way of estimating…
Laze, Kuenda
2016-08-01
Modelling of land use may be improved by incorporating the results of species distribution modelling and species distribution modelling may be upgraded if a variable of the process-based variable of forest cover change or accessibility of forest from human settlement is included. This work presents the results of spatially explicit analyses of the changes in forest cover from 2000 to 2007 using the method of Geographically Weighted Regression (GWR) and of the species distribution for protected species of Lynx lynx martinoi, Ursus arctos using Generalized Linear Models (GLMs). The methodological approach is separately searching for a parsimonious model for forest cover change and species distribution for the entire territory of Albania. The findings of this work show that modelling of land change and of species distribution is indeed value-added by showing higher values of model selection of corrected Akaike Information Criterion. These results provide evidences on the effects of process-based variables on species distribution modelling and on the performance of species distribution modelling as well as show an example of the incorporation of estimated probability of species occurrences in a land change modelling.
Analysis of oil-pipeline distribution of multiple products subject to delivery time-windows
Jittamai, Phongchai
This dissertation defines the operational problems of, and develops solution methodologies for, a distribution of multiple products into oil pipeline subject to delivery time-windows constraints. A multiple-product oil pipeline is a pipeline system composing of pipes, pumps, valves and storage facilities used to transport different types of liquids. Typically, products delivered by pipelines are petroleum of different grades moving either from production facilities to refineries or from refineries to distributors. Time-windows, which are generally used in logistics and scheduling areas, are incorporated in this study. The distribution of multiple products into oil pipeline subject to delivery time-windows is modeled as multicommodity network flow structure and mathematically formulated. The main focus of this dissertation is the investigation of operating issues and problem complexity of single-source pipeline problems and also providing solution methodology to compute input schedule that yields minimum total time violation from due delivery time-windows. The problem is proved to be NP-complete. The heuristic approach, a reversed-flow algorithm, is developed based on pipeline flow reversibility to compute input schedule for the pipeline problem. This algorithm is implemented in no longer than O(T·E) time. This dissertation also extends the study to examine some operating attributes and problem complexity of multiple-source pipelines. The multiple-source pipeline problem is also NP-complete. A heuristic algorithm modified from the one used in single-source pipeline problems is introduced. This algorithm can also be implemented in no longer than O(T·E) time. Computational results are presented for both methodologies on randomly generated problem sets. The computational experience indicates that reversed-flow algorithms provide good solutions in comparison with the optimal solutions. Only 25% of the problems tested were more than 30% greater than optimal values and
Smith, O. E.; Adelfang, S. I.; Tubbs, J. D.
1982-01-01
A five-parameter gamma distribution (BGD) having two shape parameters, two location parameters, and a correlation parameter is investigated. This general BGD is expressed as a double series and as a single series of the modified Bessel function. It reduces to the known special case for equal shape parameters. Practical functions for computer evaluations for the general BGD and for special cases are presented. Applications to wind gust modeling for the ascent flight of the space shuttle are illustrated.
Ben Issaid, Chaouki
2016-06-01
The Gamma-Gamma distribution has recently emerged in a number of applications ranging from modeling scattering and reverbation in sonar and radar systems to modeling atmospheric turbulence in wireless optical channels. In this respect, assessing the outage probability achieved by some diversity techniques over this kind of channels is of major practical importance. In many circumstances, this is intimately related to the difficult question of analyzing the statistics of a sum of Gamma-Gamma random variables. Answering this question is not a simple matter. This is essentially because outage probabilities encountered in practice are often very small, and hence the use of classical Monte Carlo methods is not a reasonable choice. This lies behind the main motivation of the present work. In particular, this paper proposes a new approach to estimate the left tail of the sum of Gamma-Gamma variates. More specifically, we propose a mean-shift importance sampling scheme that efficiently evaluates the outage probability of L-branch maximum ratio combining diversity receivers over Gamma-Gamma fading channels. The proposed estimator satisfies the well-known bounded relative error criterion, a well-desired property characterizing the robustness of importance sampling schemes, for both identically and non-identically independent distributed cases. We show the accuracy and the efficiency of our approach compared to naive Monte Carlo via some selected numerical simulations.
Mobility power flow analysis of an L-shaped plate structure subjected to distributed loading
Cuschieri, J. M.; Cimmerman, B.
1990-01-01
An analytical investigation based in the Mobility Power Flow (MPF) method is presented for the determination of the vibrational response and power flow for two coupled flat plate structures in an L-shaped configuration, subjected to distributed excitation. The principle of the MPF method consists of dividing the global structure into a series of subsystems coupled together using mobility functions. Each separate subsystem is analyzed independently to determine the structural mobility functions for the junction and excitation locations. The mobility functions, together with the characteristics of the junction between the subsystems, are then used to determine the response of the global structure and the MPF. In the considered coupled plate structure, MPF expressions are derived for distributed mechanical excitation which is independent of the structure response. However using a similar approach with some modifications excitation by an acoustic plane wave can be considered. Some modifications are required to deal with the latter case are necessary because the forces (acoustic pressure) acting on the structure are dependent on the response of the structure due to the presence of the scattered pressure.
Directory of Open Access Journals (Sweden)
Frolov Aleksey
2017-01-01
Full Text Available Creating an innovation environment is shown in the context of interaction of economic agents in the creation and consumption of innovative value-based infrastructure approach. The problem of the complexity of collecting heterogeneous data on the formation and distribution of innovative value in the conditions of the dynamic nature of the research object and the environment is formulated. An information model providing a subject-independent representation of data on innovation value flows is proposed and allows to automate the processes of data collection and analysis with the minimization of time costs. The article was prepared in the course of carrying out research work within the framework of the project part of the state task in the field of scientific activity in accordance with the assignment 26.2758.2017 / 4.6 for 2017-2019. on the topic “System for analyzing the formation and distribution of the value of innovative products based on the infrastructure concept”.
Choice Probability Generating Functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....
Directory of Open Access Journals (Sweden)
Simon van Mourik
2014-06-01
Full Text Available Multi-parameter models in systems biology are typically ‘sloppy’: some parameters or combinations of parameters may be hard to estimate from data, whereas others are not. One might expect that parameter uncertainty automatically leads to uncertain predictions, but this is not the case. We illustrate this by showing that the prediction uncertainty of each of six sloppy models varies enormously among different predictions. Statistical approximations of parameter uncertainty may lead to dramatic errors in prediction uncertainty estimation. We argue that prediction uncertainty assessment must therefore be performed on a per-prediction basis using a full computational uncertainty analysis. In practice this is feasible by providing a model with a sample or ensemble representing the distribution of its parameters. Within a Bayesian framework, such a sample may be generated by a Markov Chain Monte Carlo (MCMC algorithm that infers the parameter distribution based on experimental data. Matlab code for generating the sample (with the Differential Evolution Markov Chain sampler and the subsequent uncertainty analysis using such a sample, is supplied as Supplemental Information.
van Mourik, Simon; Ter Braak, Cajo; Stigter, Hans; Molenaar, Jaap
2014-01-01
Multi-parameter models in systems biology are typically 'sloppy': some parameters or combinations of parameters may be hard to estimate from data, whereas others are not. One might expect that parameter uncertainty automatically leads to uncertain predictions, but this is not the case. We illustrate this by showing that the prediction uncertainty of each of six sloppy models varies enormously among different predictions. Statistical approximations of parameter uncertainty may lead to dramatic errors in prediction uncertainty estimation. We argue that prediction uncertainty assessment must therefore be performed on a per-prediction basis using a full computational uncertainty analysis. In practice this is feasible by providing a model with a sample or ensemble representing the distribution of its parameters. Within a Bayesian framework, such a sample may be generated by a Markov Chain Monte Carlo (MCMC) algorithm that infers the parameter distribution based on experimental data. Matlab code for generating the sample (with the Differential Evolution Markov Chain sampler) and the subsequent uncertainty analysis using such a sample, is supplied as Supplemental Information.
Concepts of probability theory
Pfeiffer, Paul E
1979-01-01
Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.
Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...
Gellert, A R; Lewis, C A; Langford, J A; Tolfree, S E; Rudd, R M
1985-10-01
The overall and regional clearance of an inhaled isotope labelled solute from the lungs was examined on the basis of a 15 minute period of data collection, for which a technique was developed that does not require intravenous injection to correct for blood-tissue background activity. The technique was applied to 52 normal subjects (31 non-smokers and 21 smokers) and to 37 patients with asbestosis (21 non-smokers and 16 smokers). In normal smokers solute clearance was faster in the upper and middle zones, with a mean ratio of T1/2 LB (half time solute clearance from lungs to blood) in the upper two thirds to the lower one third of the lungs of 0.66 (0.28-1.33), compared with 1.24 (0.43-2.77) in normal non-smokers (p less than 0.002). In patients with asbestosis solute clearance was accelerated throughout the lungs even though radiographic abnormalities were limited to lower or lower to middle zones. Regional distribution of clearance was not affected by posture in normal subjects. Overall solute clearance was significantly faster in normal current smokers and in patients with asbestosis than in normal non-smokers (p less than 0.001 respectively). Among patients with asbestosis, smokers had faster overall clearance than non-smokers (p less than 0.01). Among normal non-smokers T1/2 LB was not significantly different between those who had never smoked and ex-smokers. Regional abnormalities in pulmonary epithelial permeability may offer insight into the pathogenesis of interstitial lung diseases and smoking related disorders.
Veldkamp, T. I. E.; Wada, Y.; Aerts, J. C. J. H.; Ward, P. J.
2016-01-01
Changing hydro-climatic and socioeconomic conditions increasingly put pressure on fresh water resources and are expected to aggravate water scarcity conditions towards the future. Despite numerous calls for risk-based water scarcity assessments, a global-scale framework that includes UNISDR's definition of risk does not yet exist. This study provides a first step towards such a risk based assessment, applying a Gamma distribution to estimate water scarcity conditions at the global scale under historic and future conditions, using multiple climate change and population growth scenarios. Our study highlights that water scarcity risk, expressed in terms of expected annual exposed population, increases given all future scenarios, up to greater than 56.2% of the global population in 2080. Looking at the drivers of risk, we find that population growth outweigh the impacts of climate change at global and regional scales. Using a risk-based method to assess water scarcity, we show the results to be less sensitive than traditional water scarcity assessments to the use of fixed threshold to represent different levels of water scarcity. This becomes especially important when moving from global to local scales, whereby deviations increase up to 50% of estimated risk levels.
Directory of Open Access Journals (Sweden)
John T. Rees
2000-12-01
Full Text Available A pandeid hydrozoan new to California, Amphinema sp., was collected in 1998 as a hydroid living on the non-indigenous bryozoan, Watersipora subtorquata, attached to floats in Bodega Harbor 80 km north of San Francisco Bay. The hydroid was cultured in the laboratory and medusae it released were raised to maturity. No species name could be assigned because although the hydroid colony structure and morphology of the polyp most closely resemble descriptions of Amphinema rugosum, the immature and adult medusae best resemble A. dinema. These two described species are known from widely-spaced locations worldwide including Europe (British Isles and the Mediterranean, New England, the Caribbean, east Africa, India, Japan and China, implying that they may transport easily between sites by man´s activities. Such wide-spread distributions of both species, coupled with the notable absence of Amphinema sp. from Bodega Harbor during a number of previous field surveys in the 1970´s, strongly intimates that Amphinemasp. has been introduced from elsewhere into Bodega Harbor during the past 25 years. Two additional species of Amphinema medusae present on the west coast of North America are discussed.
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2013-01-01
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...
Directory of Open Access Journals (Sweden)
Gian Paolo Beretta
2008-08-01
Full Text Available A rate equation for a discrete probability distribution is discussed as a route to describe smooth relaxation towards the maximum entropy distribution compatible at all times with one or more linear constraints. The resulting dynamics follows the path of steepest entropy ascent compatible with the constraints. The rate equation is consistent with the Onsager theorem of reciprocity and the fluctuation-dissipation theorem. The mathematical formalism was originally developed to obtain a quantum theoretical unification of mechanics and thermodinamics. It is presented here in a general, non-quantal formulation as a part of an effort to develop tools for the phenomenological treatment of non-equilibrium problems with applications in engineering, biology, sociology, and economics. The rate equation is also extended to include the case of assigned time-dependences of the constraints and the entropy, such as for modeling non-equilibrium energy and entropy exchanges.
Introduction to probability with Mathematica
Hastings, Kevin J
2009-01-01
Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...
Ben Issaid, Chaouki
2017-01-26
The Gamma-Gamma distribution has recently emerged in a number of applications ranging from modeling scattering and reverberation in sonar and radar systems to modeling atmospheric turbulence in wireless optical channels. In this respect, assessing the outage probability achieved by some diversity techniques over this kind of channels is of major practical importance. In many circumstances, this is related to the difficult question of analyzing the statistics of a sum of Gamma- Gamma random variables. Answering this question is not a simple matter. This is essentially because outage probabilities encountered in practice are often very small, and hence the use of classical Monte Carlo methods is not a reasonable choice. This lies behind the main motivation of the present work. In particular, this paper proposes a new approach to estimate the left tail of the sum of Gamma-Gamma variates. More specifically, we propose robust importance sampling schemes that efficiently evaluates the outage probability of diversity receivers over Gamma-Gamma fading channels. The proposed estimators satisfy the well-known bounded relative error criterion for both maximum ratio combining and equal gain combining cases. We show the accuracy and the efficiency of our approach compared to naive Monte Carlo via some selected numerical simulations.
Quantum Probabilities as Behavioral Probabilities
Directory of Open Access Journals (Sweden)
Vyacheslav I. Yukalov
2017-03-01
Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.
Introduction to probability with statistical applications
Schay, Géza
2016-01-01
Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises
Ozgul, Guler; Seyhan, Ekrem Cengiz; Özgül, Mehmet Akif; Günlüoğlu, Mehmet Zeki
2017-03-01
Chronic obstructive pulmonary disease (COPD) increases the risk of cardiovascular disease (CVD). Red blood cell distribution width (RDW) is accepted as a powerful predictor of outcomes in patients with CVD. To study RDW in patients with COPD, and to compare the value of this measurement with clinical, echocardiographic, nutritional and laboratory status. Secondly, we aimed to determine the effect of smoking on RDW values in healthy subjects. One hundred and seventy-five patients with stable COPD and 210 healthy controls were enrolled in the study. Demographic, clinical, nutritional status, echocardiographic, and laboratory characteristics, RDW values were recorded and compared. RDW values were higher in the COPD group than in controls (15±2.3% vs. 13.8±2.5%, p<0.001). In COPD patients, RDW levels positively correlated with CRP levels (r=0.27, P<.001), albumin levels (r=0.23, P=.04), right ventricular dysfunction (RVD) (r=0.24, P=.001), pulmonary hypertension (PAH) (r=0.1, P=.02), and presence of CVD (r=0.24, P=.02). In multivariable logistic regression suggested that presence of CVD (4.3; 95% CI: 1.3 to 11; P=.01), and presence of RVD (3.1; 95% CI: 1.7 to 8.3; P=.02) were independently related to elevated RDW levels in COPD patients. In the healthy population, correlations analysis showed only a significant correlation between RDW and cigarette smoking years (r=0.57, P<.001). RDW is independently associated with CVD and RVD in patients with COPD. In the healthy population, RDW is also associated with smoking status. Copyright © 2016 SEPAR. Publicado por Elsevier España, S.L.U. All rights reserved.
Pérez-Sánchez, Julio; Senent-Aparicio, Javier
2017-08-01
Dry spells are an essential concept of drought climatology that clearly defines the semiarid Mediterranean environment and whose consequences are a defining feature for an ecosystem, so vulnerable with regard to water. The present study was conducted to characterize rainfall drought in the Segura River basin located in eastern Spain, marked by the self seasonal nature of these latitudes. A daily precipitation set has been utilized for 29 weather stations during a period of 20 years (1993-2013). Furthermore, four sets of dry spell length (complete series, monthly maximum, seasonal maximum, and annual maximum) are used and simulated for all the weather stations with the following probability distribution functions: Burr, Dagum, error, generalized extreme value, generalized logistic, generalized Pareto, Gumbel Max, inverse Gaussian, Johnson SB, Log-Logistic, Log-Pearson 3, Triangular, Weibull, and Wakeby. Only the series of annual maximum spell offer a good adjustment for all the weather stations, thereby gaining the role of Wakeby as the best result, with a p value means of 0.9424 for the Kolmogorov-Smirnov test (0.2 significance level). Probability of dry spell duration for return periods of 2, 5, 10, and 25 years maps reveal the northeast-southeast gradient, increasing periods with annual rainfall of less than 0.1 mm in the eastern third of the basin, in the proximity of the Mediterranean slope.
Energy Technology Data Exchange (ETDEWEB)
Ramos, Alessandro Candido Lopes [CELG - Companhia Energetica de Goias, Goiania, GO (Brazil). Generation and Transmission. System' s Operation Center], E-mail: alessandro.clr@celg.com.br; Batista, Adalberto Jose [Universidade Federal de Goias (UFG), Goiania, GO (Brazil)], E-mail: batista@eee.ufg.br; Leborgne, Roberto Chouhy [Universidade Federal do Rio Grande do Sul (UFRS), Porto Alegre, RS (Brazil)], E-mail: rcl@ece.ufrgs.br; Emiliano, Pedro Henrique Mota, E-mail: ph@phph.com.br
2009-07-01
This article presents the impact of distributed generation in studies of voltage sags caused by faults in the electrical system. We simulated short-circuit-to-ground in 62 lines of 230, 138, 69 and 13.8 kV that are part of the electrical system of the city of Goiania, of Goias state . For each fault position was monitored the bus voltage of 380 V in an industrial consumer sensitive to such sags. Were inserted different levels of GD near the consumer. The simulations of a short circuit, with the monitoring bar 380 V, were performed again. A study using stochastic simulation Monte Carlo (SMC) was performed to obtain, at each level of GD, the probability curves and sags of the probability density and its voltage class. With these curves were obtained the average number of sags according to each class, that the consumer bar may be submitted annually. The simulations were performed using the Program Analysis of Simultaneous Faults - ANAFAS. In order to overcome the intrinsic limitations of the methods of simulation of this program and allow data entry via windows, a computational tool was developed in Java language. Data processing was done using the MATLAB software.
Directory of Open Access Journals (Sweden)
BLAGA IRINA
2014-03-01
Full Text Available The maximum amounts of rainfall are usually characterized by high intensity, and their effects on the substrate are revealed, at slope level, by the deepening of the existing forms of torrential erosion and also by the formation of new ones, and by landslide processes. For the 1971-2000 period, for the weather stations in the hilly area of Cluj County: Cluj- Napoca, Dej, Huedin and Turda the highest values of rainfall amounts fallen in 24, 48 and 72 hours were analyzed and extracted, based on which the variation and the spatial and temporal distribution of the precipitation were analyzed. The annual probability of exceedance of maximum rainfall amounts fallen in short time intervals (24, 48 and 72 hours, based on thresholds and class values was determined, using climatological practices and the Hyfran program facilities.
Frič, Roman; Papčo, Martin
2017-12-01
Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.
Frič, Roman; Papčo, Martin
2017-06-01
Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.
Energy Technology Data Exchange (ETDEWEB)
van Milligen, B. Ph. [Asociacion EURATOM-CIEMAT; Sanchez, Raul [Universidad Carlos III, Madrid, Spain; Carreras, Benjamin A [ORNL; Lynch, Vickie E [ORNL; LaBombard, Brian [Massachusetts Institute of Technology (MIT); Pedrosa, M. A. [EURATOM-CIEMAT, Madrid, Spain; Hidalgo, Carlos [EURATOM-CIEMAT, Madrid, Spain; Goncalves, B. [EURATOM IST Assoc., Lisbon, Portugal; Balb�n, Rosa [EURATOM-CIEMAT, Madrid, Spain
2005-05-01
Plasma density fluctuations and electrostatic turbulent fluxes measured at the scrape-off layer of the Alcator C-Mod tokamak [ B. LaBombard, R. L. Boivin, M. Greenwald, J. Hughes, B. Lipschultz, D. Mossessian, C. S. Pitcher, J. L. Terry, and S. J. Zweben, Phys. Plasmas 8, 2107 (2001) ], the Wendelstein 7-Advanced Stellarator [ H. Renner, E. Anabitarte, E. Ascasibar et al., Plasma Phys. Controlled Fusion 31, 1579 (1989) ], and the TJ-II stellarator [ C. Alejaldre, J. Alonso, J. Botija et al., Fusion Technol. 17, 131 (1990) ] are shown to obey a non-Gaussian but apparently universal (i.e., not dependent on device and discharge parameters) probability density distribution (pdf). The fact that a specific shape acts as an attractor for the pdf seems to suggest that emergent behavior and self-regulation are relevant concepts for these fluctuations. This shape is closely similar to the so-called Bramwell, Holdsworth, and Pinton distribution, which does not have any free parameters.
Directory of Open Access Journals (Sweden)
Juan Miguel Astorga Gómez
2015-06-01
Full Text Available En este artículo, se estudia la resistencia de puesta a tierra en redes urbanas de distribución eléctrica de la ciudad de Copiapó (Chile por medio de distribuciones de probabilidad, con el objetivo de evaluar el desempeño del diseño de las mallas que actualmente se utiliza en esta ciudad. El estudio está basado en una muestra de cuarenta y tres mediciones de resistencia de puesta a tierra. Se muestran los principales indicadores de estadística descriptiva para las mediciones de campo, se ajustan tres distribuciones de probabilidad a los datos de la muestra y se usan el criterio de información de Akaike (AIC y el criterio de información Bayesiano (BIC para elegir la distribución que mejor representa el comportamiento de los datos. Finalmente, usando el modelo seleccionado, se calculan algunas probabilidades para la resistencia de puesta a tierra y se entregan las principales conclusiones del trabajo.In this article, the grounding resistance in urban electrical distribution networks of the Copiapó city is studied. The estimation of mean value is calculated using continuous probability distributions. The aim of this study is to performance assessment of grounding grid design currently used in these networks. Forty-three grounding grids are used as sample. The main indices of descriptive statistics of field measurements are shown. Three continuous probability distribution models are fitted to the sample. For selecting the best model, the Akaike information criterion (AIC and the Bayesian information criterion (BIC are used. Finally, using the best model, some probabilities for the grounding resistance are calculated and the main conclusions are presented.
Meads, C; Auguste, P; Davenport, C; Małysiak, S; Sundar, S; Kowalska, M; Zapalska, A; Guest, P; Thangaratinam, S; Martin-Hirsch, P; Borowiack, E; Barton, P; Roberts, T; Khan, K
2013-03-01
Cancer of the uterine cervix is a common cause of mortality in women. After initial treatment women may be symptom free, but the cancer may recur within a few years. It is uncertain whether it is more clinically effective to survey asymptomatic women for signs of recurrence or to await symptoms or signs before using imaging. This project compared the diagnostic accuracy of imaging using positron emission tomography/computerised tomography (PET-CT) with that of imaging using CT or magnetic resonance imaging (MRI) alone and evaluated the cost-effectiveness of adding PET-CT as an adjunct to standard practice. Standard systematic review methods were used to obtain and evaluate relevant test accuracy and effectiveness studies. Databases searched included MEDLINE, EMBASE, Science Citation Index and The Cochrane Library. All databases were searched from inception to May 2010. Study quality was assessed using appropriately modified Quality Assessment of Diagnostic Accuracy Studies (QUADAS) criteria. Included were any studies of PET-CT, MRI or CT compared with the reference standard of histopathological findings or clinical follow-up in symptomatic women suspected of having recurrent or persistent cervical cancer and in asymptomatic women a minimum of 3 months after completion of primary treatment. Subjective elicitation of expert opinion was used to supplement diagnostic information needed for the economic evaluation. The effectiveness of treatment with chemotherapy, radiotherapy, chemoradiotherapy, radical hysterectomy and pelvic exenteration was systematically reviewed. Meta-analysis was carried out in RevMan 5.1 (The Cochrane Collaboration, The Nordic Cochrane Centre, Copenhagen, Denmark) and Stata version 11 (StataCorp LP, College Station, Texas, USA). A Markov model was developed to compare the relative cost-effectiveness using TreeAge Pro software version 2011 (TreeAge Software Inc., Evanston, IL, USA). For the diagnostic review, a total of 7524 citations were
Counterexamples in probability
Stoyanov, Jordan M
2013-01-01
While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.
Hemmo, Meir
2012-01-01
What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive.
Grimmett, Geoffrey
2014-01-01
Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...
Shorack, Galen R
2017-01-01
This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...
Nonequilibrium nuclear spin distribution function in quantum dots subject to periodic pulses
Jäschke, Natalie; Fischer, Andreas; Evers, Eiko; Belykh, Vasilii V.; Greilich, Alex; Bayer, Manfred; Anders, Frithjof B.
2017-11-01
Electron spin dephasing in a singly charged semiconductor quantum dot can partially be suppressed by periodic laser pulsing. We propose a semiclassical approach describing the decoherence of the electron spin polarization governed by the hyperfine interaction with the nuclear spins as well as the probabilistic nature of the photon absorption. We use the steady-state Floquet condition to analytically derive two subclasses of resonance conditions excellently predicting the peak locations in the part of the Overhauser field distribution which is projected in the direction of the external magnetic field. As a consequence of the periodic pulsing, a nonequilibrium distribution develops as a function of time. The numerical simulation of the coupled dynamics reveals the influence of the hyperfine coupling constant distribution onto the evolution of the electron spin polarization before the next laser pulse. Experimental indications are provided for both subclasses of resonance conditions.
Structure and Spatial Distribution of Ge Nanocrystals Subjected to Fast Neutron Irradiation
Directory of Open Access Journals (Sweden)
Alexander N. Ionov
2011-07-01
Full Text Available The influence of fast neutron irradiation on the structure and spatial distribution of Ge nanocrystals (NC embedded in an amorphous SiO2 matrix has been studied. The investigation was conducted by means of laser Raman Scattering (RS, High Resolution Transmission Electron Microscopy (HR-TEM and X-ray photoelectron spectroscopy (XPS. The irradiation of Ge- NC samples by a high dose of fast neutrons lead to a partial destruction of the nanocrystals. Full reconstruction of crystallinity was achieved after annealing the radiation damage at 8000C, which resulted in full restoration of the RS spectrum. HR-TEM images show, however, that the spatial distributions of Ge-NC changed as a result of irradiation and annealing. A sharp decrease in NC distribution towards the SiO2 surface has been observed. This was accompanied by XPS detection of Ge oxides and elemental Ge within both the surface and subsurface region.
Beach, Thomas G; Adler, Charles H; Sue, Lucia I; Vedders, Linda; Lue, Lihfen; White Iii, Charles L; Akiyama, Haru; Caviness, John N; Shill, Holly A; Sabbagh, Marwan N; Walker, Douglas G
2010-06-01
A sensitive immunohistochemical method for phosphorylated alpha-synuclein was used to stain sets of sections of spinal cord and tissue from 41 different sites in the bodies of 92 subjects, including 23 normal elderly, 7 with incidental Lewy body disease (ILBD), 17 with Parkinson's disease (PD), 9 with dementia with Lewy bodies (DLB), 19 with Alzheimer's disease with Lewy bodies (ADLB) and 17 with Alzheimer's disease with no Lewy bodies (ADNLB). The relative densities and frequencies of occurrence of phosphorylated alpha-synuclein histopathology (PASH) were tabulated and correlated with diagnostic category. The greatest densities and frequencies of PASH occurred in the spinal cord, followed by the paraspinal sympathetic ganglia, the vagus nerve, the gastrointestinal tract and endocrine organs. The frequency of PASH within other organs and tissue types was much lower. Spinal cord and peripheral PASH was most common in subjects with PD and DLB, where it appears likely that it is universally widespread. Subjects with ILBD had lesser densities of PASH within all regions, but had frequent involvement of the spinal cord and paraspinal sympathetic ganglia, with less-frequent involvement of end-organs. Subjects with ADLB had infrequent involvement of the spinal cord and paraspinal sympathetic ganglia with rare involvement of end-organs. Within the gastrointestinal tract, there was a rostrocaudal gradient of decreasing PASH frequency and density, with the lower esophagus and submandibular gland having the greatest involvement and the colon and rectum the lowest.
Corbi, Alberto; Burgos, Daniel
2017-01-01
This paper presents how virtual containers enhance the implementation of STEAM (science, technology, engineering, arts, and math) subjects as Open Educational Resources (OER). The publication initially summarizes the limitations of delivering open rich learning contents and corresponding assignments to students in college level STEAM areas. The…
The relation between postural stability and weight distribution in healthy subjects.
Anker, L.C.; Weerdesteijn, V.G.M.; Nes, I.J.W. van; Nienhuis, B.; Straatman, H.; Geurts, A.C.H.
2008-01-01
Knowledge of the effects of leg-loading asymmetry on postural control and control asymmetry during quiet upright standing in healthy young and middle-aged subjects is necessary before these relationships in patients with lateralized disorders can be assessed and understood. A posturographic
Bacterial populations were examined in a simulated chloraminated drinking water distribution system (i.e. loop). The loop (BW-AB-I) received chlorinated municipal water (BW-C) amended with ammonia (2mg/L monochloramine). After six years of continuous operation, the operational ...
I. Heinonen (Ilkka); M.J. Savolainen (Markku); C. Han (Chunlei); J. Kemppainen (Jukka); V. Oikonen (Vesa); M. Luotolahti (Matti); D.J.G.M. Duncker (Dirk); D. Merkus (Daphne); J. Knuuti (Juhani); K.K. Kalliokoski (Kari)
2013-01-01
textabstractPulmonary blood flow (PBF) is an important determinant of endurance sports performance, yet studies investigating adaptations of the pulmonary circulation in athletes are scarce. In the present study, we investigated PBF, its distribution, and heterogeneity at baseline and during
Crema, Enrico R; Habu, Junko; Kobayashi, Kenichi; Madella, Marco
2016-01-01
Recent advances in the use of summed probability distribution (SPD) of calibrated 14C dates have opened new possibilities for studying prehistoric demography. The degree of correlation between climate change and population dynamics can now be accurately quantified, and divergences in the demographic history of distinct geographic areas can be statistically assessed. Here we contribute to this research agenda by reconstructing the prehistoric population change of Jomon hunter-gatherers between 7,000 and 3,000 cal BP. We collected 1,433 14C dates from three different regions in Eastern Japan (Kanto, Aomori and Hokkaido) and established that the observed fluctuations in the SPDs were statistically significant. We also introduced a new non-parametric permutation test for comparing multiple sets of SPDs that highlights point of divergences in the population history of different geographic regions. Our analyses indicate a general rise-and-fall pattern shared by the three regions but also some key regional differences during the 6th millennium cal BP. The results confirm some of the patterns suggested by previous archaeological studies based on house and site counts but offer statistical significance and an absolute chronological framework that will enable future studies aiming to establish potential correlation with climatic changes.
Rixen, M.; Ferreira-Coelho, E.; Signell, R.
2008-01-01
Despite numerous and regular improvements in underlying models, surface drift prediction in the ocean remains a challenging task because of our yet limited understanding of all processes involved. Hence, deterministic approaches to the problem are often limited by empirical assumptions on underlying physics. Multi-model hyper-ensemble forecasts, which exploit the power of an optimal local combination of available information including ocean, atmospheric and wave models, may show superior forecasting skills when compared to individual models because they allow for local correction and/or bias removal. In this work, we explore in greater detail the potential and limitations of the hyper-ensemble method in the Adriatic Sea, using a comprehensive surface drifter database. The performance of the hyper-ensembles and the individual models are discussed by analyzing associated uncertainties and probability distribution maps. Results suggest that the stochastic method may reduce position errors significantly for 12 to 72??h forecasts and hence compete with pure deterministic approaches. ?? 2007 NATO Undersea Research Centre (NURC).
Directory of Open Access Journals (Sweden)
Sean S Downey
Full Text Available Analysis of the proportion of immature skeletons recovered from European prehistoric cemeteries has shown that the transition to agriculture after 9000 BP triggered a long-term increase in human fertility. Here we compare the largest analysis of European cemeteries to date with an independent line of evidence, the summed calibrated date probability distribution of radiocarbon dates (SCDPD from archaeological sites. Our cemetery reanalysis confirms increased growth rates after the introduction of agriculture; the radiocarbon analysis also shows this pattern, and a significant correlation between both lines of evidence confirms the demographic validity of SCDPDs. We analyze the areal extent of Neolithic enclosures and demographic data from ethnographically known farming and foraging societies and we estimate differences in population levels at individual sites. We find little effect on the overall shape and precision of the SCDPD and we observe a small increase in the correlation with the cemetery trends. The SCDPD analysis supports the hypothesis that the transition to agriculture dramatically increased demographic growth, but it was followed within centuries by a general pattern of collapse even after accounting for higher settlement densities during the Neolithic. The study supports the unique contribution of SCDPDs as a valid demographic proxy for the demographic patterns associated with early agriculture.
Straus, D. M.
2016-12-01
The goals of this research are to: (a) identify features of the probability distribution function (pdf) of pentad precipitation over the continental US (CONUS) that are controlled by the configuration of the large-scale fields, including both tails of the pdf, hence droughts and floods, and the overall shape of the pdf, e.g. skewness and kurtosis; (b) estimate the changes in the properties of the pdf controlled by the large-scale in a future climate. We first describe the significant dependence of the observed precipitation pdf conditioned on circulation regimes over CONUS. The regime states, and the number of regimes, are obtained by a method that assures a high degree of significance, and a high degree of pattern correlation between the states in a regime and its average. The regime-conditioned pdfs yield information on times scales from intra-seasonal to inter-annual. We then apply this method to atmospheric simulations run with the EC-Earth version 3 model for historical sea-surface temperatures (SST) and future (RCP8.5 CMIP5 scenario) estimates of SST, at resolutions T255 and T799, to understand what dynamically controlled changes in the precipitation pdf can be expected in a future climate.
Directory of Open Access Journals (Sweden)
Enrico R Crema
Full Text Available Recent advances in the use of summed probability distribution (SPD of calibrated 14C dates have opened new possibilities for studying prehistoric demography. The degree of correlation between climate change and population dynamics can now be accurately quantified, and divergences in the demographic history of distinct geographic areas can be statistically assessed. Here we contribute to this research agenda by reconstructing the prehistoric population change of Jomon hunter-gatherers between 7,000 and 3,000 cal BP. We collected 1,433 14C dates from three different regions in Eastern Japan (Kanto, Aomori and Hokkaido and established that the observed fluctuations in the SPDs were statistically significant. We also introduced a new non-parametric permutation test for comparing multiple sets of SPDs that highlights point of divergences in the population history of different geographic regions. Our analyses indicate a general rise-and-fall pattern shared by the three regions but also some key regional differences during the 6th millennium cal BP. The results confirm some of the patterns suggested by previous archaeological studies based on house and site counts but offer statistical significance and an absolute chronological framework that will enable future studies aiming to establish potential correlation with climatic changes.
Sugita, Takashi; Tajima, Mami; Takashima, Masako; Amaya, Misato; Saito, Masuyoshi; Tsuboi, Ryoji; Nishikawa, Akemi
2004-01-01
Over the last few years, new Malassezia species have been found regularly in Japanese subjects. We isolated another new Malassezia species from a Japanese patient with seborrheic dermatitis (SD), and named it M. yamatoensis. In its physiological characteristics and the utilization of Tween by M. yamatoensis is similar to that of M. furfur and M. dermatis. It is distinguished by its growth temperature. To examine the distribution of the microorganism in the skin of patients with SD and atopic dermatitis (AD), and healthy subjects, we applied transparent dressings to the skin, and detected M. yamatoensis DNA using a non-culture-based method that consisted of nested PCR with specific primers. M. yamatoensis DNA was detected from 3 of 31 SD patients (9.7%), 5 of 36 AD patients (13.9%), and 1 of 22 healthy subjects (4.6%). Therefore, M. yamatoensis is a rare member of the cutaneous microflora.
Spatial distribution of nanoparticles in PWR nanofluid coolant subjected to local nucleate boiling
Energy Technology Data Exchange (ETDEWEB)
Mirghaffari, Reza; Jahanfarnia, Gholamreza [Islamic Azad Univ., Tehran (Iran, Islamic Republic of). Dept. of Nuclear Engineering
2016-12-15
Nanofluids have shown to be promising as an alternative for a PWR reactor coolant or as a safety system coolant to cover the core in the event of a loss of coolant accident. The nanoparticles distribution and neutronic parameters are intensively affected by the local boiling of nanofluid coolant. The main goal of this study was the physical-mathematical modeling of the nanoparticles distribution in the nucleate boiling of nanofluids within the viscous sublayer. Nanoparticles concentration, especially near the heat transfer surfaces, plays a significant role in the enhancement of thermal conductivity of nanofluids and prediction of CHF, Hide Out and Return phenomena. By solving the equation of convection-diffusion for the liquid phase near the heating surface and the bulk stream, the effect of heat flux on the distribution of nanoparticles was studied. The steady state mass conservation equations for liquids, vapors and nanoparticles were written for the flow boiling within the viscous sublayer adjacent the fuel cladding surface. The derived differential equations were discretized by the finite difference method and were solved numerically. It was found out that by increasing the surface heat flux, the concentration of nanoparticles increased.
Probability, Nondeterminism and Concurrency
DEFF Research Database (Denmark)
Varacca, Daniele
reveals the computational intuition lying behind the mathematics. In the second part of the thesis we provide an operational reading of continuous valuations on certain domains (the distributive concrete domains of Kahn and Plotkin) through the model of probabilistic event structures. Event structures......Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particular...... there is no categorical distributive law between them. We introduce the powerdomain of indexed valuations which modifies the usual probabilistic powerdomain to take more detailed account of where probabilistic choices are made. We show the existence of a distributive law between the powerdomain of indexed valuations...
Gellert, A R; Lewis, C. A.; Langford, J A; Tolfree, S E; Rudd, R. M.
1985-01-01
The overall and regional clearance of an inhaled isotope labelled solute from the lungs was examined on the basis of a 15 minute period of data collection, for which a technique was developed that does not require intravenous injection to correct for blood-tissue background activity. The technique was applied to 52 normal subjects (31 non-smokers and 21 smokers) and to 37 patients with asbestosis (21 non-smokers and 16 smokers). In normal smokers solute clearance was faster in the upper and m...
Javaherizadeh, Hazhir; Khademvatan, Shahram; Soltani, Shahrzad; Torabizadeh, Mehdi; Yousefi, Elham
2014-01-01
Some studies suggest Blastocystis hominis is a potentially pathogenic protozoa. Blastocystis hominis contributed to anaemia in children aged 8-10 years old in one study. To compare haematological indices in cases with blastocystis hominis infection with healthy controls. From 2001 to 2012, 97600 stool examinations were done in 4 university hospitals. Parasites were observed in 46,200 specimens. Of these cases, subjects with complete laboratory investigation (complete blood count - CBC, ferritin, total iron binding capacity - TIBC, and serum) and blastocystis hominis infection were included in this study as the case group. Of these cases, 6851 cases had only B. hominis infection. In the control group, 3615 subjects without parasite infestation were included. Age, haemoglobin (Hb), serum iron, TIBC, white blood cell (WBC), platelet (PLT), mean corpuscular volume (MCV), haematocrit (HCT) and erythrocyte sedimentation rate (ESR) were recorded for cases and controls. SPSS software version 13.0 was used for analysis. Independent sample t-test and χ(2) tests were used for comparison. Erythrocyte sedimentation rate level was significantly higher in cases with B. hominis infection (p hominis infection compared to controls. Occult blood was positive in 0.93% of cases and in none of the controls (p hominis infection.
Vitola, Jaime; Pozo, Francesc; Tibaduiza, Diego A; Anaya, Maribel
2017-05-31
Structural health monitoring (SHM) is a very important area in a wide spectrum of fields and engineering applications. With an SHM system, it is possible to reduce the number of non-necessary inspection tasks, the associated risk and the maintenance cost in a wide range of structures during their lifetime. One of the problems in the detection and classification of damage are the constant changes in the operational and environmental conditions. Small changes of these conditions can be considered by the SHM system as damage even though the structure is healthy. Several applications for monitoring of structures have been developed and reported in the literature, and some of them include temperature compensation techniques. In real applications, however, digital processing technologies have proven their value by: (i) offering a very interesting way to acquire information from the structures under test; (ii) applying methodologies to provide a robust analysis; and (iii) performing a damage identification with a practical useful accuracy. This work shows the implementation of an SHM system based on the use of piezoelectric (PZT) sensors for inspecting a structure subjected to temperature changes. The methodology includes the use of multivariate analysis, sensor data fusion and machine learning approaches. The methodology is tested and evaluated with aluminum and composite structures that are subjected to temperature variations. Results show that damage can be detected and classified in all of the cases in spite of the temperature changes.
Energy Technology Data Exchange (ETDEWEB)
Carr, D.B.; Tolley, H.D.
1982-12-01
This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.
Guo, Zhenyuan; Yang, Shaofu; Wang, Jun
2016-12-01
This paper presents theoretical results on global exponential synchronization of multiple memristive neural networks in the presence of external noise by means of two types of distributed pinning control. The multiple memristive neural networks are coupled in a general structure via a nonlinear function, which consists of a linear diffusive term and a discontinuous sign term. A pinning impulsive control law is introduced in the coupled system to synchronize all neural networks. Sufficient conditions are derived for ascertaining global exponential synchronization in mean square. In addition, a pinning adaptive control law is developed to achieve global exponential synchronization in mean square. Both pinning control laws utilize only partial state information received from the neighborhood of the controlled neural network. Simulation results are presented to substantiate the theoretical results. Copyright © 2016 Elsevier Ltd. All rights reserved.
Collision Probability Analysis
DEFF Research Database (Denmark)
Hansen, Peter Friis; Pedersen, Preben Terndrup
1998-01-01
It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2010-01-01
This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....
Isaac, Richard
1995-01-01
The ideas of probability are all around us. Lotteries, casino gambling, the al most non-stop polling which seems to mold public policy more and more these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...
Training-Load Distribution in Endurance Runners: Objective Versus Subjective Assessment.
Manzi, Vincenzo; Bovenzi, Antonio; Castagna, Carlo; Sinibaldi Salimei, Paola; Volterrani, Maurizio; Iellamo, Ferdinando
2015-11-01
To assess the distribution of exercise intensity in long-distance recreational athletes (LDRs) preparing for a marathon and to test the hypothesis that individual perception of effort could provide training responses similar to those provided by standardized training methodologies. Seven LDRs (age 36.5 ± 3.8 y) were followed during a 5-mo training period culminating with a city marathon. Heart rate at 2.0 and 4.0 mmol/L and maximal heart rate were used to establish 3 intensity training zones. Internal training load (TL) was assessed by training zones and TRIMPi methods. These were compared with the session-rating-of-perceived-exertion (RPE) method. Total time spent in zone 1 was higher than in zones 2 and 3 (76.3% ± 6.4%, 17.3% ± 5.8%, and 6.3% ± 0.9%, respectively; P = .000 for both, ES = 0.98, ES = 0.99). TL quantified by session-RPE provided the same result. The comparison between session-RPE and training-zones-based methods showed no significant difference at the lowest intensity (P = .07, ES = 0.25). A significant correlation was observed between TL RPE and TL TRIMPi at both individual and group levels (r = .79, P training time is spent at low intensity and that this is associated with improved performances. Session-RPE is an easy-to-use training method that provides responses similar to those obtained with standardized training methodologies.
Distribution of genetic polymorphism of aldehyde dehydrogenase-2 (ALDH2 in Indonesian subjects
Directory of Open Access Journals (Sweden)
Septelia I. Wanandi
2002-09-01
Full Text Available Aldehyde dehydrogenase (ALDH plays a pivotal role in the alcohol metabolism. Decreased activity of ALDH enzyme has more influence on the hypersensitivity to alcohol than of alcohol dehydrogenase. ALDH enzyme exists in several isozymes. Among these isozymes, ALDH2 is a major isozyme that has a very high affinity for acetaldehyde. Recent studies suggested that the deficiency of ALDH2 may be inherited. Functional polymorphism of ALDH2 gene has been observed in a nucleotide of the 487th codon. In the atypical gene, this codon consists of AAA nucleotides for lysine, instead of GAA for glutamic acid in the wild type gene. In this study, we have analyzed the genetic polymorphism of ALDH2 gene among 100 Indonesian students using genomic DNA extracted from hair roots. Polymerase chain reaction (PCR and restriction fragment length polymorphism (RFLP methods were performed for this purpose. Three oligonucleotide primers were designed for two steps PCR. The reverse primer R was intentionally constructed not to be 100% complementary to the template strand, to generate a restriction site for Eco RI within the variable nucleotide in the PCR product of ALDH2 gene. This study indicates that 70 subjects (70% have wild type, 29 (29% atypical heterozygote and only 1 (1% atypical homozygote ALDH2 alleles. Conclusively, the atypical ALDH2 allele frequency in Indonesians (31/200 is higher than in Caucasoids (only about 5-10%, but less than in Mongoloids (40-50%. This may be due to the diverse ethnics of Indonesian population. (Med J Indones 2002; 11: 135-42 Keywords: alcohol hypersensitivity, genetic polymorphism, aldehyde dehydrogenase-2 (ALDH2 gene
von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo
2014-06-01
Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.
Introduction to probability theory with contemporary applications
Helms, Lester L
2010-01-01
This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process
Zhao, Liang; Li, Na; Yang, Harry
2011-02-01
Drug kinetics in human has been studied from both deterministic and stochastic perspectives. However, little research has been done to systematically determine the probability for a drug molecule to follow a specific traveling route. Recently a method was developed to estimate this probability and the probability density function of residence time in linear systems. In this paper, we provide a rigorous proof of the main results of the previous paper and extend the method to nonlinear multi-compartment systems. A novel concept of compartment expansion is introduced to facilitate the development of our method. This formulation resolves computational difficulties associated with nonlinear systems, allowing for direct estimation of the probability intensity coefficients, and subsequently the transition probability and probability density function of the residence time. With such expansion of the methodology, it becomes both practical and feasible to apply it in the real-world drug development where drug disposition patterns are often nonlinear. The method can be used to estimate drug exposure at any site of interest, thus may help us to gain better understanding about the impact of drug exposure on efficacy and safety.
Probability and statistics: selected problems
Machado, J.A. Tenreiro; Pinto, Carla M. A.
2014-01-01
Probability and Statistics—Selected Problems is a unique book for senior undergraduate and graduate students to fast review basic materials in Probability and Statistics. Descriptive statistics are presented first, and probability is reviewed secondly. Discrete and continuous distributions are presented. Sample and estimation with hypothesis testing are presented in the last two chapters. The solutions for proposed excises are listed for readers to references.
Plicht, J. van der; Harakeh, M.N.; van der Woude, Adriaan; David, P.; Debrus, J.; Janszen, H.; Schulze, J.
1980-01-01
The fission decay channel of 232Th and 238U has been investigated, using the (α, α’f) reaction at 120 MeV bombarding energy. The angular distributions of the fission fragments and the fission probabilities up to around 15 MeV excitation have been measured. No evidence for the fission decay of the
Dickstein, D L; Pullman, M Y; Fernandez, C; Short, J A; Kostakoglu, L; Knesaurek, K; Soleimani, L; Jordan, B D; Gordon, W A; Dams-O'Connor, K; Delman, B N; Wong, E; Tang, C Y; DeKosky, S T; Stone, J R; Cantu, R C; Sano, M; Hof, P R; Gandy, S
2016-01-01
Chronic traumatic encephalopathy (CTE) is a neurodegenerative disorder most commonly associated with repetitive traumatic brain injury (TBI) and characterized by the presence of neurofibrillary tangles of tau protein, known as a tauopathy. Currently, the diagnosis of CTE can only be definitively established postmortem. However, a new positron emission tomography (PET) ligand, [18F]T807/AV1451, may provide the antemortem detection of tau aggregates, and thus various tauopathies, including CTE. Our goal was to examine [18F]T807/AV1451 retention in athletes with neuropsychiatric symptoms associated with a history of multiple concussions. Here we report a 39-year-old retired National Football League player who suffered 22 concussions and manifested progressive neuropsychiatric symptoms. Emotional lability and irritability were the chief complaints. Serial neuropsychological exams revealed a decline in executive functioning, processing speed and fine motor skills. Naming was below average but other cognitive functions were preserved. Structural analysis of longitudinally acquired magenetic resonance imaging scans revealed cortical thinning in the left frontal and lateral temporal areas, as well as volume loss in the basal ganglia. PET with [18F]florbetapir was negative for amyloidosis. The [18F]T807/AV1451 PET showed multifocal areas of retention at the cortical gray matter–white matter junction, a distribution considered pathognomonic for CTE. [18F]T807/AV1451 standard uptake value (SUV) analysis showed increased uptake (SUVr⩾1.1) in bilateral cingulate, occipital, and orbitofrontal cortices, and several temporal areas. Although definitive identification of the neuropathological underpinnings basis for [18F]T807/AV1451 retention requires postmortem correlation, our data suggest that [18F]T807/AV1451 tauopathy imaging may be a promising tool to detect and diagnose CTE-related tauopathy in living subjects. PMID:27676441
Dickstein, D L; Pullman, M Y; Fernandez, C; Short, J A; Kostakoglu, L; Knesaurek, K; Soleimani, L; Jordan, B D; Gordon, W A; Dams-O'Connor, K; Delman, B N; Wong, E; Tang, C Y; DeKosky, S T; Stone, J R; Cantu, R C; Sano, M; Hof, P R; Gandy, S
2016-09-27
Chronic traumatic encephalopathy (CTE) is a neurodegenerative disorder most commonly associated with repetitive traumatic brain injury (TBI) and characterized by the presence of neurofibrillary tangles of tau protein, known as a tauopathy. Currently, the diagnosis of CTE can only be definitively established postmortem. However, a new positron emission tomography (PET) ligand, [18F]T807/AV1451, may provide the antemortem detection of tau aggregates, and thus various tauopathies, including CTE. Our goal was to examine [18F]T807/AV1451 retention in athletes with neuropsychiatric symptoms associated with a history of multiple concussions. Here we report a 39-year-old retired National Football League player who suffered 22 concussions and manifested progressive neuropsychiatric symptoms. Emotional lability and irritability were the chief complaints. Serial neuropsychological exams revealed a decline in executive functioning, processing speed and fine motor skills. Naming was below average but other cognitive functions were preserved. Structural analysis of longitudinally acquired magenetic resonance imaging scans revealed cortical thinning in the left frontal and lateral temporal areas, as well as volume loss in the basal ganglia. PET with [18F]florbetapir was negative for amyloidosis. The [18F]T807/AV1451 PET showed multifocal areas of retention at the cortical gray matter-white matter junction, a distribution considered pathognomonic for CTE. [18F]T807/AV1451 standard uptake value (SUV) analysis showed increased uptake (SUVr⩾1.1) in bilateral cingulate, occipital, and orbitofrontal cortices, and several temporal areas. Although definitive identification of the neuropathological underpinnings basis for [18F]T807/AV1451 retention requires postmortem correlation, our data suggest that [18F]T807/AV1451 tauopathy imaging may be a promising tool to detect and diagnose CTE-related tauopathy in living subjects.
Anke, Audny; Damsgård, Elin; Røe, Cecilie
2013-03-01
To investigate levels of life satisfaction in subjects with long-term musculoskeletal pain in relation to pain characteristics and coping. Cross-sectional study. A total of 232 (42%) respondents answered self--report questionnaires regarding life satisfaction, self-efficacy, sense of coherence, pain distribution and pain intensity at rest and during activity. Levels of life satisfaction and scores for sense of coherence were low. Pain intensity at rest was negatively correlated with global life satisfaction. This result was also obtained in multiple regression analyses together with the coping factors. The life satisfaction domains activities of daily living/contacts were negatively correlated with pain intensity during activity, and the domains work/economy were negatively correlated with pain distribution. Pain was not associated with satisfaction with family life, partner relationship or sexual life. Younger age, being married/cohabitant and being female were protective for some domains. Clinically meaningful subgroups with regard to adaptation were identified by cluster analysis, and the highest level of coping was found in the adaptive cluster with high life satisfaction/low pain intensity at rest. Long-term pain is related to low levels of life satisfaction, and pain intensity and distribution influence satisfaction in different domains. Pain intensity is negatively associated with coping. The results support efforts to reduce pain, together with strengthening active coping processes and addressing individual needs.
Probability Aggregates in Probability Answer Set Programming
Saad, Emad
2013-01-01
Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...
Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen
2002-01-01
This article and its sequel outline recent developments in the theory of infinite divisibility and Lévy processes in free probability, a subject area belonging to noncommutative (or quantum) probability. The present paper discusses the classes of infinitely divisible probability measures in classical and free probability, respectively, via a study of the Bercovici–Pata bijection between these classes.
Directory of Open Access Journals (Sweden)
Arun Kumar Gupta
2014-01-01
Full Text Available The present paper deals with the free transverse vibration of orthotropic thin trapezoidal plate of parabolically varying thickness in x-direction subjected to linear temperature distribution in x-direction through a numerical method. The deflection function is defined by the product of the equations of the prescribed continuous piecewise boundary shape. Rayleigh-Ritz method is used to evaluate the fundamental frequencies. The equations of motion, governing the free transverse vibrations of orthotropic thin trapezoidal plates, are derived with boundary condition CSCS. Frequency corresponding to the first two modes of vibration is calculated for the orthotropic thin trapezoidal plate having CSCS edges for different values of thermal gradient, taper constant, and aspect ratio. The proposed method is applied to solve orthotropic thin trapezoidal plate of variable thickness with C-S-C-S boundary conditions. Results are shown by figures for different values of thermal gradient, taper constant, and aspect ratio for the first two modes of vibrations.
Drury, Nicholas J; Ellis, Benjamin J; Weiss, Jeffrey A; McMahon, Patrick J; Debski, Richard E
2011-02-24
The anterior-inferior glenohumeral capsule is the primary passive stabilizer to the glenohumeral joint during anterior dislocation. Physical examinations following dislocation are crucial for proper diagnosis of capsule pathology; however, they are not standardized for joint position which may lead to misdiagnoses and poor outcomes. To suggest joint positions for physical examinations where the stability provided by the capsule may be consistent among patients, the objective of this study was to evaluate the distribution of maximum principal strain on the anterior-inferior capsule using two validated subject-specific finite element models of the glenohumeral joint at clinically relevant joint positions. The joint positions with 25 N anterior load applied at 60° of glenohumeral abduction and 10°, 20°, 30° and 40° of external rotation resulted in distributions of strain that were similar between shoulders (r² ≥ 0.7). Furthermore, those positions with 20-40° of external rotation resulted in capsule strains on the glenoid side of the anterior band of the inferior glenohumeral ligament that were significantly greater than in all other capsule regions. These findings suggest that anterior stability provided by the anterior-inferior capsule may be consistent among subjects at joint positions with 60° of glenohumeral abduction and a mid-range (20-40°) of external rotation, and that the glenoid side has the greatest contribution to stability at these joint positions. Therefore, it may be possible to establish standard joint positions for physical examinations that clinicians can use to effectively diagnose pathology in the anterior-inferior capsule following dislocation and lead to improved outcomes. Copyright © 2010 Elsevier Ltd. All rights reserved.
Verma, Sushma; Nongpiur, Monisha E; Oo, Hnin H; Atalay, Eray; Goh, David; Wong, Tina T; Perera, Shamira A; Aung, Tin
2017-10-01
We previously identified three distinct subgroups of patients with primary angle closure glaucoma (PACG) based on anterior segment optical coherence tomography (ASOCT) imaging. Group 1 was characterized by a large iris area with deepest anterior chambers, group 2 by a large lens vault (LV) and shallow anterior chamber depth (ACD), and group 3 displayed intermediate values across iris area, LV, and ACD. The purpose of the present study was to determine the distribution of plateau iris in these subgroups using ultrasound biomicroscopy (UBM) features. UBM images of the 210 subjects who were previously enrolled for the ASOCT subgrouping analysis and had undergone laser peripheral iridotomy were assessed and graded by a single glaucoma fellowship trained clinician. Plateau iris was defined as the presence of all the following UBM criteria in at least two quadrants: anteriorly directed ciliary body, absent ciliary sulcus, iris angulation, flat iris plane, and iridoangle touch. Of 210 subjects, 23 were excluded due to poor-quality images. Based on standardized UBM criteria, the overall prevalence of plateau iris was 36.9% (n = 187). The proportion of plateau iris was similar across the three groups (subgroup 1:35.4% (n = 29); subgroup 2:39.0% (n = 32); subgroup 3:34.8% (n = 8), P = 0.87). On multiple logistic regression analysis, iris thickness at 750 μm from the scleral spur (IT750) was the only variable associated with plateau iris (odds ratio: 1.5/100 μm increase in iris thickness [IT], P = 0.04). The proportion of plateau iris was similar across the three ASOCT-based PACG subgroups and more than one-third of subjects with PACG were diagnosed with plateau iris based on standardized UBM criteria. In addition, we noted that eyes with increased peripheral IT have an increased likelihood of plateau iris.
Liquefaction Probability Curves for Surficial Geologic Units
Holzer, T. L.; Noce, T. E.; Bennett, M. J.
2009-12-01
Liquefaction probability curves that predict the probability of surface manifestations of earthquake-induced liquefaction are developed for 14 different surficial geologic deposits. The geologic units include alluvial fan, beach ridge, river delta, eolian dune, point bar, floodbasin, natural river levee, abandoned river channel, deep-water lake, lagoonal, sandy artificial fill, and valley train deposits. Probability is conditioned on earthquake magnitude and peak ground acceleration. Curves are developed for water table depths of 1.5 and 5.0 m. Probabilities were derived from complementary cumulative frequency distributions of the liquefaction potential index (LPI) that were computed from 935 cone penetration tests. Most of the curves can be fit with a 3-parameter logistic function, which facilitates computations of probability. For natural deposits with a water table at 1.5 m depth and subjected to an M7.5 earthquake with a PGA = 0.25 g, probabilities range from 0.5 for fluvial point bar, barrier island beach ridge, and deltaic deposits. Retrospective predictions of liquefaction during historical earthquakes based on the curves compare favorably to post-earthquake observations. We also have used the curves to assign ranges of liquefaction probabilities to the susceptibility categories proposed by Youd and Perkins (1978) for different geologic deposits. For the earthquake loading and conditions described above, probabilities range from 0-0.08 for low, 0.09-0.30 for moderate, 0.31-0.62 for high, to 0.63-1.00 for very high susceptibility. Liquefaction probability curves have two primary practical applications. First, the curves can be combined with seismic source characterizations to transform surficial geologic maps into probabilistic liquefaction hazard maps. Geographic specific curves are clearly desirable, but in the absence of such information, generic liquefaction probability curves provide a first approximation of liquefaction hazard. Such maps are useful both
Carpena, Pedro; Bernaola-Galván, Pedro A; Carretero-Campos, Concepción; Coronado, Ana V
2016-11-01
Symbolic sequences have been extensively investigated in the past few years within the framework of statistical physics. Paradigmatic examples of such sequences are written texts, and deoxyribonucleic acid (DNA) and protein sequences. In these examples, the spatial distribution of a given symbol (a word, a DNA motif, an amino acid) is a key property usually related to the symbol importance in the sequence: The more uneven and far from random the symbol distribution, the higher the relevance of the symbol to the sequence. Thus, many techniques of analysis measure in some way the deviation of the symbol spatial distribution with respect to the random expectation. The problem is then to know the spatial distribution corresponding to randomness, which is typically considered to be either the geometric or the exponential distribution. However, these distributions are only valid for very large symbolic sequences and for many occurrences of the analyzed symbol. Here, we obtain analytically the exact, randomly expected spatial distribution valid for any sequence length and any symbol frequency, and we study its main properties. The knowledge of the distribution allows us to define a measure able to properly quantify the deviation from randomness of the symbol distribution, especially for short sequences and low symbol frequency. We apply the measure to the problem of keyword detection in written texts and to study amino acid clustering in protein sequences. In texts, we show how the results improve with respect to previous methods when short texts are analyzed. In proteins, which are typically short, we show how the measure quantifies unambiguously the amino acid clustering and characterize its spatial distribution.
Structural Minimax Probability Machine.
Gu, Bin; Sun, Xingming; Sheng, Victor S
2017-07-01
Minimax probability machine (MPM) is an interesting discriminative classifier based on generative prior knowledge. It can directly estimate the probabilistic accuracy bound by minimizing the maximum probability of misclassification. The structural information of data is an effective way to represent prior knowledge, and has been found to be vital for designing classifiers in real-world problems. However, MPM only considers the prior probability distribution of each class with a given mean and covariance matrix, which does not efficiently exploit the structural information of data. In this paper, we use two finite mixture models to capture the structural information of the data from binary classification. For each subdistribution in a finite mixture model, only its mean and covariance matrix are assumed to be known. Based on the finite mixture models, we propose a structural MPM (SMPM). SMPM can be solved effectively by a sequence of the second-order cone programming problems. Moreover, we extend a linear model of SMPM to a nonlinear model by exploiting kernelization techniques. We also show that the SMPM can be interpreted as a large margin classifier and can be transformed to support vector machine and maxi-min margin machine under certain special conditions. Experimental results on both synthetic and real-world data sets demonstrate the effectiveness of SMPM.
Failure probability under parameter uncertainty.
Gerrard, R; Tsanakas, A
2011-05-01
In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.
Energy Technology Data Exchange (ETDEWEB)
Hernandez, R.; Miller, W.H.; Moore, C.B. (Department of Chemistry, University of California, and Chemical Sciences Division, Lawrence Berkeley Laboratory, Berkeley, California 94720 (United States)); Polik, W.F. (Department of Chemistry, Hope College, Holland, Michigan 49423 (United States))
1993-07-15
A previously developed random matrix/transition state theory (RM/TST) model for the probability distribution of state-specific unimolecular decay rates has been generalized to incorporate total angular momentum conservation and other dynamical symmetries. The model is made into a predictive theory by using a semiclassical method to determine the transmission probabilities of a nonseparable rovibrational Hamiltonian at the transition state. The overall theory gives a good description of the state-specific rates for the D[sub 2]CO[r arrow]D[sub 2]+CO unimolecular decay; in particular, it describes the dependence of the distribution of rates on total angular momentum [ital J]. Comparison of the experimental values with results of the RM/TST theory suggests that there is mixing among the rovibrational states.
Krueger, Ute; Schimmelpfeng, Katja
2013-03-01
A sufficient staffing level in fire and rescue dispatch centers is crucial for saving lives. Therefore, it is important to estimate the expected workload properly. For this purpose, we analyzed whether a dispatch center can be considered as a call center. Current call center publications very often model call arrivals as a non-homogeneous Poisson process. This bases on the underlying assumption of the caller's independent decision to call or not to call. In case of an emergency, however, there are often calls from more than one person reporting the same incident and thus, these calls are not independent. Therefore, this paper focuses on the dependency of calls in a fire and rescue dispatch center. We analyzed and evaluated several distributions in this setting. Results are illustrated using real-world data collected from a typical German dispatch center in Cottbus ("Leitstelle Lausitz"). We identified the Pólya distribution as being superior to the Poisson distribution in describing the call arrival rate and the Weibull distribution to be more suitable than the exponential distribution for interarrival times and service times. However, the commonly used distributions offer acceptable approximations. This is important for estimating a sufficient staffing level in practice using, e.g., the Erlang-C model.
Contributions to quantum probability
Energy Technology Data Exchange (ETDEWEB)
Fritz, Tobias
2010-06-25
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a
Probability theory and mathematical statistics for engineers
Pugachev, V S
1984-01-01
Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector
Yokogawa, Shinji
2017-07-01
In this study, a simple method of statistical parameter estimation is proposed for lifetime distribution that has three parameters due to the defect clustering in the middle-of-line and back-end-of-line. A two-step procedure provides the estimations of distribution parameters effectively for the time-dependent dielectric breakdown. In the first step, a clustering parameter of distribution, which is one of the shape parameters, is estimated by a linearization treatment of plotted data on the proposed chart. Then, in the second step, shape and scale parameters are estimated by calculating of a slope and an intercept, respectively. The statistical accuracy of the estimates is evaluated using the Monte-Carlo simulation technique and mean squared error of estimates.
Probability density fittings of corrosion test-data: Implications on C 6 ...
Indian Academy of Sciences (India)
In this study, corrosion test-data of steel-rebar in concrete were subjected to the fittings of the Normal, Gumbel and the Weibull probability distribution functions. This was done to investigate the suitability of the results of the fitted test-data, by these distributions, for modelling the effectiveness of C6H15NO3, triethanolamine ...
Considerations on a posteriori probability
Directory of Open Access Journals (Sweden)
Corrado Gini
2015-06-01
Full Text Available In this first paper of 1911 relating to the sex ratio at birth, Gini repurposed a Laplace’s succession rule according to a Bayesian version. The Gini's intuition consisted in assuming for prior probability a Beta type distribution and introducing the "method of results (direct and indirect" for the determination of prior probabilities according to the statistical frequency obtained from statistical data.
Directory of Open Access Journals (Sweden)
Bruno Teixeira Ribeiro
2007-10-01
Full Text Available Estudos probabilísticos envolvendo variáveis climáticas são de extrema importância para as atividades da agropecuária, construção civil, turismo, transporte, dentre outros. Visando contribuir para o planejamento da agricultura irrigada, este trabalho teve como objetivos comparar distribuições de probabilidade ajustadas às séries históricas decendiais e mensais, e estimar as precipitações prováveis para o município de Barbacena, MG. Foram estudados os meses de dezembro, janeiro e fevereiro, no período de 1942 a 2003, constituindo-se séries históricas com 62 anos de observações. As lâminas diárias foram totalizadas em períodos mensais e decendiais, sendo aplicadas as distribuições log-Normal 2 parâmetros, log-Normal 3 parâmetros e Gama. Para avaliar a adequabilidade das distribuições, nos períodos estudados, utilizou-se o teste de Qui-quadrado (chi2, ao nível de 5% de significância. As precipitações prováveis foram estimadas para cada período estudado utilizando a distribuição que apresentou o menor valor de chi2, nos níveis de probabilidade de excedência de 75, 90 e 98%. A distribuição Gama foi a que melhor se ajustou aos dados. O estudo de precipitações prováveis é uma boa ferramenta no auxílio da tomada de decisão quanto ao planejamento e uso da irrigação.Probabilistic studies involving climatic variables are of extreme importance for farming activities, construction, tourism, among others. Seeking to contribute for the planning of irrigate agriculture, this work had as objectives to compare adjusted probability distribution models to the monthly and decennial historical series and to estimate the probable rainfall for the Barbacena County, Minas Gerais State, Brazil. Rainfall data of December, January and February, from 1942 to 2003, were studied, constituting historical series with 62 years of observations. Daily rainfall depths were added for 10 and 30 days, applying Gama, log-Normal 2 and
Wang, Bo; Li, Hongxia; Cao, Xueyuan; Zhu, Xiaojun; Gan, Zhongxue
2017-04-01
With the rapid development of the energy networks, various forms of renewable energy resources are absorbed into it. Because of the inherent random behaviour of the renewable resources, introducing them into the energy network will destroy the stability of the grids. It is required to use proper energy storages to reduce the uncertain fluctuation from the renewable energy resources. For a concrete model research, this paper presented an explicit method to give suitable capacities of the energy storages in consideration of the economics of the storage, grid losses and the probabilities of the bus voltages violation, for situations of the winds-power generations injected into the power network. Furthermore, the influence of the correlation between the different winds farms on the optimal storage capacity can also be studied by this method.
A Novel Approach to Probability
Kafri, Oded
2016-01-01
When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...
Hallas, Jesper; Pottegård, Anton; Støvring, Henrik
2017-12-01
In register-based pharmacoepidemiological studies, each day of follow-up is usually categorized either as exposed or unexposed. However, there is an underlying continuous probability of exposure, and by insisting on a dichotomy, researchers unwillingly force a nondifferential misclassification into their analyses. We have recently developed a model whereby probability of exposure can be modeled, and we tested this on an empirical case of nonsteroidal anti-inflammatory drug (NSAID)-induced upper gastrointestinal bleeding (UGIB). We used a case-controls data set, consisting of 3568 cases of severe UGIB and 35 552 matched controls. Exposure to NSAID was based on 3 different conventional dichotomous measures. In addition, we tested 3 probabilistic exposure measures, a simple univariate backward-recurrence model, a "full" multivariable model, and a "reduced" multivariable model. Odds ratios (ORs) and 95% confidence intervals for the association between NSAID use and UGIB were calculated by conditional logistic regression, while adjusting for preselected confounders. Compared to the conventional dichotomous exposure measures, the probabilistic exposure measures generated adjusted ORs in the upper range (4.37-4.75) while at the same time having the most narrow confidence intervals (ratio between upper and lower confidence limit, 1.46-1.50). Some ORs generated by conventional measures were higher than the probabilistic ORs, but only when the assumed period of intake was unrealistically short. The pattern of high ORs and narrow confidence intervals in probabilistic exposure measures is compatible with less nondifferential misclassification of exposure than in a dichotomous exposure model. Probabilistic exposure measures appear to be an attractive alternative to conventional exposure measures. Copyright © 2017 John Wiley & Sons, Ltd.
Koo, Reginald; Jones, Martin L.
2011-01-01
Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.
Domingo, J.L.; Schuhmacher, M; Piamba, C; Osorio, C; Ocampo-Duque, W
2013-01-01
10.1016/j.envint.2012.11.007 The integration of water quality monitoring variables is essential in environmental decision making. Nowadays, advanced techniques to manage subjectivity, imprecision, uncertainty, vagueness, and variability are required in such complex evaluation process. We here propose a probabilistic fuzzy hybrid model to assess river water quality. Fuzzy logic reasoning has been used to compute a water quality integrative index. By applying a Monte Carlo technique, based ...
Quantum probability measures and tomographic probability densities
Amosov, GG; Man'ko, [No Value
2004-01-01
Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the
Agreeing Probability Measures for Comparative Probability Structures
P.P. Wakker (Peter)
1981-01-01
textabstractIt is proved that fine and tight comparative probability structures (where the set of events is assumed to be an algebra, not necessarily a σ-algebra) have agreeing probability measures. Although this was often claimed in the literature, all proofs the author encountered are not valid
Using Playing Cards to Differentiate Probability Interpretations
López Puga, Jorge
2014-01-01
The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.
Maraninchi, Marie; Padilla, Nadège; Béliard, Sophie; Berthet, Bruno; Nogueira, Juan-Patricio; Dupont-Roussel, Jeanine; Mancini, Julien; Bégu-Le Corroller, Audrey; Dubois, Noémie; Grangeot, Rachel; Mattei, Catherine; Monclar, Marion; Calabrese, Anastasia; Guérin, Carole; Desmarchelier, Charles; Nicolay, Alain; Xiao, Changting; Borel, Patrick; Lewis, Gary F; Valéro, René
Elevated apolipoprotein C-III (apoC-III) has been postulated to contribute to the atherogenic dyslipidemia seen in obesity and insulin-resistant states, mainly by impairing plasma triglyceride-rich lipoprotein (TRL) metabolism. Bariatric surgery is associated with improvements of several obesity-associated metabolic abnormalities, including a reduction in plasma triglycerides (TGs) and an increase in plasma high-density lipoprotein cholesterol (HDL-C). We investigated the specific effect of bariatric surgery on apoC-III concentrations in plasma, non-HDL, and HDL fractions in relation to lipid profile parameters evolution. A total of 132 obese subjects undergoing bariatric surgery, gastric bypass (n = 61) or sleeve gastrectomy (n = 71), were studied 1 month before surgery and 6 and 12 months after surgery. Plasma apoC-III, non-HDL-apoC-III, and HDL-apoC-III concentrations were markedly reduced after surgery and strongly associated with reduction in plasma TG. This decrease was accompanied by a redistribution of apoC-III from TRL to HDL fractions. In multivariate analysis, plasma apoC-III was the strongest predictor of TG reduction after surgery, and the increase of HDL-C was positively associated with plasma adiponectin and negatively with body mass index. Marked reduction of apoC-III and changes in its distribution between TRL and HDL consistent with a better lipid profile are achieved in obese patients after bariatric surgery. These apoC-III beneficial modifications may have implications in dyslipidemia improvement and contribute to cardiovascular risk reduction after surgery. Copyright © 2017 National Lipid Association. Published by Elsevier Inc. All rights reserved.
Experience matters: information acquisition optimizes probability gain.
Nelson, Jonathan D; McKenzie, Craig R M; Cottrell, Garrison W; Sejnowski, Terrence J
2010-07-01
Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information-information gain, Kullback-Liebler distance, probability gain (error minimization), and impact-are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects' information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects' preference for probability gain is robust, suggesting that the other models contribute little to subjects' search behavior.
Stationary algorithmic probability
National Research Council Canada - National Science Library
Müller, Markus
2010-01-01
...,sincetheiractualvaluesdependonthechoiceoftheuniversal referencecomputer.Inthispaper,weanalyzeanaturalapproachtoeliminatethismachine- dependence. Our method is to assign algorithmic probabilities to the different...
Probability inequalities for decomposition integrals
Czech Academy of Sciences Publication Activity Database
Agahi, H.; Mesiar, Radko
2017-01-01
Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics Impact factor: 1.357, year: 2016 http:// library .utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf
Factual and cognitive probability
Chuaqui, Rolando
2012-01-01
This modification separates the two aspects of probability: probability as a part of physical theories (factual), and as a basis for statistical inference (cognitive). Factual probability is represented by probability structures as in the earlier papers, but now built independently of the language. Cognitive probability is interpreted as a form of "partial truth". The paper also contains a discussion of the Principle of Insufficient Reason and of Bayesian and classical statistical methods, in...
Evaluating probability forecasts
Lai, Tze Leung; Gross, Shulamith T.; Shen, David Bo
2011-01-01
Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability for...
Kroese, A.H.; van der Meulen, E.A.; Poortema, Klaas; Schaafsma, W.
1995-01-01
The making of statistical inferences in distributional form is conceptionally complicated because the epistemic 'probabilities' assigned are mixtures of fact and fiction. In this respect they are essentially different from 'physical' or 'frequency-theoretic' probabilities. The distributional form is
VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS
Directory of Open Access Journals (Sweden)
Smirnov Vladimir Alexandrovich
2012-10-01
Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.
Bunder, J. E.; McKenzie, Ross H.
2001-01-01
We consider the statistical properties of the local density of states of a one-dimensional Dirac equation in the presence of various types of disorder with Gaussian white-noise distribution. It is shown how either the replica trick or supersymmetry can be used to calculate exactly all the moments of the local density of states. Careful attention is paid to how the results change if the local density of states is averaged over atomic length scales. For both the replica trick and supersymmetry the problem is reduced to finding the ground state of a zero-dimensional Hamiltonian which is written solely in terms of a pair of coupled "spins" which are elements of u(1,1). This ground state is explicitly found for the particular case of the Dirac equation corresponding to an infinite metallic quantum wire with a single conduction channel. The calculated moments of the local density of states agree with those found previously by Al'tshuler and Prigodin [Sov. Phys. JETP 68 (1989) 198] using a technique based on recursion relations for Feynman diagrams.
Gu, G.-F.; Chen, W.; Zhou, W.-X.
2007-05-01
The statistical properties of the bid-ask spread of a frequently traded Chinese stock listed on the Shenzhen Stock Exchange are investigated using the limit-order book data. Three different definitions of spread are considered based on the time right before transactions, the time whenever the highest buying price or the lowest selling price changes, and a fixed time interval. The results are qualitatively similar no matter linear prices or logarithmic prices are used. The average spread exhibits evident intraday patterns consisting of a big L-shape in morning transactions and a small L-shape in the afternoon. The distributions of the spread with different definitions decay as power laws. The tail exponents of spreads at transaction level are well within the interval (2,3) and that of average spreads are well in line with the inverse cubic law for different time intervals. Based on the detrended fluctuation analysis, we found the evidence of long memory in the bid-ask spread time series for all three definitions, even after the removal of the intraday pattern. Using the classical box-counting approach for multifractal analysis, we show that the time series of bid-ask spread do not possess multifractal nature.
Fracture probability along a fatigue crack path
Energy Technology Data Exchange (ETDEWEB)
Makris, P. [Technical Univ., Athens (Greece)
1995-03-01
Long experience has shown that the strength of materials under fatigue load has a stochastic behavior, which can be expressed through the fracture probability. This paper deals with a new analytically derived law for the distribution of the fracture probability along a fatigue crack path. The knowledge of the distribution of the fatigue fracture probability along the crack path helps the connection between stress conditions and the expected fatigue life of a structure under stochasticly varying loads. (orig.)
The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.
Efficient probability sequence
Regnier, Eva
2014-01-01
A probability sequence is an ordered set of probability forecasts for the same event. Although single-period probabilistic forecasts and methods for evaluating them have been extensively analyzed, we are not aware of any prior work on evaluating probability sequences. This paper proposes an efficiency condition for probability sequences and shows properties of efficient forecasting systems, including memorylessness and increasing discrimination. These results suggest tests for efficiency and ...
Efficient probability sequences
Regnier, Eva
2014-01-01
DRMI working paper A probability sequence is an ordered set of probability forecasts for the same event. Although single-period probabilistic forecasts and methods for evaluating them have been extensively analyzed, we are not aware of any prior work on evaluating probability sequences. This paper proposes an efficiency condition for probability sequences and shows properties of efficiency forecasting systems, including memorylessness and increasing discrimination. These res...
Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned
Interpretations of probability
Khrennikov, Andrei
2009-01-01
This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.
Nuclear data uncertainties: I, Basic concepts of probability
Energy Technology Data Exchange (ETDEWEB)
Smith, D.L.
1988-12-01
Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.
Probability measures on metric spaces
Parthasarathy, K R
2005-01-01
In this book, the author gives a cohesive account of the theory of probability measures on complete metric spaces (which is viewed as an alternative approach to the general theory of stochastic processes). After a general description of the basics of topology on the set of measures, the author discusses regularity, tightness, and perfectness of measures, properties of sampling distributions, and metrizability and compactness theorems. Next, he describes arithmetic properties of probability measures on metric groups and locally compact abelian groups. Covered in detail are notions such as decom
National Research Council Canada - National Science Library
Anke, Audny; Damsgård, Elin; Røe, Cecilie
2013-01-01
.... Cross-sectional study. A total of 232 (42%) respondents answered self--report questionnaires regarding life satisfaction, self-efficacy, sense of coherence, pain distribution and pain intensity at rest and during activity...
Bacterial populations were examined in a simulated chloraminated drinking water distribution system. After six months of continuous operation, coupons were incubated in CDC reactors receiving water from the simulated system to study biofilm development. The study was organized ...
Oxygen boundary crossing probabilities.
Busch, N A; Silver, I A
1987-01-01
The probability that an oxygen particle will reach a time dependent boundary is required in oxygen transport studies involving solution methods based on probability considerations. A Volterra integral equation is presented, the solution of which gives directly the boundary crossing probability density function. The boundary crossing probability is the probability that the oxygen particle will reach a boundary within a specified time interval. When the motion of the oxygen particle may be described as strongly Markovian, then the Volterra integral equation can be rewritten as a generalized Abel equation, the solution of which has been widely studied.
In All Probability, Probability is not All
Helman, Danny
2004-01-01
The national lottery is often portrayed as a game of pure chance with no room for strategy. This misperception seems to stem from the application of probability instead of expectancy considerations, and can be utilized to introduce the statistical concept of expectation.
Multinomial mixture model with heterogeneous classification probabilities
Holland, M.D.; Gray, B.R.
2011-01-01
Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.
An Objective Theory of Probability (Routledge Revivals)
Gillies, Donald
2012-01-01
This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma
Directory of Open Access Journals (Sweden)
José Alves Junqueira Júnior
2007-06-01
Full Text Available Nos dias atuais a irrigação é uma das principais técnicas a serviço da agricultura. Entretanto, a consideração da irrigação como única fonte de suprir a demanda de água para as plantas pode acarretar em sistemas superdimensionados, o que contribui para elevar seu custo de implantação. Uma das alternativas utilizadas na solução desse problema consiste em considerar a precipitação a um determinado nível de probabilidade, ou seja, a precipitação provável, o que possibilitaria fazer a irrigação complementar. Assim, objetivou-se com o presente trabalho, caracterizar a precipitação provável na região do município de Madre de Deus, MG, comparando quatro diferentes modelos de distribuição de freqüência (Gama, Normal, Log-normal 2 e 3 parâmetros. As lâminas diárias foram totalizadas em períodos de 10, 15 e 30 dias, sendo avaliadas com 13 diferentes níveis de probabilidades, para séries históricas de 57 anos de observação, compreendido entre 1942 e 1999. Foi aplicado o teste de Kolmogorov-Smirnov a fim de avaliar a adequabilidade das mesmas e verificar qual modelo é mais adequado para cada uma das séries históricas. Observou-se que os modelos de probabilidade adequaram-se melhor ao período chuvoso, sendo a distribuição Log-normal 3 parâmetros a mais adequada para as séries históricas de período mensal e a distribuição Gama para os períodos quinzenal e decendial.Nowadays, irrigation is one of the most important agricultural technique. Therefore, this technique can not be the only source to supply water for crops, because the irrigation system may be over designed, increasing installation costs. One of alternatives to solve this problem is to analyze the probability of rainfall, decreasing costs and easing the irrigation management. This study purposes to characterize probable rainfall for Madre de Deus Village, comparing four (4 probability distribution models (Gama, Normal, Log-normal at 2 and 3
Probability on real Lie algebras
Franz, Uwe
2016-01-01
This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.
Probability output modeling for support vector machines
Zhang, Xiang; Xiao, Xiaoling; Tian, Jinwen; Liu, Jian
2007-11-01
In this paper we propose an approach to model the posterior probability output of multi-class SVMs. The sigmoid function is used to estimate the posterior probability output in binary classification. This approach modeling the posterior probability output of multi-class SVMs is achieved by directly solving the equations that are based on the combination of the probability outputs of binary classifiers using the Bayes's rule. The differences and different weights among these two-class SVM classifiers, based on the posterior probability, are considered and given for the combination of the probability outputs among these two-class SVM classifiers in this method. The comparative experiment results show that our method achieves the better classification precision and the better probability distribution of the posterior probability than the pairwise couping method and the Hastie's optimization method.
The Probabilities of Unique Events
2012-08-30
compensation (a $10 lottery ) on Amazon Mechanical Turk, an online platform hosted on Amazon.com [31]. All of the participants stated that they were native...Probability, Statistics and Truth (Allen & Unwin, London). 4. de Finetti B (1970) Logical foundations and measurement of subjective probabil- ity...F.P. Ramsey: Philosophical Papers, ed Mellor DH (Cam- bridge University Press, Cambridge). 7. Savage L (1972) The Foundations of Statistics (Dover
Bacterial populations were examined in a simulated chloraminated drinking water distribution system (i.e. PVC pipe loop). After six months of continuous operation, coupons were incubated in CDC reactors receiving water from the simulated system to study biofilm development. The s...
Ghosh, Indranil
2011-01-01
Consider a discrete bivariate random variable (X, Y) with possible values x[subscript 1], x[subscript 2],..., x[subscript I] for X and y[subscript 1], y[subscript 2],..., y[subscript J] for Y. Further suppose that the corresponding families of conditional distributions, for X given values of Y and of Y for given values of X are available. We…
Florescu, Ionut
2013-01-01
THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
Pukajło, Katarzyna; Łaczmański, Łukasz; Kolackov, Katarzyna; Kuliczkowska-Płaksej, Justyna; Bolanowski, Marek; Milewicz, Andrzej; Daroszewski, Jacek
2015-01-01
Irisin (Ir), a recently identified adipo-myokine, cleaved and secreted from the protein FNDC5 in response to physical activity, has been postulated to induce the differentiation of a subset of white adipocytes into brown fat and to mediate the beneficial effects on metabolic homeostasis. Metabolic syndrome (MS), a cluster of factors leading to impaired energy homeostasis, affects a significant proportion of subjects suffering from polycystic ovary syndrome (PCOS). The aim of our study was to investigate the relationship between Ir plasma concentrations and metabolic disturbances. The study group consisted of 179 PCOS patients and a population of 122 healthy controls (both groups aged 25-35 years). A subset of 90 subjects with MS was isolated. A positive association between Ir plasma level and MS in the whole group and in controls was found. In subjects with high adipose body content (>40%), Ir was higher than in lean persons (<30%). Our results showed a significant positive association between Ir concentration and android type of adipose tissue in the whole study group and in the control group. Understanding the role of Ir in increased energy expenditure may lead to the development of new therapeutics for obesity and obesity-related diseases.
Energy Technology Data Exchange (ETDEWEB)
Dawson, G.; Kruski, A.W.; Scanu, A.M.
1976-01-01
Five glycosphingolipids (GSL), glucosylceramide, lactosylceramide, trihexosylceramide, globoside, and hematoside (G/sub m//sub 3/) were studied in serum from normal human subjects and patients with dyslipoproteinemia and found to be exclusively associated with the various classes of serum lipoproteins. Based on a unit weight of lipoprotein protein, the total amount of GSL in serum from normal subjects was twice as high in very low density lipoprotein (VLDL) (d < 1.006 g/ml) and low density lipoprotein (LDL) (d 1.019-1.063 g/ml) as in high density lipoproteins HDL/sub 2/ (d 1.063-1.125 g/ml) or HDL/sub 3/ (d 1.125-1.21 g/ml). In abetalipoproteinemia the levels of serum GSL were slightly reduced when compared to normal serum and were all found in the only existing lipoprotein, HDL; this contained 2-3 moles of GSL/mole of lipoprotein as compared to 0.5 GSL/mole in normal HDL. In hypobetalipoproteinemia and Tangier disease, the serum glycosphingolipids were 10 to 30% reduced in concentration compared to the 75% reduction in other lipids, and were again found to be associated only with the serum lipoproteins. The relative proportions of GSL did not vary substantially in the normo- and hypolipidemic subjects studied. Although results establish that glycosphingolipids are intimately associated with serum lipoproteins, the mode of association or the structural and functional significance of such an association remains undetermined.
Directory of Open Access Journals (Sweden)
Patrik Felipe Nazario
2010-01-01
Full Text Available The aim of this study was to determine the possible relationship between loss of the normal medial longitudinal arch measured by the height of the navicular bone in a static situation and variables related to plantar pressure distribution measured in a dynamic situation. Eleven men (21 ± 3 years, 74 ± 10 kg and 175 ± 4 cm participated in the study. The Novel Emed-AT System was used for the acquisition of plantar pressure distribution data (peak pressure, mean pressure, contact area, and relative load at a sampling rate of 50 Hz. The navicular drop test proposed by Brody (1982 was used to assess the height of the navicular bone for classification of the subjects. The results were compared by the Mann-Whitney U test, with the level of significance set at p ≤ 0.05. Differences were observed between the two groups in the mid-foot region for all variables studied, with the observation of higher mean values in subjects with flat feet. There were also significant differences in contact area, relative load, peak pressure, and mean pressure between groups. The present study demonstrates the importance of paying attention to subjects with flat feet because changes in plantar pressure distribution are associated with discomfort and injuries.
Lourens, Aris; van Geer, F.C.
2016-01-01
In many fields of study, and certainly in hydrogeology, uncertainty propagation is a recurring subject. Usually, parametrized probability density functions (PDFs) are used to represent data uncertainty, which limits their use to particular distributions. Often, this problem is solved by Monte Carlo
Characterizations of the power distribution by record values
Indian Academy of Sciences (India)
lutely continuous distribution function F(x) with probability density function f (x) and. F(0) = 0. Assume that Xn belongs to the class C ... distributed; hazard rate; lower record values; theory of functional equations. 2010 Mathematics Subject ... distributed, then Xk, k ≥ 1, has the exponential distribution. Also, one can find more.
A probability space for quantum models
Lemmens, L. F.
2017-06-01
A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.
Difficulties related to Probabilities
Rosinger, Elemer Elad
2010-01-01
Probability theory is often used as it would have the same ontological status with, for instance, Euclidean Geometry or Peano Arithmetics. In this regard, several highly questionable aspects of probability theory are mentioned which have earlier been presented in two arxiv papers.
Indian Academy of Sciences (India)
casinos and gambling houses? How does one interpret a statement like "there is a 30 per cent chance of rain tonight" - a statement we often hear on the news? Such questions arise in the mind of every student when she/he is taught probability as part of mathematics. Many students who go on to study probability and ...
Dynamic update with probabilities
Van Benthem, Johan; Gerbrandy, Jelle; Kooi, Barteld
2009-01-01
Current dynamic-epistemic logics model different types of information change in multi-agent scenarios. We generalize these logics to a probabilistic setting, obtaining a calculus for multi-agent update with three natural slots: prior probability on states, occurrence probabilities in the relevant
Elements of quantum probability
Kummerer, B.; Maassen, H.
1996-01-01
This is an introductory article presenting some basic ideas of quantum probability. From a discussion of simple experiments with polarized light and a card game we deduce the necessity of extending the body of classical probability theory. For a class of systems, containing classical systems with
Rocchi, Paolo
2014-01-01
The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.
Directory of Open Access Journals (Sweden)
Mehmet Emin Çetin
2012-06-01
Full Text Available In this study, a three dimensionally modeled human hip joint was investigated by using finite element method. During this study, finite element models were prepared for three different prosthesis types namely; Charnley, Muller and Hipokrat and for two different activities as walking and stair climbing motions. Ansys Workbench commercial program was used for finite element analysis by applying distributed load condition. The von- Mises stresses and strains occurred on the cortical and trabecular layers of bone, prosthesis and bone cement which was used to assemble prosthesis into bone's intramedullary canal, were determined at the end of the finite element analysis and compared to each other.
Probability of Failure in Random Vibration
DEFF Research Database (Denmark)
Nielsen, Søren R.K.; Sørensen, John Dalsgaard
1988-01-01
Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out...
Probability Analysis of a Quantum Computer
Einarsson, Göran
2003-01-01
The quantum computer algorithm by Peter Shor for factorization of integers is studied. The quantum nature of a QC makes its outcome random. The output probability distribution is investigated and the chances of a successful operation is determined