Helles, Glennie; Fonseca, Rasmus
2009-01-01
residue in the input-window. The trained neural network shows a significant improvement (4-68%) in predicting the most probable bin (covering a 30°×30° area of the dihedral angle space) for all amino acids in the data set compared to first order statistics. An accuracy comparable to that of secondary...... local context dependent dihedral angle propensities in coil-regions. This predicted distribution can potentially improve tertiary structure prediction methods that are based on sampling the backbone dihedral angles of individual amino acids. The predicted distribution may also help predict local...
Exact Probability Distribution versus Entropy
Kerstin Andersson
2014-10-01
Full Text Available The problem addressed concerns the determination of the average number of successive attempts of guessing a word of a certain length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations to a natural language are considered. The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations are necessary in order to estimate the number of guesses. Several kinds of approximations are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic sizes of alphabets and words (100, the number of guesses can be estimated within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions. For many probability distributions, the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion of guesses needed on average compared to the total number decreases almost exponentially with the word length. The leading term in an asymptotic expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.
Probability distributions of landslide volumes
M. T. Brunetti
2009-03-01
Full Text Available We examine 19 datasets with measurements of landslide volume, V_{L}, for sub-aerial, submarine, and extraterrestrial mass movements. Individual datasets include from 17 to 1019 landslides of different types, including rock fall, rock slide, rock avalanche, soil slide, slide, and debris flow, with individual landslide volumes ranging over 10^{−4} m^{3}≤V_{L}≤10^{13} m^{3}. We determine the probability density of landslide volumes, p(V_{L}, using kernel density estimation. Each landslide dataset exhibits heavy tailed (self-similar behaviour for their frequency-size distributions, p(V_{L} as a function of V_{L}, for failures exceeding different threshold volumes, V_{L}*, for each dataset. These non-cumulative heavy-tailed distributions for each dataset are negative power-laws, with exponents 1.0≤β≤1.9, and averaging β≈1.3. The scaling behaviour of V_{L} for the ensemble of the 19 datasets is over 17 orders of magnitude, and is independent of lithological characteristics, morphological settings, triggering mechanisms, length of period and extent of the area covered by the datasets, presence or lack of water in the failed materials, and magnitude of gravitational fields. We argue that the statistics of landslide volume is conditioned primarily on the geometrical properties of the slope or rock mass where failures occur. Differences in the values of the scaling exponents reflect the primary landslide types, with rock falls exhibiting a smaller scaling exponent (1.1≤β≤1.4 than slides and soil slides (1.5≤β≤1.9. We argue that the difference is a consequence of the disparity in the mechanics of rock falls and slides.
Probability distributions of landslide volumes
M. T. Brunetti; Guzzetti, F.; M. Rossi
2009-01-01
We examine 19 datasets with measurements of landslide volume, VL, for sub-aerial, submarine, and extraterrestrial mass movements. Individual datasets include from 17 to 1019 landslides of different types, including rock fall, rock slide, rock avalanche, soil slide, slide, and debris flow, with individual landslide volumes ranging over 10−4 m3≤VL≤1013 m3. We determine the probability density of landslide volumes, p(VL), using kernel density estimation. Each landslide...
Pre-Aggregation with Probability Distributions
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
2006-01-01
Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how the...... motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....
ASYMPTOTIC QUANTIZATION OF PROBABILITY DISTRIBUTIONS
Klaus P(o)tzelberger
2003-01-01
We give a brief introduction to results on the asymptotics of quantization errors.The topics discussed include the quantization dimension,asymptotic distributions of sets of prototypes,asymptotically optimal quantizations,approximations and random quantizations.
The Multivariate Gaussian Probability Distribution
Ahrendt, Peter
2005-01-01
This technical report intends to gather information about the multivariate gaussian distribution, that was previously not (at least to my knowledge) to be found in one place and written as a reference manual. Additionally, some useful tips and tricks are collected that may be useful in practical ...
Non-Gaussian Photon Probability Distribution
This paper investigates the axiom that the photon's probability distribution is a Gaussian distribution. The Airy disc empirical evidence shows that the best fit, if not exact, distribution is a modified Gamma mΓ distribution (whose parameters are α = r, βr/√(u)) in the plane orthogonal to the motion of the photon. This modified Gamma distribution is then used to reconstruct the probability distributions along the hypotenuse from the pinhole, arc from the pinhole, and a line parallel to photon motion. This reconstruction shows that the photon's probability distribution is not a Gaussian function. However, under certain conditions, the distribution can appear to be Normal, thereby accounting for the success of quantum mechanics. This modified Gamma distribution changes with the shape of objects around it and thus explains how the observer alters the observation. This property therefore places additional constraints to quantum entanglement experiments. This paper shows that photon interaction is a multi-phenomena effect consisting of the probability to interact Pi, the probabilistic function and the ability to interact Ai, the electromagnetic function. Splitting the probability function Pi from the electromagnetic function Ai enables the investigation of the photon behavior from a purely probabilistic Pi perspective. The Probabilistic Interaction Hypothesis is proposed as a consistent method for handling the two different phenomena, the probability function Pi and the ability to interact Ai, thus redefining radiation shielding, stealth or cloaking, and invisibility as different effects of a single phenomenon Pi of the photon probability distribution. Sub wavelength photon behavior is successfully modeled as a multi-phenomena behavior. The Probabilistic Interaction Hypothesis provides a good fit to Otoshi's (1972) microwave shielding, Schurig et al.(2006) microwave cloaking, and Oulton et al.(2008) sub wavelength confinement; thereby providing a strong case that
Probability Distributions for a Surjective Unimodal Map
HongyanSUN; LongWANG
1996-01-01
In this paper we show that the probability distributions for a surjective unimodal map can be classified into three types,δ function,asymmetric and symmetric type,by identifying the binary structures of its initial values.The Borel's normal number theorem is equivalent or prior to the Frobenius-Perron operator in analyzing the probability distributions for this kind of maps,and in particular we can constitute a multifractal probability distribution from the surjective tent map by selecting a non- Borel's normal number as the initial value.
Lagrangian Probability Distributions of Turbulent Flows
Friedrich, R.
2002-01-01
We outline a statistical theory of turbulence based on the Lagrangian formulation of fluid motion. We derive a hierarchy of evolution equations for Lagrangian N-point probability distributions as well as a functional equation for a suitably defined probability functional which is the analog of Hopf's functional equation. Furthermore, we adress the derivation of a generalized Fokker-Plank equation for the joint velocity - position probability density of N fluid particles.
Asymmetry of the work probability distribution
Saha, Arnab; Bhattacharjee, J. K.
2006-01-01
We show, both analytically and numerically, that for a nonlinear system making a transition from one equilibrium state to another under the action of an external time dependent force, the work probability distribution is in general asymmetric.
The Pauli equation for probability distributions
The tomographic-probability distribution for a measurable coordinate and spin projection is introduced to describe quantum states as an alternative to the density matrix. An analogue of the Pauli equation for the spin-1/2 particle is obtained for such a probability distribution instead of the usual equation for the wavefunction. Examples of the tomographic description of Landau levels and coherent states of a charged particle moving in a constant magnetic field are presented. (author)
The Pauli equation for probability distributions
Mancini, S. [INFM, Dipartimento di Fisica, Universita di Milano, Milan (Italy). E-mail: Stefano.Mancini@mi.infn.it; Man' ko, O.V. [P.N. Lebedev Physical Institute, Moscow (Russian Federation). E-mail: Olga.Manko@sci.lebedev.ru; Man' ko, V.I. [INFM, Dipartimento di Matematica e Fisica, Universita di Camerino, Camerino (Italy). E-mail: Vladimir.Manko@sci.lebedev.ru; Tombesi, P. [INFM, Dipartimento di Matematica e Fisica, Universita di Camerino, Camerino (Italy). E-mail: Paolo.Tombesi@campus.unicam.it
2001-04-27
The tomographic-probability distribution for a measurable coordinate and spin projection is introduced to describe quantum states as an alternative to the density matrix. An analogue of the Pauli equation for the spin-1/2 particle is obtained for such a probability distribution instead of the usual equation for the wavefunction. Examples of the tomographic description of Landau levels and coherent states of a charged particle moving in a constant magnetic field are presented. (author)
Learning a Probability Distribution Efficiently and Reliably
Laird, Philip; Gamble, Evan
1988-01-01
A new algorithm, called the CDF-Inversion Algorithm, is described. Using it, one can efficiently learn a probability distribution over a finite set to a specified accuracy and confidence. The algorithm can be extended to learn joint distributions over a vector space. Some implementation results are described.
The Pauli Equation for Probability Distributions
Mancini, S.; Man'ko, O. V.; Man'ko, V. I.; Tombesi, P.
2000-01-01
The "marginal" distributions for measurable coordinate and spin projection is introduced. Then, the analog of the Pauli equation for spin-1/2 particle is obtained for such probability distributions instead of the usual wave functions. That allows a classical-like approach to quantum mechanics. Some illuminating examples are presented.
The Pauli Equation for Probability Distributions
Mancini, S; Man'ko, V I; Tombesi, P
2001-01-01
The "marginal" distributions for measurable coordinate and spin projection is introduced. Then, the analog of the Pauli equation for spin-1/2 particle is obtained for such probability distributions instead of the usual wave functions. That allows a classical-like approach to quantum mechanics. Some illuminating examples are presented.
Five-Parameter Bivariate Probability Distribution
Tubbs, J.; Brewer, D.; Smith, O. W.
1986-01-01
NASA technical memorandum presents four papers about five-parameter bivariate gamma class of probability distributions. With some overlap of subject matter, papers address different aspects of theories of these distributions and use in forming statistical models of such phenomena as wind gusts. Provides acceptable results for defining constraints in problems designing aircraft and spacecraft to withstand large wind-gust loads.
Qualitative criterion for atom sputtering angle distributions
A model is introduced to explain the shape of atom polar emission angle distributions for monocomponent targets sputtered by normally incident keV - energy ions. Analytical expressions are obtained from the model which make it possible to identify three known kinds of the angle distributions - subcosinus, isotropic and supracosinus, for given ion energies and target-ion pairs. Furthermore the fourth, hybrid false-isotropic distribution is found, which is superposition of supracosinus and subcosinus distributions. The theoretical predictions of the angle distributions shape agree with the numerical modeling for sputtering of carbon and platinum by 0.1-10 keV Ar+ ions
Proposal for Modified Damage Probability Distribution Functions
Pedersen, Preben Terndrup; Hansen, Peter Friis
1996-01-01
Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on...
Probability distributions for offshore wind speeds
In planning offshore wind farms, short-term wind speeds play a central role in estimating various engineering parameters, such as power output, extreme wind load, and fatigue load. Lacking wind speed time series of sufficient length, the probability distribution of wind speed serves as the primary substitute for data when estimating design parameters. It is common practice to model short-term wind speeds with the Weibull distribution. Using 10-min wind speed time series at 178 ocean buoy stations ranging from 1 month to 20 years in duration, we show that the widely-accepted Weibull distribution provides a poor fit to the distribution of wind speeds when compared with more complicated models. We compare distributions in terms of three different metrics: probability plot R2, estimates of average turbine power output, and estimates of extreme wind speed. While the Weibull model generally gives larger R2 than any other 2-parameter distribution, the bimodal Weibull, Kappa, and Wakeby models all show R2 values significantly closer to 1 than the other distributions considered (including the Weibull), with the bimodal Weibull giving the best fits. The Kappa and Wakeby distributions fit the upper tail (higher wind speeds) of a sample better than the bimodal Weibull, but may drastically over-estimate the frequency of lower wind speeds. Because the average turbine power is controlled by high wind speeds, the Kappa and Wakeby estimate average turbine power output very well, with the Kappa giving the least bias and mean square error out of all the distributions. The 2-parameter Lognormal distribution performs best for estimating extreme wind speeds, but still gives estimates with significant error. The fact that different distributions excel under different applications motivates further research on model selection based upon the engineering parameter of interest.
Pre-aggregation for Probability Distributions
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions and the...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....... computations of aggregate values. The paper also reports on the experiments with the methods. The work is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. No previous work considers the combination of the aspects of uncertain...
Evolution of the jet opening angle distribution in holographic plasma
Rajagopal, Krishna; van der Schee, Wilke
2016-01-01
We use holography to analyze the evolution of an ensemble of jets, with an initial probability distribution for their energy and opening angle as %for jets in proton-proton (pp) collisions, as they propagate through an expanding cooling droplet of strongly coupled plasma as in heavy ion collisions. We identify two competing effects: (i) each individual jet widens as it propagates; (ii) the opening angle distribution for jets emerging from the plasma within any specified range of energies has been pushed toward smaller angles, comparing to pp jets with the same energies. The second effect arises because small-angle jets suffer less energy loss and because jets with a higher initial energy are less probable in the ensemble. We illustrate both effects in a simple two-parameter model, and find that their consequence in sum is that the opening angle distribution for jets in any range of energies contains fewer narrow and wide jets. Either effect can dominate in the mean opening angle, for not unreasonable values o...
Confidence intervals for the lognormal probability distribution
The present communication addresses the topic of symmetric confidence intervals for the lognormal probability distribution. This distribution is frequently utilized to characterize inherently positive, continuous random variables that are selected to represent many physical quantities in applied nuclear science and technology. The basic formalism is outlined herein and a conjured numerical example is provided for illustration. It is demonstrated that when the uncertainty reflected in a lognormal probability distribution is large, the use of a confidence interval provides much more useful information about the variable used to represent a particular physical quantity than can be had by adhering to the notion that the mean value and standard deviation of the distribution ought to be interpreted as best value and corresponding error, respectively. Furthermore, it is shown that if the uncertainty is very large a disturbing anomaly can arise when one insists on interpreting the mean value and standard deviation as the best value and corresponding error, respectively. Reliance on using the mode and median as alternative parameters to represent the best available knowledge of a variable with large uncertainties is also shown to entail limitations. Finally, a realistic physical example involving the decay of radioactivity over a time period that spans many half-lives is presented and analyzed to further illustrate the concepts discussed in this communication
Joint probability distributions for projection probabilities of random orthonormal states
The quantum chaos conjecture applied to a finite dimensional quantum system implies that such a system has eigenstates that show similar statistical properties as the column vectors of random orthogonal or unitary matrices. Here, we consider the different probabilities for obtaining a specific outcome in a projective measurement, provided the system is in one of its eigenstates. We then give analytic expressions for the joint probability density for these probabilities, with respect to the ensemble of random matrices. In the case of the unitary group, our results can be applied, also, to the phenomenon of universal conductance fluctuations, where the same mathematical quantities describe partial conductances in a two-terminal mesoscopic scattering problem with a finite number of modes in each terminal. (paper)
Joint probability distributions for projection probabilities of random orthonormal states
Alonso, L.; Gorin, T.
2016-04-01
The quantum chaos conjecture applied to a finite dimensional quantum system implies that such a system has eigenstates that show similar statistical properties as the column vectors of random orthogonal or unitary matrices. Here, we consider the different probabilities for obtaining a specific outcome in a projective measurement, provided the system is in one of its eigenstates. We then give analytic expressions for the joint probability density for these probabilities, with respect to the ensemble of random matrices. In the case of the unitary group, our results can be applied, also, to the phenomenon of universal conductance fluctuations, where the same mathematical quantities describe partial conductances in a two-terminal mesoscopic scattering problem with a finite number of modes in each terminal.
Generating pseudo-random discrete probability distributions
Maziero, Jonas, E-mail: jonasmaziero@gmail.com [Universidade Federal de Santa Maria (UFSM), RS (Brazil). Departamento de Fisica
2015-08-15
The generation of pseudo-random discrete probability distributions is of paramount importance for a wide range of stochastic simulations spanning from Monte Carlo methods to the random sampling of quantum states for investigations in quantum information science. In spite of its significance, a thorough exposition of such a procedure is lacking in the literature. In this article, we present relevant details concerning the numerical implementation and applicability of what we call the iid, normalization, and trigonometric methods for generating an unbiased probability vector p=(p{sub 1},⋯ ,p{sub d}). An immediate application of these results regarding the generation of pseudo-random pure quantum states is also described. (author)
Constraints on probability distributions of grammatical forms
Kostić Aleksandar
2007-01-01
Full Text Available In this study we investigate the constraints on probability distribution of grammatical forms within morphological paradigms of Serbian language, where paradigm is specified as a coherent set of elements with defined criteria for inclusion. Thus, for example, in Serbian all feminine nouns that end with the suffix "a" in their nominative singular form belong to the third declension, the declension being a paradigm. The notion of a paradigm could be extended to other criteria as well, hence, we can think of noun cases, irrespective of grammatical number and gender, or noun gender, irrespective of case and grammatical number, also as paradigms. We took the relative entropy as a measure of homogeneity of probability distribution within paradigms. The analysis was performed on 116 morphological paradigms of typical Serbian and for each paradigm the relative entropy has been calculated. The obtained results indicate that for most paradigms the relative entropy values fall within a range of 0.75 - 0.9. Nonhomogeneous distribution of relative entropy values allows for estimating the relative entropy of the morphological system as a whole. This value is 0.69 and can tentatively be taken as an index of stability of the morphological system.
Supra-Bayesian Combination of Probability Distributions
Sečkárová, Vladimíra
Veszprém : University of Pannonia, 2010, s. 112-117. ISBN 978-615-5044-00-7. [11th International PhD Workshop on Systems and Control. Veszprém (HU), 01.09.2010-03.09.2010] R&D Projects: GA ČR GA102/08/0567 Institutional research plan: CEZ:AV0Z10750506 Keywords : Supra-Bayesian approach * sharing of probabilistic information * Bayesian decision making Subject RIV: BC - Control Systems Theory http://library.utia.cas.cz/separaty/2010/AS/seckarova-supra-bayesian combination of probability distributions.pdf
Parametric probability distributions for anomalous change detection
Theiler, James P [Los Alamos National Laboratory; Foy, Bernard R [Los Alamos National Laboratory; Wohlberg, Brendt E [Los Alamos National Laboratory; Scovel, James C [Los Alamos National Laboratory
2010-01-01
The problem of anomalous change detection arises when two (or possibly more) images are taken of the same scene, but at different times. The aim is to discount the 'pervasive differences' that occur thoughout the imagery, due to the inevitably different conditions under which the images were taken (caused, for instance, by differences in illumination, atmospheric conditions, sensor calibration, or misregistration), and to focus instead on the 'anomalous changes' that actually take place in the scene. In general, anomalous change detection algorithms attempt to model these normal or pervasive differences, based on data taken directly from the imagery, and then identify as anomalous those pixels for which the model does not hold. For many algorithms, these models are expressed in terms of probability distributions, and there is a class of such algorithms that assume the distributions are Gaussian. By considering a broader class of distributions, however, a new class of anomalous change detection algorithms can be developed. We consider several parametric families of such distributions, derive the associated change detection algorithms, and compare the performance with standard algorithms that are based on Gaussian distributions. We find that it is often possible to significantly outperform these standard algorithms, even using relatively simple non-Gaussian models.
Probability Distribution for Flowing Interval Spacing
The purpose of this analysis is to develop a probability distribution for flowing interval spacing. A flowing interval is defined as a fractured zone that transmits flow in the Saturated Zone (SZ), as identified through borehole flow meter surveys (Figure 1). This analysis uses the term ''flowing interval spacing'' as opposed to fractured spacing, which is typically used in the literature. The term fracture spacing was not used in this analysis because the data used identify a zone (or a flowing interval) that contains fluid-conducting fractures but does not distinguish how many or which fractures comprise the flowing interval. The flowing interval spacing is measured between the midpoints of each flowing interval. Fracture spacing within the SZ is defined as the spacing between fractures, with no regard to which fractures are carrying flow. The Development Plan associated with this analysis is entitled, ''Probability Distribution for Flowing Interval Spacing'', (CRWMS M and O 2000a). The parameter from this analysis may be used in the TSPA SR/LA Saturated Zone Flow and Transport Work Direction and Planning Documents: (1) ''Abstraction of Matrix Diffusion for SZ Flow and Transport Analyses'' (CRWMS M and O 1999a) and (2) ''Incorporation of Heterogeneity in SZ Flow and Transport Analyses'', (CRWMS M and O 1999b). A limitation of this analysis is that the probability distribution of flowing interval spacing may underestimate the effect of incorporating matrix diffusion processes in the SZ transport model because of the possible overestimation of the flowing interval spacing. Larger flowing interval spacing results in a decrease in the matrix diffusion processes. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be determined from the data. Because each flowing interval probably has more than one fracture contributing to a flowing interval, the true flowing interval spacing could be
Audio feature extraction using probability distribution function
Suhaib, A.; Wan, Khairunizam; Aziz, Azri A.; Hazry, D.; Razlan, Zuradzman M.; Shahriman A., B.
2015-05-01
Voice recognition has been one of the popular applications in robotic field. It is also known to be recently used for biometric and multimedia information retrieval system. This technology is attained from successive research on audio feature extraction analysis. Probability Distribution Function (PDF) is a statistical method which is usually used as one of the processes in complex feature extraction methods such as GMM and PCA. In this paper, a new method for audio feature extraction is proposed which is by using only PDF as a feature extraction method itself for speech analysis purpose. Certain pre-processing techniques are performed in prior to the proposed feature extraction method. Subsequently, the PDF result values for each frame of sampled voice signals obtained from certain numbers of individuals are plotted. From the experimental results obtained, it can be seen visually from the plotted data that each individuals' voice has comparable PDF values and shapes.
The Probability Distribution for a Biased Spinner
Foster, Colin
2012-01-01
This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)
Evolution of the Jet Opening Angle Distribution in Holographic Plasma.
Rajagopal, Krishna; Sadofyev, Andrey V; van der Schee, Wilke
2016-05-27
We use holography to analyze the evolution of an ensemble of jets, with an initial probability distribution for their energy and opening angle as in proton-proton (pp) collisions, as they propagate through an expanding cooling droplet of strongly coupled plasma as in heavy ion collisions. We identify two competing effects: (i) each individual jet widens as it propagates and (ii) because wide-angle jets lose more energy, energy loss combined with the steeply falling perturbative spectrum serves to filter wide jets out of the ensemble at any given energy. Even though every jet widens, jets with a given energy can have a smaller mean opening angle after passage through the plasma than jets with that energy would have had in vacuum, as experimental data may indicate. PMID:27284647
Dichotomous choice contingent valuation probability distributions
Kerr, Geoffrey N.
2000-01-01
Parametric distributions applied to dichotomous choice contingent valuation data invoke assumptions about the distribution of willingness to pay that may contravene economic theory. This article develops and applies distributions that allow the shape of bid distributions to vary. Alternative distributions provide little, if any, improvement in statistical fit from commonly used distributions. While median willingness to pay is largely invariant to distribution, estimates of mean consumer surp...
How to Read Probability Distributions as Statements about Process
Frank, Steven A.
2014-01-01
Probability distributions can be read as simple expressions of information. Each continuous probability distribution describes how information changes with magnitude. Once one learns to read a probability distribution as a measurement scale of information, opportunities arise to understand the processes that generate the commonly observed patterns. Probability expressions may be parsed into four components: the dissipation of all information, except the preservation of average values, taken o...
Eliciting Subjective Probability Distributions with Binary Lotteries
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd;
2015-01-01
We test in a laboratory experiment the theoretical prediction that risk attitudes have a surprisingly small role in distorting reports from true belief distributions. We find evidence consistent with theory in our experiment.......We test in a laboratory experiment the theoretical prediction that risk attitudes have a surprisingly small role in distorting reports from true belief distributions. We find evidence consistent with theory in our experiment....
Incorporating Skew into RMS Surface Roughness Probability Distribution
Stahl, Mark T.; Stahl, H. Philip.
2013-01-01
The standard treatment of RMS surface roughness data is the application of a Gaussian probability distribution. This handling of surface roughness ignores the skew present in the surface and overestimates the most probable RMS of the surface, the mode. Using experimental data we confirm the Gaussian distribution overestimates the mode and application of an asymmetric distribution provides a better fit. Implementing the proposed asymmetric distribution into the optical manufacturing process would reduce the polishing time required to meet surface roughness specifications.
Semi-stable distributions in free probability theory
无
2006-01-01
Semi-stable distributions, in classical probability theory, are characterized as limiting distributions of subsequences of normalized partial sums of independent and identically distributed random variables. We establish the noncommutative counterpart of semi-stable distributions. We study the characterization of noncommutative semi-stability through free cumulant transform and develop the free semi-stability and domain of semi-stable attraction in free probability theory.
Probability distributions in risk management operations
Artikis, Constantinos
2015-01-01
This book is about the formulations, theoretical investigations, and practical applications of new stochastic models for fundamental concepts and operations of the discipline of risk management. It also examines how these models can be useful in the descriptions, measurements, evaluations, and treatments of risks threatening various modern organizations. Moreover, the book makes clear that such stochastic models constitute very strong analytical tools which substantially facilitate strategic thinking and strategic decision making in many significant areas of risk management. In particular the incorporation of fundamental probabilistic concepts such as the sum, minimum, and maximum of a random number of continuous, positive, independent, and identically distributed random variables in the mathematical structure of stochastic models significantly supports the suitability of these models in the developments, investigations, selections, and implementations of proactive and reactive risk management operations. The...
Distribution of angles in hyperbolic lattices
Risager, Morten Skarsholm; Truelsen, Jimi Lee
2010-01-01
We prove an effective equidistribution result about angles in a hyperbolic lattice. We use this to generalize a result from the study by Boca.......We prove an effective equidistribution result about angles in a hyperbolic lattice. We use this to generalize a result from the study by Boca....
Distribution of Angles in Hyperbolic Lattices
S. Risager, Morten; L. Truelsen, Jimi
2008-01-01
We prove an effective equidistribution result about angles in a hyperbolic lattice. We use this to generalize a result due to F. P. Boca.......We prove an effective equidistribution result about angles in a hyperbolic lattice. We use this to generalize a result due to F. P. Boca....
Negative Binomial and Multinomial States: probability distributions and coherent states
Fu, Hong-Chen; Sasaki, Ryu
1996-01-01
Following the relationship between probability distribution and coherent states, for example the well known Poisson distribution and the ordinary coherent states and relatively less known one of the binomial distribution and the $su(2)$ coherent states, we propose ``interpretation'' of $su(1,1)$ and $su(r,1)$ coherent states ``in terms of probability theory''. They will be called the ``negative binomial'' (``multinomial'') ``states'' which correspond to the ``negative'' binomial (multinomial)...
Some explicit expressions for the probability distribution of force magnitude
Saralees Nadarajah
2008-08-01
Recently, empirical investigations have suggested that the components of contact forces follow the exponential distribution. However, explicit expressions for the probability distribution of the corresponding force magnitude have not been known and only approximations have been used in the literature. In this note, for the ﬁrst time, I provide explicit expressions for the probability distribution of the force magnitude. Both two-dimensional and three-dimensional cases are considered.
Using the analogy between brownian motion and Quantum Mechanics, we study the winding angle θ of planar brownian curves around a given point, say the origin O. In particular, we compute the characteristic function for the probability distribution of θ and recover Spitzer's law in the limit of infinitely large times. Finally, we study the (large) change in the winding angle distribution when we add a repulsive potential at the origin
Net baryon number probability distribution near the chiral phase transition
Morita, Kenji; Skokov, Vladimir; Friman, Bengt; Redlich, Krzysztof
2014-01-01
We discuss the properties of the net baryon number probability distribution near the chiral phase transition to explore the effect of critical fluctuations. Our studies are performed within Landau theory, where the coefficients of the polynomial potential are parametrized, so as to reproduce the mean-field (MF), the Z(2) , and the O(4) scaling behaviors of the cumulants of the net baryon number. We show that in the critical region the structure of the probability distribution changes, dependi...
Information-theoretic methods for estimating of complicated probability distributions
Zong, Zhi
2006-01-01
Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur
Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution
Hamadameen, Abdulqader Othman; Zainuddin, Zaitul Marlizawati
2014-06-01
This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α-. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen's method is employed to find a compromise solution, supported by illustrative numerical example.
Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution
Hamadameen, Abdulqader Othman [Optimization, Department of Mathematical Sciences, Faculty of Science, UTM (Malaysia); Zainuddin, Zaitul Marlizawati [Department of Mathematical Sciences, Faculty of Science, UTM (Malaysia)
2014-06-19
This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α{sup –}. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen’s method is employed to find a compromise solution, supported by illustrative numerical example.
Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution
This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α–. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen’s method is employed to find a compromise solution, supported by illustrative numerical example
Fitness Probability Distribution of Bit-Flip Mutation.
Chicano, Francisco; Sutton, Andrew M; Whitley, L Darrell; Alba, Enrique
2015-01-01
Bit-flip mutation is a common mutation operator for evolutionary algorithms applied to optimize functions over binary strings. In this paper, we develop results from the theory of landscapes and Krawtchouk polynomials to exactly compute the probability distribution of fitness values of a binary string undergoing uniform bit-flip mutation. We prove that this probability distribution can be expressed as a polynomial in p, the probability of flipping each bit. We analyze these polynomials and provide closed-form expressions for an easy linear problem (Onemax), and an NP-hard problem, MAX-SAT. We also discuss a connection of the results with runtime analysis. PMID:24885680
Most probable degree distribution at fixed structural entropy
Ginestra Bianconi
2008-06-01
The structural entropy is the entropy of the ensemble of uncorrelated networks with given degree sequence. Here we derive the most probable degree distribution emerging when we distribute stubs (or half-edges) randomly through the nodes of the network by keeping fixed the structural entropy. This degree distribution is found to decay as a Poisson distribution when the entropy is maximized and to have a power-law tail with an exponent → 2 when the entropy is minimized.
Amir, El-Ad David; Kalisman, Nir; Keasar, Chen
2008-07-01
Rotatable torsion angles are the major degrees of freedom in proteins. Adjacent angles are highly correlated and energy terms that rely on these correlations are intensively used in molecular modeling. However, the utility of torsion based terms is not yet fully exploited. Many of these terms do not capture the full scale of the correlations. Other terms, which rely on lookup tables, cannot be used in the context of force-driven algorithms because they are not fully differentiable. This study aims to extend the usability of torsion terms by presenting a set of high-dimensional and fully-differentiable energy terms that are derived from high-resolution structures. The set includes terms that describe backbone conformational probabilities and propensities, side-chain rotamer probabilities, and an elaborate term that couples all the torsion angles within the same residue. The terms are constructed by cubic spline interpolation with periodic boundary conditions that enable full differentiability and high computational efficiency. We show that the spline implementation does not compromise the accuracy of the original database statistics. We further show that the side-chain relevant terms are compatible with established rotamer probabilities. Despite their very local characteristics, the new terms are often able to identify native and native-like structures within decoy sets. Finally, force-based minimization of NMR structures with the new terms improves their torsion angle statistics with minor structural distortion (0.5 A RMSD on average). The new terms are freely available in the MESHI molecular modeling package. The spline coefficients are also available as a documented MATLAB file. PMID:18186478
ProbOnto: ontology and knowledge base of probability distributions
Swat, Maciej J.; Grenon, Pierre; Wimalaratne, Sarala
2016-01-01
Motivation: Probability distributions play a central role in mathematical and statistical modelling. The encoding, annotation and exchange of such models could be greatly simplified by a resource providing a common reference for the definition of probability distributions. Although some resources exist, no suitably detailed and complex ontology exists nor any database allowing programmatic access. Results: ProbOnto, is an ontology-based knowledge base of probability distributions, featuring more than 80 uni- and multivariate distributions with their defining functions, characteristics, relationships and re-parameterization formulas. It can be used for model annotation and facilitates the encoding of distribution-based models, related functions and quantities. Availability and Implementation: http://probonto.org Contact: mjswat@ebi.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153608
A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (Sst, Sst, pst) for stochastic uncertainty, a probability space (Ssu, Ssu, psu) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (Sst, Sst, pst) and (Ssu, Ssu, psu). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems
NORMALLY DISTRIBUTED PROBABILITY MEASURE ON THE METRIC SPACE OF NORMS
Á.G. HORVÁTH
2013-01-01
In this paper we propose a method to construct probability measures on the space of convex bodies. For this purpose, first, we introduce the notion of thinness of a body. Then we show the existence of a measure with the property that its pushforward by the thinness function is a probability measure of truncated normal distribution. Finally, we improve this method to find a measure satisfying some important properties in geometric measure theory.
Uniform distribution of initial states: The physical basis of probability
Zhang Kechen
1990-02-01
For repetitive experiments performed on a deterministic system with initial states restricted to a certain region in phase space, the relative frequency of an event has a definite value insensitive to the preparation of the experiments only if the initial states leading to that event are distributed uniformly in the prescribed region. Mechanical models of coin tossing and roulette spinning and equal a priori probability hypothesis in statistical mechanics are considered in the light of this principle. Probabilities that have arisen from uniform distributions of initial states do not necessarily submit to Kolmogorov's axioms of probability. In the finite-dimensional case, a uniform distribution in phase space either in the coarse-grained sense or in the limit sense can be formulated in a unified way.
Gao, Haixia; Li, Ting; Xiao, Changming
2016-05-01
When a simple system is in its nonequilibrium state, it will shift to its equilibrium state. Obviously, in this process, there are a series of nonequilibrium states. With the assistance of Bayesian statistics and hyperensemble, a probable probability distribution of these nonequilibrium states can be determined by maximizing the hyperensemble entropy. It is known that the largest probability is the equilibrium state, and the far a nonequilibrium state is away from the equilibrium one, the smaller the probability will be, and the same conclusion can also be obtained in the multi-state space. Furthermore, if the probability stands for the relative time the corresponding nonequilibrium state can stay, then the velocity of a nonequilibrium state returning back to its equilibrium can also be determined through the reciprocal of the derivative of this probability. It tells us that the far away the state from the equilibrium is, the faster the returning velocity will be; if the system is near to its equilibrium state, the velocity will tend to be smaller and smaller, and finally tends to 0 when it gets the equilibrium state.
Assigning probability distributions to input parameters of performance assessment models
This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available
Assigning probability distributions to input parameters of performance assessment models
Mishra, Srikanta [INTERA Inc., Austin, TX (United States)
2002-02-01
This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available.
Augmenting momentum resolution with well tuned probability distributions
Landi, Gregorio
2016-01-01
The realistic probability distributions of a previous article are applied to the reconstruction of tracks in constant magnetic field. The complete forms and their schematic approximations produce excellent momentum estimations, drastically better than standard fits. A simplified derivation of one of our probability distributions is illustrated. The momentum reconstructions are compared with standard fits (least squares) with two different position algorithms: the eta-algorithm and the two-strip center of gravity. The quality of our results are expressed as the increase of the magnetic field and signal-to-noise ratio that overlap the standard fit reconstructions with ours best distributions. The data and the simulations are tuned on the tracker of a running experiment and its double sided microstrip detectors, here each detector side is simulated to measure the magnetic bending. To overlap with our best distributions, the magnetic field must be increased by a factor 1.5 for the least squares based on the eta-a...
Probability distribution of extreme share returns in Malaysia
Zin, Wan Zawiah Wan; Safari, Muhammad Aslam Mohd; Jaaman, Saiful Hafizah; Yie, Wendy Ling Shin
2014-09-01
The objective of this study is to investigate the suitable probability distribution to model the extreme share returns in Malaysia. To achieve this, weekly and monthly maximum daily share returns are derived from share prices data obtained from Bursa Malaysia over the period of 2000 to 2012. The study starts with summary statistics of the data which will provide a clue on the likely candidates for the best fitting distribution. Next, the suitability of six extreme value distributions, namely the Gumbel, Generalized Extreme Value (GEV), Generalized Logistic (GLO) and Generalized Pareto (GPA), the Lognormal (GNO) and the Pearson (PE3) distributions are evaluated. The method of L-moments is used in parameter estimation. Based on several goodness of fit tests and L-moment diagram test, the Generalized Pareto distribution and the Pearson distribution are found to be the best fitted distribution to represent the weekly and monthly maximum share returns in Malaysia stock market during the studied period, respectively.
Modeling highway travel time distribution with conditional probability models
Oliveira Neto, Francisco Moraes [ORNL; Chin, Shih-Miao [ORNL; Hwang, Ho-Ling [ORNL; Han, Lee [University of Tennessee, Knoxville (UTK)
2014-01-01
ABSTRACT Under the sponsorship of the Federal Highway Administration's Office of Freight Management and Operations, the American Transportation Research Institute (ATRI) has developed performance measures through the Freight Performance Measures (FPM) initiative. Under this program, travel speed information is derived from data collected using wireless based global positioning systems. These telemetric data systems are subscribed and used by trucking industry as an operations management tool. More than one telemetric operator submits their data dumps to ATRI on a regular basis. Each data transmission contains truck location, its travel time, and a clock time/date stamp. Data from the FPM program provides a unique opportunity for studying the upstream-downstream speed distributions at different locations, as well as different time of the day and day of the week. This research is focused on the stochastic nature of successive link travel speed data on the continental United States Interstates network. Specifically, a method to estimate route probability distributions of travel time is proposed. This method uses the concepts of convolution of probability distributions and bivariate, link-to-link, conditional probability to estimate the expected distributions for the route travel time. Major contribution of this study is the consideration of speed correlation between upstream and downstream contiguous Interstate segments through conditional probability. The established conditional probability distributions, between successive segments, can be used to provide travel time reliability measures. This study also suggests an adaptive method for calculating and updating route travel time distribution as new data or information is added. This methodology can be useful to estimate performance measures as required by the recent Moving Ahead for Progress in the 21st Century Act (MAP 21).
Probability Measure of Navigation pattern predition using Poisson Distribution Analysis
Dr.V.Valli Mayil
2012-06-01
Full Text Available The World Wide Web has become one of the most important media to store, share and distribute information. The rapid expansion of the web has provided a great opportunity to study user and system behavior by exploring web access logs. Web Usage Mining is the application of data mining techniques to large web data repositories in order to extract usage patterns. Every web server keeps a log of all transactions between the server and the clients. The log data which are collected by web servers contains information about every click of user to the web documents of the site. The useful log information needs to be analyzed and interpreted in order to obtainknowledge about actual user preferences in accessing web pages. In recent years several methods have been proposed for mining web log data. This paper addresses the statistical method of Poisson distribution analysis to find out the higher probability session sequences which is then used to test the web application performance.The analysis of large volumes of click stream data demands the employment of data mining methods. Conducting data mining on logs of web servers involves the determination of frequently occurring access sequences. A statistical poisson distribution shows the frequency probability of specific events when the average probability of a single occurrence is known. The Poisson distribution is a discrete function wich is used in this paper to find out the probability frequency of particular page is visited by the user.
Probability distribution function for a solid with vacancies
Metlov, Leonid S.
2011-01-01
Expression for probability distribution is got taking into account a presence and removal of degeneracy on the microstates. Its application allows to describe the process of melting of solids, as saltatory phase transition of the first kind without bringing in of concept of the order parameter.
Contact pressure distribution and support angle optimization of kiln tyre
无
2006-01-01
According to the shearing force character and the deformation coordination condition of shell at the station of supports, the mathematical models to calculate contact angle and contact pressure distribution between tyre and shell were set up, the formulae of bending moment and bending stress of tyre were obtained. Taking the maximum of tyre fatigue life as the optimal objective, the optimization model of tyre support angle was built. The computational results show that when tyre support angle is 30°, tyre life is far less than that when tyre support angle is optimal, which is 35.6°, and it is unsuitable to stipulate tyre support angle to be 30° in traditional design. The larger the load, the less the nominal stress amplitude increment of tyre, the more favorable the tyre fatigue life when tyre support angle is optimal.
Helton, J.C. [Arizona State Univ., Tempe, AZ (United States)
1996-03-01
A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S{sub st}, S{sub st}, p{sub st}) for stochastic uncertainty, a probability space (S{sub su}, S{sub su}, p{sub su}) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S{sub st}, S{sub st}, p{sub st}) and (S{sub su}, S{sub su}, p{sub su}). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency`s standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems.
Lower and upper probabilities in the distributive lattice of subsystems
The set of subsystems Σ(m) of a finite quantum system Σ(n) (with variables in Z(n)) together with logical connectives, is a distributive lattice. With regard to this lattice, the ℓ(m| ρn)=Tr(P(m)ρn) (where P(m) is the projector to Σ(m)) obeys a supermodularity inequality, and it is interpreted as a lower probability in the sense of the Dempster–Shafer theory, and not as a Kolmogorov probability. It is shown that the basic concepts of the Dempster–Shafer theory (lower and upper probabilities and the Dempster multivaluedness) are pertinent to the quantum formalism of finite systems. (paper)
Unitary equilibrations: probability distribution of the Loschmidt echo
Venuti, Lorenzo Campos
2009-01-01
Closed quantum systems evolve unitarily and therefore cannot converge in a strong sense to an equilibrium state starting out from a generic pure state. Nevertheless for large system size one observes temporal typicality. Namely, for the overwhelming majority of the time instants, the statistics of observables is practically indistinguishable from an effective equilibrium one. In this paper we consider the Loschmidt echo (LE) to study this sort of unitary equilibration after a quench. We draw several conclusions on general grounds and on the basis of an exactly-solvable example of a quasi-free system. In particular we focus on the whole probability distribution of observing a given value of the LE after waiting a long time. Depending on the interplay between the initial state and the quench Hamiltonian, we find different regimes reflecting different equilibration dynamics. When the perturbation is small and the system is away from criticality the probability distribution is Gaussian. However close to criticali...
Measurement of probability distributions for internal stresses in dislocated crystals
Wilkinson, Angus J.; Tarleton, Edmund; Vilalta-Clemente, Arantxa; Collins, David M. [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Jiang, Jun; Britton, T. Benjamin [Department of Materials, Imperial College London, Royal School of Mines, Exhibition Road, London SW7 2AZ (United Kingdom)
2014-11-03
Here, we analyse residual stress distributions obtained from various crystal systems using high resolution electron backscatter diffraction (EBSD) measurements. Histograms showing stress probability distributions exhibit tails extending to very high stress levels. We demonstrate that these extreme stress values are consistent with the functional form that should be expected for dislocated crystals. Analysis initially developed by Groma and co-workers for X-ray line profile analysis and based on the so-called “restricted second moment of the probability distribution” can be used to estimate the total dislocation density. The generality of the results are illustrated by application to three quite different systems, namely, face centred cubic Cu deformed in uniaxial tension, a body centred cubic steel deformed to larger strain by cold rolling, and hexagonal InAlN layers grown on misfitting sapphire and silicon carbide substrates.
Probability Distribution Function Evolution for Binary Alloy Solidification
Steinzig, M.L.; Harlow, F.H.
1999-02-26
The thermally controlled solidification of a binary alloy, nucleated at isolated sites, is described by the evolution of a probability distribution function, whose variables include grain size and distance to nearest neighbor, together with descriptors of shape, orientation, and such material properties as orientation of nonisotropic elastic modulus and coefficient of thermal expansion. The relevant Liouville equation is described and coupled with global equations for energy and solute transport. Applications are discussed for problems concerning nucleation and impingement and the consequences for final size and size distribution. The goal of this analysis is to characterize the grain structure of the solidified casting and to enable the description of its probable response to thermal treatment, machining, and the imposition of mechanical insults.
Outage probability of distributed beamforming with co-channel interference
Yang, Liang
2012-03-01
In this letter, we consider a distributed beamforming scheme (DBF) in the presence of equal-power co-channel interferers for both amplify-and-forward and decode-and-forward relaying protocols over Rayleigh fading channels. We first derive outage probability expressions for the DBF systems. We then present a performance analysis for a scheme relying on source selection. Numerical results are finally presented to verify our analysis. © 2011 IEEE.
Phase diagram of epidemic spreading - unimodal vs. bimodal probability distributions
Lancic, Alen; Antulov-Fantulin, Nino; Sikic, Mile; Stefancic, Hrvoje
2009-01-01
The disease spreading on complex networks is studied in SIR model. Simulations on empirical complex networks reveal two specific regimes of disease spreading: local containment and epidemic outbreak. The variables measuring the extent of disease spreading are in general characterized by a bimodal probability distribution. Phase diagrams of disease spreading for empirical complex networks are introduced. A theoretical model of disease spreading on m-ary tree is investigated both analytically a...
Testing for the maximum cell probabilities in multinomial distributions
XIONG; Shifeng
2005-01-01
This paper investigates one-sided hypotheses testing for p[1], the largest cell probability of multinomial distribution. A small sample test of Ethier (1982) is extended to the general cases. Based on an estimator of p[1], a kind of large sample tests is proposed. The asymptotic power of the above tests under local alternatives is derived. An example is presented at the end of this paper.
Quantization of Prior Probabilities for Collaborative Distributed Hypothesis Testing
Rhim, Joong Bum; Varshney, Lav R.; GOYAL, VIVEK K.
2011-01-01
This paper studies the quantization of prior probabilities, drawn from an ensemble, for distributed detection and data fusion. Design and performance equivalences between a team of N agents tied by a fixed fusion rule and a more powerful single agent are obtained. Effects of identical quantization and diverse quantization are compared. Consideration of perceived common risk enables agents using diverse quantizers to collaborate in hypothesis testing, and it is proven that the minimum mean Bay...
Analytical theory of the probability distribution function of structure formation
Anderson, Johan; Kim, Eun-Jin
2009-01-01
The probability distribution function (PDF) tails of the zonal flow structure formation and the PDF tails of momentum flux by incorporating effect of a shear flow in ion-temperature-gradient (ITG) turbulence are computed in the present paper. The bipolar vortex soliton (modon) is assumed to be the coherent structure responsible for bursty and intermittent events driving the PDF tails. It is found that stronger zonal flows are generated in ITG turbulence than Hasegawa-Mima (HM) turbulence as w...
Estimating probable flaw distributions in PWR steam generator tubes
Gorman, J.A.; Turner, A.P.L. [Dominion Engineering, Inc., McLean, VA (United States)
1997-02-01
This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regarding uncertainties and assumptions in the data and analyses.
Wide angle near-field diffraction and Wigner distribution
Almeida, J B
2003-01-01
Free-space propagation can be described as a shearing of the Wigner distribution function in the spatial coordinate; this shearing is linear in paraxial approximation but assumes a more complex shape for wide-angle propagation. Integration in the frequency domain allows the determination of near-field diffraction, leading to the well known Fresnel diffraction when small angles are considered and allowing exact prediction of wide-angle diffraction. The authors use this technique to demonstrate evanescent wave formation and diffraction elimination for very small apertures.
The latitude dependence and probability distribution of polar mesospheric turbulence
M. Rapp
2006-11-01
Full Text Available We consider in-situ observations and results from a global circulation model to study the latitude dependence and probability distribution of polar mesospheric turbulence. A comparison of summer observations at 69° N and 79° N shows that mesospheric turbulence weakens towards the summer pole. Furthermore, these data suggest that at both latitudes in about ~70% of all samples there are non-turbulent altitude bins in the considered altitude range between 70 and 95 km. The remaining 30% with detectable turbulence show an approximately log-normal distribution of dissipation rates. A low-resolution model version with a gravity wave (GW parameterization explains the observed latitude dependence as a consequence of a downshift of the breaking levels towards the summer pole and an accompanying decay of turbulent heating per unit mass. When we do not use a GW parameterization but employ a high spatial resolution instead to simulate GW effects explicitly, the model predicts a similar latitudinal dependence with weakening turbulence towards the summer pole. In addition, the model also produces a log-normal distribution of dissipation rates. The simulated probability distribution is more narrow than in the observations since the model resolves at most mid-frequency GWs, whereas real turbulence is also excited by smaller-scale disturbances. The GW resolving simulation suggests a weaker tropospheric GW source at polar latitudes as the dominating mechanism for the latitudinal dependence.
The Probability Distribution Model of Wind Speed over East Malaysia
Nurulkamal Masseran
2013-07-01
Full Text Available Many studies have found that wind speed is the most significant parameter of wind power. Thus, an accurate determination of the probability distribution of wind speed is an important parameter to measure before estimating the wind energy potential over a particular region. Utilizing an accurate distribution will minimize the uncertainty in wind resource estimates and improve the site assessment phase of planning. In general, different regions have different wind regimes. Hence, it is reasonable that different wind distributions will be found for different regions. Because it is reasonable to consider that wind regimes vary according to the region of a particular country, nine different statistical distributions have been fitted to the mean hourly wind speed data from 20 wind stations in East Malaysia, for the period from 2000 to 2009. The values from Kolmogorov-Smirnov statistic, Akaike’s Information Criteria, Bayesian Information Criteria and R2 correlation coefficient were compared with the distributions to determine the best fit for describing the observed data. A good fit for most of the stations in East Malaysia was found using the Gamma and Burr distributions, though there was no clear pattern observed for all regions in East Malaysia. However, the Gamma distribution was a clear fit to the data from all stations in southern Sabah.
Winding angle distributions for two-dimensional collapsing polymers
Narros, Arturo; Owczarek, Aleksander L.; Prellberg, Thomas
2016-01-01
We provide numerical support for a long-standing prediction of universal scaling of winding angle distributions. Simulations of interacting self-avoiding walks show that the winding angle distribution for N-step walks is compatible with the theoretical prediction of a Gaussian with a variance growing asymptotically as C log N, with C = 2 in the swollen phase (previously verified), and C = 24/7 at the θ-point. At low temperatures weaker evidence demonstrates compatibility with the same scaling and a value of C = 4 in the collapsed phase, also as theoretically predicted.
Mathematical simulation of gamma-radiation angle distribution measurements
We developed mathematical model of the facility for gamma-radiation angle distribution measurement and calculated response functions for gamma-radiation intensities. We developed special software for experimental data processing, the 'Shelter' object radiation spectra unfolding and Sphere detector (ShD) angle resolution estimation. Neuronet method using for detection of the radiation directions is given. We developed software based on the neuronet algorithm, that allows obtaining reliable distribution of gamma-sources that make impact on the facility detectors at the measurement point. 10 refs.; 15 figs.; 4 tab
Effect of Air Outlet Angle on Air Distribution Performance Index
Isbeyeh W. Maid
2013-05-01
Full Text Available In this paper a numerical study of velocity and temperature distribution in air conditioned space have been made. The computational model consists of the non-isothermal 3-D turbulent with (k-ε model. The numerical study is made to conduct air distribution in a room air-conditioned space with real interior dimensions (6×4×3m and to analyze the effect of changing angle of grille vanes on the flow pattern, velocity, and temperature distribution in the room under a set of different condition, and under a supply air temperature of 16˚C to examine the final result on air distribution performance index (ADPI.The results show a significant effect within the change of supply air angle, the maximum air distribution performance index (ADPI is 52% when air change per hour (ACH is equal to 10 at 16˚C inlet temperature with angle ( 15˚ down, and the minimum value of (ADPI is 20% when ACH is equal to 15 at 16˚C inlet temperature and angle ( degree.
Monsoonal differences and probability distribution of PM(10) concentration.
Md Yusof, Noor Faizah Fitri; Ramli, Nor Azam; Yahaya, Ahmad Shukri; Sansuddin, Nurulilyana; Ghazali, Nurul Adyani; Al Madhoun, Wesam
2010-04-01
There are many factors that influence PM(10) concentration in the atmosphere. This paper will look at the PM(10) concentration in relation with the wet season (north east monsoon) and dry season (south west monsoon) in Seberang Perai, Malaysia from the year 2000 to 2004. It is expected that PM(10) will reach the peak during south west monsoon as the weather during this season becomes dry and this study has proved that the highest PM(10) concentrations in 2000 to 2004 were recorded in this monsoon. Two probability distributions using Weibull and lognormal were used to model the PM(10) concentration. The best model used for prediction was selected based on performance indicators. Lognormal distribution represents the data better than Weibull distribution model for 2000, 2001, and 2002. However, for 2003 and 2004, Weibull distribution represents better than the lognormal distribution. The proposed distributions were successfully used for estimation of exceedences and predicting the return periods of the sequence year. PMID:19365611
The probability distribution associated with the dichotomic Markov process
We have calculated the probability distribution function for the dichotomic Markov process using the work of Pomraning [Linear kinetic theory and particle transport in stochastic mixtures. Singapore: World Scientific; 1991] and have studied the rate of convergence to this exact form as the number of terms in a series approximation is increased. Each term in the series involves an additional stochastic moment in the hierarchy of moments. It is observed that convergence is fast near the source but, as the distance from the source increases, more and more moments are required to obtain a specified accuracy
Phase diagram of epidemic spreading - unimodal vs. bimodal probability distributions
Lancic, Alen; Sikic, Mile; Stefancic, Hrvoje
2009-01-01
The disease spreading on complex networks is studied in SIR model. Simulations on empirical complex networks reveal two specific regimes of disease spreading: local containment and epidemic outbreak. The variables measuring the extent of disease spreading are in general characterized by a bimodal probability distribution. Phase diagrams of disease spreading for empirical complex networks are introduced. A theoretical model of disease spreading on m-ary tree is investigated both analytically and in simulations. It is shown that the model reproduces qualitative features of phase diagrams of disease spreading observed in empirical complex networks. The role of tree-like structure of complex networks in disease spreading is discussed.
Brownian Motion on a Sphere: Distribution of Solid Angles
Krishna, M. M. G.; Samuel, Joseph; Sinha, Supurna
2000-01-01
We study the diffusion of Brownian particles on the surface of a sphere and compute the distribution of solid angles enclosed by the diffusing particles. This function describes the distribution of geometric phases in two state quantum systems (or polarised light) undergoing random evolution. Our results are also relevant to recent experiments which observe the Brownian motion of molecules on curved surfaces like micelles and biological membranes. Our theoretical analysis agrees well with the...
BOND-ANGLE DISTRIBUTION FUNCTIONS IN METALLIC GLASSES
Hafner, J.
1985-01-01
Bond-angle distribution functions have been calculated for realistic models of metallic glasses. They suggest a defected icosahedral short-range bond-orientational order and a close analogy of the short-range topological order in the amorphous and in the crystalline states.
Probability Distribution Function of the Upper Equatorial Pacific Current Speeds
Chu, P. C.
2008-12-01
The probability distribution function (PDF) of the upper (0-50 m) tropical Pacific current speeds (w), constructed from hourly ADCP data (1990-2007) at six stations for the TOGA-TAO project, satisfies the two- parameter Weibull distribution reasonably well with different characteristics between El Nino and La Nina events: In the western Pacific, the PDF of w has a larger peakedness during the La Nina events than during the El Nino events; and vice versa in the eastern Pacific. However, the PDF of w for the lower layer (100-200 m) does not fit the Weibull distribution so well as the upper layer. This is due to the different stochastic differential equations between upper and lower layers in the tropical Pacific. For the upper layer, the stochastic differential equations, established on the base of the Ekman dynamics, have analytical solution, i.e., the Rayleigh distribution (simplest form of the Weibull distribution), for constant eddy viscosity K. Knowledge on PDF of w during the El Nino and La Nina events will improve the ensemble horizontal flux calculation, which contributes to the climate studies.
Maximum-entropy probability distributions under Lp-norm constraints
Dolinar, S.
1991-01-01
Continuous probability density functions and discrete probability mass functions are tabulated which maximize the differential entropy or absolute entropy, respectively, among all probability distributions with a given L sub p norm (i.e., a given pth absolute moment when p is a finite integer) and unconstrained or constrained value set. Expressions for the maximum entropy are evaluated as functions of the L sub p norm. The most interesting results are obtained and plotted for unconstrained (real valued) continuous random variables and for integer valued discrete random variables. The maximum entropy expressions are obtained in closed form for unconstrained continuous random variables, and in this case there is a simple straight line relationship between the maximum differential entropy and the logarithm of the L sub p norm. Corresponding expressions for arbitrary discrete and constrained continuous random variables are given parametrically; closed form expressions are available only for special cases. However, simpler alternative bounds on the maximum entropy of integer valued discrete random variables are obtained by applying the differential entropy results to continuous random variables which approximate the integer valued random variables in a natural manner. All the results are presented in an integrated framework that includes continuous and discrete random variables, constraints on the permissible value set, and all possible values of p. Understanding such as this is useful in evaluating the performance of data compression schemes.
Log-concave Probability Distributions: Theory and Statistical Testing
An, Mark Yuing
1996-01-01
This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete and...... multivariate distributions are also discussed. We propose simple non-parametric testing procedures for log-concavity. The test statistics are constructed to test one of the two implicati ons of log-concavity: increasing hazard rates and new-is-better-than-used (NBU) property. The test for increasing hazard...... rates are based on normalized spacing of the sample order statistics. The tests for NBU property fall into the category of Hoeffding's U-statistics...
Probability Distribution Function of Passive Scalars in Shell Models
Zhang, Xiao-Qiang; Wang, Guang-Rui; Chen, Shi-Gang
2008-04-01
A shell-model version of passive scalar problem is introduced, which is inspired by the model of K. Ohkitani and M. Yakhot [K. Ohkitani and M. Yakhot, Phys. Rev. Lett. 60 (1988) 983; K. Ohkitani and M. Yakhot, Prog. Theor. Phys. 81 (1988) 329]. As in the original problem, the prescribed random velocity field is Gaussian and δ correlated in time. Deterministic differential equations are regarded as nonlinear Langevin equation. Then, the Fokker Planck equations of PDF for passive scalars are obtained and solved numerically. In energy input range (n PDF) of passive scalars is near the Gaussian distribution. In inertial range (5 = 17), the probability distribution function (PDF) of passive scalars has obvious intermittence. And the scaling power of passive scalar is anomalous. The results of numerical simulations are compared with experimental measurements.
Daniel Ting
2010-04-01
Full Text Available Distributions of the backbone dihedral angles of proteins have been studied for over 40 years. While many statistical analyses have been presented, only a handful of probability densities are publicly available for use in structure validation and structure prediction methods. The available distributions differ in a number of important ways, which determine their usefulness for various purposes. These include: 1 input data size and criteria for structure inclusion (resolution, R-factor, etc.; 2 filtering of suspect conformations and outliers using B-factors or other features; 3 secondary structure of input data (e.g., whether helix and sheet are included; whether beta turns are included; 4 the method used for determining probability densities ranging from simple histograms to modern nonparametric density estimation; and 5 whether they include nearest neighbor effects on the distribution of conformations in different regions of the Ramachandran map. In this work, Ramachandran probability distributions are presented for residues in protein loops from a high-resolution data set with filtering based on calculated electron densities. Distributions for all 20 amino acids (with cis and trans proline treated separately have been determined, as well as 420 left-neighbor and 420 right-neighbor dependent distributions. The neighbor-independent and neighbor-dependent probability densities have been accurately estimated using Bayesian nonparametric statistical analysis based on the Dirichlet process. In particular, we used hierarchical Dirichlet process priors, which allow sharing of information between densities for a particular residue type and different neighbor residue types. The resulting distributions are tested in a loop modeling benchmark with the program Rosetta, and are shown to improve protein loop conformation prediction significantly. The distributions are available at http://dunbrack.fccc.edu/hdp.
Diachronic changes in word probability distributions in daily press
Stanković Jelena
2006-01-01
Full Text Available Changes in probability distributions of individual words and word types were investigated within two samples of daily press in the span of fifty years. Two samples of daily press were used in this study. The one derived from the Corpus of Serbian Language (CSL /Kostić, Đ., 2001/ that covers period between 1945. and 1957. and the other derived from the Ebart Media Documentation (EBR that was complied from seven daily news and five weekly magazines from 2002. and 2003. Each sample consisted of about 1 million words. The obtained results indicate that nouns and adjectives were more frequent in the CSL, while verbs and prepositions are more frequent in the EBR sample, suggesting a decrease of sentence length in the last five decades. Conspicuous changes in probability distribution of individual words were observed for nouns and adjectives, while minimal or no changes were observed for verbs and prepositions. Such an outcome suggests that nouns and adjectives are most susceptible to diachronic changes, while verbs and prepositions appear to be resistant to such changes.
Baer, P.; Mastrandrea, M.
2006-12-01
Simple probabilistic models which attempt to estimate likely transient temperature change from specified CO2 emissions scenarios must make assumptions about at least six uncertain aspects of the causal chain between emissions and temperature: current radiative forcing (including but not limited to aerosols), current land use emissions, carbon sinks, future non-CO2 forcing, ocean heat uptake, and climate sensitivity. Of these, multiple PDFs (probability density functions) have been published for the climate sensitivity, a couple for current forcing and ocean heat uptake, one for future non-CO2 forcing, and none for current land use emissions or carbon cycle uncertainty (which are interdependent). Different assumptions about these parameters, as well as different model structures, will lead to different estimates of likely temperature increase from the same emissions pathway. Thus policymakers will be faced with a range of temperature probability distributions for the same emissions scenarios, each described by a central tendency and spread. Because our conventional understanding of uncertainty and probability requires that a probabilistically defined variable of interest have only a single mean (or median, or modal) value and a well-defined spread, this "multidimensional" uncertainty defies straightforward utilization in policymaking. We suggest that there are no simple solutions to the questions raised. Crucially, we must dispel the notion that there is a "true" probability probabilities of this type are necessarily subjective, and reasonable people may disagree. Indeed, we suggest that what is at stake is precisely the question, what is it reasonable to believe, and to act as if we believe? As a preliminary suggestion, we demonstrate how the output of a simple probabilistic climate model might be evaluated regarding the reasonableness of the outputs it calculates with different input PDFs. We suggest further that where there is insufficient evidence to clearly
Non-Gaussian probability distributions of solar wind fluctuations
E. Marsch
Full Text Available The probability distributions of field differences ∆x(τ=x(t+τ-x(t, where the variable x(t may denote any solar wind scalar field or vector field component at time t, have been calculated from time series of Helios data obtained in 1976 at heliocentric distances near 0.3 AU. It is found that for comparatively long time lag τ, ranging from a few hours to 1 day, the differences are normally distributed according to a Gaussian. For shorter time lags, of less than ten minutes, significant changes in shape are observed. The distributions are often spikier and narrower than the equivalent Gaussian distribution with the same standard deviation, and they are enhanced for large, reduced for intermediate and enhanced for very small values of ∆x. This result is in accordance with fluid observations and numerical simulations. Hence statistical properties are dominated at small scale τ by large fluctuation amplitudes that are sparsely distributed, which is direct evidence for spatial intermittency of the fluctuations. This is in agreement with results from earlier analyses of the structure functions of ∆x. The non-Gaussian features are differently developed for the various types of fluctuations. The relevance of these observations to the interpretation and understanding of the nature of solar wind magnetohydrodynamic (MHD turbulence is pointed out, and contact is made with existing theoretical concepts of intermittency in fluid turbulence.
Probability distribution function for reorientations in Maier-Saupe potential
Sitnitsky, A. E.
2016-06-01
Exact analytic solution for the probability distribution function of the non-inertial rotational diffusion equation, i.e., of the Smoluchowski one, in a symmetric Maier-Saupe uniaxial potential of mean torque is obtained via the confluent Heun's function. Both the ordinary Maier-Saupe potential and the double-well one with variable barrier width are considered. Thus, the present article substantially extends the scope of the potentials amenable to the treatment by reducing Smoluchowski equation to the confluent Heun's one. The solution is uniformly valid for any barrier height. We use it for the calculation of the mean first passage time. Also the higher eigenvalues for the relaxation decay modes in the case of ordinary Maier-Saupe potential are calculated. The results obtained are in full agreement with those of the approach developed by Coffey, Kalmykov, Déjardin and their coauthors in the whole range of barrier heights.
Gesture Recognition Based on the Probability Distribution of Arm Trajectories
Wan, Khairunizam; Sawada, Hideyuki
The use of human motions for the interaction between humans and computers is becoming an attractive alternative to verbal media, especially through the visual interpretation of the human body motion. In particular, hand gestures are used as non-verbal media for the humans to communicate with machines that pertain to the use of the human gestures to interact with them. This paper introduces a 3D motion measurement of the human upper body for the purpose of the gesture recognition, which is based on the probability distribution of arm trajectories. In this study, by examining the characteristics of the arm trajectories given by a signer, motion features are selected and classified by using a fuzzy technique. Experimental results show that the use of the features extracted from arm trajectories effectively works on the recognition of dynamic gestures of a human, and gives a good performance to classify various gesture patterns.
The Probability Distribution Function of Column Density in Molecular Clouds
Vázquez-Semadeni, E; Vazquez-Semadeni, Enrique; Garcia, Nieves
2001-01-01
We discuss the probability distribution function (PDF) of column density resulting from density fields with lognormal PDFs, applicable to molecular clouds. For magnetic and non-magnetic numerical simulations of compressible, isothermal turbulence, we show that the density autocorrelation function (ACF) decays over short distances compared to the simulation size. The density "events" along a line of sight can be assumed to be independent over distances larger than this, and the Central Limit Theorem should be applicable. However, using random realizations of lognormal fields, we show that the convergence to a Gaussian shape is extremely slow in the high-density tail, and thus the column density PDF is not expected to exhibit a unique functional shape, but to transit instead from a lognormal to a Gaussian form as the column length increases, with decreasing variance. For intermediate path lengths, the column density PDF assumes a nearly exponential decay. For cases with density contrasts of $10^4$, comparable t...
Probability distributions for one component equations with multiplicative noise
Deutsch, J M
1993-01-01
Abstract: Systems described by equations involving both multiplicative and additive noise are common in nature. Examples include convection of a passive scalar field, polymersin turbulent flow, and noise in dye lasers. In this paper the one component version of this problem is studied. The steady state probability distribution is classified into two different types of behavior. One class has power law tails and the other is of the form of an exponential to a power law. The value of the power law exponent is determined analytically for models having colored gaussian noise. It is found to only depend on the power spectrum of the noise at zero frequency. When non-gaussian noise is considered it is shown that stretched exponential tails are possible. An intuitive understanding of the results is found and makes use of the Lyapunov exponents for these systems.
Seismic pulse propagation with constant Q and stable probability distributions
M. Tomirotti
1997-06-01
Full Text Available The one-dimensional propagation of seismic waves with constant Q is shown to be governed by an evolution equation of fractional order in time, which interpolates the heat equation and the wave equation. The fundamental solutions for the Cauchy and Signalling problems are expressed in terms of entire functions (of Wright type in the similarity variable and their behaviours turn out to be intermediate between those for the limiting cases of a perfectly viscous fluid and a perfectly elastic solid. In view of the small dissipation exhibited by the seismic pulses, the nearly elastic limit is considered. Furthermore, the fundamental solutions for the Cauchy and Signalling problems are shown to be related to stable probability distributions with an index of stability determined by the order of the fractional time derivative in the evolution equation.
Characterizing the \\lyaf\\ flux probability distribution function using Legendre polynomials
Cieplak, Agnieszka M
2016-01-01
The Lyman-$\\alpha$ forest is a highly non-linear field with a lot of information available in the data beyond the power spectrum. The flux probability distribution function (PDF) has been used as a successful probe of small-scale physics. In this paper we argue that measuring coefficients of the Legendre polyonomial expansion of the PDF offers several advantages over measuring the binned values as is commonly done. In particular, $n$-th coefficient can be expressed as a linear combination of the first $n$ moments, allowing these coefficients to be measured in the presence of noise and allowing a clear route for marginalisation over mean flux. Moreover, in the presence of noise, our numerical work shows that a finite number of coefficients are well measured with very sharp transition into noise dominance. This compresses the available information into a small number of well-measured quantities.
Quantization of Prior Probabilities for Collaborative Distributed Hypothesis Testing
Rhim, Joong Bum; Varshney, Lav R.; Goyal, Vivek K.
2012-09-01
This paper studies the quantization of prior probabilities, drawn from an ensemble, for distributed detection and data fusion. Design and performance equivalences between a team of N agents tied by a fixed fusion rule and a more powerful single agent are obtained. Effects of identical quantization and diverse quantization are compared. Consideration of perceived common risk enables agents using diverse quantizers to collaborate in hypothesis testing, and it is proven that the minimum mean Bayes risk error is achieved by diverse quantization. The comparison shows that optimal diverse quantization with K cells per quantizer performs as well as optimal identical quantization with N(K-1)+1 cells per quantizer. Similar results are obtained for maximum Bayes risk error as the distortion criterion.
Quantization of Prior Probabilities for Collaborative Distributed Hypothesis Testing
Rhim, Joong Bum; Goyal, Vivek K
2011-01-01
This paper studies the quantization of prior probabilities, drawn from an ensemble, for distributed detection and data fusion. Design and performance equivalences between a team of N agents tied by a fixed fusion rule and a more powerful single agent are obtained. Effects of identical quantization and diverse quantization are compared. Consideration of perceived common risk enables agents using diverse quantizers to collaborate in hypothesis testing, and it is proven that the minimum mean Bayes risk error is achieved by diverse quantization. The comparison shows that optimal diverse quantization with K cells per quantizer performs as well as optimal identical quantization with N(K-1)+1 cells per quantizer. Similar results are obtained for maximum Bayes risk error as the distortion criterion.
Spin-Orbit angle distribution and the origin of (mis)aligned hot Jupiters
Crida, Aurélien
2014-01-01
For 61 transiting hot Jupiters, the projection of the angle between the orbital plane and the stellar equator (called the spin-orbit angle) has been measured. For about half of them, a significant misalignment is detected, and retrograde planets have been observed. This challenges scenarios of the formation of hot Jupiters. In order to better constrain formation models, we relate the distribution of the real spin-orbit angle $\\Psi$ to the projected one $\\beta$. Then, a comparison with the observations is relevant. We analyse the geometry of the problem to link analytically the projected angle $\\beta$ to the real spin-orbit angle $\\Psi$. The distribution of $\\Psi$ expected in various models is taken from the literature, or derived with a simplified model and Monte-Carlo simulations in the case of the disk-torquing mechanism. An easy formula to compute the probability density function (PDF) of $\\beta$ knowing the PDF of $\\Psi$ is provided. All models tested here look compatible with the observed distribution be...
Insights from probability distribution functions of intensity maps
Breysse, Patrick C; Behroozi, Peter S; Dai, Liang; Kamionkowski, Marc
2016-01-01
In the next few years, intensity-mapping surveys that target lines such as CO, Ly$\\alpha$, and CII stand to provide powerful probes of high-redshift astrophysics. However, these line emissions are highly non-Gaussian, and so the typical power-spectrum methods used to study these maps will leave out a significant amount of information. We propose a new statistic, the probability distribution of voxel intensities, which can access this extra information. Using a model of a CO intensity map at $z\\sim3$ as an example, we demonstrate that this voxel intensity distribution (VID) provides substantial constraining power beyond what is obtainable from the power spectrum alone. We find that a future survey similar to the planned COMAP Full experiment could constrain the CO luminosity function to order $\\sim10\\%$. We also explore the effects of contamination from continuum emission, interloper lines, and gravitational lensing on our constraints and find that the VID statistic retains significant constraining power even ...
Characterization of cast metals with probability distribution functions
Characterization of microstructure using a probability distribution function (PDF) provides a means for extracting useful information about material properties. In the extension of classical PDF methods developed in the research, material characteristics are evolved by propagating an initial PDF through time, using growth laws derived from consideration of heat flow and species diffusion, constrained by the Gibbs-Thomson law. A model is described here that allows for nucleation, followed by growth of nominally spherical grains according to a stable or unstable growth law. Results are presented for the final average grain size as a function of cooling rate for various nucleation parameters. In particular the authors show that the model describes linear variation of final grain size with the inverse cube root of cooling rate. Within a subset of casting parameters, the stable-to-unstable manifests itself as a bimodal distribution of final grain size. Calculations with the model are described for the liquid to epsilon phase transition in a plutonium 1 weight percent gallium alloy
The Probability Distribution for Non-Gaussianity Estimators
Smith, Tristan L; Wandelt, Benjamin D
2011-01-01
One of the principle efforts in cosmic microwave background (CMB) research is measurement of the parameter fnl that quantifies the departure from Gaussianity in a large class of non-minimal inflationary (and other) models. Estimators for fnl are composed of a sum of products of the temperatures in three different pixels in the CMB map. Since the number ~Npix^2 of terms in this sum exceeds the number Npix of measurements, these ~Npix^2 terms cannot be statistically independent. Therefore, the central-limit theorem does not necessarily apply, and the probability distribution function (PDF) for the fnl estimator does not necessarily approach a Gaussian distribution for N_pix >> 1. Although the variance of the estimators is known, the significance of a measurement of fnl depends on knowledge of the full shape of its PDF. Here we use Monte Carlo realizations of CMB maps to determine the PDF for two minimum-variance estimators: the standard estimator, constructed under the null hypothesis (fnl=0), and an improved e...
COSMIC MICROWAVE BACKGROUND LIKELIHOOD APPROXIMATION FOR BANDED PROBABILITY DISTRIBUTIONS
Gjerløw, E.; Mikkelsen, K.; Eriksen, H. K.; Næss, S. K.; Seljebotn, D. S. [Institute of Theoretical Astrophysics, University of Oslo, P.O. Box 1029 Blindern, N-0315 Oslo (Norway); Górski, K. M.; Huey, G.; Jewell, J. B.; Rocha, G.; Wehus, I. K., E-mail: eirik.gjerlow@astro.uio.no [Jet Propulsion Laboratory, California Institute of Technology, 4800 Oak Grove Drive, Pasadena, CA 91109 (United States)
2013-11-10
We investigate sets of random variables that can be arranged sequentially such that a given variable only depends conditionally on its immediate predecessor. For such sets, we show that the full joint probability distribution may be expressed exclusively in terms of uni- and bivariate marginals. Under the assumption that the cosmic microwave background (CMB) power spectrum likelihood only exhibits correlations within a banded multipole range, Δl{sub C}, we apply this expression to two outstanding problems in CMB likelihood analysis. First, we derive a statistically well-defined hybrid likelihood estimator, merging two independent (e.g., low- and high-l) likelihoods into a single expression that properly accounts for correlations between the two. Applying this expression to the Wilkinson Microwave Anisotropy Probe (WMAP) likelihood, we verify that the effect of correlations on cosmological parameters in the transition region is negligible in terms of cosmological parameters for WMAP; the largest relative shift seen for any parameter is 0.06σ. However, because this may not hold for other experimental setups (e.g., for different instrumental noise properties or analysis masks), but must rather be verified on a case-by-case basis, we recommend our new hybridization scheme for future experiments for statistical self-consistency reasons. Second, we use the same expression to improve the convergence rate of the Blackwell-Rao likelihood estimator, reducing the required number of Monte Carlo samples by several orders of magnitude, and thereby extend it to high-l applications.
Prediction of optimum section pitch angle distribution along wind turbine blades
Highlights: ► Prediction of optimum pitch angle along wind turbine blades. ► Maximum electrical power extraction at the installation site. ► Solving BEM equations with the probability distribution function of wind speed at a installation site. - Abstract: In this paper, the boost in electrical energy production of horizontal-axis wind turbines with fixed rotor speed is studied. To achieve this, a new innovative algorithm is proposed and justified to predict a distribution of section pitch angle along wind turbine blades that corresponds to the maximum power extraction in the installation site. A code is developed based on the blade element momentum theory which incorporates different corrections such as the tip loss correction. This aerodynamic code is capable of accurately predicting the aerodynamics of horizontal-axis wind turbines
Performance Probability Distributions for Sediment Control Best Management Practices
Ferrell, L.; Beighley, R.; Walsh, K.
2007-12-01
Controlling soil erosion and sediment transport can be a significant challenge during the construction process due to the extent and conditions of bare, disturbed soils. Best Management Practices (BMPs) are used as the framework for the design of sediment discharge prevention systems in stormwater pollution prevention plans which are typically required for construction sites. This research focuses on commonly-used BMP systems for perimeter control of sediment export: silt fences and fiber rolls. Although these systems are widely used, the physical and engineering parameters describing their performance are not well understood. Performance expectations are based on manufacturer results, but due to the dynamic conditions that exist on a construction site performance expectations are not always achievable in the field. Based on experimental results product performance is shown to be highly variable. Experiments using the same installation procedures show inconsistent sediment removal performances ranging from (>)85 percent to zero. The goal of this research is to improve the determination of off-site sediment yield based on probabilistic performance results of perimeter control BMPs. BMPs are evaluated in the Soil Erosion Research Laboratory (SERL) in the Civil and Environmental Engineering department at San Diego State University. SERL experiments are performed on a 3-m by 10-m tilting soil bed with a soil depth of 0.5 meters and a slope of 33 percent. The simulated storm event consists of 17 mm/hr for 20 minutes followed by 51 mm/hr for 30 minutes. The storm event is based on an ASTM design storm intended to simulate BMP failures. BMP performance is assessed based on experiments where BMPs are installed per manufacture specifications, less than optimal installations, and no treatment conditions. Preliminary results from 30 experiments are presented and used to develop probability distributions for BMP sediment removal efficiencies. The results are then combined with
Tools for Bramwell-Holdsworth-Pinton Probability Distribution
Mirela Danubianu; Tiberiu Socaciu; Ioan Maxim
2009-01-01
This paper is a synthesis of a range of papers presentedat various conferences related to distribution Bramwell-Holdsworth-Pinton. S. T. Bramwell, P. C. W. Holdsworth, J. F.Pinton introduced a new non-parametric distribution (calledBHP) after studying some magnetization problems in 2D. Probabilitydensity function of distribution can be aproximated as amodified GFT (Gumbel-Fisher-Tippitt) distribution.
Ioannou, Ioanna; Loyka, Sergey
2011-01-01
Outage probability and capacity of a class of block-fading MIMO channels are considered with partial channel distribution information. Specifically, the channel or its distribution are not known but the latter is known to belong to a class of distributions where each member is within a certain distance (uncertainty) from a nominal distribution. Relative entropy is used as a measure of distance between distributions. Compound outage probability defined as min (over the transmit signal distribution) -max (over the channel distribution class) outage probability is introduced and investigated. This generalizes the standard outage probability to the case of partial channel distribution information. Compound outage probability characterization (via one-dimensional convex optimization), its properties and approximations are given. It is shown to have two-regime behavior: when the nominal outage probability decreases (e.g. by increasing the SNR), the compound outage first decreases linearly down to a certain threshol...
Tools for Bramwell-Holdsworth-Pinton Probability Distribution
Mirela Danubianu
2009-01-01
Full Text Available This paper is a synthesis of a range of papers presentedat various conferences related to distribution Bramwell-Holdsworth-Pinton. S. T. Bramwell, P. C. W. Holdsworth, J. F.Pinton introduced a new non-parametric distribution (calledBHP after studying some magnetization problems in 2D. Probabilitydensity function of distribution can be aproximated as amodified GFT (Gumbel-Fisher-Tippitt distribution.
Investigation of Probability Distributions Using Dice Rolling Simulation
Lukac, Stanislav; Engel, Radovan
2010-01-01
Dice are considered one of the oldest gambling devices and thus many mathematicians have been interested in various dice gambling games in the past. Dice have been used to teach probability, and dice rolls can be effectively simulated using technology. The National Council of Teachers of Mathematics (NCTM) recommends that teachers use simulations…
Conant, Darcy Lynn
2013-01-01
Stochastic understanding of probability distribution undergirds development of conceptual connections between probability and statistics and supports development of a principled understanding of statistical inference. This study investigated the impact of an instructional course intervention designed to support development of stochastic…
Borguesan, Bruno; Barbachan e Silva, Mariel; Grisci, Bruno; Inostroza-Ponta, Mario; Dorn, Márcio
2015-12-01
Tertiary protein structure prediction is one of the most challenging problems in structural bioinformatics. Despite the advances in algorithm development and computational strategies, predicting the folded structure of a protein only from its amino acid sequence remains as an unsolved problem. We present a new computational approach to predict the native-like three-dimensional structure of proteins. Conformational preferences of amino acid residues and secondary structure information were obtained from protein templates stored in the Protein Data Bank and represented as an Angle Probability List. Two knowledge-based prediction methods based on Genetic Algorithms and Particle Swarm Optimization were developed using this information. The proposed method has been tested with twenty-six case studies selected to validate our approach with different classes of proteins and folding patterns. Stereochemical and structural analysis were performed for each predicted three-dimensional structure. Results achieved suggest that the Angle Probability List can improve the effectiveness of metaheuristics used to predicted the three-dimensional structure of protein molecules by reducing its conformational search space. PMID:26495908
Using projections and correlations to approximate probability distributions
Karlen, D A
1998-01-01
A method to approximate continuous multi-dimensional probability density functions (PDFs) using their projections and correlations is described. The method is particularly useful for event classification when estimates of systematic uncertainties are required and for the application of an unbinned maximum likelihood analysis when an analytic model is not available. A simple goodness of fit test of the approximation can be used, and simulated event samples that follow the approximate PDFs can be efficiently generated. The source code for a FORTRAN-77 implementation of this method is available.
Probability distributions for measures of placental shape and morphology
Birthweight at delivery is a standard cumulative measure of placental growth, but is a crude summary of other placental characteristics, such as, e.g., the chorionic plate size, and the shape and position of the umbilical cord insertion. Distributions of such measures across a cohort reveal information about the developmental history of the chorionic plate which is unavailable from an analysis based solely on the mean and standard deviation. Various measures were determined from digitized images of chorionic plates obtained from the pregnancy, infection, and nutrition study, a prospective cohort study of preterm birth in central North Carolina between 2002 and 2004. Centroids (geometric centers) and umbilical cord insertions were taken directly from the images. Chorionic plate outlines were obtained from an interpolation based on a Fourier series, while eccentricity (of the best-fit ellipse), skewness, and kurtosis were determined from the method of moments. Histograms of each variable were compared against the normal, lognormal, and Lévy distributions. Only a single measure (eccentricity) followed a normal distribution. All others followed lognormal or ‘heavy-tailed’ distributions for moderate to extreme deviations from the mean, where the relative likelihood far exceeded those of a normal distribution. (paper)
Beta-hypergeometric probability distribution on symmetric matrices
Hassairi, Abdelhamid; Masmoudi, Mouna
2012-01-01
Some remarkable properties of the beta distribution are based on relations involving independence between beta random variables such that a parameter of one among them is the sum of the parameters of an other (see (1.1) et (1.2) below). Asci, Letac and Piccioni \\cite{6} have used the real beta-hypergeometric distribution on $ \\reel$ to give a general version of these properties without the condition on the parameters. In the present paper, we extend the properties of the real beta to the beta...
Scaling Properties of the Probability Distribution of Lattice Gribov Copies
Lokhov, A Y; Roiesnel, C
2005-01-01
We study the problem of the Landau gauge fixing in the case of the SU(2) lattice gauge theory. We show that the probability to find a lattice Gribov copy increases considerably when the physical size of the lattice exceeds some critical value $\\approx2.75/\\sqrt{\\sigma}$, almost independent of the lattice spacing. The impact of the choice of the copy on Green functions is presented. We confirm that the ghost propagator depends on the choice of the copy whereas the gluon propagator is insensitive to it (within present statistical errors). The gluonic three-point functions are also insensitive to it. Finally we show that gauge copies which have the same value of the minimisation functional ($\\int d^4x (A^a_\\mu)^2$) are equivalent, up to a global gauge transformation, and yield the same Green functions.
Extreme Points of the Convex Set of Joint Probability Distributions with Fixed Marginals
K R Parthasarathy
2007-11-01
By using a quantum probabilistic approach we obtain a description of the extreme points of the convex set of all joint probability distributions on the product of two standard Borel spaces with fixed marginal distributions.
Variation in the probability distribution function of passive contaminant concentration
Jurčáková, Klára; Riveron, F.
Praha : Ústav termomechaniky AV ČR v. v. i., 2009 - (Jonáš, P.; Uruba, V.), s. 19-20 ISBN 978-80-87012-21-5. [Colloquium Fluid Dynamics 2009. Praha (CZ), 21.10.2009-23.10.2009] Institutional research plan: CEZ:AV0Z20760514 Keywords : temporal variation of concentrations * atmospheric boundary layer * Weibull distribution Subject RIV: DG - Athmosphere Sciences, Meteorology
Yilmaz, Şeyda; Bayrak, Erdem; Bayrak, Yusuf
2016-04-01
In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed time using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.
Disoriented Chiral Condensates, Pion Probability Distributions and Parallels with Disordered System
Mekjian, A. Z.
1999-01-01
A general expression is discussed for pion probability distributions coming from relativistic heavy ion collisions. The general expression contains as limits: 1) The disoriented chiral condensate (DCC), 2) the negative binomial distribution and Pearson type III distribution, 3) a binomial or Gaussian result, 4) and a Poisson distribution. This general expression approximates other distributions such as a signal to noise laser distribution. Similarities and differences of the DCC distribution ...
Calculation of ruin probabilities for a dense class of heavy tailed distributions
Bladt, Mogens; Nielsen, Bo Friis; Samorodnitsky, Gennady
2015-01-01
any such distribution. We prove that formulas from renewal theory, and with a particular attention to ruin probabilities, which are true for common phase-type distributions also hold true for the infinite-dimensional case. We provide algorithms for calculating functionals of interest such as the...... renewal density and the ruin probability. It might be of interest to approximate a given heavy tailed distribution of some other type by a distribution from the class of infinite-dimensional phase-type distributions and to this end we provide a calibration procedure which works for the approximation of...... distributions with a slowly varying tail. An example from risk theory, comparing ruin probabilities for a classical risk process with Pareto distributed claim sizes, is presented and exact known ruin probabilities for the Pareto case are compared to the ones obtained by approximating by an infinite...
The probability distribution of the predicted CFM-induced ozone depletion. [Chlorofluoromethane
Ehhalt, D. H.; Chang, J. S.; Bulter, D. M.
1979-01-01
It is argued from the central limit theorem that the uncertainty in model predicted changes of the ozone column density is best represented by a normal probability density distribution. This conclusion is validated by comparison with a probability distribution generated by a Monte Carlo technique. In the case of the CFM-induced ozone depletion, and based on the estimated uncertainties in the reaction rate coefficients alone the relative mean standard deviation of this normal distribution is estimated to be 0.29.
Zhang, Wen; Rui, Pinshu; Zhang, Ziyun; Liao, Yanlin
2016-02-01
We investigate the probabilistic quantum cloning (PQC) of two states with arbitrary probability distribution. The optimal success probabilities are worked out for 1→ 2 PQC of the two states. The results show that the upper bound on the success probabilities of PQC in Qiu (J Phys A 35:6931-6937, 2002) cannot be reached in general. With the optimal success probabilities, we design simple forms of 1→ 2 PQC and work out the unitary transformation needed in the PQC processes. The optimal success probabilities for 1→ 2 PQC are also generalized to the M→ N PQC case.
Average Consensus Analysis of Distributed Inference with Uncertain Markovian Transition Probability
Won Il Kim; Rong Xiong; Qiuguo Zhu; Jun Wu
2013-01-01
The average consensus problem of distributed inference in a wireless sensor network under Markovian communication topology of uncertain transition probability is studied. A sufficient condition for average consensus of linear distributed inference algorithm is presented. Based on linear matrix inequalities and numerical optimization, a design method of fast distributed inference is provided.
A measure of mutual divergence among a number of probability distributions
J. N. Kapur
1987-01-01
major inequalities due to Shannon, Renyi and Holder. The inequalities are then used to obtain some useful results in information theory. In particular measures are obtained to measure the mutual divergence among two or more probability distributions.
Noise figure and photon probability distribution in Coherent Anti-Stokes Raman Scattering (CARS)
Dimitropoulos, D.; Solli, D. R.; Claps, R.; Jalali, B.
2006-01-01
The noise figure and photon probability distribution are calculated for coherent anti-Stokes Raman scattering (CARS) where an anti-Stokes signal is converted to Stokes. We find that the minimum noise figure is ~ 3dB.
Simultaneous distribution between the deflection angle and the lateral displacement of fast charged particles traversing through matter is derived by applying numerical inverse Fourier transforms on the Fourier spectral density solved analytically under the Moliere theory of multiple scattering, taking account of ionization loss. Our results show the simultaneous Gaussian distribution at the region of both small deflection angle and lateral displacement, though they show the characteristic contour patterns of probability density specific to the single and the double scatterings at the regions of large deflection angle and/or lateral displacement. The influences of ionization loss on the distribution are also investigated. An exact simultaneous distribution is derived under the fixed energy condition based on a well-known model of screened single scattering, which indicates the limit of validity of the Moliere theory applied to the simultaneous distribution. The simultaneous distribution will be valuable for improving the accuracy and the efficiency of experimental analyses and simulation studies relating to charged particle transports. (orig.)
Nakatsuka, Takao [Okayama Shoka University, Laboratory of Information Science, Okayama (Japan); Okei, Kazuhide [Kawasaki Medical School, Dept. of Information Sciences, Kurashiki (Japan); Iyono, Atsushi [Okayama university of Science, Dept. of Fundamental Science, Faculty of Science, Okayama (Japan); Bielajew, Alex F. [Univ. of Michigan, Dept. Nuclear Engineering and Radiological Sciences, Ann Arbor, MI (United States)
2015-12-15
Simultaneous distribution between the deflection angle and the lateral displacement of fast charged particles traversing through matter is derived by applying numerical inverse Fourier transforms on the Fourier spectral density solved analytically under the Moliere theory of multiple scattering, taking account of ionization loss. Our results show the simultaneous Gaussian distribution at the region of both small deflection angle and lateral displacement, though they show the characteristic contour patterns of probability density specific to the single and the double scatterings at the regions of large deflection angle and/or lateral displacement. The influences of ionization loss on the distribution are also investigated. An exact simultaneous distribution is derived under the fixed energy condition based on a well-known model of screened single scattering, which indicates the limit of validity of the Moliere theory applied to the simultaneous distribution. The simultaneous distribution will be valuable for improving the accuracy and the efficiency of experimental analyses and simulation studies relating to charged particle transports. (orig.)
The wind-blown sand saltating movement is mainly categorized into two mechanical processes, that is, the interaction between the moving sand particles and the wind in the saltation layer, and the collisions of incident particles with sand bed, and the latter produces a lift-off velocity of a sand particle moving into saltation. In this Letter a methodology of phenomenological analysis is presented to get probability density (distribution) function (pdf) of the lift-off velocity of sand particles from sand bed based on the stochastic particle-bed collision. After the sand particles are dealt with by uniform circular disks and a 2D collision between an incident particle and the granular bed is employed, we get the analytical formulas of lift-off velocity of ejected and rebound particles in saltation, which are functions of some random parameters such as angle and magnitude of incident velocity of the impacting particles, impact and contact angles between the collision particles, and creeping velocity of sand particles, etc. By introducing the probability density functions (pdf's) of these parameters in communion with all possible patterns of sand bed and all possible particle-bed collisions, and using the essential arithmetic of multi-dimension random variables' pdf, the pdf's of lift-off velocities are deduced out and expressed by the pdf's of the random parameters in the collisions. The numerical results of the distributions of lift-off velocities display that they agree well with experimental ones
In this paper, we investigate an off-line strategy to incorporate inter-fraction organ motion in IMRT treatment planning. It was suggested that inverse planning could be based on a probability distribution of patient geometries instead of a single snap shot. However, this concept is connected to two intrinsic problems: first, this probability distribution has to be estimated from only a few images; and second, the distribution is only sparsely sampled over the treatment course due to a finite number of fractions. In the current work, we develop new concepts of inverse planning which account for these two problems
Mahanti, P.; Robinson, M. S.; Boyd, A. K.
2013-12-01
Craters ~20-km diameter and above significantly shaped the lunar landscape. The statistical nature of the slope distribution on their walls and floors dominate the overall slope distribution statistics for the lunar surface. Slope statistics are inherently useful for characterizing the current topography of the surface, determining accurate photometric and surface scattering properties, and in defining lunar surface trafficability [1-4]. Earlier experimental studies on the statistical nature of lunar surface slopes were restricted either by resolution limits (Apollo era photogrammetric studies) or by model error considerations (photoclinometric and radar scattering studies) where the true nature of slope probability distribution was not discernible at baselines smaller than a kilometer[2,3,5]. Accordingly, historical modeling of lunar surface slopes probability distributions for applications such as in scattering theory development or rover traversability assessment is more general in nature (use of simple statistical models such as the Gaussian distribution[1,2,5,6]). With the advent of high resolution, high precision topographic models of the Moon[7,8], slopes in lunar craters can now be obtained at baselines as low as 6-meters allowing unprecedented multi-scale (multiple baselines) modeling possibilities for slope probability distributions. Topographic analysis (Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC) 2-m digital elevation models (DEM)) of ~20-km diameter Copernican lunar craters revealed generally steep slopes on interior walls (30° to 36°, locally exceeding 40°) over 15-meter baselines[9]. In this work, we extend the analysis from a probability distribution modeling point-of-view with NAC DEMs to characterize the slope statistics for the floors and walls for the same ~20-km Copernican lunar craters. The difference in slope standard deviations between the Gaussian approximation and the actual distribution (2-meter sampling) was
Solar proton pitch angle distribution for the January 24, 1969 event
Pitch angle distributions during the highly anisotropic phase of the event are fitted by a polynomial in cosmic pitch angle, μ, and the results are compared with the predictions of a Fokker-Planck equation in μ space for quasi-steady injection. Implications for the theory of the diffusion coefficient D(μ) are discussed. (orig.)
Dutta, S.; Chan, A. H.; Oh, C. H.
2012-08-01
This paper studies the multiplicity distribution of hadrons produced in p-p collisions at 0.9 and 2.36 TeV using ALICE as a detector. The multiplicity distribution exhibits enhanced void probability. They are also found to satisfy the void probability scaling. The scaling of χ with \\bar n\\bar k2 is studied using the generalized hypergeometric model. The variation of the parameter "a" of the hyper geometric model with energy and type of events is also studied. The parameter "a" distinguishes between various theoretical models, e.g. Lorentz/Catalan, negative binomial, geometric distribution etc. Finally a comparison is made with the p--\\bar p collisions at 200, 546 and 900 GeV. It is observed both for p-p and p--\\bar p data, the value of "a" decreases with increase in collision energy and approach towards the upper bound or the NB model of the void probability scaling.
Xian-min Geng; Shu-chen Wan
2011-01-01
The compound negative binomial model, introduced in this paper, is a discrete time version. We discuss the Markov properties of the surplus process, and study the ruin probability and the joint distributions of actuarial random vectors in this model. By the strong Markov property and the mass function of a defective renewal sequence, we obtain the explicit expressions of the ruin probability, the finite-horizon ruin probability,the joint distributions of T, U(T - 1), |U(T)| and inf 0≤n＜T1 U(n) (i.e., the time of ruin, the surplus immediately before ruin, the deficit at ruin and maximal deficit from ruin to recovery) and the distributions of some actuariai random vectors.
Probability distributions for directed polymers in random media with correlated noise.
Chu, Sherry; Kardar, Mehran
2016-07-01
The probability distribution for the free energy of directed polymers in random media (DPRM) with uncorrelated noise in d=1+1 dimensions satisfies the Tracy-Widom distribution. We inquire if and how this universal distribution is modified in the presence of spatially correlated noise. The width of the distribution scales as the DPRM length to an exponent β, in good (but not full) agreement with previous renormalization group and numerical results. The scaled probability is well described by the Tracy-Widom form for uncorrelated noise, but becomes symmetric with increasing correlation exponent. We thus find a class of distributions that continuously interpolates between Tracy-Widom and Gaussian forms. PMID:27575059
Fitting the distribution of dry and wet spells with alternative probability models
Deni, Sayang Mohd; Jemain, Abdul Aziz
2009-06-01
The development of the rainfall occurrence model is greatly important not only for data-generation purposes, but also in providing informative resources for future advancements in water-related sectors, such as water resource management and the hydrological and agricultural sectors. Various kinds of probability models had been introduced to a sequence of dry (wet) days by previous researchers in the field. Based on the probability models developed previously, the present study is aimed to propose three types of mixture distributions, namely, the mixture of two log series distributions (LSD), the mixture of the log series Poisson distribution (MLPD), and the mixture of the log series and geometric distributions (MLGD), as the alternative probability models to describe the distribution of dry (wet) spells in daily rainfall events. In order to test the performance of the proposed new models with the other nine existing probability models, 54 data sets which had been published by several authors were reanalyzed in this study. Also, the new data sets of daily observations from the six selected rainfall stations in Peninsular Malaysia for the period 1975-2004 were used. In determining the best fitting distribution to describe the observed distribution of dry (wet) spells, a Chi-square goodness-of-fit test was considered. The results revealed that the new method proposed that MLGD and MLPD showed a better fit as more than half of the data sets successfully fitted the distribution of dry and wet spells. However, the existing models, such as the truncated negative binomial and the modified LSD, were also among the successful probability models to represent the sequence of dry (wet) days in daily rainfall occurrence.
Diogo de Carvalho Bezerra
2015-12-01
Full Text Available ABSTRACT Contributions from the sensitivity analysis of the parameters of the linear programming model for the elicitation of experts' beliefs are presented. The process allows for the calibration of the family of probability distributions obtained in the elicitation process. An experiment to obtain the probability distribution of a future event (Brazil vs. Spain soccer game in the 2013 FIFA Confederations Cup final game was conducted. The proposed sensitivity analysis step may help to reduce the vagueness of the information given by the expert.
Probability distribution of the order parameter in the directed percolation universality class.
Martins, P H L
2012-04-01
The probability distributions of the order parameter for two models in the directed percolation universality class were evaluated. Monte Carlo simulations have been performed for the one-dimensional generalized contact process and the Domany-Kinzel cellular automaton. In both cases, the density of active sites was chosen as the order parameter. The criticality of those models was obtained by solely using the corresponding probability distribution function. It has been shown that the present method, which has been successfully employed in treating equilibrium systems, is indeed also useful in the study of nonequilibrium phase transitions. PMID:22680423
Rank-Ordered Multifractal Analysis of Probability Distributions in Fluid Turbulence
Wu, Cheng-Chin; Chang, Tien
2015-11-01
Rank-Ordered Multifractal Analysis (ROMA) was introduced by Chang and Wu (2008) to describe the multifractal characteristic of intermittent events. The procedure provides a natural connection between the rank-ordered spectrum and the idea of one-parameter scaling for monofractals. This technique has successfully been applied to MHD turbulence simulations and turbulence data observed in various space plasmas. In this paper, the technique is applied to the probability distributions in the inertial range of the turbulent fluid flow, as given in the vast Johns Hopkins University (JHU) turbulence database. In addition, a refined method of finding the continuous ROMA spectrum and the scaled probability distribution function (PDF) simultaneously is introduced.
Runov, A.; Angelopoulos, V.; Gabrielse, C.; Zhou, X.-Z.; Turner, D.; Plaschke, F.
2013-02-01
Taking advantage of multipoint observations from a Cluster-like Time History of Events and Macroscale Interactions during Substorms (THEMIS) probe configuration repeated in three events, we study pitch-angle distributions (PAD) of lower energy (0.2-keV) electrons and omnidirectional energy-time spectrograms of higher energy (30-500 keV) electrons observed at and near dipolarization fronts in the plasma sheet. Recent observations have shown that dipolarization fronts in the plasma sheet provide an impulsive electric field suggested to cause electron energization and dispersionless injections. Increase and decrease in energetic electron flux are equally probable at the fronts, however. Our case studies demonstrate increased energetic electron flux in the front's central region but decreased flux on its dusk side, where diverted plasma flow forms a vortex. An electric field associated with this vortex causes the electron flux decrease. We also find that shorter-term energetic flux decreases, often observed before injections, coincide with a dip in the northward magnetic field ahead of the front. We attribute these decreases to particle energy loss via the inverse betatron effect. Our case studies reveal that pancake-type (maximum at 90° pitch angle) and cigar-type (maxima at 0 and 180°) PADs coexist at the same front. Our data analysis suggests that energetic electron PADs are mainly pancake type near the neutral sheet (|Bx| cigar type at |Bx| > 10 nt. These results, to be confirmed in statistical studies, provide important constraints for further modeling of electron energization and transport toward the inner magnetosphere.
Xu, Y.; Tan, L.; Cao, S. L.; Wang, Y. C.; Meng, G.; Qu, W. S.
2015-01-01
The influence of blade angle distribution along leading edge on cavitation performance of centrifugal pumps is analysed in the present paper. Three sets of blade angle distribution along leading edge for three blade inlet angles are chosen to design nine centrifugal pump impellers. The RNG k-epsilon turbulence model and the Zwart-Gerber-Belamri cavitation model are employed to simulate the cavitation flows in centrifugal pumps with different impellers and the same volute. The numerical results are compared with the experimental data, and the comparison proves that the numerical simulation can accurately predict the cavitation performance of centrifugal pumps. On the basis of the numerical simulations, the pump head variations with pump inlet pressure, and the flow details in centrifugal pump are revealed to demonstrate the influence of blade angle distribution along leading edge on cavitation performances of centrifugal pumps.
Importance measures for imprecise probability distributions and their sparse grid solutions
WANG; Pan; LU; ZhenZhou; CHENG; Lei
2013-01-01
For the imprecise probability distribution of structural system, the variance based importance measures (IMs) of the inputs are investigated, and three IMs are defined on the conditions of random distribution parameters, interval distribution parameters and the mixture of those two types of distribution parameters. The defined IMs can reflect the influence of the inputs on the output of the structural system with imprecise distribution parameters, respectively. Due to the large computational cost of the variance based IMs, sparse grid method is employed in this work to compute the variance based IMs at each reference point of distribution parameters. For the three imprecise distribution parameter cases, the sparse grid method and the combination of sparse grid method with genetic algorithm are used to compute the defined IMs. Numerical and engineering examples are em-ployed to demonstrate the rationality of the defined IMs and the efficiency of the applied methods.
Analysis of low probability of intercept (LPI) radar signals using the Wigner Distribution
Gau, Jen-Yu
2002-01-01
Approved for public release, distribution is unlimited The parameters of Low Probability of Intercept (LPI) radar signals are hard to identify by using traditional periodogram signal processing techniques. Using the Wigner Distribution (WD), this thesis examines eight types of LPI radar signals. Signal to noise ratios of 0 dB and -6dB are also investigated. The eight types LPI radar signals examined include Frequency Modulation Continuous Wave (FMCW), Frank code, P1 code, P2 code, P3 code,...
The probability distribution function for the sum of squares of independent random variables
Fateev, Yury; Dmitriev, Dmitry; Tyapkin, Valery; Kremez, Nikolai; Shaidurov, Vladimir
2016-08-01
In the present paper, the probability distribution function is derived for the sum of squares of random variables for nonzero expectations. This distribution function enables one to develop an efficient one-step algorithm for phase ambiguity resolution when determining the spatial orientation from signals of satellite radio-navigation systems. Threshold values for rejecting false solutions and statistical properties of the algorithm are obtained.
Various Models for Pion Probability Distributions from Heavy-Ion Collisions
Mekjian, A. Z.; Schlei, B. R.; Strottman, D.
1998-01-01
Various models for pion multiplicity distributions produced in relativistic heavy ion collisions are discussed. The models include a relativistic hydrodynamic model, a thermodynamic description, an emitting source pion laser model, and a description which generates a negative binomial description. The approach developed can be used to discuss other cases which will be mentioned. The pion probability distributions for these various cases are compared. Comparison of the pion laser model and Bos...
The exact probability distribution of the rank product statistics for replicated experiments
Eisinga, R.N.; Breitling, R.; Heskes, T.M.
2013-01-01
The rank product method is a widely accepted technique for detecting differentially regulated genes in replicated microarray experiments. To approximate the sampling distribution of the rank product statistic, the original publication proposed a permutation approach, whereas recently an alternative approximation based on the continuous gamma distribution was suggested. However, both approximations are imperfect for estimating small tail probabilities. In this paper we relate the rank product ...
Schürmann, Thomas
2015-01-01
We consider the nonparametric estimation problem of continuous probability distribution functions. For the integrated mean square error we provide the statistic corresponding to the best invariant estimator proposed by Aggarwal (1955) and Ferguson (1967). The table of critical values is computed and a numerical power comparison of the statistic with the traditional Cram\\'{e}r-von Mises statistic is done for several representative distributions.
Steeneveld, W.; Gaag, van der L.C.; Barkema, H.W.; Hogeveen, H.
2009-01-01
Clinical mastitis (CM) can be caused by a wide variety of pathogens and farmers must start treatment before the actual causal pathogen is known. By providing a probability distribution for the causal pathogen, naive Bayesian networks (NBN) can serve as a management tool for farmers to decide which t
Measurements of gas hydrate formation probability distributions on a quasi-free water droplet
Maeda, Nobuo
2014-06-01
A High Pressure Automated Lag Time Apparatus (HP-ALTA) can measure gas hydrate formation probability distributions from water in a glass sample cell. In an HP-ALTA gas hydrate formation originates near the edges of the sample cell and gas hydrate films subsequently grow across the water-guest gas interface. It would ideally be desirable to be able to measure gas hydrate formation probability distributions of a single water droplet or mist that is freely levitating in a guest gas, but this is technically challenging. The next best option is to let a water droplet sit on top of a denser, immiscible, inert, and wall-wetting hydrophobic liquid to avoid contact of a water droplet with the solid walls. Here we report the development of a second generation HP-ALTA which can measure gas hydrate formation probability distributions of a water droplet which sits on a perfluorocarbon oil in a container that is coated with 1H,1H,2H,2H-Perfluorodecyltriethoxysilane. It was found that the gas hydrate formation probability distributions of such a quasi-free water droplet were significantly lower than those of water in a glass sample cell.
Calculation of the Multivariate Probability Distribution Function Values and their Gradient Vectors
Szantai, T.
1987-01-01
The described collection of subroutines developed for calculation of values of multivariate normal, Dirichlet and gamma distribution functions and their gradient vectors is an unique tool that can be used e.g. to compute the Loss-of-Load Probability of electric networks and to solve optimization problems with a reliability constraint.
LU Wei-ji; CUI Wei
2001-01-01
In this paper, two kinds of models are presented and optimized for pro ject investment risk income on the basis of probability χ distribution. One kin d of model being proved has only a maximal value and another kind being proved h as no extreme values.
Criticality of the net-baryon number probability distribution at finite density
Kenji Morita; Bengt Friman; Krzysztof Redlich
2014-01-01
We compute the probability distribution $P(N)$ of the net-baryon number at finite temperature and quark-chemical potential, $\\mu$, at a physical value of the pion mass in the quark-meson model within the functional renormalization group scheme. For $\\mu/T
Criticality of the net-baryon number probability distribution at finite density
Morita, Kenji; Friman, Bengt; Redlich, Krzysztof
2015-01-01
We compute the probability distribution P(N) of the net-baryon number at finite temperature and quark-chemical potential, μ , at a physical value of the pion mass in the quark-meson model within the functional renormalization group scheme. For μ/T
Huemmrich, Karl F.
2013-01-01
The leaf inclination angle distribution (LAD) is an important characteristic of vegetation canopy structure affecting light interception within the canopy. However, LADs are difficult and time consuming to measure. To examine possible global patterns of LAD and their implications in remote sensing, a model was developed to predict leaf angles within canopies. Canopies were simulated using the SAIL radiative transfer model combined with a simple photosynthesis model. This model calculated leaf inclination angles for horizontal layers of leaves within the canopy by choosing the leaf inclination angle that maximized production over a day in each layer. LADs were calculated for five latitude bands for spring and summer solar declinations. Three distinct LAD types emerged: tropical, boreal, and an intermediate temperate distribution. In tropical LAD, the upper layers have a leaf angle around 35 with the lower layers having horizontal inclination angles. While the boreal LAD has vertical leaf inclination angles throughout the canopy. The latitude bands where each LAD type occurred changed with the seasons. The different LADs affected the fraction of absorbed photosynthetically active radiation (fAPAR) and Normalized Difference Vegetation Index (NDVI) with similar relationships between fAPAR and leaf area index (LAI), but different relationships between NDVI and LAI for the different LAD types. These differences resulted in significantly different relationships between NDVI and fAPAR for each LAD type. Since leaf inclination angles affect light interception, variations in LAD also affect the estimation of leaf area based on transmittance of light or lidar returns.
A. B. Levina
2016-03-01
Full Text Available Error detection codes are mechanisms that enable robust delivery of data in unreliable communication channels and devices. Unreliable channels and devices are error-prone objects. Respectively, error detection codes allow detecting such errors. There are two classes of error detecting codes - classical codes and security-oriented codes. The classical codes have high percentage of detected errors; however, they have a high probability to miss an error in algebraic manipulation. In order, security-oriented codes are codes with a small Hamming distance and high protection to algebraic manipulation. The probability of error masking is a fundamental parameter of security-oriented codes. A detailed study of this parameter allows analyzing the behavior of the error-correcting code in the case of error injection in the encoding device. In order, the complexity of the encoding function plays an important role in the security-oriented codes. Encoding functions with less computational complexity and a low probability of masking are the best protection of encoding device against malicious acts. This paper investigates the influence of encoding function complexity on the error masking probability distribution. It will be shownthat the more complex encoding function reduces the maximum of error masking probability. It is also shown in the paper that increasing of the function complexity changes the error masking probability distribution. In particular, increasing of computational complexity decreases the difference between the maximum and average value of the error masking probability. Our resultshave shown that functions with greater complexity have smoothed maximums of error masking probability, which significantly complicates the analysis of error-correcting code by attacker. As a result, in case of complex encoding function the probability of the algebraic manipulation is reduced. The paper discusses an approach how to measure the error masking
Yura, Harold; Hanson, Steen Grüner
2012-01-01
Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the...... desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative...
The pitch angle distributions of a mesh type electron gun for the electron transport experiments in stellarators are estimated with the electron re-entry effect and the measurement method for pitch angle distribution by means of a local mirror field is proposed. It is found that the electron re-entry effect is significant to design electron guns for the electron transport experiment in stellarators because the high pitch angle electrons which re-enter into the gun launch again with a low pitch angle. The compensation method for an error field on the quasi helically symmetric stellarator HSX is also proposed. It is found that the additional toroidal mirror modes [n,m] = [3,0], [4,0] can eliminate a dangerous error field model [-1, -1] like the earth field. Here, n and m are toroidal and poloidal modes. (author)
冉洪流
2004-01-01
In recent years, some researchers have studied the paleoearthquake along the Haiyuan fault and revealed a lot of paleoearthquake events. All available information allows more reliable analysis of earthquake recurrence interval and earthquake rupture patterns along the Haiyuan fault. Based on this paleoseismological information, the recurrence probability and magnitude distribution for M≥6.7 earthquakes in future 100 years along the Haiyuan fault can be obtained through weighted computation by using Poisson and Brownian passage time models and considering different rupture patterns. The result shows that the recurrence probability of MS≥6.7 earthquakes is about 0.035 in future 100 years along the Haiyuan fault.
Improving quality of sample entropy estimation for continuous distribution probability functions
Miśkiewicz, Janusz
2016-05-01
Entropy is a one of the key parameters characterizing state of system in statistical physics. Although, the entropy is defined for systems described by discrete and continuous probability distribution function (PDF), in numerous applications the sample entropy is estimated by a histogram, which, in fact, denotes that the continuous PDF is represented by a set of probabilities. Such a procedure may lead to ambiguities and even misinterpretation of the results. Within this paper, two possible general algorithms based on continuous PDF estimation are discussed in the application to the Shannon and Tsallis entropies. It is shown that the proposed algorithms may improve entropy estimation, particularly in the case of small data sets.
A quantile selection procedure in reliability problems pertaining to a restricted family of probability distributions is discussed. This family is assumed to be star-ordered with respect to the standard normal distribution folded at the origin. Motivation for this formulation of the problem is described. Both exact and asymptotic results dealing with the distribution of the maximum of ratios of order statistics from such a family are obtained and tables of the appropriate constants, percentiles of this statistic, are given in order to facilitate the use of the selection procedure
Probability distribution and the boundary value problem in noncommutative quantum mechanics
Full text: Non-commutative quantum mechanics (NCQM) still has some important open questions, such as, for example, the correct definition of the probability density and the consistent formulation of the boundary value problem. The main difficulty relies on the fact that in a non-commutative space the classical notion of point has no operational meaning. Besides that, it is well known that in NCQM the ordinary definition of probability density does not satisfy the continuity equation, thus being physically inadequate to this context. As a consequence, the formulation of the boundary value problem in NCQM is ill-defined, since the confining conditions for a particle trapped in a closed region are often formulated in terms of the properties of the probability density at the boundaries of such a region. In this work we solve both problems in a unified way. We consider a two-dimensional configuration space generated by two non-commutative coordinates satisfying a canonical commutation relation. This non-commutative space is formally equal to the phase space of a quantum particle moving in a line, what suggests an approach based on the Wigner formulation of quantum mechanics. We introduce a quasi-probability distribution function, constructed by means of the Moyal product of functions. By making use of the operation of partial trace we construct a normalizable, positive-definite function. We demonstrate that this function satisfy the continuity equation, so that it can be interpreted as a probability density function, thus providing a physically consistent probabilistic interpretation for NCQM. Even though the probability density contains all the available information about the physical system, it is useful to formulate the boundary value problem in terms of wave functions fulfilling some appropriated differential equation. By making use of harmonic analysis we introduce an auxiliary wave function, which is related to the physical probability density in the same way as
Li Wei; Hai-liang Yang
2004-01-01
In this paper we first consider a risk process in which claim inter-arrival times and the time until the first claim have an Erlang (2)distribution.An explicit solution is derived for the probability of ultimate ruin,given an initial reserve of u when the claim size follows a Pareto distribution.Follow Ramsay [8] ,Laplace transforms and exponential integrals are used to derive the solution,which involves a single integral of real valued functions along the positive real line,and the integrand is not of an oscillating kind.Then we show that the ultimate ruin probability can be expressed as the sum of expected values of functions of two different Gamma random variables.Finally,the results are extended to the Erlang(n)case.Numerical examples are given to illustrate the main results.
Comparison of Probability Distribution Function in Determining Minimum Annual and Monthly Streamflow
Nícolas Reinaldo Finkler
2015-11-01
Full Text Available In this study it was aimed to provide the foundation studies of water availability in the Arroio Belo basin, in Caxias do Sul/RS. Therefore, this study aimed to analyze the application of Weibull, Normal, LogNormal, Gumbel (minimum, LogPearson and Pearson theoretical probability functions to data of minimum streamflows for seven consecutive days of the basin. The analysis had two approaches: application in annual data, and then on monthly data, considering seasonality. To verify the adherence to the estimated probabilities of observed frequencies, we applied three tests: Kolmogorov-Smirnov, Anderson-Darling and Chi-Squared. The results show that the Log-Pearson III distribution shows greater accuracy in representing the annual data of the series and reach the best fit of the minimum streamflow. The monthly data analysis indicated the use of the distribution Pearson III, which showed higher suitability to the minimum streamflow data.
Rank-Ordered Multifractal Analysis (ROMA of probability distributions in fluid turbulence
C. C. Wu
2011-04-01
Full Text Available Rank-Ordered Multifractal Analysis (ROMA was introduced by Chang and Wu (2008 to describe the multifractal characteristic of intermittent events. The procedure provides a natural connection between the rank-ordered spectrum and the idea of one-parameter scaling for monofractals. This technique has successfully been applied to MHD turbulence simulations and turbulence data observed in various space plasmas. In this paper, the technique is applied to the probability distributions in the inertial range of the turbulent fluid flow, as given in the vast Johns Hopkins University (JHU turbulence database. In addition, a new way of finding the continuous ROMA spectrum and the scaled probability distribution function (PDF simultaneously is introduced.
Learning algorithms and probability distributions in feed-forward and feed-back networks
Hopfield, J J
1987-01-01
Learning algorithms have been used both on feed-forward deterministic networks and on feed-back statistical networks to capture input-output relations and do pattern classification. These learning algorithms are examined for a class of problems characterized by noisy or statistical data, in which the networks learn the relation between input data and probability distributions of answers. In simple but nontrivial networks the two learning rules are closely related. Under some circumstances the...
Huang, N. E.; Long, S. R.
1980-01-01
Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.
Dai, Mi; Wang, Yun
2015-01-01
In order to obtain robust cosmological constraints from Type Ia supernova (SN Ia) data, we have applied Markov Chain Monte Carlo (MCMC) to SN Ia lightcurve fitting. We develop a method for sampling the resultant probability density distributions (pdf) of the SN Ia lightcuve parameters in the MCMC likelihood analysis to constrain cosmological parameters, and validate it using simulated data sets. Applying this method to the Joint Lightcurve Analysis (JLA) data set of SNe Ia, we find that sampl...
Wigner Function and Phase Probability Distribution of q-Analogueof Squeezed One-Photon State
FANG Jian-Shu; MENG Xiang-Guo; ZHANG Xiang-Ping; WANG Ji-Suo; LIANG Bao-Long
2008-01-01
In this paper, in terms of the technique of integration within an ordered product (IWOP) of operators and the properties of the inverses of q-deformed annihilation and creation operators, normalizable q-analogue of the squeezed one-photon state, which is quite different from one introduced by Song and Fan [Int. 3. Theor. Phys. 41 (2002) 695], is constructed. Moreover, the Wigner function and phase probability distribution of q-analogue of the squeezed one-photon state are examined.
Optimal design of unit hydrographs using probability distribution and genetic algorithms
Rajib Kumar Bhattacharjya
2004-10-01
A nonlinear optimization model is developed to transmute a unit hydrograph into a probability distribution function (PDF). The objective function is to minimize the sum of the square of the deviation between predicted and actual direct runoff hydrograph of a watershed. The predicted runoff hydrograph is estimated by using a PDF. In a unit hydrograph, the depth of rainfall excess must be unity and the ordinates must be positive. Incorporation of a PDF ensures that the depth of rainfall excess for the unit hydrograph is unity, and the ordinates are also positive. Unit hydrograph ordinates are in terms of intensity of rainfall excess on a discharge per unit catchment area basis, the unit area thus representing the unit rainfall excess. The proposed method does not have any constraint. The nonlinear optimization formulation is solved using binary-coded genetic algorithms. The number of variables to be estimated by optimization is the same as the number of probability distribution parameters; gamma and log-normal probability distributions are used. The existing nonlinear programming model for obtaining optimal unit hydrograph has also been solved using genetic algorithms, where the constrained nonlinear optimization problem is converted to an unconstrained problem using penalty parameter approach. The results obtained are compared with those obtained by the earlier LP model and are fairly similar.
The probability distributions of external-corrosion pit depth and pit growth rate were investigated in underground pipelines using Monte Carlo simulations. The study combines a predictive pit growth model developed by the authors with the observed distributions of the model variables in a range of soils. Depending on the pipeline age, any of the three maximal extreme value distributions, i.e. Weibull, Frechet or Gumbel, can arise as the best fit to the pitting depth and rate data. The Frechet distribution best fits the corrosion data for long exposure periods. This can be explained by considering the long-term stabilization of the diffusion-controlled pit growth. The findings of the study provide reliability analysts with accurate information regarding the stochastic characteristics of the pitting damage in underground pipelines.
Caleyo, F. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico)], E-mail: fcaleyo@gmail.com; Velazquez, J.C. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico); Valor, A. [Facultad de Fisica, Universidad de La Habana, San Lazaro y L, Vedado, 10400, La Habana (Cuba); Hallen, J.M. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico)
2009-09-15
The probability distributions of external-corrosion pit depth and pit growth rate were investigated in underground pipelines using Monte Carlo simulations. The study combines a predictive pit growth model developed by the authors with the observed distributions of the model variables in a range of soils. Depending on the pipeline age, any of the three maximal extreme value distributions, i.e. Weibull, Frechet or Gumbel, can arise as the best fit to the pitting depth and rate data. The Frechet distribution best fits the corrosion data for long exposure periods. This can be explained by considering the long-term stabilization of the diffusion-controlled pit growth. The findings of the study provide reliability analysts with accurate information regarding the stochastic characteristics of the pitting damage in underground pipelines.
Probability distributions in a two-parameter scaling theory of localization
Heinrichs, J.
1988-06-01
Probability distributions for the resistance of two- and three-dimensional disordered conductors are studied using a Migdal-Kadanoff-type scaling transformation together with the author's previously derived distributions in one dimension. The present treatment differs from earlier work in two respects: On one hand, it includes the effect of an average potential barrier V experienced by an electron originating from the perfect leads which connect the conductor to a constant-voltage source; on the other hand, the input distribution for one-dimensional systems is based on an exact solution for the effect of the random potential on the complex reflection amplitude of an electron at a certain energy. The scaling equation for probability distributions and for their successive moments are parametrized in terms of the mean resistance, ρ¯, and of a fixed parameter γ related to V. Hence they correspond to a special form of two-parameter scaling. A mobility edge, ρ¯≡ρc, exists only for d>2 and, for d=3, detailed results for ρc, for the conductivity exponent ν, and for the fixed resistance distribution at ρc as a function of γ are presented. The asymptotic distribution of resistance away from the mobility edge for d=3, and in both small- and large-resistance regimes for d=2 are also studied. In the metallic regime for d>2 our treatment yields two distinct distributions, one of which is characterized by Ohm's law for the mean resistance and the other one by Ohm's law for the mean conductance. In the latter case the fluctuations of conductivity are independent of sample size for large samples. The calculated distributions are generally broad and in the localized regime, for d=3 and d=2, the rms values of resistance dominate the mean values in the infinite-sample limit.
Maadooliat, Mehdi
2012-08-27
Despite considerable progress in the past decades, protein structure prediction remains one of the major unsolved problems in computational biology. Angular-sampling-based methods have been extensively studied recently due to their ability to capture the continuous conformational space of protein structures. The literature has focused on using a variety of parametric models of the sequential dependencies between angle pairs along the protein chains. In this article, we present a thorough review of angular-sampling-based methods by assessing three main questions: What is the best distribution type to model the protein angles? What is a reasonable number of components in a mixture model that should be considered to accurately parameterize the joint distribution of the angles? and What is the order of the local sequence-structure dependency that should be considered by a prediction method? We assess the model fits for different methods using bivariate lag-distributions of the dihedral/planar angles. Moreover, the main information across the lags can be extracted using a technique called Lag singular value decomposition (LagSVD), which considers the joint distribution of the dihedral/planar angles over different lags using a nonparametric approach and monitors the behavior of the lag-distribution of the angles using singular value decomposition. As a result, we developed graphical tools and numerical measurements to compare and evaluate the performance of different model fits. Furthermore, we developed a web-tool (http://www.stat.tamu. edu/~madoliat/LagSVD) that can be used to produce informative animations. © The Author 2012. Published by Oxford University Press.
Photometric Redshift Probability Distributions for Galaxies in the SDSS DR8
Sheldon, Erin S; Mandelbaum, Rachel; Brinkmann, J; Weaver, Benjamin A
2011-01-01
We present redshift probability distributions for galaxies in the SDSS DR8 imaging data. We used the nearest-neighbor weighting algorithm presented in Lima et al. 2008 and Cunha et al. 2009 to derive the ensemble redshift distribution N(z), and individual redshift probability distributions P(z) for galaxies with r < 21.8. As part of this technique, we calculated weights for a set of training galaxies with known redshifts such that their density distribution in five dimensional color-magnitude space was proportional to that of the photometry-only sample, producing a nearly fair sample in that space. We then estimated the ensemble N(z) of the photometric sample by constructing a weighted histogram of the training set redshifts. We derived P(z) s for individual objects using the same technique, but limiting to training set objects from the local color-magnitude space around each photometric object. Using the P(z) for each galaxy, rather than an ensemble N(z), can reduce the statistical error in measurements t...
Criticality of the net-baryon number probability distribution at finite density
Morita, Kenji; Redlich, Krzysztof
2014-01-01
We compute the probability distribution $P(N)$ of the net-baryon number at finite temperature and quark-chemical potential, $\\mu$, at a physical value of the pion mass in the quark-meson model within the functional renormalization group scheme. For $\\mu/T<1$, the model exhibits the chiral crossover transition which belongs to the universality class of the $O(4)$ spin system in three dimensions. We explore the influence of the chiral crossover transition on the properties of the net baryon number probability distribution, $P(N)$. By considering ratios of $P(N)$ to the Skellam function, with the same mean and variance, we unravel the characteristic features of the distribution that are related to $O(4)$ criticality at the chiral crossover transition. We explore the corresponding ratios for data obtained at RHIC by the STAR Collaboration and discuss their implications. We also examine $O(4)$ criticality in the context of binomial and negative-binomial distributions for the net proton number.
Tong Yifei
2014-01-01
Full Text Available Crane is a mechanical device, used widely to move materials in modern production. It is reported that the energy consumptions of China are at least 5–8 times of other developing countries. Thus, energy consumption becomes an unavoidable topic. There are several reasons influencing the energy loss, and the camber of the girder is the one not to be neglected. In this paper, the problem of the deflections induced by the moving payload in the girder of overhead travelling crane is examined. The evaluation of a camber giving a counterdeflection of the girder is proposed in order to get minimum energy consumptions for trolley to move along a nonstraight support. To this aim, probabilistic payload distributions are considered instead of fixed or rated loads involved in other researches. Taking 50/10 t bridge crane as a research object, the probability loads are determined by analysis of load distribution density functions. According to load distribution, camber design under different probability loads is discussed in detail as well as energy consumptions distribution. The research results provide the design reference of reasonable camber to obtain the least energy consumption for climbing corresponding to different P0; thus energy-saving design can be achieved.
Criticality of the net-baryon number probability distribution at finite density
Kenji Morita
2015-02-01
Full Text Available We compute the probability distribution P(N of the net-baryon number at finite temperature and quark-chemical potential, μ, at a physical value of the pion mass in the quark-meson model within the functional renormalization group scheme. For μ/T<1, the model exhibits the chiral crossover transition which belongs to the universality class of the O(4 spin system in three dimensions. We explore the influence of the chiral crossover transition on the properties of the net baryon number probability distribution, P(N. By considering ratios of P(N to the Skellam function, with the same mean and variance, we unravel the characteristic features of the distribution that are related to O(4 criticality at the chiral crossover transition. We explore the corresponding ratios for data obtained at RHIC by the STAR Collaboration and discuss their implications. We also examine O(4 criticality in the context of binomial and negative-binomial distributions for the net proton number.
Silicon Photomultipliers (SiPM), also called Solid State Photomultipliers (SSPM), are based on Geiger mode avalanche breakdown that is limited by a strong negative feedback. An SSPM can detect and resolve single photons due to the high gain and ultra-low excess noise of avalanche multiplication in this mode. Crosstalk and afterpulsing processes associated with the high gain introduce specific excess noise and deteriorate the photon number resolution of the SSPM. The probabilistic features of these processes are widely studied because of its significance for the SSPM design, characterization, optimization and application, but the process modeling is mostly based on Monte Carlo simulations and numerical methods. In this study, crosstalk is considered to be a branching Poisson process, and analytical models of probability distribution and excess noise factor (ENF) of SSPM signals based on the Borel distribution as an advance on the geometric distribution models are presented and discussed. The models are found to be in a good agreement with the experimental probability distributions for dark counts and a few photon spectrums in a wide range of fired pixels number as well as with observed super-linear behavior of crosstalk ENF.
Measurement of angle distribution in multiple scattering by track digitization method
The multiple scattering of β-ray with neon gas atoms is studied by use of the projection spark chamber. The tracks are digitized and analysed on-line to give the projected angle distribution. The present data are compared with the Moliere theory. (author)
The general objective of our research is the modelling of physical and biological processes related to the development of adverse health effects following the inhalation of radioaerosols, especially the initiation of lung cancer in central human airways by the inspiration of radon progenies. There is experimental evidence that bronchogenic carcinomas originate mainly in the vicinity of the carinal ridge of the large bronchial airways where primary hot spots of deposition have been found. In case of uranium miners, more than ninety percent of the registered lung cancer formations have occurred in this region of the lung. However, current lung deposition models do not take into consideration the inhomogeneity of deposition within the airways. In the present study, cellular deposition pattern, alpha-track and DNA hit probability distributions of inhaled radon progenies in the upper and central human airway epithelial cells are computed with a computational fluid particle dynamics model. Our computer programme generates the three-dimensional morphologically realistic geometry of the upper and central airways. The flow fields within these airways are simulated by the FLUENT CFD (computational fluid dynamics) code at wide range of flow rates. Large number of attached and unattached radon progeny trajectories is simulated by our particle trajectory code to determine the proper deposition, activity patterns and alpha-track distributions on the surface of the airways. Three-dimensional distribution of secretory and basal cells are constructed. Finally, the number of DNA hits and hit probability distributions are quantified. Computed deposition, activity and hit probability patterns are strongly inhomogeneous at all realistic parameter selections and are sensitive to the shape of the geometry. Hot spots of alpha hits are found at the cranial region and at the inner sides of the daughter airways during inhalation and, with lower intensity, at the top and bottom sides of the
Vinogradov, S
2011-01-01
Silicon Photomultipliers (SiPM), also so-called Solid State Photomultipliers (SSPM), are based on Geiger mode avalanche breakdown limited by strong negative feedback. SSPM can detect and resolve single photons due to high gain and ultra-low excess noise of avalanche multiplication in this mode. Crosstalk and afterpulsing processes associated with the high gain introduce specific excess noise and deteriorate photon number resolution of the SSPM. Probabilistic features of these processes are widely studied because of its high importance for the SSPM design, characterization, optimization and application, but the process modeling is mostly based on Monte Carlo simulations and numerical methods. In this study, crosstalk is considered to be a branching Poisson process, and analytical models of probability distribution and excess noise factor (ENF) of SSPM signals based on the Borel distribution as an advance on the geometric distribution models are presented and discussed. The models are found to be in a good agre...
Játékos, Balázs, E-mail: jatekosb@eik.bme.hu; Ujhelyi, Ferenc; Lőrincz, Emőke; Erdei, Gábor
2015-01-01
SPADnet-I is a prototype, fully digital, high spatial and temporal resolution silicon photon counter, based on standard CMOS imaging technology, developed by the SPADnet consortium. Being a novel device, the exact dependence of photon detection probability (PDP) of SPADnet-I was not known as a function of angle of incidence, wavelength and polarization of the incident light. Our targeted application area of this sensor is next generation PET detector modules, where they will be used along with LYSO:Ce scintillators. Hence, we performed an extended investigation of PDP in a wide range of angle of incidence (0° to 80°), concentrating onto a 60 nm broad wavelength interval around the characteristic emission peak (λ=420 nm) of the scintillator. In the case where the sensor was optically coupled to a scintillator, our experiments showed a notable dependence of PDP on angle, polarization and wavelength. The sensor has an average PDP of approximately 30% from 0° to 60° angle of incidence, where it starts to drop rapidly. The PDP turned out not to be polarization dependent below 30°. If the sensor is used without a scintillator (i.e. the light source is in air), the polarization dependence is much less expressed, it begins only from 50°.
SPADnet-I is a prototype, fully digital, high spatial and temporal resolution silicon photon counter, based on standard CMOS imaging technology, developed by the SPADnet consortium. Being a novel device, the exact dependence of photon detection probability (PDP) of SPADnet-I was not known as a function of angle of incidence, wavelength and polarization of the incident light. Our targeted application area of this sensor is next generation PET detector modules, where they will be used along with LYSO:Ce scintillators. Hence, we performed an extended investigation of PDP in a wide range of angle of incidence (0° to 80°), concentrating onto a 60 nm broad wavelength interval around the characteristic emission peak (λ=420 nm) of the scintillator. In the case where the sensor was optically coupled to a scintillator, our experiments showed a notable dependence of PDP on angle, polarization and wavelength. The sensor has an average PDP of approximately 30% from 0° to 60° angle of incidence, where it starts to drop rapidly. The PDP turned out not to be polarization dependent below 30°. If the sensor is used without a scintillator (i.e. the light source is in air), the polarization dependence is much less expressed, it begins only from 50°
The probability distribution functions of emission line flux measurements and their ratios
Wesson, R; Scicluna, P
2016-01-01
Many physical parameters in astrophysics are derived using the ratios of two observed quantities. If the relative uncertainties on measurements are small enough, uncertainties can be propagated analytically using simplifying assumptions, but for large normally distributed uncertainties, the probability distribution of the ratio become skewed, with a modal value offset from that expected in Gaussian uncertainty propagation. Furthermore, the most likely value of a ratio A/B is not equal to the reciprocal of the most likely value of B/A. The effect is most pronounced when the uncertainty on the denominator is larger than that on the numerator. We show that this effect is seen in an analysis of 12,126 spectra from the Sloan Digital Sky Survey. The intrinsically fixed ratio of the [O III] lines at 4959 and 5007 ${\\AA}$ is conventionally expressed as the ratio of the stronger line to the weaker line. Thus, the uncertainty on the denominator is larger, and non-Gaussian probability distributions result. By taking thi...
Impact of spike train autostructure on probability distribution of joint spike events.
Pipa, Gordon; Grün, Sonja; van Vreeswijk, Carl
2013-05-01
The discussion whether temporally coordinated spiking activity really exists and whether it is relevant has been heated over the past few years. To investigate this issue, several approaches have been taken to determine whether synchronized events occur significantly above chance, that is, whether they occur more often than expected if the neurons fire independently. Most investigations ignore or destroy the autostructure of the spiking activity of individual cells or assume Poissonian spiking as a model. Such methods that ignore the autostructure can significantly bias the coincidence statistics. Here, we study the influence of the autostructure on the probability distribution of coincident spiking events between tuples of mutually independent non-Poisson renewal processes. In particular, we consider two types of renewal processes that were suggested as appropriate models of experimental spike trains: a gamma and a log-normal process. For a gamma process, we characterize the shape of the distribution analytically with the Fano factor (FFc). In addition, we perform Monte Carlo estimations to derive the full shape of the distribution and the probability for false positives if a different process type is assumed as was actually present. We also determine how manipulations of such spike trains, here dithering, used for the generation of surrogate data change the distribution of coincident events and influence the significance estimation. We find, first, that the width of the coincidence count distribution and its FFc depend critically and in a nontrivial way on the detailed properties of the structure of the spike trains as characterized by the coefficient of variation CV. Second, the dependence of the FFc on the CV is complex and mostly nonmonotonic. Third, spike dithering, even if as small as a fraction of the interspike interval, can falsify the inference on coordinated firing. PMID:23470124
Kurugol, Sila; Freiman, Moti; Afacan, Onur; Perez-Rossello, Jeannette M; Callahan, Michael J; Warfield, Simon K
2016-08-01
Quantitative diffusion-weighted MR imaging (DW-MRI) of the body enables characterization of the tissue microenvironment by measuring variations in the mobility of water molecules. The diffusion signal decay model parameters are increasingly used to evaluate various diseases of abdominal organs such as the liver and spleen. However, previous signal decay models (i.e., mono-exponential, bi-exponential intra-voxel incoherent motion (IVIM) and stretched exponential models) only provide insight into the average of the distribution of the signal decay rather than explicitly describe the entire range of diffusion scales. In this work, we propose a probability distribution model of incoherent motion that uses a mixture of Gamma distributions to fully characterize the multi-scale nature of diffusion within a voxel. Further, we improve the robustness of the distribution parameter estimates by integrating spatial homogeneity prior into the probability distribution model of incoherent motion (SPIM) and by using the fusion bootstrap solver (FBM) to estimate the model parameters. We evaluated the improvement in quantitative DW-MRI analysis achieved with the SPIM model in terms of accuracy, precision and reproducibility of parameter estimation in both simulated data and in 68 abdominal in-vivo DW-MRIs. Our results show that the SPIM model not only substantially reduced parameter estimation errors by up to 26%; it also significantly improved the robustness of the parameter estimates (paired Student's t-test, p < 0.0001) by reducing the coefficient of variation (CV) of estimated parameters compared to those produced by previous models. In addition, the SPIM model improves the parameter estimates reproducibility for both intra- (up to 47%) and inter-session (up to 30%) estimates compared to those generated by previous models. Thus, the SPIM model has the potential to improve accuracy, precision and robustness of quantitative abdominal DW-MRI analysis for clinical applications. PMID
Generalized Delta Functions and Their Use in Quasi-Probability Distributions
Brewster, R. A.; Franson, J. D.
2016-01-01
Quasi-probability distributions are an essential tool in analyzing the properties of quantum systems, especially in quantum optics. The Glauber-Sudarshan P-function $P(\\alpha)$ is especially useful for calculating the density matrix of a system, but it is often assumed that $P(\\alpha)$ may not exist for highly quantum-mechanical systems due to its singular nature. Here we define a generalized delta function with a complex argument and derive its properties, which are very different from those...
Velocity-gradient probability distribution functions in a lagrangian model of turbulence
The Recent Fluid Deformation Closure (RFDC) model of lagrangian turbulence is recast in path-integral language within the framework of the Martin–Siggia–Rose functional formalism. In order to derive analytical expressions for the velocity-gradient probability distribution functions (vgPDFs), we carry out noise renormalization in the low-frequency regime and find approximate extrema for the Martin–Siggia–Rose effective action. We verify, with the help of Monte Carlo simulations, that the vgPDFs so obtained yield a close description of the single-point statistical features implied by the original RFDC stochastic differential equations. (paper)
Urriza, Paulo; Pawe\\lczak, Przemys\\law; \\vCabrić, Danijela
2010-01-01
We present a novel modulation level classification (MLC) method based on probability distribution distance functions. The proposed method uses modified Kuiper and Kolmogorov- Smirnov (KS) distances to achieve low computational complexity and outperforms the state of the art methods based on cumulants and goodness-of-fit (GoF) tests. We derive the theoretical performance of the proposed MLC method and verify it via simulations. The best classification accuracy under AWGN with SNR mismatch and phase jitter is achieved with the proposed MLC method using Kuiper distances.
Smail, Linda
2016-06-01
The basic task of any probabilistic inference system in Bayesian networks is computing the posterior probability distribution for a subset or subsets of random variables, given values or evidence for some other variables from the same Bayesian network. Many methods and algorithms have been developed to exact and approximate inference in Bayesian networks. This work compares two exact inference methods in Bayesian networks-Lauritzen-Spiegelhalter and the successive restrictions algorithm-from the perspective of computational efficiency. The two methods were applied for comparison to a Chest Clinic Bayesian Network. Results indicate that the successive restrictions algorithm shows more computational efficiency than the Lauritzen-Spiegelhalter algorithm.
Long-Term Probability Distribution of Wind Turbine Planetary Bearing Loads (Poster)
Jiang, Z.; Xing, Y.; Guo, Y.; Dong, W.; Moan, T.; Gao, Z.
2013-04-01
Among the various causes of bearing damage and failure, metal fatigue of the rolling contact surface is the dominant failure mechanism. The fatigue life is associated with the load conditions under which wind turbines operate in the field. Therefore, it is important to understand the long-term distribution of the bearing loads under various environmental conditions. The National Renewable Energy Laboratory's 750-kW Gearbox Reliability Collaborative wind turbine is studied in this work. A decoupled analysis using several computer codes is carried out. The global aero-elastic simulations are performed using HAWC2. The time series of the drivetrain loads and motions from the global dynamic analysis are fed to a drivetrain model in SIMPACK. The time-varying internal pressure distribution along the raceway is obtained analytically. A series of probability distribution functions are then used to fit the long-term statistical distribution at different locations along raceways. The long-term distribution of the bearing raceway loads are estimated under different environmental conditions. Finally, the bearing fatigue lives are calculated.
Density probability distribution functions of diffuse gas in the Milky Way
Berkhuijsen, E M
2008-01-01
In a search for the signature of turbulence in the diffuse interstellar medium in gas density distributions, we determined the probability distribution functions (PDFs) of the average volume densities of the diffuse gas. The densities were derived from dispersion measures and HI column densities towards pulsars and stars at known distances. The PDFs of the average densities of the diffuse ionized gas (DIG) and the diffuse atomic gas are close to lognormal, especially when lines of sight at |b|=5 degrees are considered separately. The PDF of at high |b| is twice as wide as that at low |b|. The width of the PDF of the DIG is about 30 per cent smaller than that of the warm HI at the same latitudes. The results reported here provide strong support for the existence of a lognormal density PDF in the diffuse ISM, consistent with a turbulent origin of density structure in the diffuse gas.
EVALUATION OF THE PROBABILITY DISTRIBUTION OF PITTING CORROSION FATIGUE LIFE IN AIRCRAFT MATERIALS
WANG Qingyuan (王清远); N.KAWAGOISHI; Q.CHEN; R.M.PIDAPARTI
2003-01-01
Corrosion and fatigue properties of aircraft materials are known to have a considerable scatter due to the random nature of materials,loading,and environmental conditions.A probabilistic approach for predicting the pitting corrosion fatigue life has been investigated which captures the effect of the interaction of the cyclic load and corrosive environment and all stages of the corrosion fatigue process (i.e.the pit nucleation and growth,pit-crack transition,short- and long-crack propagation).The probabilistic model investigated considers the uncertainties in the initial pit size,corrosion pitting current,and material properties due to the scatter found in the experimental data.Monte Carlo simulations were performed to define the failure probability distribution.Predicted cumulative distribution functions of fatigue life agreed reasonably well with the existing experimental data.
PHOTOMETRIC REDSHIFT PROBABILITY DISTRIBUTIONS FOR GALAXIES IN THE SDSS DR8
We present redshift probability distributions for galaxies in the Sloan Digital Sky Survey (SDSS) Data Release 8 imaging data. We used the nearest-neighbor weighting algorithm to derive the ensemble redshift distribution N(z), and individual redshift probability distributions P(z) for galaxies with r < 21.8 and u < 29.0. As part of this technique, we calculated weights for a set of training galaxies with known redshifts such that their density distribution in five-dimensional color-magnitude space was proportional to that of the photometry-only sample, producing a nearly fair sample in that space. We estimated the ensemble N(z) of the photometric sample by constructing a weighted histogram of the training-set redshifts. We derived P(z)'s for individual objects by using training-set objects from the local color-magnitude space around each photometric object. Using the P(z) for each galaxy can reduce the statistical error in measurements that depend on the redshifts of individual galaxies. The spectroscopic training sample is substantially larger than that used for the DR7 release. The newly added PRIMUS catalog is now the most important training set used in this analysis by a wide margin. We expect the primary sources of error in the N(z) reconstruction to be sample variance and spectroscopic failures: The training sets are drawn from relatively small volumes of space, and some samples have large incompleteness. Using simulations we estimated the uncertainty in N(z) due to sample variance at a given redshift to be ∼10%-15%. The uncertainty on calculations incorporating N(z) or P(z) depends on how they are used; we discuss the case of weak lensing measurements. The P(z) catalog is publicly available from the SDSS Web site.
SEP distribution function and probability of the maximum magnitudes of events
Nymmik, Rikho
Based on the current knowledge the magnitude of specific anticipated SEP event is a random variable taken from large array of expected values. This set of expected values can be determined in terms of the distribution function. Form of the distribution function of SЕР events is usually determined from the data of continuous satellite measurement. Sometimes but without much effect indirect evidences, such as isotopes of samples of lunar rocks, data on the density of radioactive isotopes in the annual rings of ancient trees are used to determine the SEPE distribution function. The most successful was the attempt to describe the distribution function for 21-23 solar cycles by power-low function with exponential cutoff in the area of large events. Significant addition to the available information are relatively new data (McCracken et al., JGR 106(A10), 21585-21498, 2001) on the radioactive isotopes in the Greenland ice, which gives the additional information about the extreme SEP events since 1561. However, the lack of information about full set of events (mainly on small events) does not allow to use these data directly to determine the distribution function. However, using correlation between the number of sunspots and the corresponding mean number of SEP events, one can determine the distribution function since 1561 based on Greenland data. Surprisingly, the parameter values of this function coincide with those calculated from satellite data. Analysis of the obtained parameters of the distribution function shows that the maximum fluence of protons with energies above 30 MeV does not exceed 1011 cm-2 protons with about 10-11 midget probability.
The probability distribution of returns in the exponential Ornstein–Uhlenbeck model
We analyze the problem of the analytical characterization of the probability distribution of financial returns in the exponential Ornstein–Uhlenbeck model with stochastic volatility. In this model the prices are driven by a geometric Brownian motion, whose diffusion coefficient is expressed through an exponential function of an hidden variable Y governed by a mean-reverting process. We derive closed-form expressions for the probability distribution and its characteristic function in two limit cases. In the first one the fluctuations of Y are larger than the volatility normal level, while the second one corresponds to the assumption of a small stationary value for the variance of Y. Theoretical results are tested numerically by intensive use of Monte Carlo simulations. The effectiveness of the analytical predictions is checked via a careful analysis of the parameters involved in the numerical implementation of the Euler–Maruyama scheme and is tested on a data set of financial indexes. In particular, we discuss results for the German DAX30 and Dow Jones Euro Stoxx 50, finding a good agreement between the empirical data and the theoretical description
Rani K
2014-08-01
Full Text Available Photocopy documents are very common in our normal life. People are permitted to carry and present photocopied documents to avoid damages to the original documents. But this provision is misused for temporary benefits by fabricating fake photocopied documents. Fabrication of fake photocopied document is possible only in 2nd and higher order recursive order of photocopies. Whenever a photocopied document is submitted, it may be required to check its originality. When the document is 1st order photocopy, chances of fabrication may be ignored. On the other hand when the photocopy order is 2nd or above, probability of fabrication may be suspected. Hence when a photocopy document is presented, the recursive order number of photocopy is to be estimated to ascertain the originality. This requirement demands to investigate methods to estimate order number of photocopy. In this work, a voting based approach is used to detect the recursive order number of the photocopy document using probability distributions exponential, extreme values and lognormal distributions is proposed. A detailed experimentation is performed on a generated data set and the method exhibits efficiency close to 89%.
The probability distributions of large-scale atmospheric and oceanic variables are generally skewed and heavy-tailed. We argue that their distinctive departures from Gaussianity arise fundamentally from the fact that in a quadratically nonlinear system with a quadratic invariant, the coupling coefficients between system components are not constant but depend linearly on the system state in a distinctive way. In particular, the skewness arises from a tendency of the system trajectory to linger near states of weak coupling. We show that the salient features of the observed non-Gaussianity can be captured in the simplest such nonlinear 2-component system. If the system is stochastically forced and linearly damped, with one component damped much more strongly than the other, then the strongly damped fast component becomes effectively decoupled from the weakly damped slow component, and its impact on the slow component can be approximated as a stochastic noise forcing plus an augmented nonlinear damping. In the limit of large time-scale separation, the nonlinear augmentation of the damping becomes small, and the noise forcing can be approximated as an additive noise plus a correlated additive and multiplicative noise (CAM noise) forcing. Much of the diversity of observed large-scale atmospheric and oceanic probability distributions can be interpreted in this minimal framework
A one-parameter family of transforms, linearizing convolution laws for probability distributions
Nica, Alexandru
1995-03-01
We study a family of transforms, depending on a parameter q∈[0,1], which interpolate (in an algebraic framework) between a relative (namely: - iz(log ℱ(·)) '(-iz)) of the logarithm of the Fourier transform for probability distributions, and its free analogue constructed by D. Voiculescu ([16, 17]). The classical case corresponds to q=1, and the free one to q=0. We describe these interpolated transforms: (a) in terms of partitions of finite sets, and their crossings; (b) in terms of weighted shifts; (c) by a matrix equation related to the method of Stieltjes for expanding continued J-fractions as power series. The main result of the paper is that all these descriptions, which extend basic approaches used for q=0 and/or q=1, remain equivalent for arbitrary q∈[0, 1]. We discuss a couple of basic properties of the convolution laws (for probability distributions) which are linearized by the considered family of transforms (these convolution laws interpolate between the usual convolution — at q=1, and the free convolution introduced by Voiculescu — at q=0). In particular, we note that description (c) mentioned in the preceding paragraph gives an insight of why the central limit law for the interpolated convolution has to do with the q-continuous Hermite orthogonal polynomials.
Sardeshmukh, Prashant D.; Penland, Cécile
2015-03-01
The probability distributions of large-scale atmospheric and oceanic variables are generally skewed and heavy-tailed. We argue that their distinctive departures from Gaussianity arise fundamentally from the fact that in a quadratically nonlinear system with a quadratic invariant, the coupling coefficients between system components are not constant but depend linearly on the system state in a distinctive way. In particular, the skewness arises from a tendency of the system trajectory to linger near states of weak coupling. We show that the salient features of the observed non-Gaussianity can be captured in the simplest such nonlinear 2-component system. If the system is stochastically forced and linearly damped, with one component damped much more strongly than the other, then the strongly damped fast component becomes effectively decoupled from the weakly damped slow component, and its impact on the slow component can be approximated as a stochastic noise forcing plus an augmented nonlinear damping. In the limit of large time-scale separation, the nonlinear augmentation of the damping becomes small, and the noise forcing can be approximated as an additive noise plus a correlated additive and multiplicative noise (CAM noise) forcing. Much of the diversity of observed large-scale atmospheric and oceanic probability distributions can be interpreted in this minimal framework.
Limin Wang
2015-06-01
Full Text Available As one of the most common types of graphical models, the Bayesian classifier has become an extremely popular approach to dealing with uncertainty and complexity. The scoring functions once proposed and widely used for a Bayesian network are not appropriate for a Bayesian classifier, in which class variable C is considered as a distinguished one. In this paper, we aim to clarify the working mechanism of Bayesian classifiers from the perspective of the chain rule of joint probability distribution. By establishing the mapping relationship between conditional probability distribution and mutual information, a new scoring function, Sum_MI, is derived and applied to evaluate the rationality of the Bayesian classifiers. To achieve global optimization and high dependence representation, the proposed learning algorithm, the flexible K-dependence Bayesian (FKDB classifier, applies greedy search to extract more information from the K-dependence network structure. Meanwhile, during the learning procedure, the optimal attribute order is determined dynamically, rather than rigidly. In the experimental study, functional dependency analysis is used to improve model interpretability when the structure complexity is restricted.
Probability distribution of primordial angular momentum and formation of massive black holes
Susa, H; Tanaka, T; Hajime Susa; Misao Sasaki; Takahiro Tanaka
1994-01-01
Abstract{ We consider the joint probability distribution function for the mass contrast and angular momentum of over-density regions on the proto- galactic scale and investigate the formation of massive black holes at redshift z\\gsim10. We estimate the growth rate of the angular momentum by the linear perturbation theory and the decay rate by the Compton drag and apply the Press-Schechter theory to obtain the formation rate of massive black holes, assuming the full reionization of the universe at z=z_{ion}\\gg 10. We find the correlation between the mass contrast and angular momentum vanishes in the linear theory. However, application of the Press-Schechter theory introduces a correlation between the mass contrast and angular momentum of bound objects. Using thus obtained probability distribution, we calculate the mass fraction of black holes with M\\sim10^6-10^8M_{\\odot} in the universe. We find that it crucially depends on the reionization epoch z_{ion}. Specifically, for the standard CDM power spectrum with ...
Choi, Sung-Hwan; Kim, Seong-Jin; Lee, Kee-Joon; Sung, Sang-Jin; Chun, Youn-Sic
2016-01-01
Objective The purpose of this study was to analyze stress distributions in the roots, periodontal ligaments (PDLs), and bones around cylindrical and tapered miniscrews inserted at different angles using a finite element analysis. Methods We created a three-dimensional (3D) maxilla model of a dentition with extracted first premolars and used 2 types of miniscrews (tapered and cylindrical) with 1.45-mm diameters and 8-mm lengths. The miniscrews were inserted at 30°, 60°, and 90° angles with respect to the bone surface. A simulated horizontal orthodontic force of 2 N was applied to the miniscrew heads. Then, the stress distributions, magnitudes during miniscrew placement, and force applications were analyzed with a 3D finite element analysis. Results Stresses were primarily absorbed by cortical bone. Moreover, very little stress was transmitted to the roots, PDLs, and cancellous bone. During cylindrical miniscrew insertion, the maximum von Mises stress increased as insertion angle decreased. Tapered miniscrews exhibited greater maximum von Mises stress than cylindrical miniscrews. During force application, maximum von Mises stresses increased in both groups as insertion angles decreased. Conclusions For both cylindrical and tapered miniscrew designs, placement as perpendicular to the bone surface as possible is recommended to reduce stress in the surrounding bone. PMID:27478796
Probability distribution of the index in gauge theory on 2d non-commutative geometry
Aoki, Hajime; Nishimura, Jun; Susaki, Yoshiaki
2007-10-01
We investigate the effects of non-commutative geometry on the topological aspects of gauge theory using a non-perturbative formulation based on the twisted reduced model. The configuration space is decomposed into topological sectors labeled by the index ν of the overlap Dirac operator satisfying the Ginsparg-Wilson relation. We study the probability distribution of ν by Monte Carlo simulation of the U(1) gauge theory on 2d non-commutative space with periodic boundary conditions. In general the distribution is asymmetric under ν mapsto -ν, reflecting the parity violation due to non-commutative geometry. In the continuum and infinite-volume limits, however, the distribution turns out to be dominated by the topologically trivial sector. This conclusion is consistent with the instanton calculus in the continuum theory. However, it is in striking contrast to the known results in the commutative case obtained from lattice simulation, where the distribution is Gaussian in a finite volume, but the width diverges in the infinite-volume limit. We also calculate the average action in each topological sector, and provide deeper understanding of the observed phenomenon.
Difficulties arising from the representation of the measurand by a probability distribution
This paper identifies difficulties associated with the concept of representing fixed unknown quantities by probability distributions. This concept, which we refer to as the distributed-measurand concept, is at the heart of the approach to the evaluation of measurement uncertainty described in Supplement 1 to the Guide to the Expression of Uncertainty in Measurement. The paper notes (i) the resulting lack of invariance of measurement results to nonlinear reparametrizations of the measurement problem, (ii) the potential undetected divergence of measurement estimates obtained by Monte Carlo evaluation, (iii) the potential failure of the methodology to give uncertainty intervals enclosing the values of the measurands with an acceptable frequency and (iv) the potential loss of measurement precision. The distributed-measurand concept is gaining popularity partly because of its association with analysis using the Monte Carlo principle. However, the Monte Carlo principle is also applicable without adopting the distributed-measurand concept. Accordingly, an alternative approach to the evaluation of measurement uncertainty is briefly described
Liquid-crystal variable-focus lenses with a spatially-distributed tilt angles.
Honma, Michinori; Nose, Toshiaki; Yanase, Satoshi; Yamaguchi, Rumiko; Sato, Susumu
2009-06-22
A pretilt angle controlling method by the density of rubbings using a tiny stylus is proposed. The control of the surface pretilt angle is achieved by rubbing a side-chain type polyimide film for a homeotropic alignment. Smooth liquid crystal (LC) director distribution in the bulk layer is successfully obtained even though the rough surface orientation. This approach is applied to LC cylindrical and rectangular lenses with a variable-focusing function. The distribution profile of the rubbing pitch (the reciprocal of the rubbing density) for small aberration is determined to be quadratic. The variable focusing function is successfully achieved in the LC rectangular lens, and the voltage dependence of the focal length is tried to be explained by the LC molecular reorientation behavior. PMID:19550499
Microwave field distribution in a magic angle spinning dynamic nuclear polarization NMR probe
Nanni, Emilio A.; Barnes, Alexander B.; Matsuki, Yoh; Woskov, Paul P.; Corzilius, Björn; Griffin, Robert G.; Temkin, Richard J.
2011-01-01
We present a calculation of the microwave field distribution in a magic angle spinning (MAS) probe utilized in dynamic nuclear polarization (DNP) experiments. The microwave magnetic field (B[subscript 1S]) profile was obtained from simulations performed with the High Frequency Structure Simulator (HFSS) software suite, using a model that includes the launching antenna, the outer Kel-F stator housing coated with Ag, the RF coil, and the 4 mm diameter sapphire rotor containing the sample. The p...
SIMULATING THE EFFECTS OF INITIAL PITCH-ANGLE DISTRIBUTIONS ON SOLAR FLARES
In this work, we model both the thermal and non-thermal components of solar flares. The model we use, HYLOOP, combines a hydrodynamic equation solver with a non-thermal particle tracking code to simulate the thermal and non-thermal dynamics and emission of solar flares. In order to test the effects of pitch-angle distribution on flare dynamics and emission, a series of flares is simulated with non-thermal electron beams injected at the loop apex. The pitch-angle distribution of each beam is described by a single parameter and allowed to vary from flare to flare. We use the results of these simulations to generate synthetic hard and soft X-ray emissions (HXR and SXR). The light curves of the flares in Hinode's X-ray Telescope passbands show a distinct signal that is highly dependent on pitch-angle distribution. The simulated HXR emission in the 3-6 keV bandpass shows the formation and evolution of emission sources that correspond well to the observations of pre-impulsive flares. This ability to test theoretical models of thermal and non-thermal flare dynamics directly with observations allows for the investigation of a wide range of physical processes governing the evolution of solar flares. We find that the initial pitch-angle distribution of non-thermal particle populations has a profound effect on loop top HXR and SXR emission and that apparent motion of HXR is a natural consequence of non-thermal particle evolution in a magnetic trap.
The probability distribution functions of emission line flux measurements and their ratios
Wesson, R.; Stock, D. J.; Scicluna, P.
2016-04-01
Many physical parameters in astrophysics are derived using the ratios of two observed quantities. If the relative uncertainties on measurements are small enough, uncertainties can be propagated analytically using simplifying assumptions, but for large normally distributed uncertainties, the probability distribution of the ratio become skewed, with a modal value offset from that expected in Gaussian uncertainty propagation. Furthermore, the most likely value of a ratio A/B is not equal to the reciprocal of the most likely value of B/A. The effect is most pronounced when the uncertainty on the denominator is larger than that on the numerator. We show that this effect is seen in an analysis of 12,126 spectra from the Sloan Digital Sky Survey. The intrinsically fixed ratio of the [O III] lines at 4959 and 5007Å is conventionally expressed as the ratio of the stronger line to the weaker line. Thus, the uncertainty on the denominator is larger, and non-Gaussian probability distributions result. By taking this effect into account, we derive an improved estimate of the intrinsic 5007/4959 ratio. We obtain a value of 3.012±0.008, which is slightly but statistically significantly higher than the theoretical value of 2.98. We further investigate the suggestion that fluxes measured from emission lines at low signal to noise are strongly biased upwards. We were unable to detect this effect in the SDSS line flux measurements, and we could not reproduce the results of Rola and Pelat who first described this bias. We suggest that the magnitude of this effect may depend strongly on the specific fitting algorithm used.
The probability distribution functions of emission line flux measurements and their ratios
Wesson, R.; Stock, D. J.; Scicluna, P.
2016-07-01
Many physical parameters in astrophysics are derived using the ratios of two observed quantities. If the relative uncertainties on measurements are small enough, uncertainties can be propagated analytically using simplifying assumptions, but for large normally distributed uncertainties, the probability distribution of the ratio become skewed, with a modal value offset from that expected in Gaussian uncertainty propagation. Furthermore, the most likely value of a ratio A/B is not equal to the reciprocal of the most likely value of B/A. The effect is most pronounced when the uncertainty on the denominator is larger than that on the numerator. We show that this effect is seen in an analysis of 12 126 spectra from the Sloan Digital Sky Survey (SDSS). The intrinsically fixed ratio of the [O III] lines at 4959 and 5007 Å is conventionally expressed as the ratio of the stronger line to the weaker line. Thus, the uncertainty on the denominator is larger, and non-Gaussian probability distributions result. By taking this effect into account, we derive an improved estimate of the intrinsic 5007/4959 ratio. We obtain a value of 3.012 ± 0.008, which is slightly but statistically significantly higher than the theoretical value of 2.98. We further investigate the suggestion that fluxes measured from emission lines in noisy spectra are strongly biased upwards. We were unable to detect this effect in the SDSS line flux measurements, and we could not reproduce the results of Rola and Pelat who first described this bias. We suggest that the magnitude of this effect may depend strongly on the specific fitting algorithm used.
Pitch angle distributions of energetic ions in the lobes of the distant geomagnetic tail
Analysis of energetic (> 35 keV) ion data from the ISEE-3 spacecraft obtained during 1982-1983, when the spacecraft made a series of traversals of the distant geomagnetic tail (XGSE > - 230 RE), indicates that the pitch angle distribution of energetic ions in the distant tail lobes is usually highly anisotropic, being peaked closely perpendicular to the magnetic field direction, but with a small net flow in the antisunward direction. In this paper we present a model, based on the motion of single particles into and within the tail lobes, which accounts for these observed distributions. This model assumes that the lobe ions originate in the magnetosheath, where the energetic ion population consists of two components; a spatially uniform ''solar'' population, and a population of ''terrestrial'' origin, which decreases in strength with downtail distance. The pitch angle distribution at any point within the lobe may be constructed, assuming that the value of the distribution function along the particle trajectory is conserved. In general, those ions with a large field-aligned component to their motion enter the lobes in the deep tail, where the ''terrestrial'' source is weak, whilst those moving closely perpendicular to the field enter the lobes at positions much closer to the Earth, where the source is strong. The fluxes of these latter ions are therefore much enhanced above the rest of the pitch angle distribution, and are shown to account for the form of the observed distributions. The model also accounts for the more isotropic ion population observed in the lobe during solar particle events, when the ''terrestrial'' component of the magnetosheath source may be considered negligible in comparison to the enhanced ''solar'' component. (author)
李军超; 杨芬芬; 周志强
2015-01-01
Although multi-stage incremental sheet forming has always been adopted instead of single-stage forming to form parts with a steep wall angle or to achieve a high forming performance, it is largely dependent on empirical designs. In order to research multi-stage forming further, the effect of forming stages (n) and angle interval between the two adjacent stages (Δα) on thickness distribution was investigated. Firstly, a finite element method (FEM) model of multi-stage incremental forming was established and experimentally verified. Then, based on the proposed simulation model, different strategies were adopted to form a frustum of cone with wall angle of 30° to research the thickness distribution of multi-pass forming. It is proved that the minimum thickness increases largely and the variance of sheet thickness decreases significantly as the value of n grows. Further, with the increase of Δα, the minimum thickness increases initially and then decreases, and the optimal thickness distribution is achieved with Δα of 10°. Additionally, a formula is deduced to estimate the sheet thickness after multi-stage forming and proved to be effective. And the simulation results fit well with the experimental results.
Detection of two power-law tails in the probability distribution functions of massive GMCs
Schneider, N; Girichidis, P; Rayner, T; Motte, F; Andre, P; Russeil, D; Abergel, A; Anderson, L; Arzoumanian, D; Benedettini, M; Csengeri, T; Didelon, P; Francesco, J D; Griffin, M; Hill, T; Klessen, R S; Ossenkopf, V; Pezzuto, S; Rivera-Ingraham, A; Spinoglio, L; Tremblin, P; Zavagno, A
2015-01-01
We report the novel detection of complex high-column density tails in the probability distribution functions (PDFs) for three high-mass star-forming regions (CepOB3, MonR2, NGC6334), obtained from dust emission observed with Herschel. The low column density range can be fit with a lognormal distribution. A first power-law tail starts above an extinction (Av) of ~6-14. It has a slope of alpha=1.3-2 for the rho~r^-alpha profile for an equivalent density distribution (spherical or cylindrical geometry), and is thus consistent with free-fall gravitational collapse. Above Av~40, 60, and 140, we detect an excess that can be fitted by a flatter power law tail with alpha>2. It correlates with the central regions of the cloud (ridges/hubs) of size ~1 pc and densities above 10^4 cm^-3. This excess may be caused by physical processes that slow down collapse and reduce the flow of mass towards higher densities. Possible are: 1. rotation, which introduces an angular momentum barrier, 2. increasing optical depth and weaker...
Monahan, A. H.; Delsole, T.
2009-02-01
A basic task of exploratory data analysis is the characterisation of "structure" in multivariate datasets. For bivariate Gaussian distributions, natural measures of dependence (the predictive relationship between individual variables) and compactness (the degree of concentration of the probability density function (pdf) around a low-dimensional axis) are respectively provided by ordinary least-squares regression and Principal Component Analysis. This study considers general measures of structure for non-Gaussian distributions and demonstrates that these can be defined in terms of the information theoretic "distance" (as measured by relative entropy) between the given pdf and an appropriate "unstructured" pdf. The measure of dependence, mutual information, is well-known; it is shown that this is not a useful measure of compactness because it is not invariant under an orthogonal rotation of the variables. An appropriate rotationally invariant compactness measure is defined and shown to reduce to the equivalent PCA measure for bivariate Gaussian distributions. This compactness measure is shown to be naturally related to a standard information theoretic measure of non-Gaussianity. Finally, straightforward geometric interpretations of each of these measures in terms of "effective volume" of the pdf are presented.
Diffraction in time in terms of Wigner distributions and tomographic probabilities
Man'ko, V I; Sharma, A; Man'ko, Vladimir; Moshinsky, Marcos; Sharma, Anju
1999-01-01
Long ago appeared a discussion in quantum mechanics of the problem of opening a completely absorbing shutter on which were impinging a stream of particles of definite velocity. The solution of the problem was obtained in a form entirely analogous to the optical one of diffraction by a straight edge. The argument of the Fresnel integrals was though time dependent and thus the first part in the title of this article. In section 1 we briefly review the original formulation of the problem of diffraction in time. In section 2 and 3 we reformulate respectively this problem in Wigner distributions and tomographical probabilities. In the former case the probability in phase space is very simple but, as it takes positive and negative values, the interpretation is ambiguous, but it gives a classical limit that agrees entirely with our intuition. In the latter case we can start with our initial conditions in a given reference frame but obtain our final solution in an arbitrary frame of reference.
Han Liwei
2014-07-01
Full Text Available Monitoring data on an earth-rockfill dam constitutes a form of spatial data. Such data include much uncertainty owing to the limitation of measurement information, material parameters, load, geometry size, initial conditions, boundary conditions and the calculation model. So the cloud probability density of the monitoring data must be addressed. In this paper, the cloud theory model was used to address the uncertainty transition between the qualitative concept and the quantitative description. Then an improved algorithm of cloud probability distribution density based on a backward cloud generator was proposed. This was used to effectively convert certain parcels of accurate data into concepts which can be described by proper qualitative linguistic values. Such qualitative description was addressed as cloud numerical characteristics-- {Ex, En, He}, which could represent the characteristics of all cloud drops. The algorithm was then applied to analyze the observation data of a piezometric tube in an earth-rockfill dam. And experiment results proved that the proposed algorithm was feasible, through which, we could reveal the changing regularity of piezometric tube’s water level. And the damage of the seepage in the body was able to be found out.
Kundu, Prasun K
2015-01-01
Rainfall exhibits extreme variability at many space and time scales and calls for a statistical description. Based on an analysis of radar measurements of precipitation over the tropical oceans, we introduce a new probability law for the area-averaged rain rate constructed from the class of log-infinitely divisible distributions that accurately describes the frequency of the most intense rain events. The dependence of its parameters on the spatial averaging length L allows one to relate spatial statistics at different scales. In particular, it enables us to explain the observed power law scaling of the moments of the data and successfully predicts the continuous spectrum of scaling exponents expressing multiscaling characteristics of the rain intensity field.
A unified optical damage criterion based on the probability density distribution of detector signals
Somoskoi, T.; Vass, Cs.; Mero, M.; Mingesz, R.; Bozoki, Z.; Osvay, K.
2013-11-01
Various methods and procedures have been developed so far to test laser induced optical damage. The question naturally arises, that what are the respective sensitivities of these diverse methods. To make a suitable comparison, both the processing of the measured primary signal has to be at least similar to the various methods, and one needs to establish a proper damage criterion, which has to be universally applicable for every method. We defined damage criteria based on the probability density distribution of the obtained detector signals. This was determined by the kernel density estimation procedure. We have tested the entire evaluation procedure in four well-known detection techniques: direct observation of the sample by optical microscopy; monitoring of the change in the light scattering power of the target surface and the detection of the generated photoacoustic waves both in the bulk of the sample and in the surrounding air.
Irreversible models with Boltzmann–Gibbs probability distribution and entropy production
We analyze irreversible interacting spin models evolving according to a master equation with spin flip transition rates that do not obey detailed balance but obey global balance with a Boltzmann–Gibbs probability distribution. Spin flip transition rates with up–down symmetry are obtained for a linear chain, a square lattice, and a cubic lattice with a stationary state corresponding to the Ising model with nearest neighbor interactions. We show that these irreversible dynamics describes the contact of the system with particle reservoirs that cause a flux of particles through the system. Using a microscopic definition, we determine the entropy production rate of these irreversible models and show that it can be written as a macroscopic bilinear form in the forces and fluxes. Exact expressions for this property are obtained for the linear chain and the square lattice. In this last case the entropy production rate displays a singularity at the phase transition point of the same type as the entropy itself
Federrath, Christoph; Schmidt, Wolfram
2008-01-01
The probability density function (PDF) of the gas density in turbulent supersonic flows is investigated with high-resolution numerical simulations. In a systematic study, we compare the density statistics of compressible turbulence driven by the usually adopted solenoidal forcing (divergence-free) and by compressive forcing (curl-free). Our results are in agreement with studies using solenoidal forcing. However, compressive forcing yields a significantly broader density distribution with standard deviation ~3 times larger at the same rms Mach number. The standard deviation-Mach number relation used in analytical models of star formation is reviewed and a modification of the existing expression is proposed, which takes into account the ratio of solenoidal and compressive modes of the turbulence forcing.
Generating function for particle-number probability distribution in directed percolation
We derive a generic expression for the generating function (GF) of the particle-number probability distribution (PNPD) for a simple reaction diffusion model that belongs to the directed percolation universality class. Starting with a single particle on a lattice, we show that the GF of the PNPD can be written as an infinite series of cumulants taken at zero momentum. This series can be summed up into a complete form at the level of a mean-field approximation. Using the renormalization group techniques, we determine logarithmic corrections for the GF at the upper critical dimension. We also find the critical scaling form for the PNPD and check its universality numerically in one dimension. The critical scaling function is found to be universal up to two non-universal metric factors
Ossenkopf, Volker; Schneider, Nicola; Federrath, Christoph; Klessen, Ralf S
2016-01-01
Probability distribution functions (PDFs) of column densities are an established tool to characterize the evolutionary state of interstellar clouds. Using simulations, we show to what degree their determination is affected by noise, line-of-sight contamination, field selection, and the incomplete sampling in interferometric measurements. We solve the integrals that describe the convolution of a cloud PDF with contaminating sources and study the impact of missing information on the measured column density PDF. The effect of observational noise can be easily estimated and corrected for if the root mean square (rms) of the noise is known. For $\\sigma_{noise}$ values below 40\\,\\% of the typical cloud column density, $N_{peak}$, this involves almost no degradation of the accuracy of the PDF parameters. For higher noise levels and narrow cloud PDFs the width of the PDF becomes increasingly uncertain. A contamination by turbulent foreground or background clouds can be removed as a constant shield if the PDF of the c...
Dai, Mi; Wang, Yun
2016-06-01
In order to obtain robust cosmological constraints from Type Ia supernova (SN Ia) data, we have applied Markov Chain Monte Carlo (MCMC) to SN Ia lightcurve fitting. We develop a method for sampling the resultant probability density distributions (pdf) of the SN Ia lightcuve parameters in the MCMC likelihood analysis to constrain cosmological parameters, and validate it using simulated data sets. Applying this method to the `joint lightcurve analysis (JLA)' data set of SNe Ia, we find that sampling the SN Ia lightcurve parameter pdf's leads to cosmological parameters closer to that of a flat Universe with a cosmological constant, compared to the usual practice of using only the best-fitting values of the SN Ia lightcurve parameters. Our method will be useful in the use of SN Ia data for precision cosmology.
EVALUATION OF THE PROBABILITY DISTRIBUTION OF PITTING CORROSION FATIGUE LIFE IN AIRCRAFT MATERIALS
王清远; N.KAWAGOISHI; Q.CHEN; R.M.PIDAPARTI
2003-01-01
Corrosion and fatigue properties of aircraft materials axe known to have a considerablescatter due to the random nature of materials, loading, and environmental conditions. A probabilisticapproach for predicting the pitting corrosion fatigue life has been investigated which captures the effectof the interaction of the cyclic load and corrosive environment and all stages of the corrosion fatigueprocess (i.e. the pit nucleation and growth, pit-crack transition, short- and long-crack propagation).The probabilistic model investigated considers the uncertainties in the initial pit size, corrosion pittingcurrent, and material properties due to the scatter found in the experimental data. Monte Carlo simu-lations were performed to define the failure probability distribution. Predicted cumulative distributionfunctions of fatigue life agreed reasonably well with the existing experimental data.
Wenger, S.J.; Freeman, Mary C.
2008-01-01
Researchers have developed methods to account for imperfect detection of species with either occupancy (presence-absence) or count data using replicated sampling. We show how these approaches can be combined to simultaneously estimate occurrence, abundance, and detection probability by specifying a zero-inflated distribution for abundance. This approach may be particularly appropriate when patterns of occurrence and abundance arise from distinct processes operating at differing spatial or temporal scales. We apply the model to two data sets: (1) previously published data for a species of duck, Anas platyrhynchos, and (2) data for a stream fish species, Etheostoma scotti. We show that in these cases, an incomplete-detection zero-inflated modeling approach yields a superior fit to the data than other models. We propose that zero-inflated abundance models accounting for incomplete detection be considered when replicate count data are available.
熊峻江; 武哲; 高镇同
2002-01-01
According to the traditional fatigue constant life curve, the concept and the universal expression of the generalized fatigue constant life curve were proposed.Then, on the basis of the optimization method of the correlation coefficient, the parameter estimation formulas were induced and the generalized fatigue constant life curve with the reliability level p was given.From P-Sa-Sm curve, the two-dimensional probability distribution of the fatigue limit was derived.After then, three set of tests of LY11 CZ corresponding to the different average stress were carried out in terms of the two-dimensional up-down method.Finally, the methods are used to analyze the test results, and it is found that the analyzedresults with the high precision may be obtained.
Pisek, J.
2012-12-01
Directional distribution of leaves is one primary parameter for determining the radiation transmission through the canopy. When inverting canopy transmittance measurements for estimating the leaf area index or foliage clumping, incorrect assumptions on leaf angles may lead to considerable errors. Often spherical distribution of leaf normals is assumed, i.e. leaf normals are assumed to have no preferred direction in situations where no measurement data are available. The goal of this study is to examine if a spherical leaf angle distribution and the resulting isotropic G-function (G≡0.5) is indeed a valid assumption for temperate and boreal tree and shrub species. Leaf angle distributions were measured for over 80 deciduous broadleaf species commonly found in temperate and boreal ecoclimatic regions. The leaf inclination angles were obtained by sampling the complete vertical extent of trees and shrubs using a recently introduced technique based on digital photography. It is found a spherical leaf angle distribution is not a valid assumption for both tree and shrub species in temperate and boreal ecoclimatic regions. Given the influence of leaf angle distribution on inverting clumping and LAI estimates from canopy transmittance measurements, it is recommended to use planophile or plagiophile leaf angle distribution as more appropriate for modeling radiation transmission in temperate and boreal ecoclimatic regions when no actual leaf inclination angle measurements are available.
Application of probability distribution functions in the ASTM RBCA framework for use in California
Currently, Environmental Protection Agency (EPA, 1989b) and other conventional methodologies of risk assessment, such as the American Society for Testing and Materials--risk-based corrective action (ASTM/RBCA) format, make use of deterministic, or point numbers in making estimates of risk. The goal of risk assessment is to provide a systematic tool to evaluate hazards and exposures to assist in the management of society's activities. To properly do this, there must be an attempt by the regulator or the responsible party to use information as effectively as possible. The use of historical data and probability distribution functions is a suggested initial approach to dealing with LUFT sites in California, taking into account geophysical, societal, and health based parameters particular to the State. These parameters may be based on results of the CalLUFT HCA, from California Census information, or from other sources, where appropriate. Because of the limitations involved with the use of point sources in the ASTM/RBCA format, probability distribution functions can be used to give regulatory personnel and risk managers more understanding of the actual range of risks involved. Such information will allow the risk manager a higher comfort level in dealing with risks, and will, by detailing the residual risks involved, allow for the potential consequences of decisions to be better known. The above methodology effectively allows the risk manager to choose a level of health risk appropriate for the site, allows for a general prioritizing in regards to other sites, and removes some of the restrictions in applying remedial action necessitated by MCLs or deterministic risk estimates
Lee, T. S.; Yoon, S.; Jeong, C.
2012-12-01
The primary purpose of frequency analysis in hydrology is to estimate the magnitude of an event with a given frequency of occurrence. The precision of frequency analysis depends on the selection of an appropriate probability distribution model (PDM) and parameter estimation techniques. A number of PDMs have been developed to describe the probability distribution of the hydrological variables. For each of the developed PDMs, estimated parameters are provided based on alternative estimation techniques, such as the method of moments (MOM), probability weighted moments (PWM), linear function of ranked observations (L-moments), and maximum likelihood (ML). Generally, the results using ML are more reliable than the other methods. However, the ML technique is more laborious than the other methods because an iterative numerical solution, such as the Newton-Raphson method, must be used for the parameter estimation of PDMs. In the meantime, meta-heuristic approaches have been developed to solve various engineering optimization problems (e.g., linear and stochastic, dynamic, nonlinear). These approaches include genetic algorithms, ant colony optimization, simulated annealing, tabu searches, and evolutionary computation methods. Meta-heuristic approaches use a stochastic random search instead of a gradient search so that intricate derivative information is unnecessary. Therefore, the meta-heuristic approaches have been shown to be a useful strategy to solve optimization problems in hydrology. A number of studies focus on using meta-heuristic approaches for estimation of hydrological variables with parameter estimation of PDMs. Applied meta-heuristic approaches offer reliable solutions but use more computation time than derivative-based methods. Therefore, the purpose of this study is to enhance the meta-heuristic approach for the parameter estimation of PDMs by using a recently developed algorithm known as a harmony search (HS). The performance of the HS is compared to the
Size effect on strength and lifetime probability distributions of quasibrittle structures
Zdeněk P Bažant; Jia-Liang Le
2012-02-01
Engineering structures such as aircraft, bridges, dams, nuclear containments and ships, as well as computer circuits, chips and MEMS, should be designed for failure probability < $10^{-6}-10^{-7}$ per lifetime. The safety factors required to ensure it are still determined empirically, even though they represent much larger and much more uncertain corrections to deterministic calculations than do the typical errors of modern computer analysis of structures. The empirical approach is sufﬁcient for perfectly brittle and perfectly ductile structures since the cumulative distribution function (cdf) of random strength is known, making it possible to extrapolate to the tail from the mean and variance. However, the empirical approach does not apply to structures consisting of quasibrittle materials, which are brittle materials with inhomogeneities that are not negligible compared to structure size. This paper presents a reﬁned theory on the strength distribution of quasibrittle structures, which is based on the fracture mechanics of nanocracks propagating by activation energy controlled small jumps through the atomic lattice and an analytical model for the multi-scale transition of strength statistics. Based on the power law for creep crack growth rate and the cdf of material strength, the lifetime distribution of quasibrittle structures under constant load is derived. Both the strength and lifetime cdfs are shown to be sizeand geometry-dependent. The theory predicts intricate size effects on both the mean structural strength and lifetime, the latter being much stronger. The theory is shown to match the experimentally observed systematic deviations of strength and lifetime histograms of industrial ceramics from the Weibull distribution.
Yakovenko, Victor M.
2012-01-01
This Chapter is written for the Festschrift celebrating the 70th birthday of the distinguished economist Duncan Foley from the New School for Social Research in New York. This Chapter reviews applications of statistical physics methods, such as the principle of entropy maximization, to the probability distributions of money, income, and global energy consumption per capita. The exponential probability distribution of wages, predicted by the statistical equilibrium theory of a labor market dev...
In some of the recent probabilistic safety assessments, discrete probability distributions (DPDs) have been developed to express, in a quantitative form, estimates of the uncertainty and conservatism in the point estimate source term values. In the DPD approach, distributed, discrete factors, which are multipliers on the point estimate values by the selected factor are made based on available data, calculations, and engineering judgment. Initial application of the DPD approach to source terms for risk analysis was based largely on engineering judgment after review of applicable data. However, in more recent applications of the DPD approach, results from an extensive review of existing experimental data and applied calculations have been factored into the estimates. Programs currently in progress, largely sponsored by NRC and EPRI, are beginning to yield significant new information upon which to base improved estimates for the magnitude of radionuclide source terms. The most extensive of the reviews of existing data for application to the DPD approach was that performed as part of the risk assessment for the proposed Sizewell B PWR. As part of the Seabrook risk study, DPD values specifically for that plant were developed based on the Sizewell approach. They represent a significant update to the Sizewell DPD values. In addition, DPD values were developed for associated release parameters which also affect the consequence calculations
Various Models for Pion Probability Distributions from Heavy-Ion Collisions
Mekjian, A Z; Strottman, D D
1998-01-01
Various models for pion multiplicity distributions produced in relativistic heavy ion collisions are discussed. The models include a relativistic hydrodynamic model, a thermodynamic description, an emitting source pion laser model, and a description which generates a negative binomial description. The approach developed can be used to discuss other cases which will be mentioned. The pion probability distributions for these various cases are compared. Comparison of the pion laser model and Bose-Einstein condensation in a laser trap and with the thermal model are made. The thermal model and hydrodynamic model are also used to illustrate why the number of pions never diverges and why the Bose-Einstein correction effects are relatively small. The pion emission strength $\\eta$ of a Poisson emitter and a critical density $\\eta_c$ are connected in a thermal model by $\\eta/n_c = e^{-m/T} < 1$, and this fact reduces any Bose-Einstein correction effects in the number and number fluctuation of pions. Fluctuations can...
A new probability distribution model of turbulent irradiance based on Born perturbation theory
无
2010-01-01
The subject of the PDF (Probability Density Function) of the irradiance fluctuations in a turbulent atmosphere is still unsettled.Theory reliably describes the behavior in the weak turbulence regime,but theoretical description in the strong and whole turbulence regimes are still controversial.Based on Born perturbation theory,the physical manifestations and correlations of three typical PDF models (Rice-Nakagami,exponential-Bessel and negative-exponential distribution) were theoretically analyzed.It is shown that these models can be derived by separately making circular-Gaussian,strong-turbulence and strong-turbulence-circular-Gaussian approximations in Born perturbation theory,which denies the viewpoint that the Rice-Nakagami model is only applicable in the extremely weak turbulence regime and provides theoretical arguments for choosing rational models in practical applications.In addition,a common shortcoming of the three models is that they are all approximations.A new model,called the Maclaurin-spread distribution,is proposed without any approximation except for assuming the correlation coefficient to be zero.So,it is considered that the new model can exactly reflect the Born perturbation theory.Simulated results prove the accuracy of this new model.
Effect of slope angle of an artificial pool on distributions of turbulence
Atefeh Fazlollahi; Hossein Afzalimehr; Jueyi Sui
2015-01-01
abstract Experiments were carried out over a 2-dimentional pool with a constant length of 1.5 m and four different slopes. The distributions of velocity, Reynolds stress and turbulence intensities have been studied in this paper. Results show that as flow continues up the exit slope, the flow velocity increases near the channel bed and decreases near the water surface. The flow separation was not observed by ADV at the crest of the bed-form. In addition, the length of the separation zone increases with the increasing of entrance and exit slopes. The largest slope angle causes the maximum normalized shear stress. Based on the experiments, it is concluded that the shape of Reynolds stress distribution is generally dependent on the entrance and exit slopes of the pool. Also, the shape of Reynolds stress distribution is affected by both decelerating and accelerating flows. Additionally, with the increase in the slope angle, secondary currents are developed and become more stable. Results of the quadrant analysis show that the momentum between flow and bed-form is mostly transferred by sweep and ejection events.&2015 International Research and Training Centre on Erosion and Sedimentation/the World Association for Sedimentation and Erosion Research. Published by Elsevier B.V. All rights reserved.
Burkhart, Blakesley; Ossenkopf, V.; Lazarian, A.; Stutzki, J.
2013-07-01
We study the effects of radiative transfer on the probability distribution functions (PDFs) of simulations of magnetohydrodynamic turbulence in the widely studied 13CO 2-1 transition. We find that the integrated intensity maps generally follow a log-normal distribution, with the cases that have τ ≈ 1 best matching the PDF of the column density. We fit a two-dimensional variance-sonic Mach number relationship to our logarithmic PDFs of the form \\sigma _{\\ln (\\Sigma /\\Sigma _0)}^2=A\\times \\ln (1+b^2{\\cal M}_s^2) and find that, for parameter b = 1/3, parameter A depends on the radiative transfer environment. We also explore the variance, skewness, and kurtosis of the linear PDFs finding that higher moments reflect both higher sonic Mach number and lower optical depth. Finally, we apply the Tsallis incremental PDF function and find that the fit parameters depend on both Mach numbers, but also are sensitive to the radiative transfer parameter space, with the τ ≈ 1 case best fitting the incremental PDF of the true column density. We conclude that, for PDFs of low optical depth cases, part of the gas is always subthermally excited so that the spread of the line intensities exceeds the spread of the underlying column densities and hence the PDFs do not reflect the true column density. Similarly, PDFs of optically thick cases are dominated by the velocity dispersion and therefore do not represent the true column density PDF. Thus, in the case of molecules like carbon monoxide, the dynamic range of intensities, structures observed, and, consequently, the observable PDFs are less determined by turbulence and more often determined by radiative transfer effects.
We study the effects of radiative transfer on the probability distribution functions (PDFs) of simulations of magnetohydrodynamic turbulence in the widely studied 13CO 2-1 transition. We find that the integrated intensity maps generally follow a log-normal distribution, with the cases that have τ ≈ 1 best matching the PDF of the column density. We fit a two-dimensional variance-sonic Mach number relationship to our logarithmic PDFs of the form σln2(Σ/Σ0) = A x ln(1+b2Ms2) and find that, for parameter b = 1/3, parameter A depends on the radiative transfer environment. We also explore the variance, skewness, and kurtosis of the linear PDFs finding that higher moments reflect both higher sonic Mach number and lower optical depth. Finally, we apply the Tsallis incremental PDF function and find that the fit parameters depend on both Mach numbers, but also are sensitive to the radiative transfer parameter space, with the τ ≈ 1 case best fitting the incremental PDF of the true column density. We conclude that, for PDFs of low optical depth cases, part of the gas is always subthermally excited so that the spread of the line intensities exceeds the spread of the underlying column densities and hence the PDFs do not reflect the true column density. Similarly, PDFs of optically thick cases are dominated by the velocity dispersion and therefore do not represent the true column density PDF. Thus, in the case of molecules like carbon monoxide, the dynamic range of intensities, structures observed, and, consequently, the observable PDFs are less determined by turbulence and more often determined by radiative transfer effects.
Multiple-streaming and the Probability Distribution of Density in Redshift Space
Hui, L; Shandarin, S F; Hui, Lam; Kofman, Lev; Shandarin, Sergei F.
1999-01-01
We examine several aspects of redshift distortions by expressing the redshift-space density in terms of the eigenvalues and orientation of the local Lagrangian deformation tensor. We explore the importance of multiple-streaming using the Zel'dovich approximation (ZA), and compute the average number of streams in real and redshift-space. It is found that multiple-streaming can be significant in redshift-space but negligible in real-space, even at moderate values of the linear fluctuation amplitude ($\\sigma < 1$). Moreover, unlike their real-space counter-parts, redshift-space multiple-streams can flow past each other with minimal interactions. Such nonlinear redshift-space effects, which operate even when the real-space density field is quite linear, could suppress the classic compression of redshift-structures predicted by linear theory (Kaiser 1987). We also compute using the ZA the probability distribution function (PDF) of density, as well as $S_3$, in real and redshift-space, and compare it with the PD...
Probability distribution functions of gas in M31 and M51
Berkhuijsen, Elly M
2015-01-01
We present probability distribution functions (PDFs) of the surface densities of ionized and neutral gas in the nearby spiral galaxies M31 and M51, as well as of dust emission and extinction Av in M31. The PDFs are close to lognormal and those for HI and Av in M31 are nearly identical. However, the PDFs for H2 are wider than the HI PDFs and the M51 PDFs have larger dispersions than those for M31. We use a simple model to determine how the PDFs are changed by variations in the line-of-sight (LOS) pathlength L through the gas, telescope resolution and the volume filling factor of the gas, f_v. In each of these cases the dispersion sigma of the lognormal PDF depends on the variable with a negative power law. We also derive PDFs of mean LOS volume densities of gas components in M31 and M51. Combining these with the volume density PDFs for different components of the ISM in the Milky Way (MW), we find that sigma decreases with increasing length L with an exponent of -0.76 +/- 0.06, which is steeper than expected. ...
Turbulence-Induced Relative Velocity of Dust Particles III: The Probability Distribution
Pan, Liubin; Scalo, John
2014-01-01
Motivated by its important role in the collisional growth of dust particles in protoplanetary disks, we investigate the probability distribution function (PDF) of the relative velocity of inertial particles suspended in turbulent flows. Using the simulation from our previous work, we compute the relative velocity PDF as a function of the friction timescales, tau_p1 and tau_p2, of two particles of arbitrary sizes. The friction time of particles included in the simulation ranges from 0.1 tau_eta to 54T_L, with tau_eta and T_L the Kolmogorov time and the Lagrangian correlation time of the flow, respectively. The relative velocity PDF is generically non-Gaussian, exhibiting fat tails. For a fixed value of tau_p1, the PDF is the fattest for equal-size particles (tau_p2~tau_p1), and becomes thinner at both tau_p2tau_p1. Defining f as the friction time ratio of the smaller particle to the larger one, we find that, at a given f in 1/2>T_L). These features are successfully explained by the Pan & Padoan model. Usin...
Particle size distribution models of small angle neutron scattering pattern on ferro fluids
The Fe3O4 ferro fluids samples were synthesized by a co-precipitation method. The investigation of ferro fluids microstructure is known to be one of the most important problems because the presence of aggregates and their internal structure influence greatly the properties of ferro fluids. The size and the size dispersion of particle in ferro fluids were determined assuming a log normal distribution of particle radius. The scattering pattern of the measurement by small angle neutron scattering were fitted by the theoretical scattering function of two limitation models are log normal sphere distribution and fractal aggregate. Two types of particle are detected, which are presumably primary particle of 30 Armstrong in radius and secondary fractal aggregate of 200 Armstrong with polydispersity of 0.47 up to 0.53. (author)
Wang, Kaiti; Lin, Ching-Huei; Wang, Lu-Yin; Hada, Tohru; Nishimura, Yukitoshi; Turner, Drew L.; Angelopoulos, Vassilis
2014-12-01
Changes in pitch angle distributions of electrons with energies from a few eV to 1 MeV at dipolarization sites in Earth's magnetotail are investigated statistically to determine the extent to which adiabatic acceleration may contribute to these changes. Forty-two dipolarization events from 2008 and 2009 observed by Time History of Events and Macroscale Interactions during Substorms probes covering the inner plasma sheet from 8 RE to 12 RE during geomagnetic activity identified by the AL index are analyzed. The number of observed events with cigar-type distributions (peaks at 0° and 180°) decreases sharply below 1 keV after dipolarization because in many of these events, electron distributions became more isotropized. From above 1 keV to a few tens of keV, however, the observed number of cigar-type events increases after dipolarization and the number of isotropic events decreases. These changes can be related to the ineffectiveness of Fermi acceleration below 1 keV (at those energies, dipolarization time becomes comparable to electron bounce time). Model-calculated pitch angle distributions after dipolarization with the effect of betatron and Fermi acceleration tested indicate that these adiabatic acceleration mechanisms can explain the observed patterns of event number changes over a large range of energies for cigar events and isotropic events. Other factors still need to be considered to assess the observed increase in cigar events around 2 keV. Indeed, preferential directional increase/loss of electron fluxes, which may contribute to the formation of cigar events, was observed. Nonadiabatic processes to accelerate electrons in a parallel direction may also be important for future study.
Clark, G.; Paranicas, C.; Santos-Costa, D.; Livi, S.; Krupp, N.; Mitchell, D. G.; Roussos, E.; Tseng, W.-L.
2014-12-01
We provide a global view of ~20 to 800 keV electron pitch angle distributions (PADs) close to Saturn's current sheet using observations from the Cassini MIMI/LEMMS instrument. Previous work indicated that the nature of pitch angle distributions in Saturn's inner to middle magnetosphere changes near the radial distance of 10RS. This work confirms the existence of a PAD transition region. Here we go further and develop a new technique to statistically quantify the spatial profile of butterfly PADs as well as present new spatial trends on the isotropic PAD. Additionally, we perform a case study analysis and show the PADs exhibit strong energy dependent features throughout this transition region. We also present a diffusion theory model based on adiabatic transport, Coulomb interactions with Saturn's neutral gas torus, and an energy dependent radial diffusion coefficient. A data-model comparison reveals that adiabatic transport is the dominant transport mechanism between ~8 to 12RS, however interactions with Saturn's neutral gas torus become dominant inside ~7RS and govern the flux level of ~20 to 800 keV electrons. We have also found that field-aligned fluxes were not well reproduced by our modeling approach. We suggest that wave-particle interactions and/or a polar source of the energetic particles needs further investigation.
Okei, K.; Takahashi, N.; Nakatsuka, T.
Moliere simultaneous distribution between the deflection angle and the lateral displacement is derived by applying numerical Fourier transforms on the solution for frequency distribution acquired through Kamata-Nishimura formulation of Moliere theory. The differences of our result from that under the gaussian approximation and the basic properties of our distribution are investigated closely.
The stochastic gamma process has been widely used to model uncertain degradation in engineering systems and structures. The optimization of the condition-based maintenance (CBM) policy is typically based on the minimization of the asymptotic cost rate. In the financial planning of a maintenance program, however, a more accurate prediction interval for the cost is needed for prudent decision making. The prediction interval cannot be estimated unless the probability distribution of cost is known. In this context, the asymptotic cost rate has a limited utility. This paper presents the derivation of the probability distribution of maintenance cost, when the system degradation is modelled as a stochastic gamma process. A renewal equation is formulated to derive the characteristic function, then the discrete Fourier transform of the characteristic function leads to the complete probability distribution of cost in a finite time setting. The proposed approach is useful for a precise estimation of prediction limits and optimization of the maintenance cost.
Boots, Nam Kyoo; Shahabuddin, Perwez
2001-01-01
This paper deals with estimating small tail probabilities of thesteady-state waiting time in a GI/GI/1 queue with heavy-tailed (subexponential) service times. The problem of estimating infinite horizon ruin probabilities in insurance risk processes with heavy-tailed claims can be transformed into th