Feldman, David V
2008-01-01
We use a probabilistic interpretation of solid angles to generalize the well-known fact that the inner angles of a triangle sum to 180 degrees. For the 3-dimensional case, we show that the sum of the solid inner vertex angles of a tetrahedron T, divided by 2*pi, gives the probability that an orthogonal projection of T onto a random 2-plane is a triangle. More generally, it is shown that the sum of the (solid) inner vertex angles of an n-simplex S, normalized by the area of the unit (n-1)-hemisphere, gives the probability that an orthogonal projection of S onto a random hyperplane is an (n-1)-simplex. Applications to more general polytopes are treated briefly, as is the related Perles-Shephard proof of the classical Gram-Euler relations.
Probability distribution relationships
Directory of Open Access Journals (Sweden)
Yousry Abdelkader
2013-05-01
Full Text Available In this paper, we are interesting to show the most famous distributions and their relations to the other distributions in collected diagrams. Four diagrams are sketched as networks. The first one is concerned to the continuous distributions and their relations. The second one presents the discrete distributions. The third diagram is depicted the famous limiting distributions. Finally, the Balakrishnan skew-normal density and its relationship with the other distributions are shown in the fourth diagram.
Exact Probability Distribution versus Entropy
Directory of Open Access Journals (Sweden)
Kerstin Andersson
2014-10-01
Full Text Available The problem addressed concerns the determination of the average number of successive attempts of guessing a word of a certain length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations to a natural language are considered. The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations are necessary in order to estimate the number of guesses. Several kinds of approximations are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic sizes of alphabets and words (100, the number of guesses can be estimated within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions. For many probability distributions, the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion of guesses needed on average compared to the total number decreases almost exponentially with the word length. The leading term in an asymptotic expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.
Probability distributions for multimeric systems.
Albert, Jaroslav; Rooman, Marianne
2016-01-01
We propose a fast and accurate method of obtaining the equilibrium mono-modal joint probability distributions for multimeric systems. The method necessitates only two assumptions: the copy number of all species of molecule may be treated as continuous; and, the probability density functions (pdf) are well-approximated by multivariate skew normal distributions (MSND). Starting from the master equation, we convert the problem into a set of equations for the statistical moments which are then expressed in terms of the parameters intrinsic to the MSND. Using an optimization package on Mathematica, we minimize a Euclidian distance function comprising of a sum of the squared difference between the left and the right hand sides of these equations. Comparison of results obtained via our method with those rendered by the Gillespie algorithm demonstrates our method to be highly accurate as well as efficient.
ASYMPTOTIC QUANTIZATION OF PROBABILITY DISTRIBUTIONS
Institute of Scientific and Technical Information of China (English)
Klaus P(o)tzelberger
2003-01-01
We give a brief introduction to results on the asymptotics of quantization errors.The topics discussed include the quantization dimension,asymptotic distributions of sets of prototypes,asymptotically optimal quantizations,approximations and random quantizations.
The Multivariate Gaussian Probability Distribution
DEFF Research Database (Denmark)
Ahrendt, Peter
2005-01-01
This technical report intends to gather information about the multivariate gaussian distribution, that was previously not (at least to my knowledge) to be found in one place and written as a reference manual. Additionally, some useful tips and tricks are collected that may be useful in practical ...
Pre-aggregation for Probability Distributions
DEFF Research Database (Denmark)
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....
Pre-Aggregation with Probability Distributions
DEFF Research Database (Denmark)
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
2006-01-01
Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....
Probability Distributions for a Surjective Unimodal Map
Institute of Scientific and Technical Information of China (English)
HongyanSUN; LongWANG
1996-01-01
In this paper we show that the probability distributions for a surjective unimodal map can be classified into three types,δ function,asymmetric and symmetric type,by identifying the binary structures of its initial values.The Borel's normal number theorem is equivalent or prior to the Frobenius-Perron operator in analyzing the probability distributions for this kind of maps,and in particular we can constitute a multifractal probability distribution from the surjective tent map by selecting a non- Borel's normal number as the initial value.
Asymmetry of the work probability distribution
Saha, Arnab; Bhattacharjee, J. K.
2006-01-01
We show, both analytically and numerically, that for a nonlinear system making a transition from one equilibrium state to another under the action of an external time dependent force, the work probability distribution is in general asymmetric.
The Pauli equation for probability distributions
International Nuclear Information System (INIS)
The tomographic-probability distribution for a measurable coordinate and spin projection is introduced to describe quantum states as an alternative to the density matrix. An analogue of the Pauli equation for the spin-1/2 particle is obtained for such a probability distribution instead of the usual equation for the wavefunction. Examples of the tomographic description of Landau levels and coherent states of a charged particle moving in a constant magnetic field are presented. (author)
The Pauli equation for probability distributions
Energy Technology Data Exchange (ETDEWEB)
Mancini, S. [INFM, Dipartimento di Fisica, Universita di Milano, Milan (Italy). E-mail: Stefano.Mancini@mi.infn.it; Man' ko, O.V. [P.N. Lebedev Physical Institute, Moscow (Russian Federation). E-mail: Olga.Manko@sci.lebedev.ru; Man' ko, V.I. [INFM, Dipartimento di Matematica e Fisica, Universita di Camerino, Camerino (Italy). E-mail: Vladimir.Manko@sci.lebedev.ru; Tombesi, P. [INFM, Dipartimento di Matematica e Fisica, Universita di Camerino, Camerino (Italy). E-mail: Paolo.Tombesi@campus.unicam.it
2001-04-27
The tomographic-probability distribution for a measurable coordinate and spin projection is introduced to describe quantum states as an alternative to the density matrix. An analogue of the Pauli equation for the spin-1/2 particle is obtained for such a probability distribution instead of the usual equation for the wavefunction. Examples of the tomographic description of Landau levels and coherent states of a charged particle moving in a constant magnetic field are presented. (author)
Qualitative criterion for atom sputtering angle distributions
International Nuclear Information System (INIS)
A model is introduced to explain the shape of atom polar emission angle distributions for monocomponent targets sputtered by normally incident keV - energy ions. Analytical expressions are obtained from the model which make it possible to identify three known kinds of the angle distributions - subcosinus, isotropic and supracosinus, for given ion energies and target-ion pairs. Furthermore the fourth, hybrid false-isotropic distribution is found, which is superposition of supracosinus and subcosinus distributions. The theoretical predictions of the angle distributions shape agree with the numerical modeling for sputtering of carbon and platinum by 0.1-10 keV Ar+ ions
The Pauli Equation for Probability Distributions
Mancini, S.; Man'ko, O. V.; Man'ko, V. I.; Tombesi, P.
2000-01-01
The "marginal" distributions for measurable coordinate and spin projection is introduced. Then, the analog of the Pauli equation for spin-1/2 particle is obtained for such probability distributions instead of the usual wave functions. That allows a classical-like approach to quantum mechanics. Some illuminating examples are presented.
The Pauli Equation for Probability Distributions
Mancini, S; Man'ko, V I; Tombesi, P
2001-01-01
The "marginal" distributions for measurable coordinate and spin projection is introduced. Then, the analog of the Pauli equation for spin-1/2 particle is obtained for such probability distributions instead of the usual wave functions. That allows a classical-like approach to quantum mechanics. Some illuminating examples are presented.
Learning a Probability Distribution Efficiently and Reliably
Laird, Philip; Gamble, Evan
1988-01-01
A new algorithm, called the CDF-Inversion Algorithm, is described. Using it, one can efficiently learn a probability distribution over a finite set to a specified accuracy and confidence. The algorithm can be extended to learn joint distributions over a vector space. Some implementation results are described.
Proposal for Modified Damage Probability Distribution Functions
DEFF Research Database (Denmark)
Pedersen, Preben Terndrup; Hansen, Peter Friis
1996-01-01
Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...
Scaling of misorientation angle distributions
DEFF Research Database (Denmark)
Hughes, D.A.; Chrzan, D.C.; Liu, Q.;
1998-01-01
for the small to large strain regimes for aluminum, 304L stainless steel, nickel, and copper (taken from the literature )appear to be identical. Hence the distributions may be "universal." These results have significant implications for the development of dislocation based deformation models. [S0031...
Evolution of the jet opening angle distribution in holographic plasma
Rajagopal, Krishna; van der Schee, Wilke
2016-01-01
We use holography to analyze the evolution of an ensemble of jets, with an initial probability distribution for their energy and opening angle as %for jets in proton-proton (pp) collisions, as they propagate through an expanding cooling droplet of strongly coupled plasma as in heavy ion collisions. We identify two competing effects: (i) each individual jet widens as it propagates; (ii) the opening angle distribution for jets emerging from the plasma within any specified range of energies has been pushed toward smaller angles, comparing to pp jets with the same energies. The second effect arises because small-angle jets suffer less energy loss and because jets with a higher initial energy are less probable in the ensemble. We illustrate both effects in a simple two-parameter model, and find that their consequence in sum is that the opening angle distribution for jets in any range of energies contains fewer narrow and wide jets. Either effect can dominate in the mean opening angle, for not unreasonable values o...
Using L/E Oscillation Probability Distributions
Aguilar-Arevalo, A A; Bugel, L; Cheng, G; Church, E D; Conrad, J M; Dharmapalan, R; Djurcic, Z; Finley, D A; Ford, R; Garcia, F G; Garvey, G T; Grange, J; Huelsnitz, W; Ignarra, C; Imlay, R; Johnson, R A; Karagiorgi, G; Katori, T; Kobilarcik, T; Louis, W C; Mariani, C; Marsh, W; Mills, G B; Mirabal, J; Moore, C D; Mousseau, J; Nienaber, P; Osmanov, B; Pavlovic, Z; Perevalov, D; Polly, C C; Ray, H; Roe, B P; Russell, A D; Shaevitz, M H; Spitz, J; Stancu, I; Tayloe, R; Van de Water, R G; White, D H; Wickremasinghe, D A; Zeller, G P; Zimmerman, E D
2014-01-01
This paper explores the use of $L/E$ oscillation probability distributions to compare experimental measurements and to evaluate oscillation models. In this case, $L$ is the distance of neutrino travel and $E$ is a measure of the interacting neutrino's energy. While comparisons using allowed and excluded regions for oscillation model parameters are likely the only rigorous method for these comparisons, the $L/E$ distributions are shown to give qualitative information on the agreement of an experiment's data with a simple two-neutrino oscillation model. In more detail, this paper also outlines how the $L/E$ distributions can be best calculated and used for model comparisons. Specifically, the paper presents the $L/E$ data points for the final MiniBooNE data samples and, in the Appendix, explains and corrects the mistaken analysis published by the ICARUS collaboration.
Generating pseudo-random discrete probability distributions
Energy Technology Data Exchange (ETDEWEB)
Maziero, Jonas, E-mail: jonasmaziero@gmail.com [Universidade Federal de Santa Maria (UFSM), RS (Brazil). Departamento de Fisica
2015-08-15
The generation of pseudo-random discrete probability distributions is of paramount importance for a wide range of stochastic simulations spanning from Monte Carlo methods to the random sampling of quantum states for investigations in quantum information science. In spite of its significance, a thorough exposition of such a procedure is lacking in the literature. In this article, we present relevant details concerning the numerical implementation and applicability of what we call the iid, normalization, and trigonometric methods for generating an unbiased probability vector p=(p{sub 1},⋯ ,p{sub d}). An immediate application of these results regarding the generation of pseudo-random pure quantum states is also described. (author)
Probability Distribution for Flowing Interval Spacing
International Nuclear Information System (INIS)
The purpose of this analysis is to develop a probability distribution for flowing interval spacing. A flowing interval is defined as a fractured zone that transmits flow in the Saturated Zone (SZ), as identified through borehole flow meter surveys (Figure 1). This analysis uses the term ''flowing interval spacing'' as opposed to fractured spacing, which is typically used in the literature. The term fracture spacing was not used in this analysis because the data used identify a zone (or a flowing interval) that contains fluid-conducting fractures but does not distinguish how many or which fractures comprise the flowing interval. The flowing interval spacing is measured between the midpoints of each flowing interval. Fracture spacing within the SZ is defined as the spacing between fractures, with no regard to which fractures are carrying flow. The Development Plan associated with this analysis is entitled, ''Probability Distribution for Flowing Interval Spacing'', (CRWMS M and O 2000a). The parameter from this analysis may be used in the TSPA SR/LA Saturated Zone Flow and Transport Work Direction and Planning Documents: (1) ''Abstraction of Matrix Diffusion for SZ Flow and Transport Analyses'' (CRWMS M and O 1999a) and (2) ''Incorporation of Heterogeneity in SZ Flow and Transport Analyses'', (CRWMS M and O 1999b). A limitation of this analysis is that the probability distribution of flowing interval spacing may underestimate the effect of incorporating matrix diffusion processes in the SZ transport model because of the possible overestimation of the flowing interval spacing. Larger flowing interval spacing results in a decrease in the matrix diffusion processes. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be determined from the data. Because each flowing interval probably has more than one fracture contributing to a flowing interval, the true flowing interval spacing could be
The Probability Distribution for a Biased Spinner
Foster, Colin
2012-01-01
This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)
Scoring Rules for Subjective Probability Distributions
DEFF Research Database (Denmark)
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd;
report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...
Evolution of the Jet Opening Angle Distribution in Holographic Plasma.
Rajagopal, Krishna; Sadofyev, Andrey V; van der Schee, Wilke
2016-05-27
We use holography to analyze the evolution of an ensemble of jets, with an initial probability distribution for their energy and opening angle as in proton-proton (pp) collisions, as they propagate through an expanding cooling droplet of strongly coupled plasma as in heavy ion collisions. We identify two competing effects: (i) each individual jet widens as it propagates and (ii) because wide-angle jets lose more energy, energy loss combined with the steeply falling perturbative spectrum serves to filter wide jets out of the ensemble at any given energy. Even though every jet widens, jets with a given energy can have a smaller mean opening angle after passage through the plasma than jets with that energy would have had in vacuum, as experimental data may indicate. PMID:27284647
Polarization Mode Dispersion Probability Distribution for Arbitrary Mode Coupling
Institute of Scientific and Technical Information of China (English)
无
2003-01-01
The probability distribution of the differential group delay for arbitrary mode coupling is simulated with Monte-Carlo method. Fitting the simulation results, we obtain probability distribution function for arbitrary mode coupling.
How to Read Probability Distributions as Statements about Process
Frank, Steven A.
2014-01-01
Probability distributions can be read as simple expressions of information. Each continuous probability distribution describes how information changes with magnitude. Once one learns to read a probability distribution as a measurement scale of information, opportunities arise to understand the processes that generate the commonly observed patterns. Probability expressions may be parsed into four components: the dissipation of all information, except the preservation of average values, taken o...
Foundations of quantization for probability distributions
Graf, Siegfried
2000-01-01
Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.
Eliciting Subjective Probability Distributions with Binary Lotteries
DEFF Research Database (Denmark)
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd;
2015-01-01
We test in a laboratory experiment the theoretical prediction that risk attitudes have a surprisingly small role in distorting reports from true belief distributions. We find evidence consistent with theory in our experiment....
Directory of Open Access Journals (Sweden)
Xiao Liu
2016-01-01
Full Text Available Random disturbance factors would lead to the variation of target acquisition point during the long distance flight. To acquire a high target acquisition probability and improve the impact precision, missiles should be guided to an appropriate target acquisition position with certain attitude angles and line-of-sight (LOS angle rate. This paper has presented a new midcourse guidance law considering the influences of random disturbances, detection distance restraint, and target acquisition probability with Monte Carlo simulation. Detailed analyses of the impact points on the ground and the random distribution of the target acquisition position in the 3D space are given to get the appropriate attitude angles and the end position for the midcourse guidance. Then, a new formulation biased proportional navigation (BPN guidance law with angular constraint and LOS angle rate control has been derived to ensure the tracking ability when attacking the maneuvering target. Numerical simulations demonstrates that, compared with the proportional navigation guidance (PNG law and the near-optimal spatial midcourse guidance (NSMG law, BPN guidance law demonstrates satisfactory performances and can meet both the midcourse terminal angular constraint and the LOS angle rate requirement.
Semi-stable distributions in free probability theory
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
Semi-stable distributions, in classical probability theory, are characterized as limiting distributions of subsequences of normalized partial sums of independent and identically distributed random variables. We establish the noncommutative counterpart of semi-stable distributions. We study the characterization of noncommutative semi-stability through free cumulant transform and develop the free semi-stability and domain of semi-stable attraction in free probability theory.
Probability plots based on student’s t-distribution
Hooft, R.W.W.; Straver, L.H.; Spek, A.L.
2009-01-01
The validity of the normal distribution as an error model is commonly tested with a (half) normal probability plot. Real data often contain outliers. The use of t-distributions in a probability plot to model such data more realistically is described. It is shown how a suitable value of the parameter
Probability distributions in risk management operations
Artikis, Constantinos
2015-01-01
This book is about the formulations, theoretical investigations, and practical applications of new stochastic models for fundamental concepts and operations of the discipline of risk management. It also examines how these models can be useful in the descriptions, measurements, evaluations, and treatments of risks threatening various modern organizations. Moreover, the book makes clear that such stochastic models constitute very strong analytical tools which substantially facilitate strategic thinking and strategic decision making in many significant areas of risk management. In particular the incorporation of fundamental probabilistic concepts such as the sum, minimum, and maximum of a random number of continuous, positive, independent, and identically distributed random variables in the mathematical structure of stochastic models significantly supports the suitability of these models in the developments, investigations, selections, and implementations of proactive and reactive risk management operations. The...
Distribution of angles in hyperbolic lattices
DEFF Research Database (Denmark)
Risager, Morten Skarsholm; Truelsen, Jimi Lee
2010-01-01
We prove an effective equidistribution result about angles in a hyperbolic lattice. We use this to generalize a result from the study by Boca.......We prove an effective equidistribution result about angles in a hyperbolic lattice. We use this to generalize a result from the study by Boca....
Distribution of Angles in Hyperbolic Lattices
DEFF Research Database (Denmark)
S. Risager, Morten; L. Truelsen, Jimi
2008-01-01
We prove an effective equidistribution result about angles in a hyperbolic lattice. We use this to generalize a result due to F. P. Boca.......We prove an effective equidistribution result about angles in a hyperbolic lattice. We use this to generalize a result due to F. P. Boca....
Negative Binomial and Multinomial States: probability distributions and coherent states
Fu, Hong-Chen; Sasaki, Ryu
1996-01-01
Following the relationship between probability distribution and coherent states, for example the well known Poisson distribution and the ordinary coherent states and relatively less known one of the binomial distribution and the $su(2)$ coherent states, we propose ``interpretation'' of $su(1,1)$ and $su(r,1)$ coherent states ``in terms of probability theory''. They will be called the ``negative binomial'' (``multinomial'') ``states'' which correspond to the ``negative'' binomial (multinomial)...
Some explicit expressions for the probability distribution of force magnitude
Indian Academy of Sciences (India)
Saralees Nadarajah
2008-08-01
Recently, empirical investigations have suggested that the components of contact forces follow the exponential distribution. However, explicit expressions for the probability distribution of the corresponding force magnitude have not been known and only approximations have been used in the literature. In this note, for the ﬁrst time, I provide explicit expressions for the probability distribution of the force magnitude. Both two-dimensional and three-dimensional cases are considered.
International Nuclear Information System (INIS)
Using the analogy between brownian motion and Quantum Mechanics, we study the winding angle θ of planar brownian curves around a given point, say the origin O. In particular, we compute the characteristic function for the probability distribution of θ and recover Spitzer's law in the limit of infinitely large times. Finally, we study the (large) change in the winding angle distribution when we add a repulsive potential at the origin
How Can Histograms Be Useful for Introducing Continuous Probability Distributions?
Derouet, Charlotte; Parzysz, Bernard
2016-01-01
The teaching of probability has changed a great deal since the end of the last century. The development of technologies is indeed part of this evolution. In France, continuous probability distributions began to be studied in 2002 by scientific 12th graders, but this subject was marginal and appeared only as an application of integral calculus.…
Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution
Energy Technology Data Exchange (ETDEWEB)
Hamadameen, Abdulqader Othman [Optimization, Department of Mathematical Sciences, Faculty of Science, UTM (Malaysia); Zainuddin, Zaitul Marlizawati [Department of Mathematical Sciences, Faculty of Science, UTM (Malaysia)
2014-06-19
This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α{sup –}. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen’s method is employed to find a compromise solution, supported by illustrative numerical example.
Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution
Hamadameen, Abdulqader Othman; Zainuddin, Zaitul Marlizawati
2014-06-01
This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α-. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen's method is employed to find a compromise solution, supported by illustrative numerical example.
Information-theoretic methods for estimating of complicated probability distributions
Zong, Zhi
2006-01-01
Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur
Net baryon number probability distribution near the chiral phase transition
Morita, Kenji; Skokov, Vladimir; Friman, Bengt; Redlich, Krzysztof
2014-01-01
We discuss the properties of the net baryon number probability distribution near the chiral phase transition to explore the effect of critical fluctuations. Our studies are performed within Landau theory, where the coefficients of the polynomial potential are parametrized, so as to reproduce the mean-field (MF), the Z(2) , and the O(4) scaling behaviors of the cumulants of the net baryon number. We show that in the critical region the structure of the probability distribution changes, dependi...
Application-dependent Probability Distributions for Offshore Wind Speeds
Morgan, E. C.; Lackner, M.; Vogel, R. M.; Baise, L. G.
2010-12-01
The higher wind speeds of the offshore environment make it an attractive setting for future wind farms. With sparser field measurements, the theoretical probability distribution of short-term wind speeds becomes more important in estimating values such as average power output and fatigue load. While previous studies typically compare the accuracy of probability distributions using R2, we show that validation based on this metric is not consistent with validation based on engineering parameters of interest, namely turbine power output and extreme wind speed. Thus, in order to make the most accurate estimates possible, the probability distribution that an engineer picks to characterize wind speeds should depend on the design parameter of interest. We introduce the Kappa and Wakeby probability distribution functions to wind speed modeling, and show that these two distributions, along with the Biweibull distribution, fit wind speed samples better than the more widely accepted Weibull and Rayleigh distributions based on R2. Additionally, out of the 14 probability distributions we examine, the Kappa and Wakeby give the most accurate and least biased estimates of turbine power output. The fact that the 2-parameter Lognormal distribution estimates extreme wind speeds (i.e. fits the upper tail of wind speed distributions) with least error indicates that not one single distribution performs satisfactorily for all applications. Our use of a large dataset composed of 178 buoys (totaling ~72 million 10-minute wind speed observations) makes these findings highly significant, both in terms of large sample size and broad geographical distribution across various wind regimes. Boxplots of R2 from the fit of each of the 14 distributions to the 178 boy wind speed samples. Distributions are ranked from left to right by ascending median R2, with the Biweibull having the closest median to 1.
Fitness Probability Distribution of Bit-Flip Mutation.
Chicano, Francisco; Sutton, Andrew M; Whitley, L Darrell; Alba, Enrique
2015-01-01
Bit-flip mutation is a common mutation operator for evolutionary algorithms applied to optimize functions over binary strings. In this paper, we develop results from the theory of landscapes and Krawtchouk polynomials to exactly compute the probability distribution of fitness values of a binary string undergoing uniform bit-flip mutation. We prove that this probability distribution can be expressed as a polynomial in p, the probability of flipping each bit. We analyze these polynomials and provide closed-form expressions for an easy linear problem (Onemax), and an NP-hard problem, MAX-SAT. We also discuss a connection of the results with runtime analysis.
Most probable degree distribution at fixed structural entropy
Indian Academy of Sciences (India)
Ginestra Bianconi
2008-06-01
The structural entropy is the entropy of the ensemble of uncorrelated networks with given degree sequence. Here we derive the most probable degree distribution emerging when we distribute stubs (or half-edges) randomly through the nodes of the network by keeping fixed the structural entropy. This degree distribution is found to decay as a Poisson distribution when the entropy is maximized and to have a power-law tail with an exponent → 2 when the entropy is minimized.
ProbOnto: ontology and knowledge base of probability distributions
Swat, Maciej J.; Grenon, Pierre; Wimalaratne, Sarala
2016-01-01
Motivation: Probability distributions play a central role in mathematical and statistical modelling. The encoding, annotation and exchange of such models could be greatly simplified by a resource providing a common reference for the definition of probability distributions. Although some resources exist, no suitably detailed and complex ontology exists nor any database allowing programmatic access. Results: ProbOnto, is an ontology-based knowledge base of probability distributions, featuring more than 80 uni- and multivariate distributions with their defining functions, characteristics, relationships and re-parameterization formulas. It can be used for model annotation and facilitates the encoding of distribution-based models, related functions and quantities. Availability and Implementation: http://probonto.org Contact: mjswat@ebi.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153608
International Nuclear Information System (INIS)
A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (Sst, Sst, pst) for stochastic uncertainty, a probability space (Ssu, Ssu, psu) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (Sst, Sst, pst) and (Ssu, Ssu, psu). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems
Evidence for Truncated Exponential Probability Distribution of Earthquake Slip
Thingbaijam, Kiran K. S.
2016-07-13
Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.
Applications of the Dirichlet distribution to forensic match probabilities.
Lange, K
1995-01-01
The Dirichlet distribution provides a convenient conjugate prior for Bayesian analyses involving multinomial proportions. In particular, allele frequency estimation can be carried out with a Dirichlet prior. If data from several distinct populations are available, then the parameters characterizing the Dirichlet prior can be estimated by maximum likelihood and then used for allele frequency estimation in each of the separate populations. This empirical Bayes procedure tends to moderate extreme multinomial estimates based on sample proportions. The Dirichlet distribution can also be employed to model the contributions from different ancestral populations in computing forensic match probabilities. If the ancestral populations are in genetic equilibrium, then the product rule for computing match probabilities is valid conditional on the ancestral contributions to a typical person of the reference population. This fact facilitates computation of match probabilities and tight upper bounds to match probabilities.
NORMALLY DISTRIBUTED PROBABILITY MEASURE ON THE METRIC SPACE OF NORMS
Institute of Scientific and Technical Information of China (English)
Á.G. HORVÁTH
2013-01-01
In this paper we propose a method to construct probability measures on the space of convex bodies. For this purpose, first, we introduce the notion of thinness of a body. Then we show the existence of a measure with the property that its pushforward by the thinness function is a probability measure of truncated normal distribution. Finally, we improve this method to find a measure satisfying some important properties in geometric measure theory.
Probability distributions for Poisson processes with pile-up
Sevilla, Diego J R
2013-01-01
In this paper, two parametric probability distributions capable to describe the statistics of X-ray photon detection by a CCD are presented. They are formulated from simple models that account for the pile-up phenomenon, in which two or more photons are counted as one. These models are based on the Poisson process, but they have an extra parameter which includes all the detailed mechanisms of the pile-up process that must be fitted to the data statistics simultaneously with the rate parameter. The new probability distributions, one for number of counts per time bins (Poisson-like), and the other for waiting times (exponential-like) are tested fitting them to statistics of real data, and between them through numerical simulations, and their results are analyzed and compared. The probability distributions presented here can be used as background statistical models to derive likelihood functions for statistical methods in signal analysis.
Uniform distribution of initial states: The physical basis of probability
Zhang Kechen
1990-02-01
For repetitive experiments performed on a deterministic system with initial states restricted to a certain region in phase space, the relative frequency of an event has a definite value insensitive to the preparation of the experiments only if the initial states leading to that event are distributed uniformly in the prescribed region. Mechanical models of coin tossing and roulette spinning and equal a priori probability hypothesis in statistical mechanics are considered in the light of this principle. Probabilities that have arisen from uniform distributions of initial states do not necessarily submit to Kolmogorov's axioms of probability. In the finite-dimensional case, a uniform distribution in phase space either in the coarse-grained sense or in the limit sense can be formulated in a unified way.
Assigning probability distributions to input parameters of performance assessment models
Energy Technology Data Exchange (ETDEWEB)
Mishra, Srikanta [INTERA Inc., Austin, TX (United States)
2002-02-01
This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available.
Augmenting momentum resolution with well tuned probability distributions
Landi, Gregorio
2016-01-01
The realistic probability distributions of a previous article are applied to the reconstruction of tracks in constant magnetic field. The complete forms and their schematic approximations produce excellent momentum estimations, drastically better than standard fits. A simplified derivation of one of our probability distributions is illustrated. The momentum reconstructions are compared with standard fits (least squares) with two different position algorithms: the eta-algorithm and the two-strip center of gravity. The quality of our results are expressed as the increase of the magnetic field and signal-to-noise ratio that overlap the standard fit reconstructions with ours best distributions. The data and the simulations are tuned on the tracker of a running experiment and its double sided microstrip detectors, here each detector side is simulated to measure the magnetic bending. To overlap with our best distributions, the magnetic field must be increased by a factor 1.5 for the least squares based on the eta-a...
Probability distribution of extreme share returns in Malaysia
Zin, Wan Zawiah Wan; Safari, Muhammad Aslam Mohd; Jaaman, Saiful Hafizah; Yie, Wendy Ling Shin
2014-09-01
The objective of this study is to investigate the suitable probability distribution to model the extreme share returns in Malaysia. To achieve this, weekly and monthly maximum daily share returns are derived from share prices data obtained from Bursa Malaysia over the period of 2000 to 2012. The study starts with summary statistics of the data which will provide a clue on the likely candidates for the best fitting distribution. Next, the suitability of six extreme value distributions, namely the Gumbel, Generalized Extreme Value (GEV), Generalized Logistic (GLO) and Generalized Pareto (GPA), the Lognormal (GNO) and the Pearson (PE3) distributions are evaluated. The method of L-moments is used in parameter estimation. Based on several goodness of fit tests and L-moment diagram test, the Generalized Pareto distribution and the Pearson distribution are found to be the best fitted distribution to represent the weekly and monthly maximum share returns in Malaysia stock market during the studied period, respectively.
Contact pressure distribution and support angle optimization of kiln tyre
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
According to the shearing force character and the deformation coordination condition of shell at the station of supports, the mathematical models to calculate contact angle and contact pressure distribution between tyre and shell were set up, the formulae of bending moment and bending stress of tyre were obtained. Taking the maximum of tyre fatigue life as the optimal objective, the optimization model of tyre support angle was built. The computational results show that when tyre support angle is 30°, tyre life is far less than that when tyre support angle is optimal, which is 35.6°, and it is unsuitable to stipulate tyre support angle to be 30° in traditional design. The larger the load, the less the nominal stress amplitude increment of tyre, the more favorable the tyre fatigue life when tyre support angle is optimal.
Probability Measure of Navigation pattern predition using Poisson Distribution Analysis
Directory of Open Access Journals (Sweden)
Dr.V.Valli Mayil
2012-06-01
Full Text Available The World Wide Web has become one of the most important media to store, share and distribute information. The rapid expansion of the web has provided a great opportunity to study user and system behavior by exploring web access logs. Web Usage Mining is the application of data mining techniques to large web data repositories in order to extract usage patterns. Every web server keeps a log of all transactions between the server and the clients. The log data which are collected by web servers contains information about every click of user to the web documents of the site. The useful log information needs to be analyzed and interpreted in order to obtainknowledge about actual user preferences in accessing web pages. In recent years several methods have been proposed for mining web log data. This paper addresses the statistical method of Poisson distribution analysis to find out the higher probability session sequences which is then used to test the web application performance.The analysis of large volumes of click stream data demands the employment of data mining methods. Conducting data mining on logs of web servers involves the determination of frequently occurring access sequences. A statistical poisson distribution shows the frequency probability of specific events when the average probability of a single occurrence is known. The Poisson distribution is a discrete function wich is used in this paper to find out the probability frequency of particular page is visited by the user.
Log-concave Probability Distributions: Theory and Statistical Testing
DEFF Research Database (Denmark)
An, Mark Yuing
1996-01-01
This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...
On Probability Distributions for Trees: Representations, Inference and Learning
Denis, François; Gilleron, Rémi; Tommasi, Marc; Gilbert, Édouard
2008-01-01
We study probability distributions over free algebras of trees. Probability distributions can be seen as particular (formal power) tree series [Berstel et al 82, Esik et al 03], i.e. mappings from trees to a semiring K . A widely studied class of tree series is the class of rational (or recognizable) tree series which can be defined either in an algebraic way or by means of multiplicity tree automata. We argue that the algebraic representation is very convenient to model probability distributions over a free algebra of trees. First, as in the string case, the algebraic representation allows to design learning algorithms for the whole class of probability distributions defined by rational tree series. Note that learning algorithms for rational tree series correspond to learning algorithms for weighted tree automata where both the structure and the weights are learned. Second, the algebraic representation can be easily extended to deal with unranked trees (like XML trees where a symbol may have an unbounded num...
Probability distribution function for a solid with vacancies
Metlov, Leonid S.
2011-01-01
Expression for probability distribution is got taking into account a presence and removal of degeneracy on the microstates. Its application allows to describe the process of melting of solids, as saltatory phase transition of the first kind without bringing in of concept of the order parameter.
Energy Technology Data Exchange (ETDEWEB)
Helton, J.C. [Arizona State Univ., Tempe, AZ (United States)
1996-03-01
A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S{sub st}, S{sub st}, p{sub st}) for stochastic uncertainty, a probability space (S{sub su}, S{sub su}, p{sub su}) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S{sub st}, S{sub st}, p{sub st}) and (S{sub su}, S{sub su}, p{sub su}). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency`s standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems.
Phase diagram of epidemic spreading - unimodal vs. bimodal probability distributions
Lancic, Alen; Antulov-Fantulin, Nino; Sikic, Mile; Stefancic, Hrvoje
2009-01-01
The disease spreading on complex networks is studied in SIR model. Simulations on empirical complex networks reveal two specific regimes of disease spreading: local containment and epidemic outbreak. The variables measuring the extent of disease spreading are in general characterized by a bimodal probability distribution. Phase diagrams of disease spreading for empirical complex networks are introduced. A theoretical model of disease spreading on m-ary tree is investigated both analytically a...
Quantization of Prior Probabilities for Collaborative Distributed Hypothesis Testing
Rhim, Joong Bum; Varshney, Lav R.; GOYAL, VIVEK K.
2011-01-01
This paper studies the quantization of prior probabilities, drawn from an ensemble, for distributed detection and data fusion. Design and performance equivalences between a team of N agents tied by a fixed fusion rule and a more powerful single agent are obtained. Effects of identical quantization and diverse quantization are compared. Consideration of perceived common risk enables agents using diverse quantizers to collaborate in hypothesis testing, and it is proven that the minimum mean Bay...
Testing for the maximum cell probabilities in multinomial distributions
Institute of Scientific and Technical Information of China (English)
XIONG; Shifeng
2005-01-01
This paper investigates one-sided hypotheses testing for p[1], the largest cell probability of multinomial distribution. A small sample test of Ethier (1982) is extended to the general cases. Based on an estimator of p[1], a kind of large sample tests is proposed. The asymptotic power of the above tests under local alternatives is derived. An example is presented at the end of this paper.
Outage probability of distributed beamforming with co-channel interference
Yang, Liang
2012-03-01
In this letter, we consider a distributed beamforming scheme (DBF) in the presence of equal-power co-channel interferers for both amplify-and-forward and decode-and-forward relaying protocols over Rayleigh fading channels. We first derive outage probability expressions for the DBF systems. We then present a performance analysis for a scheme relying on source selection. Numerical results are finally presented to verify our analysis. © 2011 IEEE.
Analytical theory of the probability distribution function of structure formation
Anderson, Johan; Kim, Eun-Jin
2009-01-01
The probability distribution function (PDF) tails of the zonal flow structure formation and the PDF tails of momentum flux by incorporating effect of a shear flow in ion-temperature-gradient (ITG) turbulence are computed in the present paper. The bipolar vortex soliton (modon) is assumed to be the coherent structure responsible for bursty and intermittent events driving the PDF tails. It is found that stronger zonal flows are generated in ITG turbulence than Hasegawa-Mima (HM) turbulence as w...
Estimating probable flaw distributions in PWR steam generator tubes
Energy Technology Data Exchange (ETDEWEB)
Gorman, J.A.; Turner, A.P.L. [Dominion Engineering, Inc., McLean, VA (United States)
1997-02-01
This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regarding uncertainties and assumptions in the data and analyses.
The Probability Distribution Model of Wind Speed over East Malaysia
Directory of Open Access Journals (Sweden)
Nurulkamal Masseran
2013-07-01
Full Text Available Many studies have found that wind speed is the most significant parameter of wind power. Thus, an accurate determination of the probability distribution of wind speed is an important parameter to measure before estimating the wind energy potential over a particular region. Utilizing an accurate distribution will minimize the uncertainty in wind resource estimates and improve the site assessment phase of planning. In general, different regions have different wind regimes. Hence, it is reasonable that different wind distributions will be found for different regions. Because it is reasonable to consider that wind regimes vary according to the region of a particular country, nine different statistical distributions have been fitted to the mean hourly wind speed data from 20 wind stations in East Malaysia, for the period from 2000 to 2009. The values from Kolmogorov-Smirnov statistic, Akaike’s Information Criteria, Bayesian Information Criteria and R2 correlation coefficient were compared with the distributions to determine the best fit for describing the observed data. A good fit for most of the stations in East Malaysia was found using the Gamma and Burr distributions, though there was no clear pattern observed for all regions in East Malaysia. However, the Gamma distribution was a clear fit to the data from all stations in southern Sabah.
Methods for fitting a parametric probability distribution to most probable number data.
Williams, Michael S; Ebel, Eric D
2012-07-01
Every year hundreds of thousands, if not millions, of samples are collected and analyzed to assess microbial contamination in food and water. The concentration of pathogenic organisms at the end of the production process is low for most commodities, so a highly sensitive screening test is used to determine whether the organism of interest is present in a sample. In some applications, samples that test positive are subjected to quantitation. The most probable number (MPN) technique is a common method to quantify the level of contamination in a sample because it is able to provide estimates at low concentrations. This technique uses a series of dilution count experiments to derive estimates of the concentration of the microorganism of interest. An application for these data is food-safety risk assessment, where the MPN concentration estimates can be fitted to a parametric distribution to summarize the range of potential exposures to the contaminant. Many different methods (e.g., substitution methods, maximum likelihood and regression on order statistics) have been proposed to fit microbial contamination data to a distribution, but the development of these methods rarely considers how the MPN technique influences the choice of distribution function and fitting method. An often overlooked aspect when applying these methods is whether the data represent actual measurements of the average concentration of microorganism per milliliter or the data are real-valued estimates of the average concentration, as is the case with MPN data. In this study, we propose two methods for fitting MPN data to a probability distribution. The first method uses a maximum likelihood estimator that takes average concentration values as the data inputs. The second is a Bayesian latent variable method that uses the counts of the number of positive tubes at each dilution to estimate the parameters of the contamination distribution. The performance of the two fitting methods is compared for two
Research on probability distribution of port cargo throughput
Institute of Scientific and Technical Information of China (English)
SUN Liang; TAN De-rong
2008-01-01
In order to more accurately examine developing trends in gross cargo throughput, we have modeled the probability distribution of cargo throughput. Gross cargo throughput is determined by the time spent by cargo ships in the port and the operating efficiency of handling equipment. Gross cargo throughput is the sum of all compound variables determining each aspect of cargo throughput for every cargo ship arriving at the port. Probability distribution was determined using the Wald equation. The results show that the variability of gross cargo throughput primarily depends on the different times required by different cargo ships arriving at the port. This model overcomes the shortcoming of previous models: inability to accurately determine the probability of a specific value of future gross cargo throughput. Our proposed model of cargo throughput depends on the relationship between time required by a cargo ship arriving at the port and the operational capacity of handling equipment at the port. At the same time, key factors affecting gross cargo throughput are analyzed. In order to test the efficiency of the model, the cargo volume of a port in Shandong Province was used as an example. In the case study the actual results matched our theoretical analysis.
Probability Distribution Functions of Cosmological Lensing: Convergence, Shear, and Magnification
Takahashi, Ryuichi; Sato, Masanori; Hamana, Takashi
2011-01-01
We perform high resolution ray-tracing simulations to investigate probability distribution functions (PDFs) of lensing convergence, shear, and magnification on distant sources up to the redshift of $z_s=20$. We pay particular attention to the shot noise effect in $N$-body simulations by explicitly showing how it affects the variance of the convergence. We show that the convergence and magnification PDFs are closely related with each other via the approximate relation $\\mu=(1-\\kappa)^{-2}$, which can reproduce the behavior of PDFs surprisingly well up to the high magnification tail. The mean convergence is found to be systematically negative, rather than zero as often assumed, and is correlated with the convergence variance. We provide simple analytical formulae for the PDFs, which reproduce simulated PDFs reasonably well for a wide range of redshifts and smoothing sizes. As explicit applications of our ray-tracing simulations, we examine the strong lensing probability and the magnification effects on the lumi...
Monsoonal differences and probability distribution of PM(10) concentration.
Md Yusof, Noor Faizah Fitri; Ramli, Nor Azam; Yahaya, Ahmad Shukri; Sansuddin, Nurulilyana; Ghazali, Nurul Adyani; Al Madhoun, Wesam
2010-04-01
There are many factors that influence PM(10) concentration in the atmosphere. This paper will look at the PM(10) concentration in relation with the wet season (north east monsoon) and dry season (south west monsoon) in Seberang Perai, Malaysia from the year 2000 to 2004. It is expected that PM(10) will reach the peak during south west monsoon as the weather during this season becomes dry and this study has proved that the highest PM(10) concentrations in 2000 to 2004 were recorded in this monsoon. Two probability distributions using Weibull and lognormal were used to model the PM(10) concentration. The best model used for prediction was selected based on performance indicators. Lognormal distribution represents the data better than Weibull distribution model for 2000, 2001, and 2002. However, for 2003 and 2004, Weibull distribution represents better than the lognormal distribution. The proposed distributions were successfully used for estimation of exceedences and predicting the return periods of the sequence year. PMID:19365611
Brownian Motion on a Sphere: Distribution of Solid Angles
Krishna, M. M. G.; Samuel, Joseph; Sinha, Supurna
2000-01-01
We study the diffusion of Brownian particles on the surface of a sphere and compute the distribution of solid angles enclosed by the diffusing particles. This function describes the distribution of geometric phases in two state quantum systems (or polarised light) undergoing random evolution. Our results are also relevant to recent experiments which observe the Brownian motion of molecules on curved surfaces like micelles and biological membranes. Our theoretical analysis agrees well with the...
Maximum-entropy probability distributions under Lp-norm constraints
Dolinar, S.
1991-01-01
Continuous probability density functions and discrete probability mass functions are tabulated which maximize the differential entropy or absolute entropy, respectively, among all probability distributions with a given L sub p norm (i.e., a given pth absolute moment when p is a finite integer) and unconstrained or constrained value set. Expressions for the maximum entropy are evaluated as functions of the L sub p norm. The most interesting results are obtained and plotted for unconstrained (real valued) continuous random variables and for integer valued discrete random variables. The maximum entropy expressions are obtained in closed form for unconstrained continuous random variables, and in this case there is a simple straight line relationship between the maximum differential entropy and the logarithm of the L sub p norm. Corresponding expressions for arbitrary discrete and constrained continuous random variables are given parametrically; closed form expressions are available only for special cases. However, simpler alternative bounds on the maximum entropy of integer valued discrete random variables are obtained by applying the differential entropy results to continuous random variables which approximate the integer valued random variables in a natural manner. All the results are presented in an integrated framework that includes continuous and discrete random variables, constraints on the permissible value set, and all possible values of p. Understanding such as this is useful in evaluating the performance of data compression schemes.
Landslide Probability Assessment by the Derived Distributions Technique
Muñoz, E.; Ochoa, A.; Martínez, H.
2012-12-01
Landslides are potentially disastrous events that bring along human and economic losses; especially in cities where an accelerated and unorganized growth leads to settlements on steep and potentially unstable areas. Among the main causes of landslides are geological, geomorphological, geotechnical, climatological, hydrological conditions and anthropic intervention. This paper studies landslides detonated by rain, commonly known as "soil-slip", which characterize by having a superficial failure surface (Typically between 1 and 1.5 m deep) parallel to the slope face and being triggered by intense and/or sustained periods of rain. This type of landslides is caused by changes on the pore pressure produced by a decrease in the suction when a humid front enters, as a consequence of the infiltration initiated by rain and ruled by the hydraulic characteristics of the soil. Failure occurs when this front reaches a critical depth and the shear strength of the soil in not enough to guarantee the stability of the mass. Critical rainfall thresholds in combination with a slope stability model are widely used for assessing landslide probability. In this paper we present a model for the estimation of the occurrence of landslides based on the derived distributions technique. Since the works of Eagleson in the 1970s the derived distributions technique has been widely used in hydrology to estimate the probability of occurrence of extreme flows. The model estimates the probability density function (pdf) of the Factor of Safety (FOS) from the statistical behavior of the rainfall process and some slope parameters. The stochastic character of the rainfall is transformed by means of a deterministic failure model into FOS pdf. Exceedance probability and return period estimation is then straightforward. The rainfall process is modeled as a Rectangular Pulses Poisson Process (RPPP) with independent exponential pdf for mean intensity and duration of the storms. The Philip infiltration model
Directory of Open Access Journals (Sweden)
Daniel Ting
2010-04-01
Full Text Available Distributions of the backbone dihedral angles of proteins have been studied for over 40 years. While many statistical analyses have been presented, only a handful of probability densities are publicly available for use in structure validation and structure prediction methods. The available distributions differ in a number of important ways, which determine their usefulness for various purposes. These include: 1 input data size and criteria for structure inclusion (resolution, R-factor, etc.; 2 filtering of suspect conformations and outliers using B-factors or other features; 3 secondary structure of input data (e.g., whether helix and sheet are included; whether beta turns are included; 4 the method used for determining probability densities ranging from simple histograms to modern nonparametric density estimation; and 5 whether they include nearest neighbor effects on the distribution of conformations in different regions of the Ramachandran map. In this work, Ramachandran probability distributions are presented for residues in protein loops from a high-resolution data set with filtering based on calculated electron densities. Distributions for all 20 amino acids (with cis and trans proline treated separately have been determined, as well as 420 left-neighbor and 420 right-neighbor dependent distributions. The neighbor-independent and neighbor-dependent probability densities have been accurately estimated using Bayesian nonparametric statistical analysis based on the Dirichlet process. In particular, we used hierarchical Dirichlet process priors, which allow sharing of information between densities for a particular residue type and different neighbor residue types. The resulting distributions are tested in a loop modeling benchmark with the program Rosetta, and are shown to improve protein loop conformation prediction significantly. The distributions are available at http://dunbrack.fccc.edu/hdp.
Probability Distribution and Projected Trends of Daily Precipitation in China
Institute of Scientific and Technical Information of China (English)
CAO; Li-Ge; ZHONG; Jun; SU; Bu-Da; ZHAI; Jian-Qing; Macro; GEMMER
2013-01-01
Based on observed daily precipitation data of 540 stations and 3,839 gridded data from the high-resolution regional climate model COSMO-Climate Limited-area Modeling(CCLM)for 1961–2000,the simulation ability of CCLM on daily precipitation in China is examined,and the variation of daily precipitation distribution pattern is revealed.By applying the probability distribution and extreme value theory to the projected daily precipitation(2011–2050)under SRES A1B scenario with CCLM,trends of daily precipitation series and daily precipitation extremes are analyzed.Results show that except for the western Qinghai-Tibetan Plateau and South China,distribution patterns of the kurtosis and skewness calculated from the simulated and observed series are consistent with each other;their spatial correlation coefcients are above 0.75.The CCLM can well capture the distribution characteristics of daily precipitation over China.It is projected that in some parts of the Jianghuai region,central-eastern Northeast China and Inner Mongolia,the kurtosis and skewness will increase significantly,and precipitation extremes will increase during 2011–2050.The projected increase of maximum daily rainfall and longest non-precipitation period during flood season in the aforementioned regions,also show increasing trends of droughts and floods in the next 40 years.
Institute of Scientific and Technical Information of China (English)
无
2009-01-01
【说词】1. He can probably tell us the truth.2. Will it rain this afternoong ？ Probably【解语】作副词，意为“大概、或许”，表示可能性很大，通常指根据目前情况作出积极推测或判断；
Non-Gaussian probability distributions of solar wind fluctuations
Directory of Open Access Journals (Sweden)
E. Marsch
Full Text Available The probability distributions of field differences ∆x(τ=x(t+τ-x(t, where the variable x(t may denote any solar wind scalar field or vector field component at time t, have been calculated from time series of Helios data obtained in 1976 at heliocentric distances near 0.3 AU. It is found that for comparatively long time lag τ, ranging from a few hours to 1 day, the differences are normally distributed according to a Gaussian. For shorter time lags, of less than ten minutes, significant changes in shape are observed. The distributions are often spikier and narrower than the equivalent Gaussian distribution with the same standard deviation, and they are enhanced for large, reduced for intermediate and enhanced for very small values of ∆x. This result is in accordance with fluid observations and numerical simulations. Hence statistical properties are dominated at small scale τ by large fluctuation amplitudes that are sparsely distributed, which is direct evidence for spatial intermittency of the fluctuations. This is in agreement with results from earlier analyses of the structure functions of ∆x. The non-Gaussian features are differently developed for the various types of fluctuations. The relevance of these observations to the interpretation and understanding of the nature of solar wind magnetohydrodynamic (MHD turbulence is pointed out, and contact is made with existing theoretical concepts of intermittency in fluid turbulence.
Spin-Orbit angle distribution and the origin of (mis)aligned hot Jupiters
Crida, Aurélien
2014-01-01
For 61 transiting hot Jupiters, the projection of the angle between the orbital plane and the stellar equator (called the spin-orbit angle) has been measured. For about half of them, a significant misalignment is detected, and retrograde planets have been observed. This challenges scenarios of the formation of hot Jupiters. In order to better constrain formation models, we relate the distribution of the real spin-orbit angle $\\Psi$ to the projected one $\\beta$. Then, a comparison with the observations is relevant. We analyse the geometry of the problem to link analytically the projected angle $\\beta$ to the real spin-orbit angle $\\Psi$. The distribution of $\\Psi$ expected in various models is taken from the literature, or derived with a simplified model and Monte-Carlo simulations in the case of the disk-torquing mechanism. An easy formula to compute the probability density function (PDF) of $\\beta$ knowing the PDF of $\\Psi$ is provided. All models tested here look compatible with the observed distribution be...
Cosmological constraints from the convergence 1-point probability distribution
Patton, Kenneth; Honscheid, Klaus; Huff, Eric; Melchior, Peter; Ross, Ashley J; Suchyta, Eric
2016-01-01
We examine the cosmological information available from the 1-point probability distribution (PDF) of the weak-lensing convergence field, utilizing fast L-PICOLA simulations and a Fisher analysis. We find competitive constraints in the $\\Omega_m$-$\\sigma_8$ plane from the convergence PDF with $188\\ arcmin^2$ pixels compared to the cosmic shear power spectrum with an equivalent number of modes ($\\ell < 886$). The convergence PDF also partially breaks the degeneracy cosmic shear exhibits in that parameter space. A joint analysis of the convergence PDF and shear 2-point function also reduces the impact of shape measurement systematics, to which the PDF is less susceptible, and improves the total figure of merit by a factor of $2-3$, depending on the level of systematics. Finally, we present a correction factor necessary for calculating the unbiased Fisher information from finite differences using a limited number of cosmological simulations.
Quantization of Prior Probabilities for Collaborative Distributed Hypothesis Testing
Rhim, Joong Bum; Varshney, Lav R.; Goyal, Vivek K.
2012-09-01
This paper studies the quantization of prior probabilities, drawn from an ensemble, for distributed detection and data fusion. Design and performance equivalences between a team of N agents tied by a fixed fusion rule and a more powerful single agent are obtained. Effects of identical quantization and diverse quantization are compared. Consideration of perceived common risk enables agents using diverse quantizers to collaborate in hypothesis testing, and it is proven that the minimum mean Bayes risk error is achieved by diverse quantization. The comparison shows that optimal diverse quantization with K cells per quantizer performs as well as optimal identical quantization with N(K-1)+1 cells per quantizer. Similar results are obtained for maximum Bayes risk error as the distortion criterion.
Quantization of Prior Probabilities for Collaborative Distributed Hypothesis Testing
Rhim, Joong Bum; Goyal, Vivek K
2011-01-01
This paper studies the quantization of prior probabilities, drawn from an ensemble, for distributed detection and data fusion. Design and performance equivalences between a team of N agents tied by a fixed fusion rule and a more powerful single agent are obtained. Effects of identical quantization and diverse quantization are compared. Consideration of perceived common risk enables agents using diverse quantizers to collaborate in hypothesis testing, and it is proven that the minimum mean Bayes risk error is achieved by diverse quantization. The comparison shows that optimal diverse quantization with K cells per quantizer performs as well as optimal identical quantization with N(K-1)+1 cells per quantizer. Similar results are obtained for maximum Bayes risk error as the distortion criterion.
Gesture Recognition Based on the Probability Distribution of Arm Trajectories
Wan, Khairunizam; Sawada, Hideyuki
The use of human motions for the interaction between humans and computers is becoming an attractive alternative to verbal media, especially through the visual interpretation of the human body motion. In particular, hand gestures are used as non-verbal media for the humans to communicate with machines that pertain to the use of the human gestures to interact with them. This paper introduces a 3D motion measurement of the human upper body for the purpose of the gesture recognition, which is based on the probability distribution of arm trajectories. In this study, by examining the characteristics of the arm trajectories given by a signer, motion features are selected and classified by using a fuzzy technique. Experimental results show that the use of the features extracted from arm trajectories effectively works on the recognition of dynamic gestures of a human, and gives a good performance to classify various gesture patterns.
Probability distribution function for reorientations in Maier-Saupe potential
Sitnitsky, A. E.
2016-06-01
Exact analytic solution for the probability distribution function of the non-inertial rotational diffusion equation, i.e., of the Smoluchowski one, in a symmetric Maier-Saupe uniaxial potential of mean torque is obtained via the confluent Heun's function. Both the ordinary Maier-Saupe potential and the double-well one with variable barrier width are considered. Thus, the present article substantially extends the scope of the potentials amenable to the treatment by reducing Smoluchowski equation to the confluent Heun's one. The solution is uniformly valid for any barrier height. We use it for the calculation of the mean first passage time. Also the higher eigenvalues for the relaxation decay modes in the case of ordinary Maier-Saupe potential are calculated. The results obtained are in full agreement with those of the approach developed by Coffey, Kalmykov, Déjardin and their coauthors in the whole range of barrier heights.
Seismic pulse propagation with constant Q and stable probability distributions
Mainardi, Francesco
2010-01-01
The one-dimensional propagation of seismic waves with constant Q is shown to be governed by an evolution equation of fractional order in time, which interpolates the heat equation and the wave equation. The fundamental solutions for the Cauchy and Signalling problems are expressed in terms of entire functions (of Wright type) in the similarity variable and their behaviours turn out to be intermediate between those for the limiting cases of a perfectly viscous fluid and a perfectly elastic solid. In view of the small dissipation exhibited by the seismic pulses, the nearly elastic limit is considered. Furthermore, the fundamental solutions for the Cauchy and Signalling problems are shown to be related to stable probability distributions with index of stability determined by the order of the fractional time derivative in the evolution equation.
Characterizing the \\lyaf\\ flux probability distribution function using Legendre polynomials
Cieplak, Agnieszka M
2016-01-01
The Lyman-$\\alpha$ forest is a highly non-linear field with a lot of information available in the data beyond the power spectrum. The flux probability distribution function (PDF) has been used as a successful probe of small-scale physics. In this paper we argue that measuring coefficients of the Legendre polyonomial expansion of the PDF offers several advantages over measuring the binned values as is commonly done. In particular, $n$-th coefficient can be expressed as a linear combination of the first $n$ moments, allowing these coefficients to be measured in the presence of noise and allowing a clear route for marginalisation over mean flux. Moreover, in the presence of noise, our numerical work shows that a finite number of coefficients are well measured with very sharp transition into noise dominance. This compresses the available information into a small number of well-measured quantities.
Seto, Naoki
2014-01-01
We analytically discuss probability distribution function (PDF) for inclinations of merging compact binaries whose gravitational waves are coherently detected by a network of ground based interferometers. The PDF would be useful for studying prospects of (1) simultaneously detecting electromagnetic signals (such as gamma-ray-bursts) associated with binary mergers and (2) statistically constraining the related theoretical models from the actual observational data of multi-messenger astronomy. Our approach is similar to Schutz (2011), but we explicitly include the dependence of the polarization angles of the binaries, based on the concise formulation given in Cutler and Flanagan (1994). We find that the overall profiles of the PDFs are similar for any networks composed by the second generation detectors (Advanced-LIGO, Advanced-Virgo, KAGRA, LIGO-India). For example, 5.1% of detected binaries would have inclination angle less than 10 degree with at most 0.1% differences between the potential networks. A perturb...
Insights from probability distribution functions of intensity maps
Breysse, Patrick C; Behroozi, Peter S; Dai, Liang; Kamionkowski, Marc
2016-01-01
In the next few years, intensity-mapping surveys that target lines such as CO, Ly$\\alpha$, and CII stand to provide powerful probes of high-redshift astrophysics. However, these line emissions are highly non-Gaussian, and so the typical power-spectrum methods used to study these maps will leave out a significant amount of information. We propose a new statistic, the probability distribution of voxel intensities, which can access this extra information. Using a model of a CO intensity map at $z\\sim3$ as an example, we demonstrate that this voxel intensity distribution (VID) provides substantial constraining power beyond what is obtainable from the power spectrum alone. We find that a future survey similar to the planned COMAP Full experiment could constrain the CO luminosity function to order $\\sim10\\%$. We also explore the effects of contamination from continuum emission, interloper lines, and gravitational lensing on our constraints and find that the VID statistic retains significant constraining power even ...
Characterization of cast metals with probability distribution functions
International Nuclear Information System (INIS)
Characterization of microstructure using a probability distribution function (PDF) provides a means for extracting useful information about material properties. In the extension of classical PDF methods developed in the research, material characteristics are evolved by propagating an initial PDF through time, using growth laws derived from consideration of heat flow and species diffusion, constrained by the Gibbs-Thomson law. A model is described here that allows for nucleation, followed by growth of nominally spherical grains according to a stable or unstable growth law. Results are presented for the final average grain size as a function of cooling rate for various nucleation parameters. In particular the authors show that the model describes linear variation of final grain size with the inverse cube root of cooling rate. Within a subset of casting parameters, the stable-to-unstable manifests itself as a bimodal distribution of final grain size. Calculations with the model are described for the liquid to epsilon phase transition in a plutonium 1 weight percent gallium alloy
Bhaduri, Susmita; Ghosh, Dipak
2016-08-01
There are numerous existing works on investigating the dynamics of particle production process in ultrarelativistic nuclear collision. In the past, fluctuation of spatial pattern has been analyzed in terms of the scaling behavior of voids. But analysis of the scaling behavior of the void in fractal scenario has not been explored yet. In this work, we have analyzed the fractality of void probability distribution with a completely different and rigorous method called visibility graph analysis, analyzing the void-data produced out of fluctuation of pions in 32S-AgBr interaction at 200 GeV in pseudo-rapidity (η) and azimuthal angle (ϕ) space. The power of scale-freeness of visibility graph denoted by PSVG is a measure of fractality, which can be used as a quantitative parameter for the assessment of the state of chaotic system. As the behavior of particle production process depends on the target excitation, we can dwell down the void probability distribution in the event-wise fluctuation resulted out of the high energy interaction for different degree of target excitation, with respect to the fractal scenario and analyze the scaling behavior of the voids. From the analysis of the PSVG parameter, we have observed that scaling behavior of void probability distribution in multipion production changes with increasing target excitation. Since visibility graph method is a classic method of complex network analysis, has been applied over fractional Brownian motion (fBm) and fractional Gaussian noises (fGn) to measure the fractality and long-range dependence of a time series successfully, we can quantitatively confirm that fractal behavior of the void probability distribution in particle production process depends on the target excitation.
The Lyman alpha forest flux probability distribution at z>3
Calura, F; D'Odorico, V; Viel, M; Cristiani, S; Kim, T -S; Bolton, J S
2012-01-01
We present a measurement of the Lyman alpha flux probability distribution function (PDF) measured from a set of eight high resolution quasar spectra with emission redshifts at 3.3 < z < 3.8. We carefully study the effect of metal absorption lines on the shape of the PDF. Metals have a larger impact on the PDF measurements at lower redshift, where there are fewer Lyman alpha absorption lines. This may be explained by an increase in the number of metal lines which are blended with Lyman alpha absorption lines toward higher redshift, but may also be due to the presence of fewer metals in the intergalactic medium with increasing lookback time. We also provide a new measurement of the redshift evolution of the effective optical depth, tau_eff, at 2.8 < z < 3.6, and find no evidence for a deviation from a power law evolution in the log(tau_eff)-log(1+z) plane. The flux PDF measurements are furthermore of interest for studies of the thermal state of the intergalactic medium (IGM) at z ~ 3 . By comparing ...
Target Tracking Using SePDAF under Ambiguous Angles for Distributed Array Radar
Directory of Open Access Journals (Sweden)
Teng Long
2016-09-01
Full Text Available Distributed array radar can improve radar detection capability and measurement accuracy. However, it will suffer cyclic ambiguity in its angle estimates according to the spatial Nyquist sampling theorem since the large sparse array is undersampling. Consequently, the state estimation accuracy and track validity probability degrades when the ambiguous angles are directly used for target tracking. This paper proposes a second probability data association filter (SePDAF-based tracking method for distributed array radar. Firstly, the target motion model and radar measurement model is built. Secondly, the fusion result of each radar’s estimation is employed to the extended Kalman filter (EKF to finish the first filtering. Thirdly, taking this result as prior knowledge, and associating with the array-processed ambiguous angles, the SePDAF is applied to accomplish the second filtering, and then achieving a high accuracy and stable trajectory with relatively low computational complexity. Moreover, the azimuth filtering accuracy will be promoted dramatically and the position filtering accuracy will also improve. Finally, simulations illustrate the effectiveness of the proposed method.
Target Tracking Using SePDAF under Ambiguous Angles for Distributed Array Radar
Long, Teng; Zhang, Honggang; Zeng, Tao; Chen, Xinliang; Liu, Quanhua; Zheng, Le
2016-01-01
Distributed array radar can improve radar detection capability and measurement accuracy. However, it will suffer cyclic ambiguity in its angle estimates according to the spatial Nyquist sampling theorem since the large sparse array is undersampling. Consequently, the state estimation accuracy and track validity probability degrades when the ambiguous angles are directly used for target tracking. This paper proposes a second probability data association filter (SePDAF)-based tracking method for distributed array radar. Firstly, the target motion model and radar measurement model is built. Secondly, the fusion result of each radar’s estimation is employed to the extended Kalman filter (EKF) to finish the first filtering. Thirdly, taking this result as prior knowledge, and associating with the array-processed ambiguous angles, the SePDAF is applied to accomplish the second filtering, and then achieving a high accuracy and stable trajectory with relatively low computational complexity. Moreover, the azimuth filtering accuracy will be promoted dramatically and the position filtering accuracy will also improve. Finally, simulations illustrate the effectiveness of the proposed method. PMID:27618058
Tools for Bramwell-Holdsworth-Pinton Probability Distribution
Mirela Danubianu; Tiberiu Socaciu; Ioan Maxim
2009-01-01
This paper is a synthesis of a range of papers presentedat various conferences related to distribution Bramwell-Holdsworth-Pinton. S. T. Bramwell, P. C. W. Holdsworth, J. F.Pinton introduced a new non-parametric distribution (calledBHP) after studying some magnetization problems in 2D. Probabilitydensity function of distribution can be aproximated as amodified GFT (Gumbel-Fisher-Tippitt) distribution.
Institute of Scientific and Technical Information of China (English)
LI Hai-Xia; CHENG Chuan-Fu
2011-01-01
@@ We study the light scattering of an orthogonal anisotropic rough surface with secondary most probable slope distribution It is found that the scattered intensity profiles have obvious secondary maxima, and in the direction perpendicular to the plane of incidence, the secondary maxima are oriented in a curve on the observation plane,which is called the orientation curve.By numerical calculation of the scattering wave fields with the height data of the sample, it is validated that the secondary maxima are induced by the side face element, which constitutes the prismoid structure of the anisotropic surface.We derive the equation of the quadratic orientation curve.Experimentally, we construct the system for light scattering measurement using a CCD.The scattered intensity profiles are extracted from the images at different angles of incidence along the orientation curves.The experimental results conform to the theory.
Tools for Bramwell-Holdsworth-Pinton Probability Distribution
Directory of Open Access Journals (Sweden)
Mirela Danubianu
2009-01-01
Full Text Available This paper is a synthesis of a range of papers presentedat various conferences related to distribution Bramwell-Holdsworth-Pinton. S. T. Bramwell, P. C. W. Holdsworth, J. F.Pinton introduced a new non-parametric distribution (calledBHP after studying some magnetization problems in 2D. Probabilitydensity function of distribution can be aproximated as amodified GFT (Gumbel-Fisher-Tippitt distribution.
Measuring Robustness of Timetables in Stations using a Probability Distribution
DEFF Research Database (Denmark)
Jensen, Lars Wittrup; Landex, Alex
vulnerable to delays (less robust) while the other will be less vulnerable (more robust), but this cannot be measured by the above methods. In the light of this, the article will describe a new method where the complexity of a given station with a given timetable can be calculated based on a probability...
Conant, Darcy Lynn
2013-01-01
Stochastic understanding of probability distribution undergirds development of conceptual connections between probability and statistics and supports development of a principled understanding of statistical inference. This study investigated the impact of an instructional course intervention designed to support development of stochastic…
Probability distributions for measures of placental shape and morphology
International Nuclear Information System (INIS)
Birthweight at delivery is a standard cumulative measure of placental growth, but is a crude summary of other placental characteristics, such as, e.g., the chorionic plate size, and the shape and position of the umbilical cord insertion. Distributions of such measures across a cohort reveal information about the developmental history of the chorionic plate which is unavailable from an analysis based solely on the mean and standard deviation. Various measures were determined from digitized images of chorionic plates obtained from the pregnancy, infection, and nutrition study, a prospective cohort study of preterm birth in central North Carolina between 2002 and 2004. Centroids (geometric centers) and umbilical cord insertions were taken directly from the images. Chorionic plate outlines were obtained from an interpolation based on a Fourier series, while eccentricity (of the best-fit ellipse), skewness, and kurtosis were determined from the method of moments. Histograms of each variable were compared against the normal, lognormal, and Lévy distributions. Only a single measure (eccentricity) followed a normal distribution. All others followed lognormal or ‘heavy-tailed’ distributions for moderate to extreme deviations from the mean, where the relative likelihood far exceeded those of a normal distribution. (paper)
Beta-hypergeometric probability distribution on symmetric matrices
Hassairi, Abdelhamid; Masmoudi, Mouna
2012-01-01
Some remarkable properties of the beta distribution are based on relations involving independence between beta random variables such that a parameter of one among them is the sum of the parameters of an other (see (1.1) et (1.2) below). Asci, Letac and Piccioni \\cite{6} have used the real beta-hypergeometric distribution on $ \\reel$ to give a general version of these properties without the condition on the parameters. In the present paper, we extend the properties of the real beta to the beta...
Scaling Properties of the Probability Distribution of Lattice Gribov Copies
Lokhov, A Y; Roiesnel, C
2005-01-01
We study the problem of the Landau gauge fixing in the case of the SU(2) lattice gauge theory. We show that the probability to find a lattice Gribov copy increases considerably when the physical size of the lattice exceeds some critical value $\\approx2.75/\\sqrt{\\sigma}$, almost independent of the lattice spacing. The impact of the choice of the copy on Green functions is presented. We confirm that the ghost propagator depends on the choice of the copy whereas the gluon propagator is insensitive to it (within present statistical errors). The gluonic three-point functions are also insensitive to it. Finally we show that gauge copies which have the same value of the minimisation functional ($\\int d^4x (A^a_\\mu)^2$) are equivalent, up to a global gauge transformation, and yield the same Green functions.
Extreme Points of the Convex Set of Joint Probability Distributions with Fixed Marginals
Indian Academy of Sciences (India)
K R Parthasarathy
2007-11-01
By using a quantum probabilistic approach we obtain a description of the extreme points of the convex set of all joint probability distributions on the product of two standard Borel spaces with fixed marginal distributions.
Disoriented Chiral Condensates, Pion Probability Distributions and Parallels with Disordered System
Mekjian, A. Z.
1999-01-01
A general expression is discussed for pion probability distributions coming from relativistic heavy ion collisions. The general expression contains as limits: 1) The disoriented chiral condensate (DCC), 2) the negative binomial distribution and Pearson type III distribution, 3) a binomial or Gaussian result, 4) and a Poisson distribution. This general expression approximates other distributions such as a signal to noise laser distribution. Similarities and differences of the DCC distribution ...
Yilmaz, Şeyda; Bayrak, Erdem; Bayrak, Yusuf
2016-04-01
In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed time using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.
Calculation of ruin probabilities for a dense class of heavy tailed distributions
DEFF Research Database (Denmark)
Bladt, Mogens; Nielsen, Bo Friis; Samorodnitsky, Gennady
2015-01-01
of distributions with a slowly varying tail. An example from risk theory, comparing ruin probabilities for a classical risk process with Pareto distributed claim sizes, is presented and exact known ruin probabilities for the Pareto case are compared to the ones obtained by approximating by an infinite...
Generalized parton distributions and wide-angle exclusive scattering
Kroll, P
2004-01-01
The handbag mechanism for wide-angle exlusive scattering reactions is discussed and compared with other theoretical approaches. Its application to Compton scattering, meson photoproduction and two-photon annihilations into pairs of hadrons is reviewed.
The probability distribution of the predicted CFM-induced ozone depletion. [Chlorofluoromethane
Ehhalt, D. H.; Chang, J. S.; Bulter, D. M.
1979-01-01
It is argued from the central limit theorem that the uncertainty in model predicted changes of the ozone column density is best represented by a normal probability density distribution. This conclusion is validated by comparison with a probability distribution generated by a Monte Carlo technique. In the case of the CFM-induced ozone depletion, and based on the estimated uncertainties in the reaction rate coefficients alone the relative mean standard deviation of this normal distribution is estimated to be 0.29.
Constructing the probability distribution function for the total capacity of a power system
Energy Technology Data Exchange (ETDEWEB)
Vasin, V.P.; Prokhorenko, V.I.
1980-01-01
The difficulties involved in constructing the probability distribution function for the total capacity of a power system consisting of numerous power plants are discussed. A method is considered for the approximate determination of such a function by a Monte Carlo method and by exact calculation based on special recursion formulas on a particular grid of argument values. It is shown that there may be significant deviations between the true probability distribution and a normal distribution.
A New Probability of Detection Model for Updating Crack Distribution of Offshore Structures
Institute of Scientific and Technical Information of China (English)
李典庆; 张圣坤; 唐文勇
2003-01-01
There exists model uncertainty of probability of detection for inspecting ship structures with nondestructive inspection techniques. Based on a comparison of several existing probability of detection (POD) models, a new probability of detection model is proposed for the updating of crack size distribution. Furthermore, the theoretical derivation shows that most existing probability of detection models are special cases of the new probability of detection model. The least square method is adopted for determining the values of parameters in the new POD model. This new model is also compared with other existing probability of detection models. The results indicate that the new probability of detection model can fit the inspection data better. This new probability of detection model is then applied to the analysis of the problem of crack size updating for offshore structures. The Bayesian updating method is used to analyze the effect of probability of detection models on the posterior distribution of a crack size. The results show that different probabilities of detection models generate different posterior distributions of a crack size for offshore structures.
Some possible q-exponential type probability distribution in the non-extensive statistical physics
Chung, Won Sang
2016-08-01
In this paper, we present two exponential type probability distributions which are different from Tsallis’s case which we call Type I: one given by pi = 1 Zq[eq(Ei)]‑β (Type IIA) and another given by pi = 1 Zq[eq(‑β)]Ei (Type IIIA). Starting with the Boltzman-Gibbs entropy, we obtain the different probability distribution by using the Kolmogorov-Nagumo average for the microstate energies. We present the first-order differential equations related to Types I, II and III. For three types of probability distributions, we discuss the quantum harmonic oscillator, two-level problem and the spin-1 2 paramagnet.
Average Consensus Analysis of Distributed Inference with Uncertain Markovian Transition Probability
Won Il Kim; Rong Xiong; Qiuguo Zhu; Jun Wu
2013-01-01
The average consensus problem of distributed inference in a wireless sensor network under Markovian communication topology of uncertain transition probability is studied. A sufficient condition for average consensus of linear distributed inference algorithm is presented. Based on linear matrix inequalities and numerical optimization, a design method of fast distributed inference is provided.
A measure of mutual divergence among a number of probability distributions
Directory of Open Access Journals (Sweden)
J. N. Kapur
1987-01-01
major inequalities due to Shannon, Renyi and Holder. The inequalities are then used to obtain some useful results in information theory. In particular measures are obtained to measure the mutual divergence among two or more probability distributions.
Noise figure and photon probability distribution in Coherent Anti-Stokes Raman Scattering (CARS)
Dimitropoulos, D.; Solli, D. R.; Claps, R.; Jalali, B.
2006-01-01
The noise figure and photon probability distribution are calculated for coherent anti-Stokes Raman scattering (CARS) where an anti-Stokes signal is converted to Stokes. We find that the minimum noise figure is ~ 3dB.
Energy Technology Data Exchange (ETDEWEB)
Nakatsuka, Takao [Okayama Shoka University, Laboratory of Information Science, Okayama (Japan); Okei, Kazuhide [Kawasaki Medical School, Dept. of Information Sciences, Kurashiki (Japan); Iyono, Atsushi [Okayama university of Science, Dept. of Fundamental Science, Faculty of Science, Okayama (Japan); Bielajew, Alex F. [Univ. of Michigan, Dept. Nuclear Engineering and Radiological Sciences, Ann Arbor, MI (United States)
2015-12-15
Simultaneous distribution between the deflection angle and the lateral displacement of fast charged particles traversing through matter is derived by applying numerical inverse Fourier transforms on the Fourier spectral density solved analytically under the Moliere theory of multiple scattering, taking account of ionization loss. Our results show the simultaneous Gaussian distribution at the region of both small deflection angle and lateral displacement, though they show the characteristic contour patterns of probability density specific to the single and the double scatterings at the regions of large deflection angle and/or lateral displacement. The influences of ionization loss on the distribution are also investigated. An exact simultaneous distribution is derived under the fixed energy condition based on a well-known model of screened single scattering, which indicates the limit of validity of the Moliere theory applied to the simultaneous distribution. The simultaneous distribution will be valuable for improving the accuracy and the efficiency of experimental analyses and simulation studies relating to charged particle transports. (orig.)
DEFF Research Database (Denmark)
Schjær-Jacobsen, Hans
2012-01-01
to the understanding of similarities and differences of the two approaches as well as practical applications. The probability approach offers a good framework for representation of randomness and variability. Once the probability distributions of uncertain parameters and their correlations are known the resulting...... uncertainty can be calculated. The possibility approach is particular well suited for representation of uncertainty of a non-statistical nature due to lack of knowledge and requires less information than the probability approach. Based on the kind of uncertainty and knowledge present, these aspects...... by probability distributions is readily done by means of Monte Carlo simulation. Calculation of non-monotonic functions of possibility distributions is done within the theoretical framework of fuzzy intervals, but straight forward application of fuzzy arithmetic in general results in overestimation of interval...
Probability distributions in conservative energy exchange models of multiple interacting agents
Energy Technology Data Exchange (ETDEWEB)
Scafetta, Nicola [Department of Physics, Duke University, Durham, NC 27708 (United States); West, Bruce J [Department of Physics, Duke University, Durham, NC 27708 (United States)
2007-02-14
Herein we study energy exchange models of multiple interacting agents that conserve energy in each interaction. The models differ regarding the rules that regulate the energy exchange and boundary effects. We find a variety of stochastic behaviours that manifest energy equilibrium probability distributions of different types and interaction rules that yield not only the exponential distributions such as the familiar Maxwell-Boltzmann-Gibbs distribution of an elastically colliding ideal particle gas, but also uniform distributions, truncated exponential distributions, Gaussian distributions, Gamma distributions, inverse power law distributions, mixed exponential and inverse power law distributions, and evolving distributions. This wide variety of distributions should be of value in determining the underlying mechanisms generating the statistical properties of complex phenomena including those to be found in complex chemical reactions.
Predicting the probability of slip in gait: methodology and distribution study.
Gragg, Jared; Yang, James
2016-01-01
The likelihood of a slip is related to the available and required friction for a certain activity, here gait. Classical slip and fall analysis presumed that a walking surface was safe if the difference between the mean available and required friction coefficients exceeded a certain threshold. Previous research was dedicated to reformulating the classical slip and fall theory to include the stochastic variation of the available and required friction when predicting the probability of slip in gait. However, when predicting the probability of a slip, previous researchers have either ignored the variation in the required friction or assumed the available and required friction to be normally distributed. Also, there are no published results that actually give the probability of slip for various combinations of required and available frictions. This study proposes a modification to the equation for predicting the probability of slip, reducing the previous equation from a double-integral to a more convenient single-integral form. Also, a simple numerical integration technique is provided to predict the probability of slip in gait: the trapezoidal method. The effect of the random variable distributions on the probability of slip is also studied. It is shown that both the required and available friction distributions cannot automatically be assumed as being normally distributed. The proposed methods allow for any combination of distributions for the available and required friction, and numerical results are compared to analytical solutions for an error analysis. The trapezoidal method is shown to be highly accurate and efficient. The probability of slip is also shown to be sensitive to the input distributions of the required and available friction. Lastly, a critical value for the probability of slip is proposed based on the number of steps taken by an average person in a single day.
Mahanti, P.; Robinson, M. S.; Boyd, A. K.
2013-12-01
Craters ~20-km diameter and above significantly shaped the lunar landscape. The statistical nature of the slope distribution on their walls and floors dominate the overall slope distribution statistics for the lunar surface. Slope statistics are inherently useful for characterizing the current topography of the surface, determining accurate photometric and surface scattering properties, and in defining lunar surface trafficability [1-4]. Earlier experimental studies on the statistical nature of lunar surface slopes were restricted either by resolution limits (Apollo era photogrammetric studies) or by model error considerations (photoclinometric and radar scattering studies) where the true nature of slope probability distribution was not discernible at baselines smaller than a kilometer[2,3,5]. Accordingly, historical modeling of lunar surface slopes probability distributions for applications such as in scattering theory development or rover traversability assessment is more general in nature (use of simple statistical models such as the Gaussian distribution[1,2,5,6]). With the advent of high resolution, high precision topographic models of the Moon[7,8], slopes in lunar craters can now be obtained at baselines as low as 6-meters allowing unprecedented multi-scale (multiple baselines) modeling possibilities for slope probability distributions. Topographic analysis (Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC) 2-m digital elevation models (DEM)) of ~20-km diameter Copernican lunar craters revealed generally steep slopes on interior walls (30° to 36°, locally exceeding 40°) over 15-meter baselines[9]. In this work, we extend the analysis from a probability distribution modeling point-of-view with NAC DEMs to characterize the slope statistics for the floors and walls for the same ~20-km Copernican lunar craters. The difference in slope standard deviations between the Gaussian approximation and the actual distribution (2-meter sampling) was
Energy Technology Data Exchange (ETDEWEB)
Koltsov, A. V., E-mail: koltsov@x4u.lebedev.ru; Serov, A. V., E-mail: serov@x4u.lebedev.ru [Russian Academy of Sciences, Lebedev Physical Institute (Russian Federation)
2011-10-15
The spatial distributions of transition radiation from relativistic particles entering and exiting the edge of a dihedral angle formed by perfectly conducting flat surfaces have been investigated. The angular distributions of the radiation intensity in dihedral angles with various opening angles have been calculated. The angular distributions of forward radiation (when the particle exits the dihedral angle) and backward radiation (when the particle enters the dihedral angle) are shown to differ significantly.
A Class of Chaotic Sequences with Gauss Probability Distribution for Radar Mask Jamming
Institute of Scientific and Technical Information of China (English)
Ni-Ni Rao; Yu-Chuan Huang; Bin Liu
2007-01-01
A simple generation approach for chaotic sequences with Gauss probability distribution is proposed. Theoretical analysis and simulation based on Logistic chaotic model show that the approach is feasible and effective. The distribution characteristics of the novel chaotic sequence are comparable to that of the standard normal distribution. Its mean and variance can be changed to the desired values. The novel sequences have also good randomness. The applications for radar mask jamming are analyzed.
Score distributions of gapped multiple sequence alignments down to the low-probability tail
Fieth, Pascal; Hartmann, Alexander K.
2016-08-01
Assessing the significance of alignment scores of optimally aligned DNA or amino acid sequences can be achieved via the knowledge of the score distribution of random sequences. But this requires obtaining the distribution in the biologically relevant high-scoring region, where the probabilities are exponentially small. For gapless local alignments of infinitely long sequences this distribution is known analytically to follow a Gumbel distribution. Distributions for gapped local alignments and global alignments of finite lengths can only be obtained numerically. To obtain result for the small-probability region, specific statistical mechanics-based rare-event algorithms can be applied. In previous studies, this was achieved for pairwise alignments. They showed that, contrary to results from previous simple sampling studies, strong deviations from the Gumbel distribution occur in case of finite sequence lengths. Here we extend the studies to multiple sequence alignments with gaps, which are much more relevant for practical applications in molecular biology. We study the distributions of scores over a large range of the support, reaching probabilities as small as 10-160, for global and local (sum-of-pair scores) multiple alignments. We find that even after suitable rescaling, eliminating the sequence-length dependence, the distributions for multiple alignment differ from the pairwise alignment case. Furthermore, we also show that the previously discussed Gaussian correction to the Gumbel distribution needs to be refined, also for the case of pairwise alignments.
Dutta, S.; Chan, A. H.; Oh, C. H.
2012-08-01
This paper studies the multiplicity distribution of hadrons produced in p-p collisions at 0.9 and 2.36 TeV using ALICE as a detector. The multiplicity distribution exhibits enhanced void probability. They are also found to satisfy the void probability scaling. The scaling of χ with \\bar n\\bar k2 is studied using the generalized hypergeometric model. The variation of the parameter "a" of the hyper geometric model with energy and type of events is also studied. The parameter "a" distinguishes between various theoretical models, e.g. Lorentz/Catalan, negative binomial, geometric distribution etc. Finally a comparison is made with the p--\\bar p collisions at 200, 546 and 900 GeV. It is observed both for p-p and p--\\bar p data, the value of "a" decreases with increase in collision energy and approach towards the upper bound or the NB model of the void probability scaling.
Institute of Scientific and Technical Information of China (English)
Xian-min Geng; Shu-chen Wan
2011-01-01
The compound negative binomial model, introduced in this paper, is a discrete time version. We discuss the Markov properties of the surplus process, and study the ruin probability and the joint distributions of actuarial random vectors in this model. By the strong Markov property and the mass function of a defective renewal sequence, we obtain the explicit expressions of the ruin probability, the finite-horizon ruin probability,the joint distributions of T, U(T - 1), |U(T)| and inf 0≤n＜T1 U(n) (i.e., the time of ruin, the surplus immediately before ruin, the deficit at ruin and maximal deficit from ruin to recovery) and the distributions of some actuariai random vectors.
Impact on the Extreme Value of Ice Thickness of Conductors From Probability Distribution Models
Gao, Ke-Li; Yang, Jia-Lun; Zhu, Kuan-Jun; Zhang, Feng; Cheng, Yong-Feng; Liu, Bin; Liu, Cao-Lan; Gao, Zheng-Xu
Probability distribution model can affect the extreme value of standard ice thickness on conductors of transmission lines for different return periods. This paper discusses the impact of three probability distribution models on calculation of standard ice thickness, which are Pearson III distribution model (P-III), generalized extreme value distribution model (GEV), and generalized Pareto distribution model (GPD), respectively. The historic icing data have been collected in Lvcongpo Mountain from Hubei province, including icing data from north-south direction, west-east direction, bigger data from two directions, and smaller data from two directions. The analyzing results indicate that GPD is more suitable than P-III and GEV for the icing data collected from Lvcongpo Mountain, which is helpful to reasonably determine the design ice thickness of conductors of transmission lines.
Probability distributions for directed polymers in random media with correlated noise.
Chu, Sherry; Kardar, Mehran
2016-07-01
The probability distribution for the free energy of directed polymers in random media (DPRM) with uncorrelated noise in d=1+1 dimensions satisfies the Tracy-Widom distribution. We inquire if and how this universal distribution is modified in the presence of spatially correlated noise. The width of the distribution scales as the DPRM length to an exponent β, in good (but not full) agreement with previous renormalization group and numerical results. The scaled probability is well described by the Tracy-Widom form for uncorrelated noise, but becomes symmetric with increasing correlation exponent. We thus find a class of distributions that continuously interpolates between Tracy-Widom and Gaussian forms. PMID:27575059
Fitting the distribution of dry and wet spells with alternative probability models
Deni, Sayang Mohd; Jemain, Abdul Aziz
2009-06-01
The development of the rainfall occurrence model is greatly important not only for data-generation purposes, but also in providing informative resources for future advancements in water-related sectors, such as water resource management and the hydrological and agricultural sectors. Various kinds of probability models had been introduced to a sequence of dry (wet) days by previous researchers in the field. Based on the probability models developed previously, the present study is aimed to propose three types of mixture distributions, namely, the mixture of two log series distributions (LSD), the mixture of the log series Poisson distribution (MLPD), and the mixture of the log series and geometric distributions (MLGD), as the alternative probability models to describe the distribution of dry (wet) spells in daily rainfall events. In order to test the performance of the proposed new models with the other nine existing probability models, 54 data sets which had been published by several authors were reanalyzed in this study. Also, the new data sets of daily observations from the six selected rainfall stations in Peninsular Malaysia for the period 1975-2004 were used. In determining the best fitting distribution to describe the observed distribution of dry (wet) spells, a Chi-square goodness-of-fit test was considered. The results revealed that the new method proposed that MLGD and MLPD showed a better fit as more than half of the data sets successfully fitted the distribution of dry and wet spells. However, the existing models, such as the truncated negative binomial and the modified LSD, were also among the successful probability models to represent the sequence of dry (wet) days in daily rainfall occurrence.
DEFF Research Database (Denmark)
Helles, Glennie; Fonseca, Rasmus
2009-01-01
Predicting the three-dimensional structure of a protein from its amino acid sequence is currently one of the most challenging problems in bioinformatics. The internal structure of helices and sheets is highly recurrent and help reduce the search space significantly. However, random coil segments ...
Directory of Open Access Journals (Sweden)
Diogo de Carvalho Bezerra
2015-12-01
Full Text Available ABSTRACT Contributions from the sensitivity analysis of the parameters of the linear programming model for the elicitation of experts' beliefs are presented. The process allows for the calibration of the family of probability distributions obtained in the elicitation process. An experiment to obtain the probability distribution of a future event (Brazil vs. Spain soccer game in the 2013 FIFA Confederations Cup final game was conducted. The proposed sensitivity analysis step may help to reduce the vagueness of the information given by the expert.
Xu, Y.; Tan, L.; Cao, S. L.; Wang, Y. C.; Meng, G.; Qu, W. S.
2015-01-01
The influence of blade angle distribution along leading edge on cavitation performance of centrifugal pumps is analysed in the present paper. Three sets of blade angle distribution along leading edge for three blade inlet angles are chosen to design nine centrifugal pump impellers. The RNG k-epsilon turbulence model and the Zwart-Gerber-Belamri cavitation model are employed to simulate the cavitation flows in centrifugal pumps with different impellers and the same volute. The numerical results are compared with the experimental data, and the comparison proves that the numerical simulation can accurately predict the cavitation performance of centrifugal pumps. On the basis of the numerical simulations, the pump head variations with pump inlet pressure, and the flow details in centrifugal pump are revealed to demonstrate the influence of blade angle distribution along leading edge on cavitation performances of centrifugal pumps.
Gulev, Sergey; Tilinina, Natalia; Belyaev, Konstantin
2015-04-01
Surface turbulent heat fluxes from modern era and first generation reanalyses (NCEP-DOE, ERA-Interim, MERRA NCEP-CFSR, JRA) as well as from satellite products (SEAFLUX, IFREMER, HOAPS) were intercompared using framework of probability distributions for sensible and latent heat fluxes. For approximation of probability distributions and estimation of extreme flux values Modified Fisher-Tippett (MFT) distribution has been used. Besides mean flux values, consideration is given to the comparative analysis of (i) parameters of the MFT probability density functions (scale and location), (ii) extreme flux values corresponding high order percentiles of fluxes (e.g. 99th and higher) and (iii) fractional contribution of extreme surface flux events in the total surface turbulent fluxes integrated over months and seasons. The latter was estimated using both fractional distribution derived from MFT and empirical estimates based upon occurrence histograms. The strongest differences in the parameters of probability distributions of surface fluxes and extreme surface flux values between different reanalyses are found in the western boundary current extension regions and high latitudes, while the highest differences in the fractional contributions of surface fluxes may occur in mid ocean regions being closely associated with atmospheric synoptic dynamics. Generally, satellite surface flux products demonstrate relatively stronger extreme fluxes compared to reanalyses, even in the Northern Hemisphere midlatitudes where data assimilation input in reanalyses is quite dense compared to the Southern Ocean regions.
Importance measures for imprecise probability distributions and their sparse grid solutions
Institute of Scientific and Technical Information of China (English)
WANG; Pan; LU; ZhenZhou; CHENG; Lei
2013-01-01
For the imprecise probability distribution of structural system, the variance based importance measures (IMs) of the inputs are investigated, and three IMs are defined on the conditions of random distribution parameters, interval distribution parameters and the mixture of those two types of distribution parameters. The defined IMs can reflect the influence of the inputs on the output of the structural system with imprecise distribution parameters, respectively. Due to the large computational cost of the variance based IMs, sparse grid method is employed in this work to compute the variance based IMs at each reference point of distribution parameters. For the three imprecise distribution parameter cases, the sparse grid method and the combination of sparse grid method with genetic algorithm are used to compute the defined IMs. Numerical and engineering examples are em-ployed to demonstrate the rationality of the defined IMs and the efficiency of the applied methods.
The probability distribution function for the sum of squares of independent random variables
Fateev, Yury; Dmitriev, Dmitry; Tyapkin, Valery; Kremez, Nikolai; Shaidurov, Vladimir
2016-08-01
In the present paper, the probability distribution function is derived for the sum of squares of random variables for nonzero expectations. This distribution function enables one to develop an efficient one-step algorithm for phase ambiguity resolution when determining the spatial orientation from signals of satellite radio-navigation systems. Threshold values for rejecting false solutions and statistical properties of the algorithm are obtained.
Analysis of low probability of intercept (LPI) radar signals using the Wigner Distribution
Gau, Jen-Yu
2002-01-01
Approved for public release, distribution is unlimited The parameters of Low Probability of Intercept (LPI) radar signals are hard to identify by using traditional periodogram signal processing techniques. Using the Wigner Distribution (WD), this thesis examines eight types of LPI radar signals. Signal to noise ratios of 0 dB and -6dB are also investigated. The eight types LPI radar signals examined include Frequency Modulation Continuous Wave (FMCW), Frank code, P1 code, P2 code, P3 code,...
The exact probability distribution of the rank product statistics for replicated experiments
Eisinga, R.N.; Breitling, R.; Heskes, T.M.
2013-01-01
The rank product method is a widely accepted technique for detecting differentially regulated genes in replicated microarray experiments. To approximate the sampling distribution of the rank product statistic, the original publication proposed a permutation approach, whereas recently an alternative approximation based on the continuous gamma distribution was suggested. However, both approximations are imperfect for estimating small tail probabilities. In this paper we relate the rank product ...
Schürmann, Thomas
2015-01-01
We consider the nonparametric estimation problem of continuous probability distribution functions. For the integrated mean square error we provide the statistic corresponding to the best invariant estimator proposed by Aggarwal (1955) and Ferguson (1967). The table of critical values is computed and a numerical power comparison of the statistic with the traditional Cram\\'{e}r-von Mises statistic is done for several representative distributions.
Various Models for Pion Probability Distributions from Heavy-Ion Collisions
Mekjian, A. Z.; Schlei, B. R.; Strottman, D.
1998-01-01
Various models for pion multiplicity distributions produced in relativistic heavy ion collisions are discussed. The models include a relativistic hydrodynamic model, a thermodynamic description, an emitting source pion laser model, and a description which generates a negative binomial description. The approach developed can be used to discuss other cases which will be mentioned. The pion probability distributions for these various cases are compared. Comparison of the pion laser model and Bos...
Steeneveld, W.; Gaag, van der L.C.; Barkema, H.W.; Hogeveen, H.
2009-01-01
Clinical mastitis (CM) can be caused by a wide variety of pathogens and farmers must start treatment before the actual causal pathogen is known. By providing a probability distribution for the causal pathogen, naive Bayesian networks (NBN) can serve as a management tool for farmers to decide which t
Institute of Scientific and Technical Information of China (English)
吕渭济; 崔巍
2001-01-01
In this paper, two kinds of models are presented and optimized for project investment risk income on the basis of probability X distribution. One kind of model being proved has only a maximal value and another kind being proved has no extreme values.
Measurements of gas hydrate formation probability distributions on a quasi-free water droplet
Maeda, Nobuo
2014-06-01
A High Pressure Automated Lag Time Apparatus (HP-ALTA) can measure gas hydrate formation probability distributions from water in a glass sample cell. In an HP-ALTA gas hydrate formation originates near the edges of the sample cell and gas hydrate films subsequently grow across the water-guest gas interface. It would ideally be desirable to be able to measure gas hydrate formation probability distributions of a single water droplet or mist that is freely levitating in a guest gas, but this is technically challenging. The next best option is to let a water droplet sit on top of a denser, immiscible, inert, and wall-wetting hydrophobic liquid to avoid contact of a water droplet with the solid walls. Here we report the development of a second generation HP-ALTA which can measure gas hydrate formation probability distributions of a water droplet which sits on a perfluorocarbon oil in a container that is coated with 1H,1H,2H,2H-Perfluorodecyltriethoxysilane. It was found that the gas hydrate formation probability distributions of such a quasi-free water droplet were significantly lower than those of water in a glass sample cell.
BOOMSMA, A; MOLENAAR, IW
1994-01-01
This article reviews four packages for MS-DOS personal computers (Electronic Tables, PCalc, StaTable, and STAT-POWER), producing cumulative probabilities and quantiles for the most common statistical distributions. Some of them provide other results such as mathematical functions, confidence bounds,
Institute of Scientific and Technical Information of China (English)
LU Wei-ji; CUI Wei
2001-01-01
In this paper, two kinds of models are presented and optimized for pro ject investment risk income on the basis of probability χ distribution. One kin d of model being proved has only a maximal value and another kind being proved h as no extreme values.
Criticality of the net-baryon number probability distribution at finite density
Kenji Morita; Bengt Friman; Krzysztof Redlich
2014-01-01
We compute the probability distribution $P(N)$ of the net-baryon number at finite temperature and quark-chemical potential, $\\mu$, at a physical value of the pion mass in the quark-meson model within the functional renormalization group scheme. For $\\mu/T
Criticality of the net-baryon number probability distribution at finite density
Morita, Kenji; Friman, Bengt; Redlich, Krzysztof
2015-01-01
We compute the probability distribution P(N) of the net-baryon number at finite temperature and quark-chemical potential, μ , at a physical value of the pion mass in the quark-meson model within the functional renormalization group scheme. For μ/T
Directory of Open Access Journals (Sweden)
A. B. Levina
2016-03-01
Full Text Available Error detection codes are mechanisms that enable robust delivery of data in unreliable communication channels and devices. Unreliable channels and devices are error-prone objects. Respectively, error detection codes allow detecting such errors. There are two classes of error detecting codes - classical codes and security-oriented codes. The classical codes have high percentage of detected errors; however, they have a high probability to miss an error in algebraic manipulation. In order, security-oriented codes are codes with a small Hamming distance and high protection to algebraic manipulation. The probability of error masking is a fundamental parameter of security-oriented codes. A detailed study of this parameter allows analyzing the behavior of the error-correcting code in the case of error injection in the encoding device. In order, the complexity of the encoding function plays an important role in the security-oriented codes. Encoding functions with less computational complexity and a low probability of masking are the best protection of encoding device against malicious acts. This paper investigates the influence of encoding function complexity on the error masking probability distribution. It will be shownthat the more complex encoding function reduces the maximum of error masking probability. It is also shown in the paper that increasing of the function complexity changes the error masking probability distribution. In particular, increasing of computational complexity decreases the difference between the maximum and average value of the error masking probability. Our resultshave shown that functions with greater complexity have smoothed maximums of error masking probability, which significantly complicates the analysis of error-correcting code by attacker. As a result, in case of complex encoding function the probability of the algebraic manipulation is reduced. The paper discusses an approach how to measure the error masking
Pitch Angle Distribution Evolution of Energetic Electrons by Whistler-Mode Chorus
Institute of Scientific and Technical Information of China (English)
ZHENG Hui-Nan; SU Zhen-Peng; XIONG Ming
2008-01-01
We develop a two-dimensional momentum and pitch angle code to solve the typical Fokker-Planck equation which governs wave-particle interaction in space plasmas. We carry out detailed calculations of momentum and pitch angle diffusion coefficients, and temporal evolution of pitch angle distribution for a band of chorus frequency distributed over a standard Gaussian spectrum particularly in the heart of the Earth's radiation belt L = 4.5,where peaks of the electron phase space density are observed. We find that the Whistler-mode chorus can produce significant acceleration of electrons at large pitch angles, and can enhance the phase space density for energies of 0.5～1 MeV by a factor of 10 or above after about 24h. This result can account for observation of significant enhancement in flux of energetic electrons during the recovery phase of a geomagnetic storm.
Distributed fiber surface plasmon resonance sensor based on the incident angle adjusting method.
Liu, Zhihai; Wei, Yong; Zhang, Yu; Liu, Chunlan; Zhang, Yaxun; Zhao, Enming; Yang, Jun; Liu, Chunyu; Yuan, Libo
2015-10-01
We propose and demonstrate a distributed surface plasmon resonance (SPR) fiber sensor based on a novel, simple, and effective incident angle adjusting method. For normal fiber SPR sensors, it is hard to realize distributed sensing because it is hard to produce two dynamic ranges (resonance wavebands) with a great difference. The dynamic range depends on the incident angle, and therefore, we propose an incident angle adjusting method that is implemented by grinding an eccentric-core fiber to different angles, which helps to produce different SPR wavebands with great difference, thus realizing distributed sensing. In our two cascaded distributed configuration, with the refractive index range of 1.333-1.385, the fiber grind angles are 9° and 17°, the testing wavelength ranges are 613-760 nm and 745-944 nm, and the average testing sensitivities are 2826 nm/RIU and 4738 nm/RIU, respectively. Larger resonance wavelengths are associated with larger testing sensitivities. This distributed fiber sensor has important significance in the fields of multichannel liquid refractive indices and temperature self-reference measurements. PMID:26421554
Audio analysis of statistically instantaneous signals with mixed Gaussian probability distributions
Naik, Ganesh R.; Wang, Wenwu
2012-10-01
In this article, a novel method is proposed to measure the separation qualities of statistically instantaneous audio signals with mixed Gaussian probability distributions. This study evaluates the impact of the Probability Distribution Function (PDF) of the mixed signals on the outcomes of both sub- and super-Gaussian distributions. Different Gaussian measures are evaluated by using various spectral-distortion measures. It aims to compare the different audio mixtures from both super-Gaussian and sub-Gaussian perspectives. Extensive computer simulation confirms that the separated sources always have super-Gaussian characteristics irrespective of the PDF of the signals or mixtures. The result based on the objective measures demonstrates the effectiveness of source separation in improving the quality of the separated audio sources.
Improving quality of sample entropy estimation for continuous distribution probability functions
Miśkiewicz, Janusz
2016-05-01
Entropy is a one of the key parameters characterizing state of system in statistical physics. Although, the entropy is defined for systems described by discrete and continuous probability distribution function (PDF), in numerous applications the sample entropy is estimated by a histogram, which, in fact, denotes that the continuous PDF is represented by a set of probabilities. Such a procedure may lead to ambiguities and even misinterpretation of the results. Within this paper, two possible general algorithms based on continuous PDF estimation are discussed in the application to the Shannon and Tsallis entropies. It is shown that the proposed algorithms may improve entropy estimation, particularly in the case of small data sets.
Institute of Scientific and Technical Information of China (English)
冉洪流
2004-01-01
In recent years, some researchers have studied the paleoearthquake along the Haiyuan fault and revealed a lot of paleoearthquake events. All available information allows more reliable analysis of earthquake recurrence interval and earthquake rupture patterns along the Haiyuan fault. Based on this paleoseismological information, the recurrence probability and magnitude distribution for M≥6.7 earthquakes in future 100 years along the Haiyuan fault can be obtained through weighted computation by using Poisson and Brownian passage time models and considering different rupture patterns. The result shows that the recurrence probability of MS≥6.7 earthquakes is about 0.035 in future 100 years along the Haiyuan fault.
Storm-Time Evolution of Energetic Electron Pitch Angle Distributions by Wave-Particle Interaction
Institute of Scientific and Technical Information of China (English)
XIAO Fuliang; HE Huiyong; ZHOU Qinghua; WU Guanhong; SHI Xianghua
2008-01-01
The quasi-pure pitch-angle scattering of energetic electrons driven by field-alignedpropagating whistler mode waves during the 9～15 October 1990 magnetic storm at L ≈ 3 ～ 4 is studied, and numerical calculations for energetic electrons in gyroresonance with a band of frequency of whistler mode waves distributed over a standard Gaussian spectrum is performed. It is found that the whistler.mode waves can efficiently drive energetic electrons from the larger pitch-angles into the loss cone, and lead to a flat-top distribution during the main phase of geomagnetic storms. This result perhaps presents a feasible interpretation for observation of time evolution of the quasi-isotropic pitch-angle distribution by Combined Release and Radiation Effects Satellite (CRRES) spacecraft at L ≈ 3 ～ 4.
Zhou, Alice; O'Hern, Corey; Regan, Lynne
2012-02-01
With the long-term goal to improve the design of protein-protein interactions, we develop a simple hard sphere model for dipeptides that can predict the side-chain dihedral angle distributions of Val and Thr in both the α-helix and β-sheet backbone conformations. We find that it is essential to include the non-polar hydrogens in the model; indeed interatomic clashes involving the non-polar hydrogens largely determine the form of side-chain dihedral angle distributions. Further, we are able to explain key differences in the side-chain dihedral angle distributions for Val and Thr from intra-residue steric clashes rather than inter-residue steric clashes or hydrogen bonding. These results are the crucial first step in developing computational models that can predict the side chain conformations of residues at protein-peptide interfaces.
Gulev, S.
2015-12-01
Surface turbulent heat fluxes from modern era and first generation reanalyses (NCEP-DOE, ERA-Interim, MERRA NCEP-CFSR, JRA) as well as from satellite products (SEAFLUX, IFREMER, HOAPS) were intercompared using framework of probability distributions for sensible and latent heat fluxes. For approximation of probability distributions and estimation of extreme flux values Modified Fisher-Tippett (MFT) distribution has been used. Besides mean flux values, consideration is given to the comparative analysis of (i) parameters of the MFT probability density functions (scale and location), (ii) extreme flux values corresponding high order percentiles of fluxes (e.g. 99th and higher) and (iii) fractional contribution of extreme surface flux events in the total surface turbulent fluxes integrated over months and seasons. The latter was estimated using both fractional distribution derived from MFT and empirical estimates based upon occurrence histograms. The strongest differences in the parameters of probability distributions of surface fluxes and extreme surface flux values between different reanalyses are found in the western boundary current extension regions and high latitudes, while the highest differences in the fractional contributions of surface fluxes may occur in mid ocean regions being closely associated with atmospheric synoptic dynamics. Generally, satellite surface flux products demonstrate relatively stronger extreme fluxes compared to reanalyses, even in the Northern Hemisphere midlatitudes where data assimilation input in reanalyses is quite dense compared to the Southern Ocean regions. Our assessment also discriminated different reanalyses and satellite products with respect to their ability to quantify the role of extreme surface turbulent fluxes in forming ocean heat release in different regions.
Study on Probability Distributions of Multi-Timescale Aerosol Optical Depth Using AERONET Data
Institute of Scientific and Technical Information of China (English)
WU Lin; ZENG Qing-Cun
2011-01-01
The probability distribution analysis is performed for multi-timescale aerosol optical depth （AOD） using AErosol RObotic NETwork （AERONET） level 2.0 data. The maximum likelihood estimation is employed to determine the best-fit probability density function （PDF）, and the statement that the fitting Weibull distribution will be light-tailed is proved true for these AOD samples. The best-fit PDF results for multi-site data show that the PDF of AOD samples with longer timescale in most sites tends to be stably represented by lognormal distribution, while Weibull distribution is a better fit for AOD samples with short timescales. The reason for this difference is analyzed through tail characteristics of the two distributions, and an indicator for the selection between Weibull and lognormal distributions is suggested and validated. The result of this research is helpful for determining the most accurate AOD statistics for a given site and a given timescale and for validating the retrieved AOD through its PDF.
Institute of Scientific and Technical Information of China (English)
Li Wei; Hai-liang Yang
2004-01-01
In this paper we first consider a risk process in which claim inter-arrival times and the time until the first claim have an Erlang (2)distribution.An explicit solution is derived for the probability of ultimate ruin,given an initial reserve of u when the claim size follows a Pareto distribution.Follow Ramsay [8] ,Laplace transforms and exponential integrals are used to derive the solution,which involves a single integral of real valued functions along the positive real line,and the integrand is not of an oscillating kind.Then we show that the ultimate ruin probability can be expressed as the sum of expected values of functions of two different Gamma random variables.Finally,the results are extended to the Erlang(n)case.Numerical examples are given to illustrate the main results.
Rank-Ordered Multifractal Analysis (ROMA of probability distributions in fluid turbulence
Directory of Open Access Journals (Sweden)
C. C. Wu
2011-04-01
Full Text Available Rank-Ordered Multifractal Analysis (ROMA was introduced by Chang and Wu (2008 to describe the multifractal characteristic of intermittent events. The procedure provides a natural connection between the rank-ordered spectrum and the idea of one-parameter scaling for monofractals. This technique has successfully been applied to MHD turbulence simulations and turbulence data observed in various space plasmas. In this paper, the technique is applied to the probability distributions in the inertial range of the turbulent fluid flow, as given in the vast Johns Hopkins University (JHU turbulence database. In addition, a new way of finding the continuous ROMA spectrum and the scaled probability distribution function (PDF simultaneously is introduced.
Maadooliat, Mehdi
2012-08-27
Despite considerable progress in the past decades, protein structure prediction remains one of the major unsolved problems in computational biology. Angular-sampling-based methods have been extensively studied recently due to their ability to capture the continuous conformational space of protein structures. The literature has focused on using a variety of parametric models of the sequential dependencies between angle pairs along the protein chains. In this article, we present a thorough review of angular-sampling-based methods by assessing three main questions: What is the best distribution type to model the protein angles? What is a reasonable number of components in a mixture model that should be considered to accurately parameterize the joint distribution of the angles? and What is the order of the local sequence-structure dependency that should be considered by a prediction method? We assess the model fits for different methods using bivariate lag-distributions of the dihedral/planar angles. Moreover, the main information across the lags can be extracted using a technique called Lag singular value decomposition (LagSVD), which considers the joint distribution of the dihedral/planar angles over different lags using a nonparametric approach and monitors the behavior of the lag-distribution of the angles using singular value decomposition. As a result, we developed graphical tools and numerical measurements to compare and evaluate the performance of different model fits. Furthermore, we developed a web-tool (http://www.stat.tamu. edu/~madoliat/LagSVD) that can be used to produce informative animations. © The Author 2012. Published by Oxford University Press.
Huang, N. E.; Long, S. R.
1980-01-01
Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.
Learning algorithms and probability distributions in feed-forward and feed-back networks
Hopfield, J J
1987-01-01
Learning algorithms have been used both on feed-forward deterministic networks and on feed-back statistical networks to capture input-output relations and do pattern classification. These learning algorithms are examined for a class of problems characterized by noisy or statistical data, in which the networks learn the relation between input data and probability distributions of answers. In simple but nontrivial networks the two learning rules are closely related. Under some circumstances the...
Wigner Function and Phase Probability Distribution of q-Analogueof Squeezed One-Photon State
Institute of Scientific and Technical Information of China (English)
FANG Jian-Shu; MENG Xiang-Guo; ZHANG Xiang-Ping; WANG Ji-Suo; LIANG Bao-Long
2008-01-01
In this paper, in terms of the technique of integration within an ordered product (IWOP) of operators and the properties of the inverses of q-deformed annihilation and creation operators, normalizable q-analogue of the squeezed one-photon state, which is quite different from one introduced by Song and Fan [Int. 3. Theor. Phys. 41 (2002) 695], is constructed. Moreover, the Wigner function and phase probability distribution of q-analogue of the squeezed one-photon state are examined.
Dai, Mi; Wang, Yun
2015-01-01
In order to obtain robust cosmological constraints from Type Ia supernova (SN Ia) data, we have applied Markov Chain Monte Carlo (MCMC) to SN Ia lightcurve fitting. We develop a method for sampling the resultant probability density distributions (pdf) of the SN Ia lightcuve parameters in the MCMC likelihood analysis to constrain cosmological parameters, and validate it using simulated data sets. Applying this method to the Joint Lightcurve Analysis (JLA) data set of SNe Ia, we find that sampl...
Optimal design of unit hydrographs using probability distribution and genetic algorithms
Indian Academy of Sciences (India)
Rajib Kumar Bhattacharjya
2004-10-01
A nonlinear optimization model is developed to transmute a unit hydrograph into a probability distribution function (PDF). The objective function is to minimize the sum of the square of the deviation between predicted and actual direct runoff hydrograph of a watershed. The predicted runoff hydrograph is estimated by using a PDF. In a unit hydrograph, the depth of rainfall excess must be unity and the ordinates must be positive. Incorporation of a PDF ensures that the depth of rainfall excess for the unit hydrograph is unity, and the ordinates are also positive. Unit hydrograph ordinates are in terms of intensity of rainfall excess on a discharge per unit catchment area basis, the unit area thus representing the unit rainfall excess. The proposed method does not have any constraint. The nonlinear optimization formulation is solved using binary-coded genetic algorithms. The number of variables to be estimated by optimization is the same as the number of probability distribution parameters; gamma and log-normal probability distributions are used. The existing nonlinear programming model for obtaining optimal unit hydrograph has also been solved using genetic algorithms, where the constrained nonlinear optimization problem is converted to an unconstrained problem using penalty parameter approach. The results obtained are compared with those obtained by the earlier LP model and are fairly similar.
Probability distributions in a two-parameter scaling theory of localization
Heinrichs, J.
1988-06-01
Probability distributions for the resistance of two- and three-dimensional disordered conductors are studied using a Migdal-Kadanoff-type scaling transformation together with the author's previously derived distributions in one dimension. The present treatment differs from earlier work in two respects: On one hand, it includes the effect of an average potential barrier V experienced by an electron originating from the perfect leads which connect the conductor to a constant-voltage source; on the other hand, the input distribution for one-dimensional systems is based on an exact solution for the effect of the random potential on the complex reflection amplitude of an electron at a certain energy. The scaling equation for probability distributions and for their successive moments are parametrized in terms of the mean resistance, ρ¯, and of a fixed parameter γ related to V. Hence they correspond to a special form of two-parameter scaling. A mobility edge, ρ¯≡ρc, exists only for d>2 and, for d=3, detailed results for ρc, for the conductivity exponent ν, and for the fixed resistance distribution at ρc as a function of γ are presented. The asymptotic distribution of resistance away from the mobility edge for d=3, and in both small- and large-resistance regimes for d=2 are also studied. In the metallic regime for d>2 our treatment yields two distinct distributions, one of which is characterized by Ohm's law for the mean resistance and the other one by Ohm's law for the mean conductance. In the latter case the fluctuations of conductivity are independent of sample size for large samples. The calculated distributions are generally broad and in the localized regime, for d=3 and d=2, the rms values of resistance dominate the mean values in the infinite-sample limit.
A laser speckle sensor to measure the distribution of static torsion angles of twisted targets
DEFF Research Database (Denmark)
Rose, B.; Imam, H.; Hanson, Steen Grüner;
1998-01-01
A novel method for measuring the distribution of static torsion angles of twisted targets is presented. The method is based on Fourier transforming the scattered field in the direction perpendicular to the twist axis, while performing an imaging operation in the direction parallel to the axis....... A cylindrical lens serves to image the closely spaced lateral positions of the target along the twist axis onto corresponding lines of the two dimensional image sensor. Thus, every single line of the image sensor measures the torsion angle of the corresponding surface position along the twist axis of the target....... Experimentally, we measure the distribution of torsion angles in both uniform and non-uniform deformation zones. It is demonstrated both theoretically and experimentally that the measurements are insensitive to object shape and target distance if the image sensor is placed in the Fourier plane. A straightforward...
Criticality of the net-baryon number probability distribution at finite density
Morita, Kenji; Redlich, Krzysztof
2014-01-01
We compute the probability distribution $P(N)$ of the net-baryon number at finite temperature and quark-chemical potential, $\\mu$, at a physical value of the pion mass in the quark-meson model within the functional renormalization group scheme. For $\\mu/T<1$, the model exhibits the chiral crossover transition which belongs to the universality class of the $O(4)$ spin system in three dimensions. We explore the influence of the chiral crossover transition on the properties of the net baryon number probability distribution, $P(N)$. By considering ratios of $P(N)$ to the Skellam function, with the same mean and variance, we unravel the characteristic features of the distribution that are related to $O(4)$ criticality at the chiral crossover transition. We explore the corresponding ratios for data obtained at RHIC by the STAR Collaboration and discuss their implications. We also examine $O(4)$ criticality in the context of binomial and negative-binomial distributions for the net proton number.
Photometric Redshift Probability Distributions for Galaxies in the SDSS DR8
Sheldon, Erin S; Mandelbaum, Rachel; Brinkmann, J; Weaver, Benjamin A
2011-01-01
We present redshift probability distributions for galaxies in the SDSS DR8 imaging data. We used the nearest-neighbor weighting algorithm presented in Lima et al. 2008 and Cunha et al. 2009 to derive the ensemble redshift distribution N(z), and individual redshift probability distributions P(z) for galaxies with r < 21.8. As part of this technique, we calculated weights for a set of training galaxies with known redshifts such that their density distribution in five dimensional color-magnitude space was proportional to that of the photometry-only sample, producing a nearly fair sample in that space. We then estimated the ensemble N(z) of the photometric sample by constructing a weighted histogram of the training set redshifts. We derived P(z) s for individual objects using the same technique, but limiting to training set objects from the local color-magnitude space around each photometric object. Using the P(z) for each galaxy, rather than an ensemble N(z), can reduce the statistical error in measurements t...
Directory of Open Access Journals (Sweden)
Tong Yifei
2014-01-01
Full Text Available Crane is a mechanical device, used widely to move materials in modern production. It is reported that the energy consumptions of China are at least 5–8 times of other developing countries. Thus, energy consumption becomes an unavoidable topic. There are several reasons influencing the energy loss, and the camber of the girder is the one not to be neglected. In this paper, the problem of the deflections induced by the moving payload in the girder of overhead travelling crane is examined. The evaluation of a camber giving a counterdeflection of the girder is proposed in order to get minimum energy consumptions for trolley to move along a nonstraight support. To this aim, probabilistic payload distributions are considered instead of fixed or rated loads involved in other researches. Taking 50/10 t bridge crane as a research object, the probability loads are determined by analysis of load distribution density functions. According to load distribution, camber design under different probability loads is discussed in detail as well as energy consumptions distribution. The research results provide the design reference of reasonable camber to obtain the least energy consumption for climbing corresponding to different P0; thus energy-saving design can be achieved.
Criticality of the net-baryon number probability distribution at finite density
Directory of Open Access Journals (Sweden)
Kenji Morita
2015-02-01
Full Text Available We compute the probability distribution P(N of the net-baryon number at finite temperature and quark-chemical potential, μ, at a physical value of the pion mass in the quark-meson model within the functional renormalization group scheme. For μ/T<1, the model exhibits the chiral crossover transition which belongs to the universality class of the O(4 spin system in three dimensions. We explore the influence of the chiral crossover transition on the properties of the net baryon number probability distribution, P(N. By considering ratios of P(N to the Skellam function, with the same mean and variance, we unravel the characteristic features of the distribution that are related to O(4 criticality at the chiral crossover transition. We explore the corresponding ratios for data obtained at RHIC by the STAR Collaboration and discuss their implications. We also examine O(4 criticality in the context of binomial and negative-binomial distributions for the net proton number.
Energy Technology Data Exchange (ETDEWEB)
Játékos, Balázs, E-mail: jatekosb@eik.bme.hu; Ujhelyi, Ferenc; Lőrincz, Emőke; Erdei, Gábor
2015-01-01
SPADnet-I is a prototype, fully digital, high spatial and temporal resolution silicon photon counter, based on standard CMOS imaging technology, developed by the SPADnet consortium. Being a novel device, the exact dependence of photon detection probability (PDP) of SPADnet-I was not known as a function of angle of incidence, wavelength and polarization of the incident light. Our targeted application area of this sensor is next generation PET detector modules, where they will be used along with LYSO:Ce scintillators. Hence, we performed an extended investigation of PDP in a wide range of angle of incidence (0° to 80°), concentrating onto a 60 nm broad wavelength interval around the characteristic emission peak (λ=420 nm) of the scintillator. In the case where the sensor was optically coupled to a scintillator, our experiments showed a notable dependence of PDP on angle, polarization and wavelength. The sensor has an average PDP of approximately 30% from 0° to 60° angle of incidence, where it starts to drop rapidly. The PDP turned out not to be polarization dependent below 30°. If the sensor is used without a scintillator (i.e. the light source is in air), the polarization dependence is much less expressed, it begins only from 50°.
Vinogradov, S
2011-01-01
Silicon Photomultipliers (SiPM), also so-called Solid State Photomultipliers (SSPM), are based on Geiger mode avalanche breakdown limited by strong negative feedback. SSPM can detect and resolve single photons due to high gain and ultra-low excess noise of avalanche multiplication in this mode. Crosstalk and afterpulsing processes associated with the high gain introduce specific excess noise and deteriorate photon number resolution of the SSPM. Probabilistic features of these processes are widely studied because of its high importance for the SSPM design, characterization, optimization and application, but the process modeling is mostly based on Monte Carlo simulations and numerical methods. In this study, crosstalk is considered to be a branching Poisson process, and analytical models of probability distribution and excess noise factor (ENF) of SSPM signals based on the Borel distribution as an advance on the geometric distribution models are presented and discussed. The models are found to be in a good agre...
The probability distribution functions of emission line flux measurements and their ratios
Wesson, R; Scicluna, P
2016-01-01
Many physical parameters in astrophysics are derived using the ratios of two observed quantities. If the relative uncertainties on measurements are small enough, uncertainties can be propagated analytically using simplifying assumptions, but for large normally distributed uncertainties, the probability distribution of the ratio become skewed, with a modal value offset from that expected in Gaussian uncertainty propagation. Furthermore, the most likely value of a ratio A/B is not equal to the reciprocal of the most likely value of B/A. The effect is most pronounced when the uncertainty on the denominator is larger than that on the numerator. We show that this effect is seen in an analysis of 12,126 spectra from the Sloan Digital Sky Survey. The intrinsically fixed ratio of the [O III] lines at 4959 and 5007 ${\\AA}$ is conventionally expressed as the ratio of the stronger line to the weaker line. Thus, the uncertainty on the denominator is larger, and non-Gaussian probability distributions result. By taking thi...
Kurugol, Sila; Freiman, Moti; Afacan, Onur; Perez-Rossello, Jeannette M; Callahan, Michael J; Warfield, Simon K
2016-08-01
Quantitative diffusion-weighted MR imaging (DW-MRI) of the body enables characterization of the tissue microenvironment by measuring variations in the mobility of water molecules. The diffusion signal decay model parameters are increasingly used to evaluate various diseases of abdominal organs such as the liver and spleen. However, previous signal decay models (i.e., mono-exponential, bi-exponential intra-voxel incoherent motion (IVIM) and stretched exponential models) only provide insight into the average of the distribution of the signal decay rather than explicitly describe the entire range of diffusion scales. In this work, we propose a probability distribution model of incoherent motion that uses a mixture of Gamma distributions to fully characterize the multi-scale nature of diffusion within a voxel. Further, we improve the robustness of the distribution parameter estimates by integrating spatial homogeneity prior into the probability distribution model of incoherent motion (SPIM) and by using the fusion bootstrap solver (FBM) to estimate the model parameters. We evaluated the improvement in quantitative DW-MRI analysis achieved with the SPIM model in terms of accuracy, precision and reproducibility of parameter estimation in both simulated data and in 68 abdominal in-vivo DW-MRIs. Our results show that the SPIM model not only substantially reduced parameter estimation errors by up to 26%; it also significantly improved the robustness of the parameter estimates (paired Student's t-test, p < 0.0001) by reducing the coefficient of variation (CV) of estimated parameters compared to those produced by previous models. In addition, the SPIM model improves the parameter estimates reproducibility for both intra- (up to 47%) and inter-session (up to 30%) estimates compared to those generated by previous models. Thus, the SPIM model has the potential to improve accuracy, precision and robustness of quantitative abdominal DW-MRI analysis for clinical applications. PMID
Impact of spike train autostructure on probability distribution of joint spike events.
Pipa, Gordon; Grün, Sonja; van Vreeswijk, Carl
2013-05-01
The discussion whether temporally coordinated spiking activity really exists and whether it is relevant has been heated over the past few years. To investigate this issue, several approaches have been taken to determine whether synchronized events occur significantly above chance, that is, whether they occur more often than expected if the neurons fire independently. Most investigations ignore or destroy the autostructure of the spiking activity of individual cells or assume Poissonian spiking as a model. Such methods that ignore the autostructure can significantly bias the coincidence statistics. Here, we study the influence of the autostructure on the probability distribution of coincident spiking events between tuples of mutually independent non-Poisson renewal processes. In particular, we consider two types of renewal processes that were suggested as appropriate models of experimental spike trains: a gamma and a log-normal process. For a gamma process, we characterize the shape of the distribution analytically with the Fano factor (FFc). In addition, we perform Monte Carlo estimations to derive the full shape of the distribution and the probability for false positives if a different process type is assumed as was actually present. We also determine how manipulations of such spike trains, here dithering, used for the generation of surrogate data change the distribution of coincident events and influence the significance estimation. We find, first, that the width of the coincidence count distribution and its FFc depend critically and in a nontrivial way on the detailed properties of the structure of the spike trains as characterized by the coefficient of variation CV. Second, the dependence of the FFc on the CV is complex and mostly nonmonotonic. Third, spike dithering, even if as small as a fraction of the interspike interval, can falsify the inference on coordinated firing. PMID:23470124
Wind speed analysis in La Vainest, Mexico: a bimodal probability distribution case
Energy Technology Data Exchange (ETDEWEB)
Jaramillo, O.A.; Borja, M.A. [Energias No Convencionales, Morelos (Mexico). Instituto de Investigaciones Electricas
2004-08-01
The statistical characteristics of the wind speed in La Vainest, Oxoic, Mexico, have been analyzed by using wind speed data recorded by Instituto de Investigaciones Electricas (IIE). By grouping the observations by annual, seasonal and wind direction, we show that the wind speed distribution, with calms included, is not represented by the typical two-parameter Weibull function. A mathematical formulation by using a bimodal Weibull and Weibull probability distribution function (PDF) has been developed to analyse the wind speed frequency distribution in that region. The model developed here can be applied for similar regions where the wind speed distribution presents a bimodal PDF. The two-parameter Weibull wind speed distribution must not be generalised, since it is not accurate to represent some wind regimes as the case of La Ventosa, Mexico. The analysis of wind data shows that computing the capacity factor for wind power plants to be installed in La Ventosa must be carded out by means of a bimodal PDF instead of the typical Weibull PDF. Otherwise, the capacity factor will be underestimated. (author)
Study of the SEMG probability distribution of the paretic tibialis anterior muscle
Cherniz, Analía S.; Bonell, Claudia E.; Tabernig, Carolina B.
2007-11-01
The surface electromyographic signal is a stochastic signal that has been modeled as a Gaussian process, with a zero mean. It has been experimentally proved that this probability distribution can be adjusted with less error to a Laplacian type distribution. The selection of estimators for the detection of changes in the amplitude of the muscular signal depends, among other things, on the type of distribution. In the case of subjects with lesions to the superior motor neuron, the lack of central control affects the muscular tone, the force and the patterns of muscular movement involved in activities such as the gait cycle. In this work, the distribution types of the SEMG signal amplitudes of the tibialis anterior muscle are evaluated during gait, both in two healthy subjects and in two hemiparetic ones in order to select the estimators that best characterize them. It was observed that the Laplacian distribution function would be the one that best adjusts to the experimental data in the studied subjects, although this largely depends on the subject and on the data segment analyzed.
Choi, Sung-Hwan; Kim, Seong-Jin; Lee, Kee-Joon; Sung, Sang-Jin; Chun, Youn-Sic
2016-01-01
Objective The purpose of this study was to analyze stress distributions in the roots, periodontal ligaments (PDLs), and bones around cylindrical and tapered miniscrews inserted at different angles using a finite element analysis. Methods We created a three-dimensional (3D) maxilla model of a dentition with extracted first premolars and used 2 types of miniscrews (tapered and cylindrical) with 1.45-mm diameters and 8-mm lengths. The miniscrews were inserted at 30°, 60°, and 90° angles with respect to the bone surface. A simulated horizontal orthodontic force of 2 N was applied to the miniscrew heads. Then, the stress distributions, magnitudes during miniscrew placement, and force applications were analyzed with a 3D finite element analysis. Results Stresses were primarily absorbed by cortical bone. Moreover, very little stress was transmitted to the roots, PDLs, and cancellous bone. During cylindrical miniscrew insertion, the maximum von Mises stress increased as insertion angle decreased. Tapered miniscrews exhibited greater maximum von Mises stress than cylindrical miniscrews. During force application, maximum von Mises stresses increased in both groups as insertion angles decreased. Conclusions For both cylindrical and tapered miniscrew designs, placement as perpendicular to the bone surface as possible is recommended to reduce stress in the surrounding bone. PMID:27478796
EVALUATION OF THE PROBABILITY DISTRIBUTION OF PITTING CORROSION FATIGUE LIFE IN AIRCRAFT MATERIALS
Institute of Scientific and Technical Information of China (English)
WANG Qingyuan (王清远); N.KAWAGOISHI; Q.CHEN; R.M.PIDAPARTI
2003-01-01
Corrosion and fatigue properties of aircraft materials are known to have a considerable scatter due to the random nature of materials,loading,and environmental conditions.A probabilistic approach for predicting the pitting corrosion fatigue life has been investigated which captures the effect of the interaction of the cyclic load and corrosive environment and all stages of the corrosion fatigue process (i.e.the pit nucleation and growth,pit-crack transition,short- and long-crack propagation).The probabilistic model investigated considers the uncertainties in the initial pit size,corrosion pitting current,and material properties due to the scatter found in the experimental data.Monte Carlo simulations were performed to define the failure probability distribution.Predicted cumulative distribution functions of fatigue life agreed reasonably well with the existing experimental data.
Inhomogeneous broadening of PAC spectra with V zz and η joint probability distribution functions
Evenson, W. E.; Adams, M.; Bunker, A.; Hodges, J.; Matheson, P.; Park, T.; Stufflebeam, M.; Zacate, M. O.
2013-05-01
The perturbed angular correlation (PAC) spectrum, G 2( t), is broadened by the presence of randomly distributed defects in crystals due to a distribution of electric field gradients (EFGs) experienced by probe nuclei. Heuristic approaches to fitting spectra that exhibit such inhomogeneous broadening (ihb) consider only the distribution of EFG magnitudes V zz , but the physical effect actually depends on the joint probability distribution function (pdf) of V zz and EFG asymmetry parameter η. The difficulty in determining the joint pdf leads us to more appropriate representations of the EFG coordinates, and to express the joint pdf as the product of two approximately independent pdfs describing each coordinate separately. We have pursued this case in detail using as an initial illustration of the method a simple point defect model with nuclear spin I = 5/2 in several cubic lattices, where G 2( t) is primarily induced by a defect trapped in the first neighbor shell of a probe and broadening is due to defects distributed at random outside the first neighbor shell. Effects such as lattice relaxation are ignored in this simple test of the method. The simplicity of our model is suitable for gaining insight into ihb with more than V zz alone. We simulate ihb in this simple case by averaging the net EFGs of 20,000 random defect arrangements, resulting in a broadened average G 2( t). The 20,000 random cases provide a distribution of EFG components which are first transformed to Czjzek coordinates and then further into the full Czjzek half plane by conformal mapping. The topology of this transformed space yields an approximately separable joint pdf for the EFG components. We then fit the nearly independent pdfs and reconstruct G 2( t) as a function of defect concentration. We report results for distributions of defects on simple cubic, face-centered cubic, and body-centered cubic lattices. The method explored here for analyzing ihb is applicable to more realistic cases.
Liquid-crystal variable-focus lenses with a spatially-distributed tilt angles.
Honma, Michinori; Nose, Toshiaki; Yanase, Satoshi; Yamaguchi, Rumiko; Sato, Susumu
2009-06-22
A pretilt angle controlling method by the density of rubbings using a tiny stylus is proposed. The control of the surface pretilt angle is achieved by rubbing a side-chain type polyimide film for a homeotropic alignment. Smooth liquid crystal (LC) director distribution in the bulk layer is successfully obtained even though the rough surface orientation. This approach is applied to LC cylindrical and rectangular lenses with a variable-focusing function. The distribution profile of the rubbing pitch (the reciprocal of the rubbing density) for small aberration is determined to be quadratic. The variable focusing function is successfully achieved in the LC rectangular lens, and the voltage dependence of the focal length is tried to be explained by the LC molecular reorientation behavior. PMID:19550499
Directory of Open Access Journals (Sweden)
Limin Wang
2015-06-01
Full Text Available As one of the most common types of graphical models, the Bayesian classifier has become an extremely popular approach to dealing with uncertainty and complexity. The scoring functions once proposed and widely used for a Bayesian network are not appropriate for a Bayesian classifier, in which class variable C is considered as a distinguished one. In this paper, we aim to clarify the working mechanism of Bayesian classifiers from the perspective of the chain rule of joint probability distribution. By establishing the mapping relationship between conditional probability distribution and mutual information, a new scoring function, Sum_MI, is derived and applied to evaluate the rationality of the Bayesian classifiers. To achieve global optimization and high dependence representation, the proposed learning algorithm, the flexible K-dependence Bayesian (FKDB classifier, applies greedy search to extract more information from the K-dependence network structure. Meanwhile, during the learning procedure, the optimal attribute order is determined dynamically, rather than rigidly. In the experimental study, functional dependency analysis is used to improve model interpretability when the structure complexity is restricted.
International Nuclear Information System (INIS)
The probability distributions of large-scale atmospheric and oceanic variables are generally skewed and heavy-tailed. We argue that their distinctive departures from Gaussianity arise fundamentally from the fact that in a quadratically nonlinear system with a quadratic invariant, the coupling coefficients between system components are not constant but depend linearly on the system state in a distinctive way. In particular, the skewness arises from a tendency of the system trajectory to linger near states of weak coupling. We show that the salient features of the observed non-Gaussianity can be captured in the simplest such nonlinear 2-component system. If the system is stochastically forced and linearly damped, with one component damped much more strongly than the other, then the strongly damped fast component becomes effectively decoupled from the weakly damped slow component, and its impact on the slow component can be approximated as a stochastic noise forcing plus an augmented nonlinear damping. In the limit of large time-scale separation, the nonlinear augmentation of the damping becomes small, and the noise forcing can be approximated as an additive noise plus a correlated additive and multiplicative noise (CAM noise) forcing. Much of the diversity of observed large-scale atmospheric and oceanic probability distributions can be interpreted in this minimal framework
Directory of Open Access Journals (Sweden)
Rani K
2014-08-01
Full Text Available Photocopy documents are very common in our normal life. People are permitted to carry and present photocopied documents to avoid damages to the original documents. But this provision is misused for temporary benefits by fabricating fake photocopied documents. Fabrication of fake photocopied document is possible only in 2nd and higher order recursive order of photocopies. Whenever a photocopied document is submitted, it may be required to check its originality. When the document is 1st order photocopy, chances of fabrication may be ignored. On the other hand when the photocopy order is 2nd or above, probability of fabrication may be suspected. Hence when a photocopy document is presented, the recursive order number of photocopy is to be estimated to ascertain the originality. This requirement demands to investigate methods to estimate order number of photocopy. In this work, a voting based approach is used to detect the recursive order number of the photocopy document using probability distributions exponential, extreme values and lognormal distributions is proposed. A detailed experimentation is performed on a generated data set and the method exhibits efficiency close to 89%.
Wave Packet Dynamics in the Infinite Square Well with the Wigner Quasi-probability Distribution
Belloni, Mario; Doncheski, Michael; Robinett, Richard
2004-05-01
Over the past few years a number of authors have been interested in the time evolution and revivals of Gaussian wave packets in one-dimensional infinite wells and in two-dimensional infinite wells of various geometries. In all of these circumstances, the wave function is guaranteed to revive at a time related to the inverse of the system's ground state energy, if not sooner. To better visualize these revivals we have calculated the time-dependent Wigner quasi-probability distribution for position and momentum, P_W(x; p), for Gaussian wave packet solutions of this system. The Wigner quasi-probability distribution clearly demonstrates the short-term semi-classical time dependence, as well as longer-term revival behavior and the structure during the collapsed state. This tool also provides an excellent way of demonstrating the patterns of highly-correlated Schrödinger-cat-like `mini-packets' which appear at fractional multiples of the exact revival time. This research is supported in part by a Research Corporation Cottrell College Science Award (CC5470) and the National Science Foundation under contracts DUE-0126439 and DUE-9950702.
Probability distribution of the index in gauge theory on 2d non-commutative geometry
Aoki, Hajime; Nishimura, Jun; Susaki, Yoshiaki
2007-10-01
We investigate the effects of non-commutative geometry on the topological aspects of gauge theory using a non-perturbative formulation based on the twisted reduced model. The configuration space is decomposed into topological sectors labeled by the index ν of the overlap Dirac operator satisfying the Ginsparg-Wilson relation. We study the probability distribution of ν by Monte Carlo simulation of the U(1) gauge theory on 2d non-commutative space with periodic boundary conditions. In general the distribution is asymmetric under ν mapsto -ν, reflecting the parity violation due to non-commutative geometry. In the continuum and infinite-volume limits, however, the distribution turns out to be dominated by the topologically trivial sector. This conclusion is consistent with the instanton calculus in the continuum theory. However, it is in striking contrast to the known results in the commutative case obtained from lattice simulation, where the distribution is Gaussian in a finite volume, but the width diverges in the infinite-volume limit. We also calculate the average action in each topological sector, and provide deeper understanding of the observed phenomenon.
Difficulties arising from the representation of the measurand by a probability distribution
International Nuclear Information System (INIS)
This paper identifies difficulties associated with the concept of representing fixed unknown quantities by probability distributions. This concept, which we refer to as the distributed-measurand concept, is at the heart of the approach to the evaluation of measurement uncertainty described in Supplement 1 to the Guide to the Expression of Uncertainty in Measurement. The paper notes (i) the resulting lack of invariance of measurement results to nonlinear reparametrizations of the measurement problem, (ii) the potential undetected divergence of measurement estimates obtained by Monte Carlo evaluation, (iii) the potential failure of the methodology to give uncertainty intervals enclosing the values of the measurands with an acceptable frequency and (iv) the potential loss of measurement precision. The distributed-measurand concept is gaining popularity partly because of its association with analysis using the Monte Carlo principle. However, the Monte Carlo principle is also applicable without adopting the distributed-measurand concept. Accordingly, an alternative approach to the evaluation of measurement uncertainty is briefly described
Institute of Scientific and Technical Information of China (English)
李军超; 杨芬芬; 周志强
2015-01-01
Although multi-stage incremental sheet forming has always been adopted instead of single-stage forming to form parts with a steep wall angle or to achieve a high forming performance, it is largely dependent on empirical designs. In order to research multi-stage forming further, the effect of forming stages (n) and angle interval between the two adjacent stages (Δα) on thickness distribution was investigated. Firstly, a finite element method (FEM) model of multi-stage incremental forming was established and experimentally verified. Then, based on the proposed simulation model, different strategies were adopted to form a frustum of cone with wall angle of 30° to research the thickness distribution of multi-pass forming. It is proved that the minimum thickness increases largely and the variance of sheet thickness decreases significantly as the value of n grows. Further, with the increase of Δα, the minimum thickness increases initially and then decreases, and the optimal thickness distribution is achieved with Δα of 10°. Additionally, a formula is deduced to estimate the sheet thickness after multi-stage forming and proved to be effective. And the simulation results fit well with the experimental results.
The probability distribution functions of emission line flux measurements and their ratios
Wesson, R.; Stock, D. J.; Scicluna, P.
2016-07-01
Many physical parameters in astrophysics are derived using the ratios of two observed quantities. If the relative uncertainties on measurements are small enough, uncertainties can be propagated analytically using simplifying assumptions, but for large normally distributed uncertainties, the probability distribution of the ratio become skewed, with a modal value offset from that expected in Gaussian uncertainty propagation. Furthermore, the most likely value of a ratio A/B is not equal to the reciprocal of the most likely value of B/A. The effect is most pronounced when the uncertainty on the denominator is larger than that on the numerator. We show that this effect is seen in an analysis of 12 126 spectra from the Sloan Digital Sky Survey (SDSS). The intrinsically fixed ratio of the [O III] lines at 4959 and 5007 Å is conventionally expressed as the ratio of the stronger line to the weaker line. Thus, the uncertainty on the denominator is larger, and non-Gaussian probability distributions result. By taking this effect into account, we derive an improved estimate of the intrinsic 5007/4959 ratio. We obtain a value of 3.012 ± 0.008, which is slightly but statistically significantly higher than the theoretical value of 2.98. We further investigate the suggestion that fluxes measured from emission lines in noisy spectra are strongly biased upwards. We were unable to detect this effect in the SDSS line flux measurements, and we could not reproduce the results of Rola and Pelat who first described this bias. We suggest that the magnitude of this effect may depend strongly on the specific fitting algorithm used.
DEFF Research Database (Denmark)
Yura, Harold; Hanson, Steen Grüner
2012-01-01
Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the......Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set...... with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative...
Detection of two power-law tails in the probability distribution functions of massive GMCs
Schneider, N; Girichidis, P; Rayner, T; Motte, F; Andre, P; Russeil, D; Abergel, A; Anderson, L; Arzoumanian, D; Benedettini, M; Csengeri, T; Didelon, P; Francesco, J D; Griffin, M; Hill, T; Klessen, R S; Ossenkopf, V; Pezzuto, S; Rivera-Ingraham, A; Spinoglio, L; Tremblin, P; Zavagno, A
2015-01-01
We report the novel detection of complex high-column density tails in the probability distribution functions (PDFs) for three high-mass star-forming regions (CepOB3, MonR2, NGC6334), obtained from dust emission observed with Herschel. The low column density range can be fit with a lognormal distribution. A first power-law tail starts above an extinction (Av) of ~6-14. It has a slope of alpha=1.3-2 for the rho~r^-alpha profile for an equivalent density distribution (spherical or cylindrical geometry), and is thus consistent with free-fall gravitational collapse. Above Av~40, 60, and 140, we detect an excess that can be fitted by a flatter power law tail with alpha>2. It correlates with the central regions of the cloud (ridges/hubs) of size ~1 pc and densities above 10^4 cm^-3. This excess may be caused by physical processes that slow down collapse and reduce the flow of mass towards higher densities. Possible are: 1. rotation, which introduces an angular momentum barrier, 2. increasing optical depth and weaker...
Directory of Open Access Journals (Sweden)
Han Liwei
2014-07-01
Full Text Available Monitoring data on an earth-rockfill dam constitutes a form of spatial data. Such data include much uncertainty owing to the limitation of measurement information, material parameters, load, geometry size, initial conditions, boundary conditions and the calculation model. So the cloud probability density of the monitoring data must be addressed. In this paper, the cloud theory model was used to address the uncertainty transition between the qualitative concept and the quantitative description. Then an improved algorithm of cloud probability distribution density based on a backward cloud generator was proposed. This was used to effectively convert certain parcels of accurate data into concepts which can be described by proper qualitative linguistic values. Such qualitative description was addressed as cloud numerical characteristics-- {Ex, En, He}, which could represent the characteristics of all cloud drops. The algorithm was then applied to analyze the observation data of a piezometric tube in an earth-rockfill dam. And experiment results proved that the proposed algorithm was feasible, through which, we could reveal the changing regularity of piezometric tube’s water level. And the damage of the seepage in the body was able to be found out.
Kundu, Prasun K
2015-01-01
Rainfall exhibits extreme variability at many space and time scales and calls for a statistical description. Based on an analysis of radar measurements of precipitation over the tropical oceans, we introduce a new probability law for the area-averaged rain rate constructed from the class of log-infinitely divisible distributions that accurately describes the frequency of the most intense rain events. The dependence of its parameters on the spatial averaging length L allows one to relate spatial statistics at different scales. In particular, it enables us to explain the observed power law scaling of the moments of the data and successfully predicts the continuous spectrum of scaling exponents expressing multiscaling characteristics of the rain intensity field.
EVALUATION OF THE PROBABILITY DISTRIBUTION OF PITTING CORROSION FATIGUE LIFE IN AIRCRAFT MATERIALS
Institute of Scientific and Technical Information of China (English)
王清远; N.KAWAGOISHI; Q.CHEN; R.M.PIDAPARTI
2003-01-01
Corrosion and fatigue properties of aircraft materials axe known to have a considerablescatter due to the random nature of materials, loading, and environmental conditions. A probabilisticapproach for predicting the pitting corrosion fatigue life has been investigated which captures the effectof the interaction of the cyclic load and corrosive environment and all stages of the corrosion fatigueprocess (i.e. the pit nucleation and growth, pit-crack transition, short- and long-crack propagation).The probabilistic model investigated considers the uncertainties in the initial pit size, corrosion pittingcurrent, and material properties due to the scatter found in the experimental data. Monte Carlo simu-lations were performed to define the failure probability distribution. Predicted cumulative distributionfunctions of fatigue life agreed reasonably well with the existing experimental data.
Andrade, Daniel
2012-01-01
We present a new method to propagate lower bounds on conditional probability distributions in conventional Bayesian networks. Our method guarantees to provide outer approximations of the exact lower bounds. A key advantage is that we can use any available algorithms and tools for Bayesian networks in order to represent and infer lower bounds. This new method yields results that are provable exact for trees with binary variables, and results which are competitive to existing approximations in credal networks for all other network structures. Our method is not limited to a specific kind of network structure. Basically, it is also not restricted to a specific kind of inference, but we restrict our analysis to prognostic inference in this article. The computational complexity is superior to that of other existing approaches.
The HI Probability Distribution Function and the Atomic-to-Molecular Transition in Molecular Clouds
Imara, Nia
2016-01-01
We characterize the column density probability distributions functions (PDFs) of the atomic hydrogen gas, HI, associated with seven Galactic molecular clouds (MCs). We use 21 cm observations from the Leiden/Argentine/Bonn Galactic HI Survey to derive column density maps and PDFs. We find that the peaks of the HI PDFs occur at column densities ranging from ~1-2$\\times 10^{21}$ cm$^2$ (equivalently, ~0.5-1 mag). The PDFs are uniformly narrow, with a mean dispersion of $\\sigma_{HI}\\approx 10^{20}$ cm$^2$ (~0.1 mag). We also investigate the HI-to-H$_2$ transition towards the cloud complexes and estimate HI surface densities ranging from 7-16 $M_\\odot$ pc$^{-2}$ at the transition. We propose that the HI PDF is a fitting tool for identifying the HI-to-H$_2$ transition column in Galactic MCs.
The H I Probability Distribution Function and the Atomic-to-molecular Transition in Molecular Clouds
Imara, Nia; Burkhart, Blakesley
2016-10-01
We characterize the column-density probability distribution functions (PDFs) of the atomic hydrogen gas, H i, associated with seven Galactic molecular clouds (MCs). We use 21 cm observations from the Leiden/Argentine/Bonn Galactic H i Survey to derive column-density maps and PDFs. We find that the peaks of the H i PDFs occur at column densities in the range ˜1-2 × 1021 {{cm}}-2 (equivalently, ˜0.5-1 mag). The PDFs are uniformly narrow, with a mean dispersion of {σ }{{H}{{I}}}≈ {10}20 {{cm}}-2 (˜0.1 mag). We also investigate the H i-to-H2 transition toward the cloud complexes and estimate H i surface densities ranging from 7 to 16 {M}⊙ {{pc}}-2 at the transition. We propose that the H i PDF is a fitting tool for identifying the H i-to-H2 transition column in Galactic MCs.
Irreversible models with Boltzmann–Gibbs probability distribution and entropy production
International Nuclear Information System (INIS)
We analyze irreversible interacting spin models evolving according to a master equation with spin flip transition rates that do not obey detailed balance but obey global balance with a Boltzmann–Gibbs probability distribution. Spin flip transition rates with up–down symmetry are obtained for a linear chain, a square lattice, and a cubic lattice with a stationary state corresponding to the Ising model with nearest neighbor interactions. We show that these irreversible dynamics describes the contact of the system with particle reservoirs that cause a flux of particles through the system. Using a microscopic definition, we determine the entropy production rate of these irreversible models and show that it can be written as a macroscopic bilinear form in the forces and fluxes. Exact expressions for this property are obtained for the linear chain and the square lattice. In this last case the entropy production rate displays a singularity at the phase transition point of the same type as the entropy itself
Wenger, S.J.; Freeman, Mary C.
2008-01-01
Researchers have developed methods to account for imperfect detection of species with either occupancy (presence-absence) or count data using replicated sampling. We show how these approaches can be combined to simultaneously estimate occurrence, abundance, and detection probability by specifying a zero-inflated distribution for abundance. This approach may be particularly appropriate when patterns of occurrence and abundance arise from distinct processes operating at differing spatial or temporal scales. We apply the model to two data sets: (1) previously published data for a species of duck, Anas platyrhynchos, and (2) data for a stream fish species, Etheostoma scotti. We show that in these cases, an incomplete-detection zero-inflated modeling approach yields a superior fit to the data than other models. We propose that zero-inflated abundance models accounting for incomplete detection be considered when replicate count data are available.
Muralisankar, S; Manivannan, A; Balasubramaniam, P
2015-09-01
The aim of this manuscript is to investigate the mean square delay dependent-probability-distribution stability analysis of neutral type stochastic neural networks with time-delays. The time-delays are assumed to be interval time-varying and randomly occurring. Based on the new Lyapunov-Krasovskii functional and stochastic analysis approach, a novel sufficient condition is obtained in the form of linear matrix inequality such that the delayed stochastic neural networks are globally robustly asymptotically stable in the mean-square sense for all admissible uncertainties. Finally, the derived theoretical results are validated through numerical examples in which maximum allowable upper bounds are calculated for different lower bounds of time-delay.
Institute of Scientific and Technical Information of China (English)
熊峻江; 武哲; 高镇同
2002-01-01
According to the traditional fatigue constant life curve, the concept and the universal expression of the generalized fatigue constant life curve were proposed.Then, on the basis of the optimization method of the correlation coefficient, the parameter estimation formulas were induced and the generalized fatigue constant life curve with the reliability level p was given.From P-Sa-Sm curve, the two-dimensional probability distribution of the fatigue limit was derived.After then, three set of tests of LY11 CZ corresponding to the different average stress were carried out in terms of the two-dimensional up-down method.Finally, the methods are used to analyze the test results, and it is found that the analyzedresults with the high precision may be obtained.
Ossenkopf, Volker; Schneider, Nicola; Federrath, Christoph; Klessen, Ralf S
2016-01-01
Probability distribution functions (PDFs) of column densities are an established tool to characterize the evolutionary state of interstellar clouds. Using simulations, we show to what degree their determination is affected by noise, line-of-sight contamination, field selection, and the incomplete sampling in interferometric measurements. We solve the integrals that describe the convolution of a cloud PDF with contaminating sources and study the impact of missing information on the measured column density PDF. The effect of observational noise can be easily estimated and corrected for if the root mean square (rms) of the noise is known. For $\\sigma_{noise}$ values below 40\\,\\% of the typical cloud column density, $N_{peak}$, this involves almost no degradation of the accuracy of the PDF parameters. For higher noise levels and narrow cloud PDFs the width of the PDF becomes increasingly uncertain. A contamination by turbulent foreground or background clouds can be removed as a constant shield if the PDF of the c...
Royle, J. Andrew; Chandler, Richard B.; Yackulic, Charles; Nichols, James D.
2012-01-01
1. Understanding the factors affecting species occurrence is a pre-eminent focus of applied ecological research. However, direct information about species occurrence is lacking for many species. Instead, researchers sometimes have to rely on so-called presence-only data (i.e. when no direct information about absences is available), which often results from opportunistic, unstructured sampling. MAXENT is a widely used software program designed to model and map species distribution using presence-only data. 2. We provide a critical review of MAXENT as applied to species distribution modelling and discuss how it can lead to inferential errors. A chief concern is that MAXENT produces a number of poorly defined indices that are not directly related to the actual parameter of interest – the probability of occurrence (ψ). This focus on an index was motivated by the belief that it is not possible to estimate ψ from presence-only data; however, we demonstrate that ψ is identifiable using conventional likelihood methods under the assumptions of random sampling and constant probability of species detection. 3. The model is implemented in a convenient r package which we use to apply the model to simulated data and data from the North American Breeding Bird Survey. We demonstrate that MAXENT produces extreme under-predictions when compared to estimates produced by logistic regression which uses the full (presence/absence) data set. We note that MAXENT predictions are extremely sensitive to specification of the background prevalence, which is not objectively estimated using the MAXENT method. 4. As with MAXENT, formal model-based inference requires a random sample of presence locations. Many presence-only data sets, such as those based on museum records and herbarium collections, may not satisfy this assumption. However, when sampling is random, we believe that inference should be based on formal methods that facilitate inference about interpretable ecological quantities
Application of probability distribution functions in the ASTM RBCA framework for use in California
International Nuclear Information System (INIS)
Currently, Environmental Protection Agency (EPA, 1989b) and other conventional methodologies of risk assessment, such as the American Society for Testing and Materials--risk-based corrective action (ASTM/RBCA) format, make use of deterministic, or point numbers in making estimates of risk. The goal of risk assessment is to provide a systematic tool to evaluate hazards and exposures to assist in the management of society's activities. To properly do this, there must be an attempt by the regulator or the responsible party to use information as effectively as possible. The use of historical data and probability distribution functions is a suggested initial approach to dealing with LUFT sites in California, taking into account geophysical, societal, and health based parameters particular to the State. These parameters may be based on results of the CalLUFT HCA, from California Census information, or from other sources, where appropriate. Because of the limitations involved with the use of point sources in the ASTM/RBCA format, probability distribution functions can be used to give regulatory personnel and risk managers more understanding of the actual range of risks involved. Such information will allow the risk manager a higher comfort level in dealing with risks, and will, by detailing the residual risks involved, allow for the potential consequences of decisions to be better known. The above methodology effectively allows the risk manager to choose a level of health risk appropriate for the site, allows for a general prioritizing in regards to other sites, and removes some of the restrictions in applying remedial action necessitated by MCLs or deterministic risk estimates
Institute of Scientific and Technical Information of China (English)
Jie Fu; Huaguang Zhang; Tiedong Ma
2009-01-01
The delay-probability-distribution-dependent robust stability problem for a class of uncertain stochastic neural networks (SNNs) with time-varying delay is investigated. The information of probability distribution of the time delay is considered and transformed into parameter matrices of the transferred SNNs model. Based on the Lyapunov-Krasovskii functional and stochastic analysis approach, a delay-probability-distribution-dependent sufficient condition is obtained in the linear matrix inequality (LMI) format such that delayed SNNs are robustly globally asymptotically stable in the mean-square sense for all admissible uncertainties. An important feature of the results is that the stability conditions are dependent on the probability distribution of the delay and upper bound of the delay derivative, and the upper bound is allowed to be greater than or equal to 1. Finally, numerical examples are given to illustrate the effectiveness and less conservativeness of the proposed method.
Size effect on strength and lifetime probability distributions of quasibrittle structures
Indian Academy of Sciences (India)
Zdeněk P Bažant; Jia-Liang Le
2012-02-01
Engineering structures such as aircraft, bridges, dams, nuclear containments and ships, as well as computer circuits, chips and MEMS, should be designed for failure probability < $10^{-6}-10^{-7}$ per lifetime. The safety factors required to ensure it are still determined empirically, even though they represent much larger and much more uncertain corrections to deterministic calculations than do the typical errors of modern computer analysis of structures. The empirical approach is sufﬁcient for perfectly brittle and perfectly ductile structures since the cumulative distribution function (cdf) of random strength is known, making it possible to extrapolate to the tail from the mean and variance. However, the empirical approach does not apply to structures consisting of quasibrittle materials, which are brittle materials with inhomogeneities that are not negligible compared to structure size. This paper presents a reﬁned theory on the strength distribution of quasibrittle structures, which is based on the fracture mechanics of nanocracks propagating by activation energy controlled small jumps through the atomic lattice and an analytical model for the multi-scale transition of strength statistics. Based on the power law for creep crack growth rate and the cdf of material strength, the lifetime distribution of quasibrittle structures under constant load is derived. Both the strength and lifetime cdfs are shown to be sizeand geometry-dependent. The theory predicts intricate size effects on both the mean structural strength and lifetime, the latter being much stronger. The theory is shown to match the experimentally observed systematic deviations of strength and lifetime histograms of industrial ceramics from the Weibull distribution.
Yakovenko, Victor M.
2012-01-01
This Chapter is written for the Festschrift celebrating the 70th birthday of the distinguished economist Duncan Foley from the New School for Social Research in New York. This Chapter reviews applications of statistical physics methods, such as the principle of entropy maximization, to the probability distributions of money, income, and global energy consumption per capita. The exponential probability distribution of wages, predicted by the statistical equilibrium theory of a labor market dev...
Yang, Chang; Su, Zhenpeng; Xiao, Fuliang; Zheng, Huinan; Wang, Yuming; Wang, Shui; Spence, H. E.; Reeves, G. D.; Baker, D. N.; Blake, J. B.; Funsten, H. O.
2016-08-01
Van Allen radiation belt electrons exhibit complex dynamics during geomagnetically active periods. Investigation of electron pitch angle distributions (PADs) can provide important information on the dominant physical mechanisms controlling radiation belt behaviors. Here we report a storm time radiation belt event where energetic electron PADs changed from butterfly distributions to normal or flattop distributions within several hours. Van Allen Probes observations showed that the flattening of butterfly PADs was closely related to the occurrence of whistler-mode chorus waves. Two-dimensional quasi-linear STEERB simulations demonstrate that the observed chorus can resonantly accelerate the near-equatorially trapped electrons and rapidly flatten the corresponding electron butterfly PADs. These results provide a new insight on how chorus waves affect the dynamic evolution of radiation belt electrons.
International Nuclear Information System (INIS)
In some of the recent probabilistic safety assessments, discrete probability distributions (DPDs) have been developed to express, in a quantitative form, estimates of the uncertainty and conservatism in the point estimate source term values. In the DPD approach, distributed, discrete factors, which are multipliers on the point estimate values by the selected factor are made based on available data, calculations, and engineering judgment. Initial application of the DPD approach to source terms for risk analysis was based largely on engineering judgment after review of applicable data. However, in more recent applications of the DPD approach, results from an extensive review of existing experimental data and applied calculations have been factored into the estimates. Programs currently in progress, largely sponsored by NRC and EPRI, are beginning to yield significant new information upon which to base improved estimates for the magnitude of radionuclide source terms. The most extensive of the reviews of existing data for application to the DPD approach was that performed as part of the risk assessment for the proposed Sizewell B PWR. As part of the Seabrook risk study, DPD values specifically for that plant were developed based on the Sizewell approach. They represent a significant update to the Sizewell DPD values. In addition, DPD values were developed for associated release parameters which also affect the consequence calculations
Probability distribution functions for ELM bursts in a series of JET tokamak discharges
Energy Technology Data Exchange (ETDEWEB)
Greenhough, J [Space and Astrophysics Group, Department of Physics, Warwick University, Coventry CV4 7AL (United Kingdom); Chapman, S C [Space and Astrophysics Group, Department of Physics, Warwick University, Coventry CV4 7AL (United Kingdom); Dendy, R O [Space and Astrophysics Group, Department of Physics, Warwick University, Coventry CV4 7AL (United Kingdom); Ward, D J [EURATOM/UKAEA Fusion Association, Culham Science Centre, Abingdon, Oxfordshire OX14 3DB (United Kingdom)
2003-05-01
A novel statistical treatment of the full raw edge localized mode (ELM) signal from a series of previously studied JET plasmas is tested. The approach involves constructing probability distribution functions (PDFs) for ELM amplitudes and time separations, and quantifying the fit between the measured PDFs and model distributions (Gaussian, inverse exponential) and Poisson processes. Uncertainties inherent in the discreteness of the raw signal require the application of statistically rigorous techniques to distinguish ELM data points from background, and to extrapolate peak amplitudes. The accuracy of PDF construction is further constrained by the relatively small number of ELM bursts (several hundred) in each sample. In consequence the statistical technique is found to be difficult to apply to low frequency (typically Type I) ELMs, so the focus is narrowed to four JET plasmas with high frequency (typically Type III) ELMs. The results suggest that there may be several fundamentally different kinds of Type III ELMing process at work. It is concluded that this novel statistical treatment can be made to work, may have wider applications to ELM data, and has immediate practical value as an additional quantitative discriminant between classes of ELMing behaviour.
Various Models for Pion Probability Distributions from Heavy-Ion Collisions
Mekjian, A Z; Strottman, D D
1998-01-01
Various models for pion multiplicity distributions produced in relativistic heavy ion collisions are discussed. The models include a relativistic hydrodynamic model, a thermodynamic description, an emitting source pion laser model, and a description which generates a negative binomial description. The approach developed can be used to discuss other cases which will be mentioned. The pion probability distributions for these various cases are compared. Comparison of the pion laser model and Bose-Einstein condensation in a laser trap and with the thermal model are made. The thermal model and hydrodynamic model are also used to illustrate why the number of pions never diverges and why the Bose-Einstein correction effects are relatively small. The pion emission strength $\\eta$ of a Poisson emitter and a critical density $\\eta_c$ are connected in a thermal model by $\\eta/n_c = e^{-m/T} < 1$, and this fact reduces any Bose-Einstein correction effects in the number and number fluctuation of pions. Fluctuations can...
A new probability distribution model of turbulent irradiance based on Born perturbation theory
Institute of Scientific and Technical Information of China (English)
无
2010-01-01
The subject of the PDF (Probability Density Function) of the irradiance fluctuations in a turbulent atmosphere is still unsettled.Theory reliably describes the behavior in the weak turbulence regime,but theoretical description in the strong and whole turbulence regimes are still controversial.Based on Born perturbation theory,the physical manifestations and correlations of three typical PDF models (Rice-Nakagami,exponential-Bessel and negative-exponential distribution) were theoretically analyzed.It is shown that these models can be derived by separately making circular-Gaussian,strong-turbulence and strong-turbulence-circular-Gaussian approximations in Born perturbation theory,which denies the viewpoint that the Rice-Nakagami model is only applicable in the extremely weak turbulence regime and provides theoretical arguments for choosing rational models in practical applications.In addition,a common shortcoming of the three models is that they are all approximations.A new model,called the Maclaurin-spread distribution,is proposed without any approximation except for assuming the correlation coefficient to be zero.So,it is considered that the new model can exactly reflect the Born perturbation theory.Simulated results prove the accuracy of this new model.
The probability distribution for non-Gaussianity estimators constructed from the CMB trispectrum
Smith, Tristan L
2012-01-01
Considerable recent attention has focussed on the prospects to use the cosmic microwave background (CMB) trispectrum to probe the physics of the early universe. Here we evaluate the probability distribution function (PDF) for the standard estimator tau_nle for the amplitude tau_nl of the CMB trispectrum both for the null-hypothesis (i.e., for Gaussian maps with tau_nl = 0) and for maps with a non-vanishing trispectrum (|tau_nl|>0). We find these PDFs to be highly non-Gaussian in both cases. We also evaluate the variance with which the trispectrum amplitude can be measured, , as a function of its underlying value, tau_nl. We find a strong dependence of this variance on tau_nl. We also find that the variance does not, given the highly non-Gaussian nature of the PDF, effectively characterize the distribution. Detailed knowledge of these PDFs will therefore be imperative in order to properly interpret the implications of any given trispectrum measurement. For example, if a CMB experiment with a maximum multipole ...
Effect of slope angle of an artificial pool on distributions of turbulence
Institute of Scientific and Technical Information of China (English)
Atefeh Fazlollahi; Hossein Afzalimehr; Jueyi Sui
2015-01-01
abstract Experiments were carried out over a 2-dimentional pool with a constant length of 1.5 m and four different slopes. The distributions of velocity, Reynolds stress and turbulence intensities have been studied in this paper. Results show that as flow continues up the exit slope, the flow velocity increases near the channel bed and decreases near the water surface. The flow separation was not observed by ADV at the crest of the bed-form. In addition, the length of the separation zone increases with the increasing of entrance and exit slopes. The largest slope angle causes the maximum normalized shear stress. Based on the experiments, it is concluded that the shape of Reynolds stress distribution is generally dependent on the entrance and exit slopes of the pool. Also, the shape of Reynolds stress distribution is affected by both decelerating and accelerating flows. Additionally, with the increase in the slope angle, secondary currents are developed and become more stable. Results of the quadrant analysis show that the momentum between flow and bed-form is mostly transferred by sweep and ejection events.&2015 International Research and Training Centre on Erosion and Sedimentation/the World Association for Sedimentation and Erosion Research. Published by Elsevier B.V. All rights reserved.
,
2014-01-01
We present the results of an analysis of the large angular scale distribution of the arrival directions of cosmic rays with energy above 4 EeV detected at the Pierre Auger Observatory including for the first time events with zenith angle between $60^\\circ$ and $80^\\circ$. We perform two Rayleigh analyses, one in the right ascension and one in the azimuth angle distributions, that are sensitive to modulations in right ascension and declination, respectively. The largest departure from isotropy appears in the $E > 8$ EeV energy bin, with an amplitude for the first harmonic in right ascension $r_1^\\alpha =(4.4 \\pm 1.0){\\times}10^{-2}$, that has a chance probability $P(\\ge r_1^\\alpha)=6.4{\\times}10^{-5}$, reinforcing the hint previously reported with vertical events alone.
Particle size distribution models of small angle neutron scattering pattern on ferro fluids
International Nuclear Information System (INIS)
The Fe3O4 ferro fluids samples were synthesized by a co-precipitation method. The investigation of ferro fluids microstructure is known to be one of the most important problems because the presence of aggregates and their internal structure influence greatly the properties of ferro fluids. The size and the size dispersion of particle in ferro fluids were determined assuming a log normal distribution of particle radius. The scattering pattern of the measurement by small angle neutron scattering were fitted by the theoretical scattering function of two limitation models are log normal sphere distribution and fractal aggregate. Two types of particle are detected, which are presumably primary particle of 30 Armstrong in radius and secondary fractal aggregate of 200 Armstrong with polydispersity of 0.47 up to 0.53. (author)
Loo, K E
1998-01-01
We will extend Nonrelativistic Quantum Mechanics as a theory in $L\\sp2$ to a theory in the space of distributions. We will provide 3 major theories of Nonrelativist Quantum Mechanics. First, we will extend the concept of an integral kernel for the evolution operator to a distribution kernel for the $L\\sp2$ transition probability amplitude. Second, we will extend the $L\\sp2$ Schrodinger's equation to a distributions Schrodinger's equation. Lastly, we will rigorously prove that; Feynman's original formulation of the real time, time- sliced path integral is well defined when formulated on the $L\\sp2$ transition probability amplitude.
Burkhart, Blakesley; Ossenkopf, V.; Lazarian, A.; Stutzki, J.
2013-07-01
We study the effects of radiative transfer on the probability distribution functions (PDFs) of simulations of magnetohydrodynamic turbulence in the widely studied 13CO 2-1 transition. We find that the integrated intensity maps generally follow a log-normal distribution, with the cases that have τ ≈ 1 best matching the PDF of the column density. We fit a two-dimensional variance-sonic Mach number relationship to our logarithmic PDFs of the form \\sigma _{\\ln (\\Sigma /\\Sigma _0)}^2=A\\times \\ln (1+b^2{\\cal M}_s^2) and find that, for parameter b = 1/3, parameter A depends on the radiative transfer environment. We also explore the variance, skewness, and kurtosis of the linear PDFs finding that higher moments reflect both higher sonic Mach number and lower optical depth. Finally, we apply the Tsallis incremental PDF function and find that the fit parameters depend on both Mach numbers, but also are sensitive to the radiative transfer parameter space, with the τ ≈ 1 case best fitting the incremental PDF of the true column density. We conclude that, for PDFs of low optical depth cases, part of the gas is always subthermally excited so that the spread of the line intensities exceeds the spread of the underlying column densities and hence the PDFs do not reflect the true column density. Similarly, PDFs of optically thick cases are dominated by the velocity dispersion and therefore do not represent the true column density PDF. Thus, in the case of molecules like carbon monoxide, the dynamic range of intensities, structures observed, and, consequently, the observable PDFs are less determined by turbulence and more often determined by radiative transfer effects.
Multiple-streaming and the Probability Distribution of Density in Redshift Space
Hui, L; Shandarin, S F; Hui, Lam; Kofman, Lev; Shandarin, Sergei F.
1999-01-01
We examine several aspects of redshift distortions by expressing the redshift-space density in terms of the eigenvalues and orientation of the local Lagrangian deformation tensor. We explore the importance of multiple-streaming using the Zel'dovich approximation (ZA), and compute the average number of streams in real and redshift-space. It is found that multiple-streaming can be significant in redshift-space but negligible in real-space, even at moderate values of the linear fluctuation amplitude ($\\sigma < 1$). Moreover, unlike their real-space counter-parts, redshift-space multiple-streams can flow past each other with minimal interactions. Such nonlinear redshift-space effects, which operate even when the real-space density field is quite linear, could suppress the classic compression of redshift-structures predicted by linear theory (Kaiser 1987). We also compute using the ZA the probability distribution function (PDF) of density, as well as $S_3$, in real and redshift-space, and compare it with the PD...
Schneider, N; Csengeri, T; Klessen, R; Federrath, C; Tremblin, P; Girichidis, P; Bontemps, S; Andre, Ph
2014-01-01
Column density maps of molecular clouds are one of the most important observables in the context of molecular cloud- and star-formation (SF) studies. With Herschel it is now possible to reveal rather precisely the column density of dust, which is the best tracer of the bulk of material in molecular clouds. However, line-of-sight (LOS) contamination from fore- or background clouds can lead to an overestimation of the dust emission of molecular clouds, in particular for distant clouds. This implies too high values for column density and mass, and a misleading interpretation of probability distribution functions (PDFs) of the column density. In this paper, we demonstrate by using observations and simulations how LOS contamination affects the PDF. We apply a first-order approximation (removing a constant level) to the molecular clouds of Auriga and Maddalena (low-mass star-forming), and Carina and NGC3603(both high-mass SF regions). In perfect agreement with the simulations, we find that the PDFs become broader, ...
Turbulence-Induced Relative Velocity of Dust Particles III: The Probability Distribution
Pan, Liubin; Scalo, John
2014-01-01
Motivated by its important role in the collisional growth of dust particles in protoplanetary disks, we investigate the probability distribution function (PDF) of the relative velocity of inertial particles suspended in turbulent flows. Using the simulation from our previous work, we compute the relative velocity PDF as a function of the friction timescales, tau_p1 and tau_p2, of two particles of arbitrary sizes. The friction time of particles included in the simulation ranges from 0.1 tau_eta to 54T_L, with tau_eta and T_L the Kolmogorov time and the Lagrangian correlation time of the flow, respectively. The relative velocity PDF is generically non-Gaussian, exhibiting fat tails. For a fixed value of tau_p1, the PDF is the fattest for equal-size particles (tau_p2~tau_p1), and becomes thinner at both tau_p2tau_p1. Defining f as the friction time ratio of the smaller particle to the larger one, we find that, at a given f in 1/2>T_L). These features are successfully explained by the Pan & Padoan model. Usin...
ANNz2: Photometric Redshift and Probability Distribution Function Estimation using Machine Learning
Sadeh, I.; Abdalla, F. B.; Lahav, O.
2016-10-01
We present ANNz2, a new implementation of the public software for photometric redshift (photo-z) estimation of Collister & Lahav, which now includes generation of full probability distribution functions (PDFs). ANNz2 utilizes multiple machine learning methods, such as artificial neural networks and boosted decision/regression trees. The objective of the algorithm is to optimize the performance of the photo-z estimation, to properly derive the associated uncertainties, and to produce both single-value solutions and PDFs. In addition, estimators are made available, which mitigate possible problems of non-representative or incomplete spectroscopic training samples. ANNz2 has already been used as part of the first weak lensing analysis of the Dark Energy Survey, and is included in the experiment's first public data release. Here we illustrate the functionality of the code using data from the tenth data release of the Sloan Digital Sky Survey and the Baryon Oscillation Spectroscopic Survey. The code is available for download at http://github.com/IftachSadeh/ANNZ.
Gernez, Pierre; Stramski, Dariusz; Darecki, Miroslaw
2011-07-01
Time series measurements of fluctuations in underwater downward irradiance, Ed, within the green spectral band (532 nm) show that the probability distribution of instantaneous irradiance varies greatly as a function of depth within the near-surface ocean under sunny conditions. Because of intense light flashes caused by surface wave focusing, the near-surface probability distributions are highly skewed to the right and are heavy tailed. The coefficients of skewness and excess kurtosis at depths smaller than 1 m can exceed 3 and 20, respectively. We tested several probability models, such as lognormal, Gumbel, Fréchet, log-logistic, and Pareto, which are potentially suited to describe the highly skewed heavy-tailed distributions. We found that the models cannot approximate with consistently good accuracy the high irradiance values within the right tail of the experimental distribution where the probability of these values is less than 10%. This portion of the distribution corresponds approximately to light flashes with Ed > 1.5?, where ? is the time-averaged downward irradiance. However, the remaining part of the probability distribution covering all irradiance values smaller than the 90th percentile can be described with a reasonable accuracy (i.e., within 20%) with a lognormal model for all 86 measurements from the top 10 m of the ocean included in this analysis. As the intensity of irradiance fluctuations decreases with depth, the probability distribution tends toward a function symmetrical around the mean like the normal distribution. For the examined data set, the skewness and excess kurtosis assumed values very close to zero at a depth of about 10 m.
Institute of Scientific and Technical Information of China (English)
Ren-Jie He; Zhen-Yu Yang
2012-01-01
Differential evolution (DE) has become a very popular and effective global optimization algorithm in the area of evolutionary computation.In spite of many advantages such as conceptual simplicity,high efficiency and ease of use,DE has two main components,i.e.,mutation scheme and parameter control,which significantly influence its performance.In this paper we intend to improve the performance of DE by using carefully considered strategies for both of the two components.We first design an adaptive mutation scheme,which adaptively makes use of the bias of superior individuals when generating new solutions.Although introducing such a bias is not a new idea,existing methods often use heuristic rules to control the bias.They can hardly maintain the appropriate balance between exploration and exploitation during the search process,because the preferred bias is often problem and evolution-stage dependent.Instead of using any fixed rule,a novel strategy is adopted in the new adaptive mutation scheme to adjust the bias dynamically based on the identified local fitness landscape captured by the current population.As for the other component,i.e.,parameter control,we propose a mechanism by using the Lévy probability distribution to adaptively control the scale factor F of DE.For every mutation in each generation,an Fi is produced from one of four different Lévy distributions according to their historical performance.With the adaptive mutation scheme and parameter control using Lévy distribution as the main components,we present a new DE variant called Lévy DE (LDE).Experimental studies were carried out on a broad range of benchmark functions in global numerical optimization.The results show that LDE is very competitive,and both of the two main components have contributed to its overall performance.The scalability of LDE is also discussed by conducting experiments on some selected benchmark functions with dimensions from 30 to 200.
Boots, Nam Kyoo; Shahabuddin, Perwez
2001-01-01
This paper deals with estimating small tail probabilities of thesteady-state waiting time in a GI/GI/1 queue with heavy-tailed (subexponential) service times. The problem of estimating infinite horizon ruin probabilities in insurance risk processes with heavy-tailed claims can be transformed into th
Directory of Open Access Journals (Sweden)
Jinhua Xu
Full Text Available Visual saliency is the perceptual quality that makes some items in visual scenes stand out from their immediate contexts. Visual saliency plays important roles in natural vision in that saliency can direct eye movements, deploy attention, and facilitate tasks like object detection and scene understanding. A central unsolved issue is: What features should be encoded in the early visual cortex for detecting salient features in natural scenes? To explore this important issue, we propose a hypothesis that visual saliency is based on efficient encoding of the probability distributions (PDs of visual variables in specific contexts in natural scenes, referred to as context-mediated PDs in natural scenes. In this concept, computational units in the model of the early visual system do not act as feature detectors but rather as estimators of the context-mediated PDs of a full range of visual variables in natural scenes, which directly give rise to a measure of visual saliency of any input stimulus. To test this hypothesis, we developed a model of the context-mediated PDs in natural scenes using a modified algorithm for independent component analysis (ICA and derived a measure of visual saliency based on these PDs estimated from a set of natural scenes. We demonstrated that visual saliency based on the context-mediated PDs in natural scenes effectively predicts human gaze in free-viewing of both static and dynamic natural scenes. This study suggests that the computation based on the context-mediated PDs of visual variables in natural scenes may underlie the neural mechanism in the early visual cortex for detecting salient features in natural scenes.
Characterisation of seasonal flood types according to timescales in mixed probability distributions
Fischer, Svenja; Schumann, Andreas; Schulte, Markus
2016-08-01
When flood statistics are based on annual maximum series (AMS), the sample often contains flood peaks, which differ in their genesis. If the ratios among event types change over the range of observations, the extrapolation of a probability distribution function (pdf) can be dominated by a majority of events that belong to a certain flood type. If this type is not typical for extraordinarily large extremes, such an extrapolation of the pdf is misleading. To avoid this breach of the assumption of homogeneity, seasonal models were developed that differ between winter and summer floods. We show that a distinction between summer and winter floods is not always sufficient if seasonal series include events with different geneses. Here, we differentiate floods by their timescales into groups of long and short events. A statistical method for such a distinction of events is presented. To demonstrate their applicability, timescales for winter and summer floods in a German river basin were estimated. It is shown that summer floods can be separated into two main groups, but in our study region, the sample of winter floods consists of at least three different flood types. The pdfs of the two groups of summer floods are combined via a new mixing model. This model considers that information about parallel events that uses their maximum values only is incomplete because some of the realisations are overlaid. A statistical method resulting in an amendment of statistical parameters is proposed. The application in a German case study demonstrates the advantages of the new model, with specific emphasis on flood types.
International Nuclear Information System (INIS)
For Anderson localization on the Cayley tree, we study the statistics of various observables as a function of the disorder strength W and the number N of generations. We first consider the Landauer transmission TN. In the localized phase, its logarithm follows the traveling wave form TN≅(ln TN)-bar + ln t* where (i) the disorder-averaged value moves linearly (ln(TN))-bar≅-N/ξloc and the localization length diverges as ξloc∼(W-Wc)-νloc with νloc = 1 and (ii) the variable t* is a fixed random variable with a power-law tail P*(t*) ∼ 1/(t*)1+β(W) for large t* with 0 N are governed by rare events. In the delocalized phase, the transmission TN remains a finite random variable as N → ∞, and we measure near criticality the essential singularity (ln(T∞))-bar∼-|Wc-W|-κT with κT ∼ 0.25. We then consider the statistical properties of normalized eigenstates Σx|ψ(x)|2 = 1, in particular the entropy S = -Σx|ψ(x)|2ln |ψ(x)|2 and the inverse participation ratios (IPR) Iq = Σx|ψ(x)|2q. In the localized phase, the typical entropy diverges as Styp∼( W-Wc)-νS with νS 1.5, whereas it grows linearly as Styp(N) ∼ N in the delocalized phase. Finally for the IPR, we explain how closely related variables propagate as traveling waves in the delocalized phase. In conclusion, both the localized phase and the delocalized phase are characterized by the traveling wave propagation of some probability distributions, and the Anderson localization/delocalization transition then corresponds to a traveling/non-traveling critical point. Moreover, our results point toward the existence of several length scales that diverge with different exponents ν at criticality
Schneider, N.; Bontemps, S.; Motte, F.; Ossenkopf, V.; Klessen, R. S.; Simon, R.; Fechtenbaum, S.; Herpin, F.; Tremblin, P.; Csengeri, T.; Myers, P. C.; Hill, T.; Cunningham, M.; Federrath, C.
2016-03-01
The probability distribution function of column density (N-PDF) serves as a powerful tool to characterise the various physical processes that influence the structure of molecular clouds. Studies that use extinction maps or H2 column-density maps (N) that are derived from dust show that star-forming clouds can best be characterised by lognormal PDFs for the lower N range and a power-law tail for higher N, which is commonly attributed to turbulence and self-gravity and/or pressure, respectively. While PDFs from dust cover a large dynamic range (typically N ~ 1020-24 cm-2 or Av~ 0.1-1000), PDFs obtained from molecular lines - converted into H2 column density - potentially trace more selectively different regimes of (column) densities and temperatures. They also enable us to distinguish different clouds along the line of sight through using the velocity information. We report here on PDFs that were obtained from observations of 12CO, 13CO, C18O, CS, and N2H+ in the Cygnus X North region, and make a comparison to a PDF that was derived from dust observations with the Herschel satellite. The PDF of 12CO is lognormal for Av ~ 1-30, but is cut for higher Av because of optical depth effects. The PDFs of C18O and 13CO are mostly lognormal up to Av ~ 1-15, followed by excess up to Av ~ 40. Above that value, all CO PDFs drop, which is most likely due to depletion. The high density tracers CS and N2H+ exhibit only a power law distribution between Av ~ 15 and 400, respectively. The PDF from dust is lognormal for Av ~ 3-15 and has a power-law tail up to Av ~ 500. Absolute values for the molecular line column densities are, however, rather uncertain because of abundance and excitation temperature variations. If we take the dust PDF at face value, we "calibrate" the molecular line PDF of CS to that of the dust and determine an abundance [CS]/[H2] of 10-9. The slopes of the power-law tails of the CS, N2H+, and dust PDFs are -1.6, -1.4, and -2.3, respectively, and are thus consistent
Souza, V. M. C. E. S.; Vieira, L.; Alves, L. R.; Da Silva, L. A.; Koga, D.; Sibeck, D. G.; Walsh, B.; Kanekal, S. G.; Silveira, M. D.; Medeiros, C.; Mendes, O., Jr.; Marchezi, J.; Rockenbach, M.; Jauer, P. R.; Gonzalez, W.; Baker, D. N.
2015-12-01
A myriad of physical phenomena occur in the inner magnetosphere, in particular at the Earth's radiation belts, which can be a result of the combination of both internal and external processes. However, the connection between physical processes occurring deep within the magnetosphere and external interplanetary drivers it is not yet well understood. In this work we investigate whether a selected set of interplanetary structures affect the local time distribution of three different classes of high energy electron pitch angle distributions (PADs), namely normal, isotropic, and butterfly. We split this work into two parts: initially we focus on the methodology used which employs a Self-Organized Feature Map (SOFM) neural network for identifying different classes of electron PAD shapes in the Van Allen Probes' Relativistic Electron Proton Telescope (REPT) data. The algorithm can categorize the input data into an arbitrary number of classes from which three of them appears the most: normal, isotropic and butterfly. Other classes which are related with these three also emerge and deserve to be addressed in detail in future works. We also discuss the uncertainties of the algorithm. Then, we move to the second part where we describe in details the criteria used for selecting the interplanetary events, and also try to investigate the relation between key parameters characterizing such interplanetary structures and the local time distributions of electron PAD shapes.
Staniford-Chen, Stuart
1992-01-01
For a system near a second order phase transition, the probability distribution for the order parameter can be given a finite size scaling form. This fact is used to compare the finite temperature phase transition for the Wilson lines in d=3+1 SU(2) lattice gauge theory with the phase transition in d=3 phi^4 field theory. I exhibit the finite size scaled probability distributions in the form of a function of two variables (the reduced `temperature' and the magnetization) for both models. The ...
Virrueta, Alejandro; O'Hern, Corey; Regan, Lynne
Methionine (Met) is a versatile amino acid found frequently both in protein cores and at protein-protein interfaces. Thus, a complete description of the structure of Met is tantamount to a fundamental understanding of protein structure and design. In previous work, we showed that our hard-sphere dipeptide model is able to recapitulate the side chain dihedral angle distributions observed in high-resolution protein crystal structures for the 8 amino acids we have studied to date: Val, Thr, Ser, Leu, Ile, Cys, Tyr, and Phe. Using the same approach, we can predict the observed Met side chain dihedral angle distributions P (χ1) and P (χ2) , but not P (χ3) . In this manuscript, we investigate the possible origins of the discrepancy and identify the minimal additions to the hard-sphere dipeptide model necessary to quantitatively predict P (χ3) of Met. We find that applying a Lennard-Jones potential with weak attraction between hydrogen atoms is sufficient to achieve predictions that match the observed χ3 side chain dihedral angle probability distributions for Met, Nle, and Mse without negatively affecting our results for the 8 previously studied amino acids. A. V. is supported by an NSF Graduate Research Fellowship and a Ford Foundation Fellowship.
Bel, J.; Branchini, E.; Di Porto, C.; Cucciati, O.; Granett, B. R.; Iovino, A.; de la Torre, S.; Marinoni, C.; Guzzo, L.; Moscardini, L.; Cappi, A.; Abbas, U.; Adami, C.; Arnouts, S.; Bolzonella, M.; Bottini, D.; Coupon, J.; Davidzon, I.; De Lucia, G.; Fritz, A.; Franzetti, P.; Fumana, M.; Garilli, B.; Ilbert, O.; Krywult, J.; Le Brun, V.; Le Fèvre, O.; Maccagni, D.; Małek, K.; Marulli, F.; McCracken, H. J.; Paioro, L.; Polletta, M.; Pollo, A.; Schlagenhaufer, H.; Scodeggio, M.; Tasca, L. A. M.; Tojeiro, R.; Vergani, D.; Zanichelli, A.; Burden, A.; Marchetti, A.; Mellier, Y.; Nichol, R. C.; Peacock, J. A.; Percival, W. J.; Phleps, S.; Wolk, M.
2016-04-01
We compare three methods to measure the count-in-cell probability density function of galaxies in a spectroscopic redshift survey. From this comparison we found that, when the sampling is low (the average number of object per cell is around unity), it is necessary to use a parametric method to model the galaxy distribution. We used a set of mock catalogues of VIPERS to verify if we were able to reconstruct the cell-count probability distribution once the observational strategy is applied. We find that, in the simulated catalogues, the probability distribution of galaxies is better represented by a Gamma expansion than a skewed log-normal distribution. Finally, we correct the cell-count probability distribution function from the angular selection effect of the VIMOS instrument and study the redshift and absolute magnitude dependency of the underlying galaxy density function in VIPERS from redshift 0.5 to 1.1. We found a very weak evolution of the probability density distribution function and that it is well approximated by a Gamma distribution, independently of the chosen tracers. Based on observations collected at the European Southern Observatory, Cerro Paranal, Chile, using the Very Large Telescope under programmes 182.A-0886 and partly 070.A-9007. Also based on observations obtained with MegaPrime/MegaCam, a joint project of CFHT and CEA/DAPNIA, at the Canada-France-Hawaii Telescope (CFHT), which is operated by the National Research Council (NRC) of Canada, the Institut National des Sciences de l'Univers of the Centre National de la Recherche Scientifique (CNRS) of France, and the University of Hawaii. This work is based in part on data products produced at TERAPIX and the Canadian Astronomy Data Centre as part of the Canada-France-Hawaii Telescope Legacy Survey, a collaborative project of NRC and CNRS. The VIPERS web site is http://www.vipers.inaf.it/
Kim, Gichul; HwangBo, Pil-Neo
2016-03-01
[Purpose] The purpose of this study was to compare the effect of Schroth and Pilates exercises on the Cobb angle and body weight distribution of patients with idiopathic scoliosis. [Subjects] Twenty-four scoliosis patients with a Cobb angle of ≥20° were divided into the Schroth exercise group (SEG, n = 12) and the Pilates exercise group (PEG, n = 12). [Methods] The SEG and PEG performed Schroth and Pilates exercises, respectively, three times a week for 12 weeks. The Cobb angle was measured in the standing position with a radiography apparatus, and weight load was measured with Gait View Pro 1.0. [Results] In the intragroup comparison, both groups showed significant changes in the Cobb angle. For weight distribution, the SEG showed significant differences in the total weight between the concave and convex sides, but the PEG did not show significant differences. Furthermore, in the intragroup comparison, the SEG showed significant differences in the changes in the Cobb angle and weight distribution compared with the PEG. [Conclusion] Both Schroth and Pilates exercises were effective in changing the Cobb angle and weight distribution of scoliosis patients; however, the intergroup comparison showed that the Schroth exercise was more effective than the Pilates exercise. PMID:27134403
Kim, Gichul; HwangBo, Pil-neo
2016-01-01
[Purpose] The purpose of this study was to compare the effect of Schroth and Pilates exercises on the Cobb angle and body weight distribution of patients with idiopathic scoliosis. [Subjects] Twenty-four scoliosis patients with a Cobb angle of ≥20° were divided into the Schroth exercise group (SEG, n = 12) and the Pilates exercise group (PEG, n = 12). [Methods] The SEG and PEG performed Schroth and Pilates exercises, respectively, three times a week for 12 weeks. The Cobb angle was measured in the standing position with a radiography apparatus, and weight load was measured with Gait View Pro 1.0. [Results] In the intragroup comparison, both groups showed significant changes in the Cobb angle. For weight distribution, the SEG showed significant differences in the total weight between the concave and convex sides, but the PEG did not show significant differences. Furthermore, in the intragroup comparison, the SEG showed significant differences in the changes in the Cobb angle and weight distribution compared with the PEG. [Conclusion] Both Schroth and Pilates exercises were effective in changing the Cobb angle and weight distribution of scoliosis patients; however, the intergroup comparison showed that the Schroth exercise was more effective than the Pilates exercise. PMID:27134403
Institute of Scientific and Technical Information of China (English)
ZHANG Yi-Xin; CANG Ji
2009-01-01
Effects of atmospheric turbulence tilt, defocus, astigmatism and coma aberrations on the orbital angular mo-mentum measurement probability of photons propagating in weak turbulent regime are modeled with Rytov approximation. By considering the resulting wave as a superposition of angular momentum eigenstates, the or-bital angular momentum measurement probabilities of the transmitted digit axe presented. Our results show that the effect of turbulent tilt aberration on the orbital angular momentum measurement probabilities of photons is the maximum among these four kinds of aberrations. As the aberration order increases, the effects of turbulence aberrations on the measurement probabilities of orbital angular momentum generally decrease, whereas the effect of turbulence defoens can be ignored. For tilt aberration, as the difference between the measured orbital angular momentum and the original orbital angular momentum increases, the orbital angular momentum measurement probabifity decreases.
Turbulence-induced relative velocity of dust particles. III. The probability distribution
Energy Technology Data Exchange (ETDEWEB)
Pan, Liubin [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Padoan, Paolo [ICREA and ICC, University of Barcelona, Marti i Franquès 1, E-08028 Barcelona (Spain); Scalo, John, E-mail: lpan@cfa.harvard.edu, E-mail: ppadoan@icc.ub.edu, E-mail: parrot@astro.as.utexas.edu [Department of Astronomy, University of Texas, Austin, TX 78712 (United States)
2014-09-01
Motivated by its important role in the collisional growth of dust particles in protoplanetary disks, we investigate the probability distribution function (PDF) of the relative velocity of inertial particles suspended in turbulent flows. Using the simulation from our previous work, we compute the relative velocity PDF as a function of the friction timescales, τ{sub p1} and τ{sub p2}, of two particles of arbitrary sizes. The friction time of the particles included in the simulation ranges from 0.1τ{sub η} to 54T {sub L}, where τ{sub η} and T {sub L} are the Kolmogorov time and the Lagrangian correlation time of the flow, respectively. The relative velocity PDF is generically non-Gaussian, exhibiting fat tails. For a fixed value of τ{sub p1}, the PDF shape is the fattest for equal-size particles (τ{sub p2} = τ{sub p1}), and becomes thinner at both τ{sub p2} < τ{sub p1} and τ{sub p2} > τ{sub p1}. Defining f as the friction time ratio of the smaller particle to the larger one, we find that, at a given f in (1/2) ≲ f ≲ 1, the PDF fatness first increases with the friction time τ{sub p,h} of the larger particle, peaks at τ{sub p,h} ≅ τ{sub η}, and then decreases as τ{sub p,h} increases further. For 0 ≤ f ≲ (1/4), the PDF becomes continuously thinner with increasing τ{sub p,h}. The PDF is nearly Gaussian only if τ{sub p,h} is sufficiently large (>>T {sub L}). These features are successfully explained by the Pan and Padoan model. Using our simulation data and some simplifying assumptions, we estimated the fractions of collisions resulting in sticking, bouncing, and fragmentation as a function of the dust size in protoplanetary disks, and argued that accounting for non-Gaussianity of the collision velocity may help further alleviate the bouncing barrier problem.
Williams, Michael S; Ebel, Eric D
2014-11-18
The fitting of statistical distributions to chemical and microbial contamination data is a common application in risk assessment. These distributions are used to make inferences regarding even the most pedestrian of statistics, such as the population mean. The reason for the heavy reliance on a fitted distribution is the presence of left-, right-, and interval-censored observations in the data sets, with censored observations being the result of nondetects in an assay, the use of screening tests, and other practical limitations. Considerable effort has been expended to develop statistical distributions and fitting techniques for a wide variety of applications. Of the various fitting methods, Markov Chain Monte Carlo methods are common. An underlying assumption for many of the proposed Markov Chain Monte Carlo methods is that the data represent independent and identically distributed (iid) observations from an assumed distribution. This condition is satisfied when samples are collected using a simple random sampling design. Unfortunately, samples of food commodities are generally not collected in accordance with a strict probability design. Nevertheless, pseudosystematic sampling efforts (e.g., collection of a sample hourly or weekly) from a single location in the farm-to-table continuum are reasonable approximations of a simple random sample. The assumption that the data represent an iid sample from a single distribution is more difficult to defend if samples are collected at multiple locations in the farm-to-table continuum or risk-based sampling methods are employed to preferentially select samples that are more likely to be contaminated. This paper develops a weighted bootstrap estimation framework that is appropriate for fitting a distribution to microbiological samples that are collected with unequal probabilities of selection. An example based on microbial data, derived by the Most Probable Number technique, demonstrates the method and highlights the
Directory of Open Access Journals (Sweden)
Jindal Shveta
2010-01-01
Full Text Available Purpose: To compare the diagnostic performance of the Heidelberg retinal tomograph (HRT glaucoma probability score (GPS with that of Moorfield′s regression analysis (MRA. Materials and Methods: The study included 50 eyes of normal subjects and 50 eyes of subjects with early-to-moderate primary open angle glaucoma. Images were obtained by using HRT version 3.0. Results: The agreement coefficient (weighted k for the overall MRA and GPS classification was 0.216 (95% CI: 0.119 - 0.315. The sensitivity and specificity were evaluated using the most specific (borderline results included as test negatives and least specific criteria (borderline results included as test positives. The MRA sensitivity and specificity were 30.61 and 98% (most specific and 57.14 and 98% (least specific. The GPS sensitivity and specificity were 81.63 and 73.47% (most specific and 95.92 and 34.69% (least specific. The MRA gave a higher positive likelihood ratio (28.57 vs. 3.08 and the GPS gave a higher negative likelihood ratio (0.25 vs. 0.44.The sensitivity increased with increasing disc size for both MRA and GPS. Conclusions: There was a poor agreement between the overall MRA and GPS classifications. GPS tended to have higher sensitivities, lower specificities, and lower likelihood ratios than the MRA. The disc size should be taken into consideration when interpreting the results of HRT, as both the GPS and MRA showed decreased sensitivity for smaller discs and the GPS showed decreased specificity for larger discs.
Kim, Gichul; Hwangbo, Pil-Neo
2016-01-01
[Purpose] The purpose of this study was to compare the effect of Schroth and Pilates exercises on the Cobb angle and body weight distribution of patients with idiopathic scoliosis. [Subjects] Twenty-four scoliosis patients with a Cobb angle of ≥20° were divided into the Schroth exercise group (SEG, n = 12) and the Pilates exercise group (PEG, n = 12). [Methods] The SEG and PEG performed Schroth and Pilates exercises, respectively, three times a week for 12 weeks. The Cobb angle was measured i...
International Nuclear Information System (INIS)
Differential cross-sections of excitation and decay of 7Li*(7,45 MeV) resonance into 6Li + n channel in three particle reaction 7Li(alpha, alpha6Li)n at alpha-particle energy of 27,2 MeV have been determined in kinematically complete and incomplete experiments. Usage of position sensitive detector made it possible to obtain the data on space distributions of decay events for full range of possible angles and to determine the total probability of this process, which value essentially differs from the data for binary reactions. This result is agreed with previously obtained [1] and confirms the theoretical calculations [2] of decay branching ratio for short lived near-threshold resonances in three particle reactions
Yadav, C.; Thomas, R. G.; Mohanty, A. K.; Kapoor, S. S.
2015-07-01
The presence of various fissionlike reactions in heavy-ion induced reactions is a major hurdle in the path to laboratory synthesis of heavy and super-heavy nuclei. It is known that the cross section of forming a heavy evaporation residue in fusion reactions depends on the three factors—the capture cross section, probability of compound nucleus formation PCN, and the survival probability of the compound nucleus against fission. As the probability of compound nucleus formation, PCN is difficult to theoretically estimate because of its complex dependence on several parameters; attempts have been made in the past to deduce it from the fission fragment anisotropy data. In the present work, the fragment anisotropy data for a number of heavy-ion reactions are analyzed and it is found that deduction of PCN from the anisotropy data also requires the knowledge of the ratio of relaxation time of the K degree of freedom to pre-equilibrium fission time.
DEFF Research Database (Denmark)
Asmussen, Søren; Albrecher, Hansjörg
, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially......The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....
Approximating Probability Levels for Testing Null Hypotheses with Noncentral F Distributions.
Fowler, Robert L.
1984-01-01
This study compared two approximations for normalizing noncentral F distributions: one based on the square root of the chi-square distribution (SRA), the other derived from a cube root of the chi-square distribution (CRA). The CRA was superior, and generally provided an excellent approximation for noncentral F. (Author/BW)
González-Garciá, M Concepción; Smirnov, Yu A
2001-01-01
We have performed a detailed study of the zenith angle dependence of the regeneration factor and distributions of events at SNO and SK for different solutions of the solar neutrino problem. In particular, we discuss oscillatory behaviour and the synchronization effect in the distribution for the LMA solution, the parametric peak for the LOW solution, etc.. Physical interpretation of the effects is given. We suggest a new binning of events which emphasizes distinctive features of zenith angle distributions for the different solutions. We also find the correlations between the integrated day-night asymmetry and the rates of events in different zenith angle bins. Study of these correlations strengthens the identification power of the analysis.
Mestres-Missé, Anna; Trampel, Robert; Turner, Robert; Kotz, Sonja A
2016-04-01
A key aspect of optimal behavior is the ability to predict what will come next. To achieve this, we must have a fairly good idea of the probability of occurrence of possible outcomes. This is based both on prior knowledge about a particular or similar situation and on immediately relevant new information. One question that arises is: when considering converging prior probability and external evidence, is the most probable outcome selected or does the brain represent degrees of uncertainty, even highly improbable ones? Using functional magnetic resonance imaging, the current study explored these possibilities by contrasting words that differ in their probability of occurrence, namely, unbalanced ambiguous words and unambiguous words. Unbalanced ambiguous words have a strong frequency-based bias towards one meaning, while unambiguous words have only one meaning. The current results reveal larger activation in lateral prefrontal and insular cortices in response to dominant ambiguous compared to unambiguous words even when prior and contextual information biases one interpretation only. These results suggest a probability distribution, whereby all outcomes and their associated probabilities of occurrence-even if very low-are represented and maintained. PMID:25523107
Directory of Open Access Journals (Sweden)
Fang Zheng
2013-04-01
Full Text Available Analysis of knee joint vibration or vibroarthrographic (VAG signals using signal processing and machine learning algorithms possesses high potential for the noninvasive detection of articular cartilage degeneration, which may reduce unnecessary exploratory surgery. Feature representation of knee joint VAG signals helps characterize the pathological condition of degenerative articular cartilages in the knee. This paper used the kernel-based probability density estimation method to model the distributions of the VAG signals recorded from healthy subjects and patients with knee joint disorders. The estimated densities of the VAG signals showed explicit distributions of the normal and abnormal signal groups, along with the corresponding contours in the bivariate feature space. The signal classifications were performed by using the Fisher’s linear discriminant analysis, support vector machine with polynomial kernels, and the maximal posterior probability decision criterion. The maximal posterior probability decision criterion was able to provide the total classification accuracy of 86.67% and the area (Az of 0.9096 under the receiver operating characteristics curve, which were superior to the results obtained by either the Fisher’s linear discriminant analysis (accuracy: 81.33%, Az: 0.8564 or the support vector machine with polynomial kernels (accuracy: 81.33%, Az: 0.8533. Such results demonstrated the merits of the bivariate feature distribution estimation and the superiority of the maximal posterior probability decision criterion for analysis of knee joint VAG signals.
Goldberg, Samuel
1960-01-01
Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.
High-energy spectrum and zenith-angle distribution of atmospheric neutrinos
Sinegovsky, S I; Sinegovskaya, T S
2011-01-01
High-energy neutrinos, arising from decays of mesons produced through the collisions of cosmic ray particles with air nuclei, form the background in the astrophysical neutrino detection problem. An ambiguity in high-energy behavior of pion and especially kaon production cross sections for nucleon-nucleus collisions may affect essentially the calculated neutrino flux. We present results of the calculation of the energy spectrum and zenith-angle distribution of the muon and electron atmospheric neutrinos in the energy range 10 GeV to 10 PeV. The calculation was performed with usage of known hadronic models (QGSJET-II-03, SIBYLL 2.1, Kimel & Mokhov) for two of the primary spectrum parametrizations, by Gaisser & Honda and by Zatsepin & Sokolskaya. The comparison of the calculated muon neutrino spectrum with the IceCube40 experiment data make it clear that even at energies above 100 TeV the prompt neutrino contribution is not so apparent because of tangled uncertainties of the strange (kaons) and charm...
Counterion Distribution Around Protein-SNAs probed by Small-angle X-ray scattering
Krishnamoorthy, Kurinji; Bedzyk, Michael; Kewalramani, Sumit; Moreau, Liane; Mirkin, Chad
Protein-DNA conjugates couple the advanced cell transfection capabilities of spherical DNA architecture and the biocompatible enzymatic activity of a protein core to potentially create therapeutic agents with dual functionality. An understanding of their stabilizing ionic environment is crucial to better understand and predict their properties. Here, we use Small-angle X-ray scattering techniques to decipher the structure of the counterion cloud surrounding these DNA coated nanoparticles. Through the use of anomalous scattering techniques we have mapped the local concentrations of Rb+ ions in the region around the Protein-DNA constructs. These results are further corroborated with simulations using a geometric model for the excess charge density as function of radial distance from the protein core. Further, we investigate the influence of solution ionic strength on the structure of the DNA corona and demonstrate a reduction in the extension of the DNA corona with increasing concentration of NaCl in solution for the case of both single and double stranded DNA shells. Our work reveals the distribution of counterions in the vicinity of Protein-DNA conjugates and decouples the effect of solution ionic strength on the thickness of the DNA layer.
Huang, Chao-Chi; Chiu, Yang-Hung; Wen, Chih-Yu
2014-01-01
In a vehicular sensor network (VSN), the key design issue is how to organize vehicles effectively, such that the local network topology can be stabilized quickly. In this work, each vehicle with on-board sensors can be considered as a local controller associated with a group of communication members. In order to balance the load among the nodes and govern the local topology change, a group formation scheme using localized criteria is implemented. The proposed distributed topology control method focuses on reducing the rate of group member change and avoiding the unnecessary information exchange. Two major phases are sequentially applied to choose the group members of each vehicle using hybrid angle/distance information. The operation of Phase I is based on the concept of the cone-based method, which can select the desired vehicles quickly. Afterwards, the proposed time-slot method is further applied to stabilize the network topology. Given the network structure in Phase I, a routing scheme is presented in Phase II. The network behaviors are explored through simulation and analysis in a variety of scenarios. The results show that the proposed mechanism is a scalable and effective control framework for VSNs. PMID:25350506
Directory of Open Access Journals (Sweden)
Chao-Chi Huang
2014-10-01
Full Text Available In a vehicular sensor network (VSN, the key design issue is how to organize vehicles effectively, such that the local network topology can be stabilized quickly. In this work, each vehicle with on-board sensors can be considered as a local controller associated with a group of communication members. In order to balance the load among the nodes and govern the local topology change, a group formation scheme using localized criteria is implemented. The proposed distributed topology control method focuses on reducing the rate of group member change and avoiding the unnecessary information exchange. Two major phases are sequentially applied to choose the group members of each vehicle using hybrid angle/distance information. The operation of Phase I is based on the concept of the cone-based method, which can select the desired vehicles quickly. Afterwards, the proposed time-slot method is further applied to stabilize the network topology. Given the network structure in Phase I, a routing scheme is presented in Phase II. The network behaviors are explored through simulation and analysis in a variety of scenarios. The results show that the proposed mechanism is a scalable and effective control framework for VSNs.
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
A two-dimensional model was established in the rectangular co-ordinate to study the thermal stress in the sapphire single crystal grown by the improved Kyropoulos. In the simulation, the distribution, the maximum and minimum values of the thermal stress were calculated. In addition, the relationship between the thermal stress and the shouldering angles was obtained that for lower shouldering angles, the maximum of the thermal stress value is lower and the minimum value is higher. It indicates that the distribution of the thermal stress can be improved by decreasing the shouldering angles of the crystal during the growth process. To evaluate the model, the experiment was carried out and the results are in good agreement with the calculation.
Taylor, J F; Abbitt, B.; Walter, J P; Davis, S. K.; Jaques, J. T.; Ochoa, R. F.
1993-01-01
β-Mannosidosis is a lethal lysosomal storage disease inherited as an autosomal recessive in man, cattle and goats. Laboratory assay data of plasma β-mannosidase activity represent a mixture of homozygous normal and carrier genotype distributions in a proportion determined by genotype frequency. A maximum likelihood approach employing data transformations for each genotype distribution and assuming a diallelic model of inheritance is described. Estimates of the transformation and genotype dist...
Seery, David; Hidalgo, J. Carlos
2006-01-01
We show how to obtain the probability density function for the amplitude of the curvature perturbation, R, produced during an epoch of slow-roll, single-field inflation, working directly from n-point correlation functions of R. These n-point functions are the usual output of quantum field theory calculations, and as a result we bypass approximate statistical arguments based on the central limit theorem. Our method can be extended to deal with arbitrary forms of non-Gaussianity, appearing at a...
A Probability Distribution of Surface Elevation for Wind Waves in Terms of the Gram-Charlier Series
Institute of Scientific and Technical Information of China (English)
黄传江; 戴德君; 王伟; 钱成春
2003-01-01
Laboratory experiments are conducted to study the probability distribution of surface elevation for wind waves and the convergence is discussed of the Gram-Charlier series in describing the surface elevation distribution. Results show that the agreement between the Gram-Charlier series and the observed distribution becomes better and better as the truncated order of the series increases in a certain range, which is contrary to the phenomenon observed by Huang and Long (1980). It is also shown that the Gram-Charlier series is sensitive to the anomalies in the data set which will make the agreement worse if they are not preprocessed appropriately. Negative values of the probability distribution expressed by the Gram-Charlier series in some ranges of surface elevations are discussed, but the absolute values of the negative values as well as the ranges of their occurrence become smaller gradually as more and more terms are included. Therefore the negative values will have no evident effect on the form of the whole surface elevation distribution when the series is truncated at higher orders. Furthermore, a simple recurrence formula is obtained to calculate the coefficients of the Gram-Charlier series in order to extend the Gram-Charlier series to high orders conveniently.
Elizalde, E.; Gaztanaga, E.
1992-01-01
The dependence of counts in cells on the shape of the cell for the large scale galaxy distribution is studied. A very concrete prediction can be done concerning the void distribution for scale invariant models. The prediction is tested on a sample of the CfA catalog, and good agreement is found. It is observed that the probability of a cell to be occupied is bigger for some elongated cells. A phenomenological scale invariant model for the observed distribution of the counts in cells, an extension of the negative binomial distribution, is presented in order to illustrate how this dependence can be quantitatively determined. An original, intuitive derivation of this model is presented.
Research on the behavior of fiber orientation probability distribution function in the planar flows
Institute of Scientific and Technical Information of China (English)
ZHOU Kun; LIN Jian-zhong
2005-01-01
The equation of two-dimensional fiber direction vector was solved theoretically to give the fiber orientation distribution in simple shear flow, flow with two direction shears, extensional flow and arbitrary planar incompressible flow. The Fokker-Planck equation was solved numerically to validify the theoretical solutions. The stable orientation and orientation period of fiber were obtained. The results showed that the fiber orientation distribution is dependent on the relative not absolute magnitude of the matrix rate-of-strain of flow. The effect of fiber aspect ratio on the orientation distribution of fiber is insignificant in most conditions except the simple shear case. It was proved that the results for a planar flow could be generalized to the case of 3-D fiber direction vector.
DEFF Research Database (Denmark)
Hansen, Thomas Mejer; Mosegaard, Klaus; Cordua, Knud Skou
2010-01-01
Markov chain Monte Carlo methods such as the Gibbs sampler and the Metropolis algorithm can be used to sample the solutions to non-linear inverse problems. In principle these methods allow incorporation of arbitrarily complex a priori information, but current methods allow only relatively simple...... priors to be used. We demonstrate how sequential simulation can be seen as an application of the Gibbs sampler, and how such a Gibbs sampler assisted by sequential simulation can be used to perform a random walk generating realizations of a relatively complex random function. We propose to combine...... this algorithm with the Metropolis algorithm to obtain an efficient method for sampling posterior probability densities for nonlinear inverse problems....
The probability distributions of the first hitting times of Bessel processes
Hamana, Yuji; Matsumoto, Hiroyuki
2011-01-01
We consider the first hitting times of the Bessel processes. We give explicit expressions for the distribution functions and for the densities by means of the zeros of the Bessel functions. The results extend the classical ones and cover all the cases.
Ruin Probabilities and Aggregrate Claims Distributions for Shot Noise Cox Processes
DEFF Research Database (Denmark)
Albrecher, H.; Asmussen, Søren
We consider a risk process Rt where the claim arrival process is a superposition of a homogeneous Poisson process and a Cox process with a Poisson shot noise intensity process, capturing the effect of sudden increases of the claim intensity due to external events. The distribution of the aggregate...
Burkert, A; Burkert, Andreas; Low, Mordecai-Mark Mac
2001-01-01
The one-point statistics of column density distributions of turbulent molecular cloud models are investigated and compared with observations. In agreement with the observations, the number N of pixels with surface density S is distributed exponentially N(S)=exp(-S/S0) in models of driven compressible supersonic turbulence. However, in contrast to the observations, the exponential slope defined by S0 is not universal but instead depends strongly on the adopted rms Mach number and on the smoothing of the data cube. We demonstrate that this problem can be solved if one restricts the analysis of the surface density distribution to subregions with sizes equal to the correlation length of the flow which turns out to be given by the driving scale. In this case, the column density distributions are universal with a slope that is in excellent agreement with the observations and independent of the Mach number or smoothing. The observed molecular clouds therefore are coherent structures with sizes of order their correla...
Smith, O. E.; Adelfang, S. I.
1981-01-01
A model of the largest gust amplitude and gust length is presented which uses the properties of the bivariate gamma distribution. The gust amplitude and length are strongly dependent on the filter function; the amplitude increases with altitude and is larger in winter than in summer.
Is extrapair mating random? On the probability distribution of extrapair young in avian broods
Brommer, Jon E.; Korsten, Peter; Bouwman, Karen A.; Berg, Mathew L.; Komdeur, Jan
2007-01-01
A dichotomy in female extrapair copulation (EPC) behavior, with some females seeking EPC and others not, is inferred if the observed distribution of extrapair young (EPY) over broods differs from a random process on the level of individual offspring (binomial, hypergeometrical, or Poisson). A review
Dezerega, Alejandro Bartolome
1994-01-01
Approved for public release, distribution unlimited The end of cold war levels of defense expenditures has promoted the reduction in the number of defense-related companies, creating potential monopolistic economic scenarios for defense procurement. This thesis studies one methodology to deal with these scenarios, based on the Baron-Myerson monopolist regulation mechanisms. The Baron-Myerson mechanism provides a tool to regulate monopolists ...
Tord Gomis, Marta De
2011-01-01
Efficiency and patient satisfaction are two of the most important factors for a hospital; in order to be competitive these two factors have to be improved. Tactical admission plans are focused on increasing efficiency, but in this paper we try to also associate patient satisfaction with the tactical plan. To this respect, we present a procedure to calculate exact waiting time distributions and another procedure to compute the exact level of resources usage. Then we explore two different metho...
Flanagan, Éanna É; Wasserman, Ira; Vanderveld, R Ali
2011-01-01
We study the fluctuations in luminosity distances due to gravitational lensing by large scale (> 35 Mpc) structures, specifically voids and sheets. We use a simplified "Swiss cheese" model consisting of a \\Lambda -CDM Friedman-Robertson-Walker background in which a number of randomly distributed non-overlapping spherical regions are replaced by mass compensating comoving voids, each with a uniform density interior and a thin shell of matter on the surface. We compute the distribution of magnitude shifts using a variant of the method of Holz & Wald (1998), which includes the effect of lensing shear. The standard deviation of this distribution is ~ 0.027 magnitudes and the mean is ~ 0.003 magnitudes for voids of radius 35 Mpc, sources at redshift z_s=1.0, with the voids chosen so that 90% of the mass is on the shell today. The standard deviation varies from 0.005 to 0.06 magnitudes as we vary the void size, source redshift, and fraction of mass on the shells today. If the shell walls are given a finite thic...
Probability Distribution Extraction from TEC Estimates based on Kernel Density Estimation
Demir, Uygar; Toker, Cenk; Çenet, Duygu
2016-07-01
Statistical analysis of the ionosphere, specifically the Total Electron Content (TEC), may reveal important information about its temporal and spatial characteristics. One of the core metrics that express the statistical properties of a stochastic process is its Probability Density Function (pdf). Furthermore, statistical parameters such as mean, variance and kurtosis, which can be derived from the pdf, may provide information about the spatial uniformity or clustering of the electron content. For example, the variance differentiates between a quiet ionosphere and a disturbed one, whereas kurtosis differentiates between a geomagnetic storm and an earthquake. Therefore, valuable information about the state of the ionosphere (and the natural phenomena that cause the disturbance) can be obtained by looking at the statistical parameters. In the literature, there are publications which try to fit the histogram of TEC estimates to some well-known pdf.s such as Gaussian, Exponential, etc. However, constraining a histogram to fit to a function with a fixed shape will increase estimation error, and all the information extracted from such pdf will continue to contain this error. In such techniques, it is highly likely to observe some artificial characteristics in the estimated pdf which is not present in the original data. In the present study, we use the Kernel Density Estimation (KDE) technique to estimate the pdf of the TEC. KDE is a non-parametric approach which does not impose a specific form on the TEC. As a result, better pdf estimates that almost perfectly fit to the observed TEC values can be obtained as compared to the techniques mentioned above. KDE is particularly good at representing the tail probabilities, and outliers. We also calculate the mean, variance and kurtosis of the measured TEC values. The technique is applied to the ionosphere over Turkey where the TEC values are estimated from the GNSS measurement from the TNPGN-Active (Turkish National Permanent
Directory of Open Access Journals (Sweden)
Denis Cousineau
2008-03-01
Full Text Available This article discusses how to characterize response time (RT frequency distributions in terms of probability functions and how to implement the necessary analysis tools using MATLAB. The first part of the paper discusses the general principles of maximum likelihood estimation. A detailed implementation that allows fitting the popular ex-Gaussian function is then presented followed by the results of a Monte Carlo study that shows the validity of the proposed approach. Although the main focus is the ex-Gaussian function, the general procedure described here can be used to estimate best fitting parameters of various probability functions. The proposed computational tools, written in MATLAB source code, are available through the Internet.
Directory of Open Access Journals (Sweden)
D Johan Kotze
Full Text Available Temporal variation in the detectability of a species can bias estimates of relative abundance if not handled correctly. For example, when effort varies in space and/or time it becomes necessary to take variation in detectability into account when data are analyzed. We demonstrate the importance of incorporating seasonality into the analysis of data with unequal sample sizes due to lost traps at a particular density of a species. A case study of count data was simulated using a spring-active carabid beetle. Traps were 'lost' randomly during high beetle activity in high abundance sites and during low beetle activity in low abundance sites. Five different models were fitted to datasets with different levels of loss. If sample sizes were unequal and a seasonality variable was not included in models that assumed the number of individuals was log-normally distributed, the models severely under- or overestimated the true effect size. Results did not improve when seasonality and number of trapping days were included in these models as offset terms, but only performed well when the response variable was specified as following a negative binomial distribution. Finally, if seasonal variation of a species is unknown, which is often the case, seasonality can be added as a free factor, resulting in well-performing negative binomial models. Based on these results we recommend (a add sampling effort (number of trapping days in our example to the models as an offset term, (b if precise information is available on seasonal variation in detectability of a study object, add seasonality to the models as an offset term; (c if information on seasonal variation in detectability is inadequate, add seasonality as a free factor; and (d specify the response variable of count data as following a negative binomial or over-dispersed Poisson distribution.
Directory of Open Access Journals (Sweden)
2007-01-01
Full Text Available To produce probability distributions for regional climate change in surface temperature and precipitation, a probability distribution for global mean temperature increase has been combined with the probability distributions for the appropriate scaling variables, i.e. the changes in regional temperature/precipitation per degree global mean warming. Each scaling variable is assumed to be normally distributed. The uncertainty of the scaling relationship arises from systematic differences between the regional changes from global and regional climate model simulations and from natural variability. The contributions of these sources of uncertainty to the total variance of the scaling variable are estimated from simulated temperature and precipitation data in a suite of regional climate model experiments conducted within the framework of the EU-funded project PRUDENCE, using an Analysis Of Variance (ANOVA. For the area covered in the 2001–2004 EU-funded project SWURVE, five case study regions (CSRs are considered: NW England, the Rhine basin, Iberia, Jura lakes (Switzerland and Mauvoisin dam (Switzerland. The resulting regional climate changes for 2070–2099 vary quite significantly between CSRs, between seasons and between meteorological variables. For all CSRs, the expected warming in summer is higher than that expected for the other seasons. This summer warming is accompanied by a large decrease in precipitation. The uncertainty of the scaling ratios for temperature and precipitation is relatively large in summer because of the differences between regional climate models. Differences between the spatial climate-change patterns of global climate model simulations make significant contributions to the uncertainty of the scaling ratio for temperature. However, no meaningful contribution could be found for the scaling ratio for precipitation due to the small number of global climate models in the PRUDENCE project and natural variability, which is
Perrotta, A
2002-01-01
A MC method is proposed to compute upper limits, in a pure Bayesian approach, when the errors associated with the experimental sensitivity and expected background content are not Gaussian distributed or not small enough to apply usual approximations. It is relatively easy to extend the procedure to the multichannel case (for instance when different decay branching, luminosities or experiments have to be combined). Some of the searches for supersymmetric particles performed in the DELPHI experiment at the LEP electron- positron collider use such a procedure to propagate systematics into the calculation of cross-section upper limits. One of these searches is described as an example. (6 refs).
精密播种粒距的概率分布%Probability Distribution of Seed Spacing of Precision Drilling
Institute of Scientific and Technical Information of China (English)
王玉顺; 司慧萍; 郑德聪; 吴海平
2001-01-01
The placement formula was deduced from the seed placement coordinate which was expressed as the sum of expectation value and normal stochastic error based on the observation of the seed spacing forming process in precision drilling. On this basis, a mathematical model was set up, from which the probability density function, probability distribution function, and first as well as second order moments of the seed spacing were obtained. The result show that the seed spacing is a stochastic variable conforming to a deductive distribution of several normal distributions with the same variance but different mean value, and it can be regarded as the folding tail and addition of normal distributions with different distribution parameters. The results of goodness-of-fit test proved that the deduced seed spacing distribution coincided with various practical seed spacing samples. The graph of the spacing probability density whose shape relies on the distribution parameters usually takes the form of “multi-peak and non-symmetry”.%考察精密播种的粒距形成过程，将种子落点坐标表示为期望值与正态随机偏差之和，导出落点间隔表达式。基于平稳随机过程建立了粒距的数学模型，演绎推导出粒距分布的分布函数、分布密度、一阶原点矩、二阶原点矩和方差。结果表明：粒距随机变量遵从一种源于方差相同但均值不同的多个正态总体的导出分布，可看作是不同分布参数正态分布的折尾与叠加。拟合优度检验的结果证实所求粒距分布对多种实测粒距样本拟合良好。分布密度图像的形状决定于分布参数，一般呈现为“多峰非对称”形式。
International Nuclear Information System (INIS)
The evolution of the scalar probability density function (pdf), the conditional scalar dissipation rate, and other statistics including transport properties are studied for passive temperature fluctuations in decaying grid-generated turbulence. The effect of filtering and differentiating the time series is also investigated. For a nonzero mean temperature gradient it is shown that the pdf of the temperature fluctuations has pronounced exponential tails for turbulence Reynolds number (Rel) greater than 70 but below this value the pdf is close to Gaussian. The scalar dissipation rate, conditioned on the fluctuations, shows that there is a high expectation of dissipation in the presence of the large, rare fluctuations that produce the exponential tails. Significant positive correlation between the mean square scalar fluctuations and the instantaneous scalar dissipation rate is found when exponential tails occur. The case of temperature fluctuations in the absence of a mean gradient is also studied. Here, the results are less definite because the generation of the fluctuations (by means of fine heated wires) causes an asymmetry in the pdf. The results show, however, that the pdf is close to Gaussian and that the correlation between the mean square temperature fluctuations and the instantaneous scalar dissipation rate is very weak. For the linear profile case, measurements over the range 60≤Rel≤1100 show that the dimensionless heat flux Nu is proportional to Rel0.88 and that the transition from a Gaussian pdf to one with exponential tails occurs at Nu∼31, a value close to transitions observed in other recent mixing experiments conducted in entirely different turbulent flows
Del Giudice, G; Padulano, R; Siciliano, D
2016-01-01
The lack of geometrical and hydraulic information about sewer networks often excludes the adoption of in-deep modeling tools to obtain prioritization strategies for funds management. The present paper describes a novel statistical procedure for defining the prioritization scheme for preventive maintenance strategies based on a small sample of failure data collected by the Sewer Office of the Municipality of Naples (IT). Novelty issues involve, among others, considering sewer parameters as continuous statistical variables and accounting for their interdependences. After a statistical analysis of maintenance interventions, the most important available factors affecting the process are selected and their mutual correlations identified. Then, after a Box-Cox transformation of the original variables, a methodology is provided for the evaluation of a vulnerability map of the sewer network by adopting a joint multivariate normal distribution with different parameter sets. The goodness-of-fit is eventually tested for each distribution by means of a multivariate plotting position. The developed methodology is expected to assist municipal engineers in identifying critical sewers, prioritizing sewer inspections in order to fulfill rehabilitation requirements. PMID:26901717
Codon information value and codon transition-probability distributions in short-term evolution
Jiménez-Montaño, M. A.; Coronel-Brizio, H. F.; Hernández-Montoya, A. R.; Ramos-Fernández, A.
2016-07-01
To understand the way the Genetic Code and the physical-chemical properties of coded amino acids affect accepted amino acid substitutions in short-term protein evolution, taking into account only overall amino acid conservation, we consider an underlying codon-level model. This model employs codon pair-substitution frequencies from an empirical matrix in the literature, modified for single-base mutations only. Ordering the degenerated codons according to their codon information value (Volkenstein, 1979), we found that three-fold and most of four-fold degenerated codons, which have low codon values, were best fitted to rank-frequency distributions with constant failure rate (exponentials). In contrast, almost all two-fold degenerated codons, which have high codon values, were best fitted to rank-frequency distributions with variable failure rate (inverse power-laws). Six-fold degenerated codons are considered to be doubly assigned. The exceptional behavior of some codons, including non-degenerate codons, is discussed.
Directory of Open Access Journals (Sweden)
Hideaki Tanoue
2013-07-01
Full Text Available The body tilt angle of a fish has a large effect on the acoustic target strength. For an accurate estimation of fish abundance using acoustic methods, it is necessary to measure body tilt angles in free-ranging fish. We measured diurnal body tilt angle distributions of threeline grunt (Parapristipoma trilineatum while swimming in schools in a fish cage. Micro-acceleration data loggers were used to record (for 3 days swaying and surging accelerations (at 16 Hz intervals of 10 individuals among 20 forming a school in a fish cage. Time series analysis of 1-h mean body tilt angles revealed significant differences in body tilt angles between day (−7.9 ± 3.28° and night (0.8 ± 5.89°, which must be taken into account when conducting acoustic surveys. These results will be useful for calculating the average dorsal aspect target strength (TS of threeline grunt for accurate estimations of fish abundance.
Probability distribution of the free energy of a directed polymer in a random medium.
Brunet, E; Derrida, B
2000-06-01
We calculate exactly the first cumulants of the free energy of a directed polymer in a random medium for the geometry of a cylinder. By using the fact that the nth moment of the partition function is given by the ground-state energy of a quantum problem of n interacting particles on a ring of length L, we write an integral equation allowing to expand these moments in powers of the strength of the disorder gamma or in powers of n. For n small and n approximately (Lgamma)(-1/2), the moments take a scaling form which allows us to describe all the fluctuations of order 1/L of the free energy per unit length of the directed polymer. The distribution of these fluctuations is the same as the one found recently in the asymmetric exclusion process, indicating that it is characteristic of all the systems described by the Kardar-Parisi-Zhang equation in 1+1 dimensions. PMID:11088374
Perrotta, Andrea
A MC method is proposed to compute upper limits, in a pure Bayesian approach, when the errors associated to the experimental sensitivity and to the expected background content are not Gaussian distributed or not small enough to apply the usual approximations. It is relatively easy to extend the procedure to the multichannel case (for instance when different decay branchings, or luminosities or experiments have to be combined). Some of the searches for supersymmetric particles performed in the DELPHI experiment at the LEP electron-positron collider use such a procedure to propagate the systematics into the calculation of the cross-section upper limits. One of these searches will be described as an example.
Institute of Scientific and Technical Information of China (English)
QIAN Shang-Wu; GU Zhi-Yu
2001-01-01
Using the Feynman's path integral with topological constraints arising from the presence of one singular line, we find the homotopic probability distribution PnL for the winding number n and the partition function PL of the entangled system around a ribbon segment chain. We find that when the width of the ribbon segment chain 2a increases,the partition function exponentially decreases, whereas the free energy increases an amount, which is proportional to the square of the width. When the width tends to zero we obtain the same results as those of a single chain with one singular point.
Jacobsen, J L; Saleur, H
2008-02-29
We determine exactly the probability distribution of the number N_(c) of valence bonds connecting a subsystem of length L>1 to the rest of the system in the ground state of the XXX antiferromagnetic spin chain. This provides, in particular, the asymptotic behavior of the valence-bond entanglement entropy S_(VB)=N_(c)ln2=4ln2/pi(2)lnL disproving a recent conjecture that this should be related with the von Neumann entropy, and thus equal to 1/3lnL. Our results generalize to the Q-state Potts model.
Timonina, Anna; Hochrainer-Stigler, Stefan; Pflug, Georg; Jongman, Brenden; Rojas, Rodrigo
2015-11-01
Losses due to natural hazard events can be extraordinarily high and difficult to cope with. Therefore, there is considerable interest to estimate the potential impact of current and future extreme events at all scales in as much detail as possible. As hazards typically spread over wider areas, risk assessment must take into account interrelations between regions. Neglecting such interdependencies can lead to a severe underestimation of potential losses, especially for extreme events. This underestimation of extreme risk can lead to the failure of riskmanagement strategies when they are most needed, namely, in times of unprecedented events. In this article, we suggest a methodology to incorporate such interdependencies in risk via the use of copulas. We demonstrate that by coupling losses, dependencies can be incorporated in risk analysis, avoiding the underestimation of risk. Based on maximum discharge data of river basins and stream networks, we present and discuss different ways to couple loss distributions of basins while explicitly incorporating tail dependencies. We distinguish between coupling methods that require river structure data for the analysis and those that do not. For the later approach we propose a minimax algorithm to choose coupled basin pairs so that the underestimation of risk is avoided and the use of river structure data is not needed. The proposed methodology is especially useful for large-scale analysis and we motivate and apply our method using the case of Romania. The approach can be easily extended to other countries and natural hazards.
Schneider, N; Motte, F; Ossenkopf, V; Klessen, R S; Simon, R; Fechtenbaum, S; Herpin, F; Tremblin, P; Csengeri, T; Myers, P C; Hill, T; Cunningham, M; Federrath, C
2015-01-01
Column density (N) PDFs serve as a powerful tool to characterize the physical processes that influence the structure of molecular clouds. Star-forming clouds can best be characterized by lognormal PDFs for the lower N range and a power-law tail for higher N, commonly attributed to turbulence and self-gravity and/or pressure, respectively. We report here on PDFs obtained from observations of 12CO, 13CO, C18O, CS, and N2H+ in the Cygnus X North region and compare to a PDF derived from dust observations with the Herschel satellite. The PDF of 12CO is lognormal for Av~1-30, but is cut for higher Av due to optical depth effects. The PDFs of C18O and 13CO are mostly lognormal up for Av~1-15, followed by excess up to Av~40. Above that value, all CO PDFs drop, most likely due to depletion. The high density tracers CS and N2H+ exhibit only a power law distribution between Av~15 and 400, respectively. The PDF from dust is lognormal for Av~2-15 and has a power-law tail up to Av~500. Absolute values for the molecular lin...
Asgarani, Somayeh
2015-02-01
A method of finding entropic form for a given stationary probability distribution and specified potential field is discussed, using the steady-state Fokker-Planck equation. As examples, starting with the Boltzmann and Tsallis distribution and knowing the force field, we obtain the Boltzmann-Gibbs and Tsallis entropies. Also, the associated entropy for the gamma probability distribution is found, which seems to be in the form of the gamma function. Moreover, the related Fokker-Planck equations are given for the Boltzmann, Tsallis, and gamma probability distributions. PMID:25768455
Strange, P.
2012-01-01
In this paper we demonstrate a surprising aspect of quantum mechanics that is accessible to an undergraduate student. We discuss probability backflow for an electron in a constant magnetic field. It is shown that even for a wavepacket composed entirely of states with negative angular momentum the effective angular momentum can take on positive…
Ying, L H
2012-01-01
Nonlinear instability and refraction by ocean currents are both important mechanisms that go beyond the Rayleigh approximation and may be responsible for the formation of freak waves. In this paper, we quantitatively study nonlinear effects on the evolution of surface gravity waves on the ocean, to explore systematically the effects of various input parameters on the probability of freak wave formation. The fourth-order current-modified nonlinear Schr\\"odinger equation (CNLS4) is employed to describe the wave evolution. By solving CNLS4 numerically, we are able to obtain quantitative predictions for the wave height distribution as a function of key environmental conditions such as average steepness, angular spread, and frequency spread of the local sea state. Additionally, we explore the spatial dependence of the wave height distribution, associated with the buildup of nonlinear development.
Belloni, M; Robinett, R W
2003-01-01
We calculate the Wigner quasi-probability distribution for position and momentum, P_W^(n)(x,p), for the energy eigenstates of the standard infinite well potential, using both x- and p-space stationary-state solutions, as well as visualizing the results. We then evaluate the time-dependent Wigner distribution, P_W(x,p;t), for Gaussian wave packet solutions of this system, illustrating both the short-term semi-classical time dependence, as well as longer-term revival and fractional revival behavior and the structure during the collapsed state. This tool provides an excellent way of demonstrating the patterns of highly correlated Schrodinger-cat-like `mini-packets' which appear at fractional multiples of the exact revival time.
Energy Technology Data Exchange (ETDEWEB)
West, H.I. Jr.
1978-12-11
An account is given of the obervations of the pitch angle distributions of energetic particles in the near equatorial regions of the Earth's magnetosphere. The emphasis is on relating the observed distributions to the field configuration responsible for the observed effects. The observed effects relate to drift-shell splitting, to the breakdown of adiabatic guiding center motion in regions of sharp field curvature relative to partial gyro radii, to wave-particle interactions, and to moving field configurations. 39 references.
Drakos, Nicole E; Wahl, Lindi M
2015-12-01
Theoretical approaches are essential to our understanding of the complex dynamics of mobile genetic elements (MGEs) within genomes. Recently, the birth-death-diversification model was developed to describe the dynamics of mobile promoters (MPs), a particular class of MGEs in prokaryotes. A unique feature of this model is that genetic diversification of elements was included. To explore the implications of diversification on the longterm fate of MGE lineages, in this contribution we analyze the extinction probabilities, extinction times and equilibrium solutions of the birth-death-diversification model. We find that diversification increases both the survival and growth rate of MGE families, but the strength of this effect depends on the rate of horizontal gene transfer (HGT). We also find that the distribution of MGE families per genome is not necessarily monotonically decreasing, as observed for MPs, but may have a peak in the distribution that is related to the HGT rate. For MPs specifically, we find that new families have a high extinction probability, and predict that the number of MPs is increasing, albeit at a very slow rate. Additionally, we develop an extension of the birth-death-diversification model which allows MGEs in different regions of the genome, for example coding and non-coding, to be described by different rates. This extension may offer a potential explanation as to why the majority of MPs are located in non-promoter regions of the genome.
Azimuth angle distribution of thermal-infrared temperature over rice canopy with row orientation
International Nuclear Information System (INIS)
Using ground-based and airborne observation, as well as numerical simulation, we confirmed that the thermal-infrared temperature (TIT) of a rice canopy surface with row orientation changes with azimuth viewing angle. The TIT of the direction parallel to row orientation is 1-4degC higher than that of the other directions. The TIT differences occur during the daytime, and for a leaf area index (LAI) around 0.5-3 because the field of view of an infrared thermometer viewing a direction parallel to the rows contains much more of the water surface under the rice canopy than the plant surface of the canopy. The temperature of the water surface between rows is much higher than that of the plant surface, because the intense incoming solar radiation near noon is not absorbed by the canopy and so warms the water efficiently. Matsushima and Kondo (1997) developed a radiation transfer model for TIT of a rice canopy surface, and confirmed a nadir viewing angle dependence of TIT of according to leaf area index. Based on the above model, a model of a rice canopy with row orientation was developed to investigate the TIT variation with azimuth viewing angle. The model design employs the ratio of the apparent areas of the plant surface and the underground water surface, which change with the azimuth and nadir viewing angles, and reproduces the observation well. These results indicate that the main cause of the TIT difference is the ratio of the apparent areas of the plant surface and the water surface when the temperature of the water surface is much higher than that of the plant surface. The TIT in a westerly direction exceeds that of the other directions shortly after sunrise because the solar elevation is low and the azimuth of the sun is around east. This is because the plant surface temperature exceeds that of the water surface, which is opposite the near noon cases. On the scale of a satellite grid, a simple numerical experiment demonstrated that the TIT difference of azimuth
Lum, Daniel J; Knarr, Samuel H; Howell, John C
2015-10-19
We demonstrate how to efficiently implement extremely high-dimensional compressive imaging of a bi-photon probability distribution. Our method uses fast-Hadamard-transform Kronecker-based compressive sensing to acquire the joint space distribution. We list, in detail, the operations necessary to enable fast-transform-based matrix-vector operations in the joint space to reconstruct a 16.8 million-dimensional image in less than 10 minutes. Within a subspace of that image exists a 3.2 million-dimensional bi-photon probability distribution. In addition, we demonstrate how the marginal distributions can aid in the accuracy of joint space distribution reconstructions.
Distribution of Sulfur in Carbon/Sulfur Nanocomposites Analyzed by Small-Angle X-ray Scattering.
Petzold, Albrecht; Juhl, Anika; Scholz, Jonas; Ufer, Boris; Goerigk, Günter; Fröba, Michael; Ballauff, Matthias; Mascotto, Simone
2016-03-22
The analysis of sulfur distribution in porous carbon/sulfur nanocomposites using small-angle X-ray scattering (SAXS) is presented. Ordered porous CMK-8 carbon was used as the host matrix and gradually filled with sulfur (20-50 wt %) via melt impregnation. Owing to the almost complete match between the electron densities of carbon and sulfur, the porous nanocomposites present in essence a two-phase system and the filling of the host material can be precisely followed by this method. The absolute scattering intensities normalized per unit of mass were corrected accounting for the scattering contribution of the turbostratic microstructure of carbon and amorphous sulfur. The analysis using the Porod parameter and the chord-length distribution (CLD) approach determined the specific surface areas and filling mechanism of the nanocomposite materials, respectively. Thus, SAXS provides comprehensive characterization of the sulfur distribution in porous carbon and valuable information for a deeper understanding of cathode materials of lithium-sulfur batteries.
Magnetization curves and probability angular distribution of the magnetization vector in Er2Fe14Si3
Sobh, Hala A.; Aly, Samy H.; Shabara, Reham M.; Yehia, Sherif
2016-01-01
Specific magnetic and magneto-thermal properties of Er2Fe14Si3, in the temperature range of 80-300 K, have been investigated using basic laws of classical statistical mechanics in a simple model. In this model, the constructed partition function was used to derive, and therefore calculate the temperature and/or field dependence of a host of physical properties. Examples of these properties are: the magnetization, magnetic heat capacity, magnetic susceptibility, probability angular distribution of the magnetization vector, and the associated angular dependence of energy. We highlight a correlation between the energy of the system, its magnetization behavior and the angular location of the magnetization vector. Our results show that Er2Fe14Si3 is an easy-axis system in the temperature range 80-114 K, but switches to an easy-plane system at T≥114 K. This transition is also supported by both of the temperature dependence of the magnetic heat capacity, which develops a peak at a temperature ~114 K, and the probability landscape which shows, in zero magnetic field, a prominent peak in the basal plane at T=113.5 K.
Institute of Scientific and Technical Information of China (English)
HAN Li-Bo; GONG Xiao-Long; CAO Li; WU Da-Jin
2007-01-01
An approximate Fokker-P1anck equation for the logistic growth model which is driven by coloured correlated noises is derived by applying the Novikov theorem and the Fox approximation. The steady-state probability distribution (SPD) and the mean of the tumour cell number are analysed. It is found that the SPD is the single extremum configuration when the degree of correlation between the multiplicative and additive noises, λ, is in -1＜λ ≤ 0 and can be the double extrema in 0＜λ＜1. A configuration transition occurs because of the variation of noise parameters. A minimum appears in the curve of the mean of the steady-state tumour cell number, 〈x〉, versus λ. The position and the value of the minimum are controlled by the noise-correlated times.
Energy Technology Data Exchange (ETDEWEB)
Shabbir, A., E-mail: aqsa.shabbir@ugent.be; Noterdaeme, J. M. [Department of Applied Physics, Ghent University, B-9000 Ghent (Belgium); Max-Planck-Institut für Plasmaphysik, Garching D-85748 (Germany); Verdoolaege, G. [Department of Applied Physics, Ghent University, B-9000 Ghent (Belgium); Laboratoire de Physique des Plasmas de l' ERM, Laboratorium voor Plasmafysica van de KMS (LPP-ERM/KMS), Ecole Royale Militaire, Koninklijke Militaire School, B-1000 Brussels (Belgium); Kardaun, O. J. W. F. [Max-Planck-Institut für Plasmaphysik, Garching D-85748 (Germany); Collaboration: JET-EFDA Team
2014-11-15
Information visualization aimed at facilitating human perception is an important tool for the interpretation of experiments on the basis of complex multidimensional data characterizing the operational space of fusion devices. This work describes a method for visualizing the operational space on a two-dimensional map and applies it to the discrimination of type I and type III edge-localized modes (ELMs) from a series of carbon-wall ELMy discharges at JET. The approach accounts for stochastic uncertainties that play an important role in fusion data sets, by modeling measurements with probability distributions in a metric space. The method is aimed at contributing to physical understanding of ELMs as well as their control. Furthermore, it is a general method that can be applied to the modeling of various other plasma phenomena as well.
Rodionov, V. N.; Kravtsova, G. A.; Mandel', A. M.
2010-07-01
We study the effects of electromagnetic fields on nonrelativistic charged spinning particles bound by a short-range potential. We analyze the exact solution of the Pauli equation for an electron moving in the potential field determined by the three-dimensional δ-well in the presence of a strong magnetic field. We obtain asymptotic expressions for this solution for different values of the problem parameters. In addition, we consider electron probability currents and their dependence on the magnetic field. We show that including the spin in the framework of the nonrelativistic approach allows correctly taking the effect of the magnetic field on the electric current into account. The obtained dependences of the current distribution, which is an experimentally observable quantity, can be manifested directly in scattering processes, for example.
Directory of Open Access Journals (Sweden)
Chung-Ho Su
2010-12-01
Full Text Available To forecast a complex and non-linear system, such as a stock market, advanced artificial intelligence algorithms, like neural networks (NNs and genetic algorithms (GAs have been proposed as new approaches. However, for the average stock investor, two major disadvantages are argued against these advanced algorithms: (1 the rules generated by NNs and GAs are difficult to apply in investment decisions; and (2 the time complexity of the algorithms to produce forecasting outcomes is very high. Therefore, to provide understandable rules for investors and to reduce the time complexity of forecasting algorithms, this paper proposes a novel model for the forecasting process, which combines two granulating methods (the minimize entropy principle approach and the cumulative probability distribution approach and a rough set algorithm. The model verification demonstrates that the proposed model surpasses the three listed conventional fuzzy time-series models and a multiple regression model (MLR in forecast accuracy.
Energy Technology Data Exchange (ETDEWEB)
Cai, H.; Wang, M.; Elgowainy, A.; Han, J. (Energy Systems)
2012-07-06
Greenhouse gas (CO{sub 2}, CH{sub 4} and N{sub 2}O, hereinafter GHG) and criteria air pollutant (CO, NO{sub x}, VOC, PM{sub 10}, PM{sub 2.5} and SO{sub x}, hereinafter CAP) emission factors for various types of power plants burning various fuels with different technologies are important upstream parameters for estimating life-cycle emissions associated with alternative vehicle/fuel systems in the transportation sector, especially electric vehicles. The emission factors are typically expressed in grams of GHG or CAP per kWh of electricity generated by a specific power generation technology. This document describes our approach for updating and expanding GHG and CAP emission factors in the GREET (Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation) model developed at Argonne National Laboratory (see Wang 1999 and the GREET website at http://greet.es.anl.gov/main) for various power generation technologies. These GHG and CAP emissions are used to estimate the impact of electricity use by stationary and transportation applications on their fuel-cycle emissions. The electricity generation mixes and the fuel shares attributable to various combustion technologies at the national, regional and state levels are also updated in this document. The energy conversion efficiencies of electric generating units (EGUs) by fuel type and combustion technology are calculated on the basis of the lower heating values of each fuel, to be consistent with the basis used in GREET for transportation fuels. On the basis of the updated GHG and CAP emission factors and energy efficiencies of EGUs, the probability distribution functions (PDFs), which are functions that describe the relative likelihood for the emission factors and energy efficiencies as random variables to take on a given value by the integral of their own probability distributions, are updated using best-fit statistical curves to characterize the uncertainties associated with GHG and CAP emissions in life
International Nuclear Information System (INIS)
A five-step procedure was used in the 1990 performance simulations to construct probability distributions of the uncertain variables appearing in the mathematical models used to simulate the Waste Isolation Pilot Plant's (WIPP's) performance. This procedure provides a consistent approach to the construction of probability distributions in cases where empirical data concerning a variable are sparse or absent and minimizes the amount of spurious information that is often introduced into a distribution by assumptions of nonspecialists. The procedure gives first priority to the professional judgment of subject-matter experts and emphasizes the use of site-specific empirical data for the construction of the probability distributions when such data are available. In the absence of sufficient empirical data, the procedure employs the Maximum Entropy Formalism and the subject-matter experts' subjective estimates of the parameters of the distribution to construct a distribution that can be used in a performance simulation. (author)
International Nuclear Information System (INIS)
In the present work contrast-matching USANS (ultra-small-angle neutron scattering) was employed in order to determine the spatial distribution of immiscible fluids confined within a macroporous α-Al2O3 membrane. Water-air as well as water-hydrocarbon and hydrocarbon-air systems were examined and the analysis of the results, also on the basis of a complementary numerical study provided significant information on the behaviour of the multiphase ensemble as it has been demonstrated that the individual fluids occupy certain positions in the pore space, regardless of the actual values of the respective interfacial properties.
Dong, Wan Jae; Lo, Nhat-Truong; Jung, Gwan Ho; Ham, Juyoung; Lee, Jong-Lam
2016-03-01
A distributed Bragg reflector (DBR) is conducted as a bottom reflector in see-through organic photovoltaics (OPVs) with an active layer of poly(3-hexylthiophene) and phenyl-C61-butyric acid methyl ester (P3HT:PCBM). The DBR consists of alternative layers of the high- and low-refractive index materials of Ta2O5 (n = 2.16) and SiO2 (n = 1.46). The DBR selectively reflects the light within a specific wavelength region (490 nm-630 nm) where the absorbance of P3HT:PCBM is maximum. The see-through OPVs fabricated on DBR exhibit efficiency enhancement by 31% compared to the device without DBR. Additionally, the angle-dependent transmittance of DBR is analysed using optical simulation and verified by experimental results. As the incident angle of light increases, peak of reflectance shifts to shorter wavelength and the bandwidth gets narrower. This unique angle-dependent optical properties of DBR allows the facile color change of see-through OPVs.
Ait-Chaalal, Farid; Bartello, Peter
2011-01-01
We study an instantaneous bimolecular chemical reaction in a two-dimensional chaotic, incompressible and closed Navier-Stokes flow. Areas of well mixed reactants are initially separated by infinite gradients. We focus on the initial regime, characterized by a well-defined one-dimensional contact line between the reactants. The amount of reactant consumed is given by the diffusive flux along this line, and hence relates directly to its length and to the gradients along it. We show both theoretically and numerically that the probability distribution of the modulus of the gradient of the reactants along this contact line multiplied by {\\kappa} does not depend on the diffusion {\\kappa} and can be inferred, after a few turnover times, from the joint distribution of the finite time Lyapunov exponent {\\lambda} and the frequency 1/{\\tau} . The equivalent time {\\tau} measures the stretching time scale of a Lagrangian parcel in the recent past, while {\\lambda} measures it on the whole chaotic orbit. At smaller times, w...
IGM Constraints from the SDSS-III/BOSS DR9 Ly-alpha Forest Flux Probability Distribution Function
Lee, Khee-Gan; Spergel, David N; Weinberg, David H; Hogg, David W; Viel, Matteo; Bolton, James S; Bailey, Stephen; Pieri, Matthew M; Carithers, William; Schlegel, David J; Lundgren, Britt; Palanque-Delabrouille, Nathalie; Suzuki, Nao; Schneider, Donald P; Yeche, Christophe
2014-01-01
The Ly$\\alpha$ forest flux probability distribution function (PDF) is an established probe of the intergalactic medium (IGM) astrophysics, especially the temperature-density relationship of the IGM. We measure the flux PDF from 3393 Baryon Oscillations Spectroscopic Survey (BOSS) quasars from SDSS Data Release 9, and compare with mock spectra that include careful modeling of the noise, continuum, and astrophysical uncertainties. The BOSS flux PDFs, measured at $\\langle z \\rangle = [2.3,2.6,3.0]$, are compared with PDFs created from mock spectra drawn from a suite of hydrodynamical simulations that sample the IGM temperature-density relationship, $\\gamma$, and temperature at mean-density, $T_0$, where $T(\\Delta) = T_0 \\Delta^{\\gamma-1}$. We find that a significant population of partial Lyman-limit systems with a column-density distribution slope of $\\beta_\\mathrm{pLLS} \\sim -2$ are required to explain the data at the low-flux end of flux PDF, while uncertainties in the mean \\lya\\ forest transmission affect the...
Remote Sensing of Spatial Distributions of Greenhouse Gases in the Los Angles Basin
Fu, Dejian; Pongetti, Thomas J.; Sander, Stanley P.; Cheung, Ross; Stutz, Jochen; Park, Chang Hyoun; Li, Qinbin
2011-01-01
The Los Angeles air basin is a significant anthropogenic source of greenhouse gases and pollutants including CO2, CH4, N2O, and CO, contributing significantly to regional and global climate change. Recent legislation in California, the California Global Warming Solutions Act (AB32), established a statewide cap for greenhouse gas emissions for 2020 based on 1990 emissions. Verifying the effectiveness of regional greenhouse gas emissions controls requires high-precision, regional-scale measurement methods combined with models that capture the principal anthropogenic and biogenic sources and sinks. We present a novel approach for monitoring the spatial distributions of greenhouse gases in the Los Angeles basin using high resolution remote sensing spectroscopy. We participated in the CalNex 2010 campaign to provide greenhouse gas distributions for comparison between top-down and bottom-up emission estimates.
Chao-Chi Huang; Yang-Hung Chiu; Chih-Yu Wen
2014-01-01
In a vehicular sensor network (VSN), the key design issue is how to organize vehicles effectively, such that the local network topology can be stabilized quickly. In this work, each vehicle with on-board sensors can be considered as a local controller associated with a group of communication members. In order to balance the load among the nodes and govern the local topology change, a group formation scheme using localized criteria is implemented. The proposed distributed topology control meth...
IGM CONSTRAINTS FROM THE SDSS-III/BOSS DR9 Lyα FOREST TRANSMISSION PROBABILITY DISTRIBUTION FUNCTION
International Nuclear Information System (INIS)
The Lyα forest transmission probability distribution function (PDF) is an established probe of the intergalactic medium (IGM) astrophysics, especially the temperature-density relationship of the IGM. We measure the transmission PDF from 3393 Baryon Oscillations Spectroscopic Survey (BOSS) quasars from Sloan Digital Sky Survey Data Release 9, and compare with mock spectra that include careful modeling of the noise, continuum, and astrophysical uncertainties. The BOSS transmission PDFs, measured at (z) = [2.3, 2.6, 3.0], are compared with PDFs created from mock spectra drawn from a suite of hydrodynamical simulations that sample the IGM temperature-density relationship, γ, and temperature at mean density, T 0, where T(Δ) = T 0Δγ – 1. We find that a significant population of partial Lyman-limit systems (LLSs) with a column-density distribution slope of βpLLS ∼ – 2 are required to explain the data at the low-transmission end of transmission PDF, while uncertainties in the mean Lyα forest transmission affect the high-transmission end. After modeling the LLSs and marginalizing over mean transmission uncertainties, we find that γ = 1.6 best describes the data over our entire redshift range, although constraints on T 0 are affected by systematic uncertainties. Within our model framework, isothermal or inverted temperature-density relationships (γ ≤ 1) are disfavored at a significance of over 4σ, although this could be somewhat weakened by cosmological and astrophysical uncertainties that we did not model
IGM CONSTRAINTS FROM THE SDSS-III/BOSS DR9 Lyα FOREST TRANSMISSION PROBABILITY DISTRIBUTION FUNCTION
Energy Technology Data Exchange (ETDEWEB)
Lee, Khee-Gan; Hennawi, Joseph F. [Max Planck Institute for Astronomy, Königstuhl 17, D-69117 Heidelberg (Germany); Spergel, David N. [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States); Weinberg, David H. [Department of Astronomy and Center for Cosmology and Astro-Particle Physics, Ohio State University, Columbus, OH 43210 (United States); Hogg, David W. [Center for Cosmology and Particle Physics, New York University, 4 Washington Place, Meyer Hall of Physics, New York, NY 10003 (United States); Viel, Matteo [INAF, Osservatorio Astronomico di Trieste, Via G. B. Tiepolo 11, I-34131 Trieste (Italy); Bolton, James S. [School of Physics and Astronomy, University of Nottingham, University Park, Nottingham NG7 2RD (United Kingdom); Bailey, Stephen; Carithers, William; Schlegel, David J. [E.O. Lawrence Berkeley National Lab, 1 Cyclotron Road, Berkeley, CA 94720 (United States); Pieri, Matthew M. [Institute of Cosmology and Gravitation, University of Portsmouth, Dennis Sciama Building, Portsmouth PO1 3FX (United Kingdom); Lundgren, Britt [Department of Astronomy, University of Wisconsin, Madison, WI 53706 (United States); Palanque-Delabrouille, Nathalie; Yèche, Christophe [CEA, Centre de Saclay, Irfu/SPP, F-91191 Gif-sur-Yvette (France); Suzuki, Nao [Kavli Institute for the Physics and Mathematics of the Universe (IPMU), The University of Tokyo, Kashiwano-ha 5-1-5, Kashiwa-shi, Chiba (Japan); Schneider, Donald P., E-mail: lee@mpia.de [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States)
2015-02-01
The Lyα forest transmission probability distribution function (PDF) is an established probe of the intergalactic medium (IGM) astrophysics, especially the temperature-density relationship of the IGM. We measure the transmission PDF from 3393 Baryon Oscillations Spectroscopic Survey (BOSS) quasars from Sloan Digital Sky Survey Data Release 9, and compare with mock spectra that include careful modeling of the noise, continuum, and astrophysical uncertainties. The BOSS transmission PDFs, measured at (z) = [2.3, 2.6, 3.0], are compared with PDFs created from mock spectra drawn from a suite of hydrodynamical simulations that sample the IGM temperature-density relationship, γ, and temperature at mean density, T {sub 0}, where T(Δ) = T {sub 0}Δ{sup γ} {sup –} {sup 1}. We find that a significant population of partial Lyman-limit systems (LLSs) with a column-density distribution slope of β{sub pLLS} ∼ – 2 are required to explain the data at the low-transmission end of transmission PDF, while uncertainties in the mean Lyα forest transmission affect the high-transmission end. After modeling the LLSs and marginalizing over mean transmission uncertainties, we find that γ = 1.6 best describes the data over our entire redshift range, although constraints on T {sub 0} are affected by systematic uncertainties. Within our model framework, isothermal or inverted temperature-density relationships (γ ≤ 1) are disfavored at a significance of over 4σ, although this could be somewhat weakened by cosmological and astrophysical uncertainties that we did not model.
Energy distributions of plume ions from silver at different angles ablated in vacuum
DEFF Research Database (Denmark)
Christensen, Bo Toftmann; Schou, Jørgen; Canulescu, Stela
A typical pulsed laser deposition (PLD) is carried out for a fluence between 0.5 and 2.5 J/cm2. The ablated particles are largely neutrals at the lowest fluence, but the fraction of ions increases strongly with fluence and accounts for more 0.5 of the particles at 2.5 J/cm2 [1,2]. Since it may be...... comparatively difficult to measure the energy and angular distribution of neutrals, measurements of the ionic fraction will be valuable for any modeling of PLD. We have irradiated silver in a vacuum chamber (~ 10-7 mbar) with a Nd:YAG laser at a wavelength of 355 nm and made detailed measurements of the time...
International Nuclear Information System (INIS)
The Waste Isolation Pilot Plant (WIPP), in southeastern New Mexico, is a research and development facility to demonstrate safe disposal of defense-generated transuranic waste. The US Department of Energy will designate WIPP as a disposal facility if it meets the US Environmental Protection Agency's standard for disposal of such waste; the standard includes a requirement that estimates of cumulative releases of radioactivity to the accessible environment be incorporated in an overall probability distribution. The WIPP Project has chosen an approach to calculation of an overall probability distribution that employs the concept of scenarios for release and transport of radioactivity to the accessible environment. This report reviews the use of Monte Carlo methods in the calculation of an overall probability distribution and presents a logical and mathematical foundation for use of the scenario concept in such calculations. The report also draws preliminary conclusions regarding the shape of the probability distribution for the WIPP system; preliminary conclusions are based on the possible occurrence of three events and the presence of one feature: namely, the events ''attempted boreholes over rooms and drifts,'' ''mining alters ground-water regime,'' ''water-withdrawal wells provide alternate pathways,'' and the feature ''brine pocket below room or drift.'' Calculation of the WIPP systems's overall probability distributions for only five of sixteen possible scenario classes that can be obtained by combining the four postulated events or features
Elements of probability theory
Rumshiskii, L Z
1965-01-01
Elements of Probability Theory presents the methods of the theory of probability. This book is divided into seven chapters that discuss the general rule for the multiplication of probabilities, the fundamental properties of the subject matter, and the classical definition of probability. The introductory chapters deal with the functions of random variables; continuous random variables; numerical characteristics of probability distributions; center of the probability distribution of a random variable; definition of the law of large numbers; stability of the sample mean and the method of moments
Scheidt, Holger A; Pampel, André; Nissler, Ludwig; Gebhardt, Rolf; Huster, Daniel
2004-05-27
To investigate the structural basis for the antioxidative effects of plant flavonoids on the lipid molecules of cellular membranes, we have studied the location and distribution of five different flavonoid molecules (flavone, chrysin, luteolin, myricetin, and luteolin-7-glucoside) with varying polarity in monounsaturated model membranes. The investigated molecules differed in the number of hydroxyl groups attached to the polyphenolic benzo-gamma-pyrone compounds. To investigate the relation between hydrophobicity and membrane localization/orientation, we have applied (1)H magic angle spinning NMR techniques measuring ring current induced chemical shift changes, nuclear Overhauser enhancement cross-relaxation rates, and lateral diffusion coefficients. All investigated flavonoids show a broad distribution along the membrane normal with a maximum in the lipid/water interface. With increasing number of hydroxyl groups, the maximum of this distribution is biased towards the lipid headgroups. These results are confirmed by pulsed field gradient NMR measurements of the lateral diffusion coefficients of phospholipids and flavonoids, respectively. From the localization of different flavonoid protons in the membrane, a model for the orientation of the molecules in a lipid bilayer can be deduced. This orientation depends on the position of the polar center of the flavonoid molecule. PMID:15157612
Nikolaev, Pavel
2009-01-01
Many applications of single wall carbon nanotubes (SWCNT), especially in microelectronics, will benefit from use of certain (n,m) nanotube types (metallic, small gap semiconductor, etc.) Especially fascinating is the possibility of quantum conductors that require metallic armchair nanotubes. However, as produced SWCNT samples are polydisperse, with many (n,m) types present and typical approx.1:2 metal/semiconductor ratio. Nanotube nucleation models predict that armchair nuclei are energetically preferential due to formation of partial triple bonds along the armchair edge. However, nuclei can not reach any meaningful thermal equilibrium in a rapidly expanding and cooling plume of carbon clusters, leading to polydispersity. In the present work, SWCNTs were produced by a pulsed laser vaporization (PLV) technique. The carbon vapor plume cooling rate was either increased by change in the oven temperature (expansion into colder gas), or decreased via "warm-up" with a laser pulse at the moment of nucleation. The effect of oven temperature and "warm-up" on nanotube type population was studied via photoluminescence, UV-Vis-NIR absorption and Raman spectroscopy. It was found that reduced temperatures leads to smaller average diameters, progressively narrower diameter distributions, and some preference toward armchair structures. "Warm-up" shifts nanotube population towards arm-chair structures as well, but the effect is small. Possible improvement of the "warm-up" approach to produce armchair SWCNTs will be discussed. These results demonstrate that PLV production technique can provide at least partial control over the nanotube (n,m) population. In addition, these results have implications for the understanding the nanotube nucleation mechanism in the laser oven.
International Nuclear Information System (INIS)
A visual study is conducted to determine the effect of geometrical parameters of a two-fluid atomizer on its spray cone angle. The liquid (water) jets exit from six peripheral inclined orifices and are introduced to a high speed gas (air) stream in the gravitational direction. Using a high speed imaging system, the spray cone angle has been determined in constant operational conditions, i.e., Reynolds and Weber numbers for different nozzle geometries. Also, the droplet sizes (Sauter mean diameter) and their distributions have been determined using Malvern Master Sizer x. The investigated geometrical parameters are the liquid jet diameter, liquid port angle and the length of the gas-liquid mixing chamber. The results show that among these parameters, the liquid jet diameter has a significant effect on spray cone angle. In addition, an empirical correlation has been obtained to predict the spray cone angle of the present two-fluid atomizer in terms of nozzle geometries
Hewson, Alex C.; Bauer, Johannes
2010-01-01
We show that information on the probability density of local fluctuations can be obtained from a numerical renormalisation group calculation of a reduced density matrix. We apply this approach to the Anderson-Holstein impurity model to calculate the ground state probability density $\\rho(x)$ for the displacement $x$ of the local oscillator. From this density we can deduce an effective local potential for the oscillator and compare its form with that obtained from a semiclassical approximation...
Baiamonte, Giorgio; Singh, Vijay P.
2016-04-01
extended to the case of pervious hillslopes, accounting for infiltration. In particular, an analytical solution for the time of concentration for overland flow on a rectangular plane surface was derived using the kinematic wave equation under the Green-Ampt infiltration (Baiamonte and Singh, 2015). The objective of this work is to apply the latter solution to determine the probability distribution of hillslope peak discharge by combining it with the familiar rainfall duration-intensity-frequency approach. References Agnese, C., Baiamonte, G., and Corrao, C. (2001). "A simple model of hillslope response for overland flow generation". Hydrol. Process., 15, 3225-3238, ISSN: 0885-6087, doi: 10.1002/hyp.182. Baiamonte, G., and Agnese, C. (2010). "An analytical solution of kinematic wave equations for overland flow under Green-Ampt infiltration". J. Agr. Eng., vol. 1, p. 41-49, ISSN: 1974-7071. Baiamonte, G., and Singh, V.P., (2015). "Analytical solution of kinematic wave time of concentration for overland flow under Green-Ampt Infiltration." J Hydrol E - ASCE, DOI: 10.1061/(ASCE)HE.1943-5584.0001266. Robinson, J.S., and Sivapalan, M. (1996). "Instantaneous response functions of overland flow and subsurface stormflow for catchment models". Hydrol. Process., 10, 845-862. Singh, V.P. (1976). "Derivation of time of concentration". J. of Hydrol., 30, 147-165. Singh, V.P., (1996). Kinematic-Wave Modeling in Water Resources: Surface-Water Hydrology. John Wiley & Sons, Inc., New York, 1399 pp.
Lunsford, M. Leigh; Rowell, Ginger Holmes; Goodson-Espy, Tracy
2006-01-01
We applied a classroom research model to investigate student understanding of sampling distributions of sample means and the Central Limit Theorem in post-calculus introductory probability and statistics courses. Using a quantitative assessment tool developed by previous researchers and a qualitative assessment tool developed by the authors, we…
Hewson, Alex C; Bauer, Johannes
2010-03-24
We show that information on the probability density of local fluctuations can be obtained from a numerical renormalization group calculation of a reduced density matrix. We apply this approach to the Anderson-Holstein impurity model to calculate the ground state probability density ρ(x) for the displacement x of the local oscillator. From this density we can deduce an effective local potential for the oscillator and compare its form with that obtained from a semiclassical approximation as a function of the coupling strength. The method is extended to the infinite dimensional Holstein-Hubbard model using dynamical mean field theory. We use this approach to compare the probability densities for the displacement of the local oscillator in the normal, antiferromagnetic and charge ordered phases.
Freund, John E
1993-01-01
Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.
Gamma probability distribution calculation using Orthogonal Polynomials%Gamma概率分布值的正交多项式矩计算法研究
Institute of Scientific and Technical Information of China (English)
高钰婧; 宋松柏
2013-01-01
A numerical calculation of Gamma distribution value was presented.Taking common two-pa-rameter gamma distribution in hydrological analysis as example,employing the advantage of high accuracy calculation of Mathematica,the paper calculated the recurrence relation coefficients of non-classical orthogonal polynomials of weight function and further computed hydrological probability distribution.The results show that the method of probability distribution calculation has high accuracy,and is a general method for numerical integration of weight function and interval and can provide a way for the hydrological probability distribution calculation.%研究Gamma分布概率分布值的数值计算,以水文分析常用的两参数Gamma分布为例,采用Mathematica数值高精度计算的优势,计算权函数的非古典正交多项式递推系数,进而进行水文概率分布计算.结果表明:文中方法推求的概率分布值计算方法具有较高的计算精度,是一种通用的权函数和积分区间数值积分算法,可为水文概率计算提供一种计算途径.
Heller, William; Qian, Shuo
2012-02-01
Cellular membranes are complex mixtures of lipids, proteins and other small molecules that provide functional, dynamic barriers between the cell and its environment, as well as between environments within the cell. The lipid composition of the membrane is highly specific and controlled in terms of both content and lipid localization. Here, small-angle neutron scattering and selective deuterium labeling were used to probe the impact of the membrane-active peptides melittin and alamethicin on the structure of lipid bilayers composed of a mixture of the lipids dimyristoyl phosphatidylglycerol (DMPG) and chain-perdeuterated dimyristoyl phosphatidylcholine (DMPC). We found that both peptides enriched the outer leaflet of the bilayer with the negatively charged DMPG, creating an asymmetric distribution of lipids. The level of enrichment is peptide concentration-dependent and is stronger for melittin than alamethicin. The enrichment between the inner and outer bilayer leaflets occurs at very low peptide concentrations, and increases with peptide concentration, including when the peptide adopts a membrane-spanning, pore-forming state.
Goerigk, G.; Schweins, R.; Huber, K.; Ballauff, M.
2004-05-01
The distribution of Sr counterions around negatively charged sodium polyacrylate chains (NaPA) in aqueous solution was studied by anomalous small-angle X-ray scattering. Different ratios of the concentrations of SrCl2/[NaPA] reveal dramatic changes in the scattering curves. At the lower ratio the scattering curves indicate a coil-like behavior, while at the higher ratio the scattering curves are contracted to smaller q-values, caused by the collapse of the NaPA coil. The form factor of the scattering contribution of the counterions was separated and analyzed. For the scattering curves of the collapsed chains, this analysis agrees with the model of a pearl necklace, consisting of collapsed sphere-like subdomains which are connected by stretched chain segments. An averaged radius of the pearls of 19 nm and a distance between neighbouring pearls close to 60 nm could be established for the collapsed state of the NaPA chains.
Integrated statistical modelling of spatial landslide probability
Mergili, M.; Chu, H.-J.
2015-09-01
Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.
Takemura, Akihiro; Togawa, Kumiko; Yokoi, Tomohiro; Ueda, Shinichi; Noto, Kimiya; Kojima, Hironori; Isomura, Naoki; Kumano, Tomoyasu
2016-07-01
In volumetric modulated arc therapy (VMAT) for prostate cancer, a positional and rotational error correction is performed according to the position and angle of the prostate. The correction often involves body leaning, and there is concern regarding variation in the dose distribution. Our purpose in this study was to evaluate the impact of body pitch rotation on the dose distribution regarding VMAT. Treatment plans were obtained retrospectively from eight patients with prostate cancer. The body in the computed tomography images for the original VMAT plan was shifted to create VMAT plans with virtual pitch angle errors of ±1.5° and ±3°. Dose distributions for the tilted plans were recalculated with use of the same beam arrangement as that used for the original VMAT plan. The mean value of the maximum dose differences in the dose distributions between the original VMAT plan and the tilted plans was 2.98 ± 0.96 %. The value of the homogeneity index for the planning target volume (PTV) had an increasing trend according to the pitch angle error, and the values of the D 95 for the PTV and D 2ml, V 50, V 60, and V 70 for the rectum had decreasing trends (p pitch angle error caused by body leaning had little effect on the dose distribution; in contrast, the pitch angle correction reduced the effects of organ displacement and improved these indexes. Thus, the pitch angle setup error in VMAT for prostate cancer should be corrected. PMID:26873139
Energy Technology Data Exchange (ETDEWEB)
Ezure, Hideo
1988-09-01
Effective combination of measured data with theoretical analysis has permitted deriving a mehtod for more accurately estimating the power distribution in BWRs. Use is made of least squares method for the combination between relationship of the power distribution with measured values and the model used in FLARE or in the three-dimensional two-group diffusion code. Trial application of the new method to estimating the power distribution in JPDR-1 has proved the method to provide reliable results.
Directory of Open Access Journals (Sweden)
Luis Vicente Chamorro Marcillllo
2013-06-01
Full Text Available Engineering, within its academic and application forms, as well as any formal research work requires the use of statistics and every inferential statistical analysis requires the use of values of probability distribution functions that are generally available in tables. Generally, management of those tables have physical problems (wasteful transport and wasteful consultation and operational (incomplete lists and limited accuracy. The study, “Probability distribution function values in mobile phones”, permitted determining – through a needs survey applied to students involved in statistics studies at Universidad de Nariño – that the best known and most used values correspond to Chi-Square, Binomial, Student’s t, and Standard Normal distributions. Similarly, it showed user’s interest in having the values in question within an alternative means to correct, at least in part, the problems presented by “the famous tables”. To try to contribute to the solution, we built software that allows immediately and dynamically obtaining the values of the probability distribution functions most commonly used by mobile phones.
Aab, A.; Abreu, P.; Aglietta, M.; Ahn, E. J.; Al Samarai, I.; Albuquerque, I. F. M.; Allekotte, I.; Allen, J.; Allison, P.; Almela, A.; Alvarez Castillo, J.; Alvarez-Muniz, J.; Batista, R. Alves; Ambrosio, M.; Aminaei, A.; Anchordoqui, L.; Andringa, S.; Aramo, C.; Aranda, M.; Arqueros, F.; Asorey, H.; Assis, P.; Aublin, J.; Ave, M.; Avenier, M.; Avila, G.; Awal, N.; Badescu, A. M.; Barber, K. B.; Baeuml, J.; Baus, C.; Beatty, J. J.; Becker, K. H.; Bellido, J. A.; Berat, C.; Bertaina, M. E.; Bertou, X.; Biermann, P. L.; Billoir, P.; Blaess, S. G.; Blanco, M.; Bleve, C.; Bluemer, H.; Bohacova, M.; Boncioli, D.; Bonifazi, C.; Bonino, R.; Borodai, N.; Brack, J.; Brancus, I.; Bridgeman, A.; Brogueira, P.; Brown, W. C.; Buchholz, P.; Bueno, A.; Buitink, S.; Buscemi, M.; Caballero-Mora, K. S.; Caccianiga, B.; Caccianiga, L.; Candusso, M.; Caramete, L.; Caruso, R.; Castellina, A.; Cataldi, G.; Cazon, L.; Cester, R.; Chavez, A. G.; Chiavassa, A.; Chinellato, J. A.; Chudoba, J.; Cilmo, M.; Clay, R. W.; Cocciolo, G.; Colalillo, R.; Coleman, A.; Collica, L.; Coluccia, M. R.; Conceicao, R.; Contreras, F.; Cooper, M. J.; Cordier, A.; Coutu, S.; Covault, C. E.; Cronin, J.; Curutiu, A.; Dallier, R.; Daniel, B.; Dasso, S.; Daumiller, K.; Dawson, B. R.; de Almeida, R. M.; De Domenico, M.; de Jong, S. J.; de Mello Neto, J. R. T.; De Mitri, I.; de Oliveira, J.; de Souza, V.; del Peral, L.; Deligny, O.; Dembinski, H.; Dhital, N.; Di Giulio, C.; Di Matteo, A.; Diaz, J. C.; Diaz Castro, M. L.; Diogo, F.; Dobrigkeit, C.; Docters, W.; D'Olivo, J. C.; Dorofeev, A.; Hasankiadeh, Q. Dorosti; Dova, M. T.; Ebr, J.; Engel, R.; Erdmann, M.; Erfani, M.; Escobar, C. O.; Espadanal, J.; Etchegoyen, A.; Luis, P. Facal San; Falcke, H.; Fang, K.; Farrar, G.; Fauth, A. C.; Fazzini, N.; Ferguson, A. P.; Fernandes, M.; Fick, B.; Figueira, J. M.; Filevich, A.; Filipcic, A.; Fox, B. D.; Fratu, O.; Freire, M. M.; Froehlich, U.; Fuchs, B.; Fujii, T.; Gaior, R.; Garcia, B.; Garcia-Gamez, D.; Garcia-Pinto, D.; Garilli, G.; Gascon Bravo, A.; Gate, F.; Gemmeke, H.; Ghia, P. L.; Giaccari, U.; Giammarchi, M.; Giller, M.; Glaser, C.; Glass, H.; Gomez Berisso, M.; Gomez Vitale, P. F.; Goncalves, P.; Gonzalez, J. G.; Gonzalez, N.; Gookin, B.; Gordon, J.; Gorgi, A.; Gorham, P.; Gouffon, P.; Grebe, S.; Griffith, N.; Grillo, A. F.; Grubb, T. D.; Guarino, F.; Guedes, G. P.; Hampel, M. R.; Hansen, P.; Harari, D.; Harrison, T. A.; Hartmann, S.; Harton, J. L.; Haungs, A.; Hebbeker, T.; Heck, D.; Heimann, P.; Herve, A. E.; Hill, G. C.; Hojvat, C.; Hollon, N.; Holt, E.; Homola, P.; Horandel, J. R.; Horvath, P.; Hrabovsky, M.; Huber, D.; Huege, T.; Insolia, A.; Isar, P. G.; Jandt, I.; Jansen, S.; Jarne, C.; Josebachuili, M.; Kaeaepae, A.; Kambeitz, O.; Kampert, K. H.; Kasper, P.; Katkov, I.; Kegl, B.; Keilhauer, B.; Keivani, A.; Kemp, E.; Kieckhafer, R. M.; Klages, H. O.; Kleifges, M.; Kleinfeller, J.; Krause, R.; Krohm, N.; Kroemer, O.; Kruppke-Hansen, D.; Kuempel, D.; Kunka, N.; LaHurd, D.; Latronico, L.; Lauer, R.; Lauscher, M.; Lautridou, P.; Le Coz, S.; Leao, M. S. A. B.; Lebrun, D.; Lebrun, P.; Leigui de Oliveira, M. A.; Letessier-Selvon, A.; Lhenry-Yvon, I.; Link, K.; Lopez, R.; Louedec, K.; Lozano Bahilo, J.; Lu, L.; Lucero, A.; Ludwig, M.; Malacari, M.; Maldera, S.; Mallamaci, M.; Maller, J.; Mandat, D.; Mantsch, P.; Mariazzi, A. G.; Marin, V.; Maris, I. C.; Marsella, G.; Martello, D.; Martin, L.; Martinez, H.; Martinez Bravo, O.; Martraire, D.; Masias Meza, J. J.; Mathes, H. J.; Mathys, S.; Matthews, J.; Matthews, J. A. J.; Matthiae, G.; Maurel, D.; Maurizio, D.; Mayotte, E.; Mazur, P. O.; Medina, C.; Medina-Tanco, G.; Meissner, R.; Melissas, M.; Melo, D.; Menshikov, A.; Messina, S.; Meyhandan, R.; Micanovic, S.; Micheletti, M. I.; Middendorf, L.; Minaya, I. A.; Miramonti, L.; Mitrica, B.; Molina-Bueno, L.; Mollerach, S.; Monasor, M.; Ragaigne, D. Monnier; Montanet, F.; Morello, C.; Mostafa, M.; Moura, C. A.; Muller, M. A.; Mueller, G.; Mueller, S.; Muenchmeyer, M.; Mussa, R.; Navarra, G.; Navas, S.; Necesal, P.; Nellen, L.; Nelles, A.; Neuser, J.; Nguyen, P. H.; Niechciol, M.; Niemietz, L.; Niggemann, T.; Nitz, D.; Nosek, D.; Novotny, V.; Nozka, L.; Ochilo, L.; Oikonomou, F.; Olinto, A.; Oliveira, M.; Pacheco, N.; Pakk Selmi-Dei, D.; Palatka, M.; Pallotta, J.; Palmieri, N.; Papenbreer, P.; Parente, G.; Parra, A.; Paul, T.; Pech, M.; Pekala, J.; Pelayo, R.; Pepe, I. M.; Perrone, L.; Petermann, E.; Peters, C.; Petrera, S.; Petrov, Y.; Phuntsok, J.; Piegaia, R.; Pierog, T.; Pieroni, P.; Pimenta, M.; Pirronello, V.; Platino, M.; Plum, M.; Porcelli, A.; Porowski, C.; Prado, R. R.; Privitera, P.; Prouza, M.; Purrello, V.; Quel, E. J.; Querchfeld, S.; Quinn, S.; Rautenberg, J.; Ravel, O.; Ravignani, D.; Revenu, B.; Ridky, J.; Riggi, S.; Risse, M.; Ristori, P.; Rizi, V.; Rodrigues de Carvalho, W.; Fernandez, G. Rodriguez; Rodriguez Rojo, J.; Rodriguez-Frias, M. D.; Rogozin, D.; Ros, G.; Rosado, J.; Rossler, T.; Roth, M.; Roulet, E.; Rovero, A. C.; Saffi, S. J.; Saftoiu, A.; Salamida, F.; Salazar, H.; Saleh, A.; Greus, F. Salesa; Salina, G.; Sanchez, F.; Sanchez-Lucas, P.; Santo, C. E.; Santos, E.; Santos, E. M.; Sarazin, F.; Sarkar, B.; Sarmento, R.; Sato, R.; Scharf, N.; Scherini, V.; Schieler, H.; Schiffer, P.; Schmidt, D.; Scholten, O.; Schoorlemmer, H.; Schovanek, P.; Schroeder, F. G.; Schulz, A.; Schulz, J.; Schumacher, J.; Sciutto, S. J.; Segreto, A.; Settimo, M.; Shadkam, A.; Shellard, R. C.; Sidelnik, I.; Sigl, G.; Sima, O.; Smialkowski, A.; Smida, R.; Snow, G. R.; Sommers, P.; Sorokin, J.; Squartini, R.; Srivastava, Y. N.; Stanic, S.; Stapleton, J.; Stasielak, J.; Stephan, M.; Stutz, A.; Suarez, F.; Suomijaervi, T.; Supanitsky, A. D.; Sutherland, M. S.; Swain, J.; Szadkowski, Z.; Szuba, M.; Taborda, O. A.; Tapia, A.; Tepe, A.; Theodoro, V. M.; Timmermans, C.; Todero Peixoto, C. J.; Toma, G.; Tomankova, L.; Tome, B.; Tonachini, A.; Torralba Elipe, G.; Torres Machado, D.; Travnicek, P.; Trovato, E.; Ulrich, R.; Unger, M.; Urban, M.; Valdes Galicia, J. F.; Valino, I.; Valore, L.; van Aar, G.; van Bodegom, P.; van den Berg, A. M.; van Velzen, S.; van Vliet, A.; Varela, E.; Vargas Cardenas, B.; Varner, G.; Vazquez, J. R.; Vazquez, R. A.; Veberic, D.; Verzi, V.; Vicha, J.; Videla, M.; Villasenor, L.; Vlcek, B.; Vorobiov, S.; Wahlberg, H.; Wainberg, O.; Walz, D.; Watson, A. A.; Weber, M.; Weidenhaupt, K.; Weindl, A.; Werner, F.; Widom, A.; Wiencke, L.; Wilczynska, B.; Wilczynski, H.; Williams, C.; Winchen, T.; Wittkowski, D.; Wundheiler, B.; Wykes, S.; Yamamoto, T.; Yapici, T.; Yuan, G.; Yushkov, A.; Zamorano, B.; Zas, E.; Zavrtanik, D.; Zavrtanik, M.; Zepeda, A.; Zhou, J.; Zhu, Y.; Zimbres Silva, M.; Ziolkowski, M.; Zuccarello, F.
2015-01-01
We present the results of an analysis of the large angular scale distribution of the arrival directions of cosmic rays with energy above 4 EeV detected at the Pierre Auger Observatory including for the first time events with zenith angle between 60 degrees and 80 degrees. We perform two Rayleigh ana
Aab, A.; Abreu, P.; Aglietta, M.; Ahn, E. J.; Samarai, I. Al; Albuquerque, I. F. M.; Allekotte, I.; Allen, J.; Allison, P.; Almela, A.; Alvarez Castillo, J.; Alvarez-Muñiz, J.; Alves Batista, R.; Ambrosio, M.; Aminaei, A.; Anchordoqui, L.; Andringa, S.; Aramo, C.; Aranda, V. M.; Arqueros, F.; Asorey, H.; Assis, P.; Aublin, J.; Ave, M.; Avenier, M.; Avila, G.; Awal, N.; Badescu, A. M.; Barber, K. B.; Bäuml, J.; Baus, C.; Beatty, J. J.; Becker, K. H.; Bellido, J. A.; Berat, C.; Bertaina, M. E.; Bertou, X.; Biermann, P. L.; Billoir, P.; Blaess, S. G.; Blanco, M.; Bleve, C.; Blümer, H.; Boháčová, M.; Boncioli, D.; Bonifazi, C.; Bonino, R.; Borodai, N.; Brack, J.; Brancus, I.; Bridgeman, A.; Brogueira, P.; Brown, W. C.; Buchholz, P.; Bueno, A.; Buitink, S.; Buscemi, M.; Caballero-Mora, K. S.; Caccianiga, B.; Caccianiga, L.; Candusso, M.; Caramete, L.; Caruso, R.; Castellina, A.; Cataldi, G.; Cazon, L.; Cester, R.; Chavez, A. G.; Chiavassa, A.; Chinellato, J. A.; Chudoba, J.; Cilmo, M.; Clay, R. W.; Cocciolo, G.; Colalillo, R.; Coleman, A.; Collica, L.; Coluccia, M. R.; Conceição, R.; Contreras, F.; Cooper, M. J.; Cordier, A.; Coutu, S.; Covault, C. E.; Cronin, J.; Curutiu, A.; Dallier, R.; Daniel, B.; Dasso, S.; Daumiller, K.; Dawson, B. R.; de Almeida, R. M.; De Domenico, M.; de Jong, S. J.; de Mello Neto, J. R. T.; De Mitri, I.; de Oliveira, J.; de Souza, V.; del Peral, L.; Deligny, O.; Dembinski, H.; Dhital, N.; Di Giulio, C.; Di Matteo, A.; Diaz, J. C.; Díaz Castro, M. L.; Diogo, F.; Dobrigkeit, C.; Docters, W.; D’Olivo, J. C.; Dorofeev, A.; Dorosti Hasankiadeh, Q.; Dova, M. T.; Ebr, J.; Engel, R.; Erdmann, M.; Erfani, M.; Escobar, C. O.; Espadanal, J.; Etchegoyen, A.; Facal San Luis, P.; Falcke, H.; Fang, K.; Farrar, G.; Fauth, A. C.; Fazzini, N.; Ferguson, A. P.; Fernandes, M.; Fick, B.; Figueira, J. M.; Filevich, A.; Filipčič, A.; Fox, B. D.; Fratu, O.; Freire, M. M.; Fröhlich, U.; Fuchs, B.; Fujii, T.; Gaior, R.; García, B.; Garcia-Gamez, D.; Garcia-Pinto, D.; Garilli, G.; Gascon Bravo, A.; Gate, F.; Gemmeke, H.; Ghia, P. L.; Giaccari, U.; Giammarchi, M.; Giller, M.; Glaser, C.; Glass, H.; Gómez Berisso, M.; Gómez Vitale, P. F.; Gonçalves, P.; Gonzalez, J. G.; González, N.; Gookin, B.; Gordon, J.; Gorgi, A.; Gorham, P.; Gouffon, P.; Grebe, S.; Griffith, N.; Grillo, A. F.; Grubb, T. D.; Guarino, F.; Guedes, G. P.; Hampel, M. R.; Hansen, P.; Harari, D.; Harrison, T. A.; Hartmann, S.; Harton, J. L.; Haungs, A.; Hebbeker, T.; Heck, D.; Heimann, P.; Herve, A. E.; Hill, G. C.; Hojvat, C.; Hollon, N.; Holt, E.; Homola, P.; Hörandel, J. R.; Horvath, P.; Hrabovský, M.; Huber, D.; Huege, T.; Insolia, A.; Isar, P. G.; Jandt, I.; Jansen, S.; Jarne, C.; Josebachuili, M.; Kääpä, A.; Kambeitz, O.; Kampert, K. H.; Kasper, P.; Katkov, I.; Kégl, B.; Keilhauer, B.; Keivani, A.; Kemp, E.; Kieckhafer, R. M.; Klages, H. O.; Kleifges, M.; Kleinfeller, J.; Krause, R.; Krohm, N.; Krömer, O.; Kruppke-Hansen, D.; Kuempel, D.; Kunka, N.; LaHurd, D.; Latronico, L.; Lauer, R.; Lauscher, M.; Lautridou, P.; Le Coz, S.; Leão, M. S. A. B.; Lebrun, D.; Lebrun, P.; Leigui de Oliveira, M. A.; Letessier-Selvon, A.; Lhenry-Yvon, I.; Link, K.; López, R.; Louedec, K.; Lozano Bahilo, J.; Lu, L.; Lucero, A.; Ludwig, M.; Malacari, M.; Maldera, S.; Mallamaci, M.; Maller, J.; Mandat, D.; Mantsch, P.; Mariazzi, A. G.; Marin, V.; Mariş, I. C.; Marsella, G.; Martello, D.; Martin, L.; Martinez, H.; Martínez Bravo, O.; Martraire, D.; Masías Meza, J. J.; Mathes, H. J.; Mathys, S.; Matthews, J.; Matthews, J. A. J.; Matthiae, G.; Maurel, D.; Maurizio, D.; Mayotte, E.; Mazur, P. O.; Medina, C.; Medina-Tanco, G.; Meissner, R.; Melissas, M.; Melo, D.; Menshikov, A.; Messina, S.; Meyhandan, R.; Mićanović, S.; Micheletti, M. I.; Middendorf, L.; Minaya, I. A.; Miramonti, L.; Mitrica, B.; Molina-Bueno, L.; Mollerach, S.; Monasor, M.; Monnier Ragaigne, D.; Montanet, F.; Morello, C.; Mostafá, M.; Moura, C. A.; Muller, M. A.; Müller, G.; Müller, S.; Münchmeyer, M.; Mussa, R.; Navarra, G.; Navas, S.; Necesal, P.; Nellen, L.; Nelles, A.; Neuser, J.; Nguyen, P. H.; Niechciol, M.; Niemietz, L.; Niggemann, T.; Nitz, D.; Nosek, D.; Novotny, V.; Nožka, L.; Ochilo, L.; Oikonomou, F.; Olinto, A.; Oliveira, M.; Pacheco, N.; Pakk Selmi-Dei, D.; Palatka, M.; Pallotta, J.; Palmieri, N.; Papenbreer, P.; Parente, G.; Parra, A.; Paul, T.; Pech, M.; Pȩkala, J.; Pelayo, R.; Pepe, I. M.; Perrone, L.; Petermann, E.; Peters, C.; Petrera, S.; Petrov, Y.; Phuntsok, J.; Piegaia, R.; Pierog, T.; Pieroni, P.; Pimenta, M.; Pirronello, V.; Platino, M.; Plum, M.; Porcelli, A.; Porowski, C.; Prado, R. R.; Privitera, P.; Prouza, M.; Purrello, V.; Quel, E. J.; Querchfeld, S.; Quinn, S.; Rautenberg, J.; Ravel, O.; Ravignani, D.; Revenu, B.; Ridky, J.; Riggi, S.; Risse, M.; Ristori, P.; Rizi, V.; Rodrigues de Carvalho, W.; Rodriguez Fernandez, G.; Rodriguez Rojo, J.; Rodríguez-Frías, M. D.; Rogozin, D.; Ros, G.; Rosado, J.; Rossler, T.; Roth, M.; Roulet, E.; Rovero, A. C.; Saffi, S. J.; Saftoiu, A.; Salamida, F.; Salazar, H.; Saleh, A.; Salesa Greus, F.; Salina, G.; Sánchez, F.; Sanchez-Lucas, P.; Santo, C. E.; Santos, E.; Santos, E. M.; Sarazin, F.; Sarkar, B.; Sarmento, R.; Sato, R.; Scharf, N.; Scherini, V.; Schieler, H.; Schiffer, P.; Schmidt, D.; Scholten, O.; Schoorlemmer, H.; Schovánek, P.; Schröder, F. G.; Schulz, A.; Schulz, J.; Schumacher, J.; Sciutto, S. J.; Segreto, A.; Settimo, M.; Shadkam, A.; Shellard, R. C.; Sidelnik, I.; Sigl, G.; Sima, O.; Śmiałkowski, A.; Šmída, R.; Snow, G. R.; Sommers, P.; Sorokin, J.; Squartini, R.; Srivastava, Y. N.; Stanič, S.; Stapleton, J.; Stasielak, J.; Stephan, M.; Stutz, A.; Suarez, F.; Suomijärvi, T.; Supanitsky, A. D.; Sutherland, M. S.; Swain, J.; Szadkowski, Z.; Szuba, M.; Taborda, O. A.; Tapia, A.; Tepe, A.; Theodoro, V. M.; Timmermans, C.; Todero Peixoto, C. J.; Toma, G.; Tomankova, L.; Tomé, B.; Tonachini, A.; Torralba Elipe, G.; Torres Machado, D.; Travnicek, P.; Trovato, E.; Ulrich, R.; Unger, M.; Urban, M.; Valdés Galicia, J. F.; Valiño, I.; Valore, L.; van Aar, G.; van Bodegom, P.; van den Berg, A. M.; van Velzen, S.; van Vliet, A.; Varela, E.; Vargas Cárdenas, B.; Varner, G.; Vázquez, J. R.; Vázquez, R. A.; Veberič, D.; Verzi, V.; Vicha, J.; Videla, M.; Villaseñor, L.; Vlcek, B.; Vorobiov, S.; Wahlberg, H.; Wainberg, O.; Walz, D.; Watson, A. A.; Weber, M.; Weidenhaupt, K.; Weindl, A.; Werner, F.; Widom, A.; Wiencke, L.; Wilczyńska, B.; Wilczyński, H.; Williams, C.; Winchen, T.; Wittkowski, D.; Wundheiler, B.; Wykes, S.; Yamamoto, T.; Yapici, T.; Yuan, G.; Yushkov, A.; Zamorano, B.; Zas, E.; Zavrtanik, D.; Zavrtanik, M.; Zepeda, A.; Zhou, J.; Zhu, Y.; Zimbres Silva, M.; Ziolkowski, M.; Zuccarello, F.
2015-01-01
We present the results of an analysis of the large angular scale distribution of the arrival directions of cosmic rays with energy above 4 EeV detected at the Pierre Auger Observatory including for the first time events with zenith angle between 60° and 80°. We perform two Rayleigh analyses, one in
Samejima, Masaki; Negoro, Keisuke; Mitsukuni, Koshichiro; Akiyoshi, Masanori
We propose a finding method of business risk factors on qualitative and quantitative hybrid simulation in time series. Effect ratios of qualitative arcs in the hybrid simulation vary output values of the simulation, so we define effect ratios causing risk as business risk factors. Finding business risk factors in entire ranges of effect ratios is time-consuming. It is considered that probability distributions of effect ratios in present time step and ones in previous time step are similar, the probability distributions in present time step can be estimated. Our method finds business risk factors in only estimated ranges effectively. Experimental results show that a precision rate and a recall rate are 86%, and search time is decreased 20% at least.
Leske, R. A.; Cummings, A. C.; Cohen, C. M.; Mewaldt, R. A.; Labrador, A. W.; Stone, E. C.; Wiedenbeck, M. E.; Christian, E. R.; von Rosenvinge, T. T.
2015-12-01
Solar energetic particle (SEP) pitch angle distributions arise from the competing effects of magnetic focusing and scattering as the particles travel through the interplanetary medium, and can therefore indicate interplanetary conditions far from the observer. The STEREO Low Energy Telescopes measure SEP pitch angle distributions for protons, helium, and heavier ions with energies of about 2-12 MeV/nucleon. A wide variety of particle anisotropies was observed in the extreme SEP event of 23 July 2012. At the STEREO-Ahead spacecraft, the solar source of the activity was near central meridian and the pitch angle distribution was initially an outward-flowing beam. High time resolution (1-minute) observations revealed peculiar oscillations in beam width on a timescale of several minutes; such behavior does not seem to have been previously reported in other events. Particle flow became bidirectional while inside a magnetic cloud following a tremendous shock. Particle intensities at the Behind spacecraft, from which the event occurred over the east limb of the Sun, were about 1000 times lower than at Ahead. Pitch angle distributions during the peak of the event show inward-flowing particles that underwent partial mirroring closer to the Sun and formed a distinctive loss-cone distribution (indicating that the magnetic field strength at the mirror point was too small to turn around particles with the smallest pitch angles). We present the observations of this rich variety of anisotropies within a single event, compare with observations in other events, and discuss the implications for SEP transport in the inner heliosphere.
Institute of Scientific and Technical Information of China (English)
ZHENGGuizhen; JIANGXiulan; HANShuzong
2004-01-01
The joint distribution of wave heights and periods of individual waves is usually approximated by the joint distribution of apparent wave heights and periods. However there is difference between them. This difference is addressed and the theoretical joint distributions of apparent wave heights and periods due to Longuet-Higgins and Sun are modified to give more reasonable representations of the joint distribution of wave heights and periods of individual waves. The modification has overcome an inherent drawback of these joint PDFs that the mean wave period is infinite. A comparison is made between the modified formulae and the field data of Goda, which shows that the new formulae consist with the measurement better than their original counterparts.
Institute of Scientific and Technical Information of China (English)
Xiao Bingjia; Shinichiro Kado; Shin Kajita; Daisuge Yamasaki; Satoru Tanaka
2005-01-01
A novel fitting procedure is proposed for a better determination of H2 rovibrational distribution from the Fulcher-a band spectroscopy. We have recalculated the transition probabilities and the results show that they deviate from Franck-Condon approximation especially for the non-diagonal transitions. We also calculated the complete sets of vibrationally resolved crosssections for electron impact d3∏u- X3∑g transition based on the semi-classical Gryzinski theory.An example of experimental study confirms that current approach provides a tool for a better diagnostics of H2 rovibrational distribution in electronic ground state.
Grunbaum, B W; Selvin, S; Pace, N; Black, D M
1978-07-01
Fresh blood samples were obtained from 6004 whites, 1025 blacks, 1596 Chicano/Amerindians, and 3053 Asians of California and Hawaii. The samples were typed for ABO and Rh groups and were analyzed electrophoretically for ten genetically determined protein variant systems. The effects of race, age, and sex on phenotypic frequencies within each of the twelve genetic systems were investigated. Large frequency differences were found between races but not between different age and sex subgroups within races. It was also demonstrated that the twelve genetic systems behaved statistically independently. Discrimination probabilities were computed for each of the four ethnic groups. These serve as a measure of the effectiveness of the twelve genetic systems examined in individualizing blood samples. The method is discussed for computing the probability that a randomly chosen individual of a given ethnic group possesses the same blood phenotypes as found in a predetermined sample of blood. The results presented here should prove useful in the investigation of civil and criminal cases involving blood samples.
Choice Probability Generating Functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....
Gudder, Stanley P
2014-01-01
Quantum probability is a subtle blend of quantum mechanics and classical probability theory. Its important ideas can be traced to the pioneering work of Richard Feynman in his path integral formalism.Only recently have the concept and ideas of quantum probability been presented in a rigorous axiomatic framework, and this book provides a coherent and comprehensive exposition of this approach. It gives a unified treatment of operational statistics, generalized measure theory and the path integral formalism that can only be found in scattered research articles.The first two chapters survey the ne
Nolen, Matthew S.; Magoulick, Daniel D.; DiStefano, Robert J.; Imhoff, Emily M.; Wagner, Brian K.
2014-01-01
Crayfishes and other freshwater aquatic fauna are particularly at risk globally due to anthropogenic demand, manipulation and exploitation of freshwater resources and yet are often understudied. The Ozark faunal region of Missouri and Arkansas harbours a high level of aquatic biological diversity, especially in regard to endemic crayfishes. Three such endemics, Orconectes eupunctus,Orconectes marchandi and Cambarus hubbsi, are threatened by limited natural distribution and the invasions of Orconectes neglectus.
Turban, L
2016-01-01
The probability distribution of the number $s$ of distinct sites visited up to time $t$ by a random walk on the fully-connected lattice with $N$ sites is first obtained by solving the eigenvalue problem associated with the discrete master equation. Then, using generating function techniques, we compute the joint probability distribution of $s$ and $r$, where $r$ is the number of sites visited only once up to time $t$. Mean values, variances and covariance are deduced from the generating functions and their finite-size-scaling behaviour is studied. Introducing properly centered and scaled variables $u$ and $v$ for $r$ and $s$ and working in the scaling limit ($t\\to\\infty$, $N\\to\\infty$ with $w=t/N$ fixed) the joint probability density of $u$ and $v$ is shown to be a bivariate Gaussian density. It follows that the fluctuations of $r$ and $s$ around their mean values in a finite-size system are Gaussian in the scaling limit. The same type of finite-size scaling is expected to hold on periodic lattices above the ...
Chaouki Ben Issaid
2016-06-01
The Gamma-Gamma distribution has recently emerged in a number of applications ranging from modeling scattering and reverbation in sonar and radar systems to modeling atmospheric turbulence in wireless optical channels. In this respect, assessing the outage probability achieved by some diversity techniques over this kind of channels is of major practical importance. In many circumstances, this is intimately related to the difficult question of analyzing the statistics of a sum of Gamma-Gamma random variables. Answering this question is not a simple matter. This is essentially because outage probabilities encountered in practice are often very small, and hence the use of classical Monte Carlo methods is not a reasonable choice. This lies behind the main motivation of the present work. In particular, this paper proposes a new approach to estimate the left tail of the sum of Gamma-Gamma variates. More specifically, we propose a mean-shift importance sampling scheme that efficiently evaluates the outage probability of L-branch maximum ratio combining diversity receivers over Gamma-Gamma fading channels. The proposed estimator satisfies the well-known bounded relative error criterion, a well-desired property characterizing the robustness of importance sampling schemes, for both identically and non-identically independent distributed cases. We show the accuracy and the efficiency of our approach compared to naive Monte Carlo via some selected numerical simulations.
The Derivation of the Probability Density Function of the t Distribution%t 分布概率密度的分析
Institute of Scientific and Technical Information of China (English)
彭定忠; 张映辉; 刘朝才
2012-01-01
t 分布是数理统计中应用广泛的3个重要分布之一，大多数教材没有或仅用直接法推导其概率密度，本文采用变换法推导，简化了运算过程，降低了计算难度。% The t distribution is one of three most important distributions which are applied widely in mathematically statistical analysis, most of the teaching material not including or only use the direct method to derivate the probability density function of the distribution. In this paper, the transform method which features a more simple operation and less difficult computation is presented for derivation.
Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...
Shiryaev, Albert N
2016-01-01
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.
Directory of Open Access Journals (Sweden)
Bora Tas
2016-01-01
Full Text Available To investigate the dose-volume variations of planning target volume (PTV and organ at risks (OARs in eleven prostate cancer patients planned with single and double arc volumetric modulated arc therapy (VMAT when varying collimator angle. Single and double arc VMAT treatment plans were created using Monaco5.0® with collimator angle set to 0°. All plans were normalized 7600 cGy dose to the 95% of clinical target volume (CTV volume. The single arc VMAT plans were reoptimized with different collimator angles (0°, 15°, 30°, 45°, 60°, 75°, and 90°, and for double arc VMAT plans (0–0°, 15°–345, 30–330°, 45–315°, 60–300°, 75–285°, 90–270° using the same optimization parameters. For the comparison the parameters of heterogeneity index (HI, dose-volume histogram and minimum dose to the 95% of PTV volume (D95 PTV calculated and analyzed. The best plans were verified using 2 dimensional ion chamber array IBA Matrixx® and three-dimensional IBA Compass® program. The comparison between calculation and measurement were made by the γ-index (3%/3 mm analysis. A higher D95 (PTV were found for single arc VMAT with 15° collimator angle. For double arc, VMAT with 60–300° and 75–285° collimator angles. However, lower rectum doses obtained for 75–285° collimator angles. There was no significant dose difference, based on other OARs which are bladder and femur head. When we compared single and double arc VMAT's D95 (PTV, we determined 2.44% high coverage and lower HI with double arc VMAT. All plans passed the γ-index (3%/3 mm analysis with more than 97% of the points and we had an average γ-index for CTV 0.36, for PTV 0.32 with double arc VMAT. These results were significant by Wilcoxon signed rank test statistically. The results show that dose coverage of target and OAR's doses also depend significantly on the collimator angles due to the geometry of target and OARs. Based on the results we have decided to plan prostate
Tas, Bora; Bilge, Hatice; Ozturk, Sibel Tokdemir
2016-01-01
To investigate the dose-volume variations of planning target volume (PTV) and organ at risks (OARs) in eleven prostate cancer patients planned with single and double arc volumetric modulated arc therapy (VMAT) when varying collimator angle. Single and double arc VMAT treatment plans were created using Monaco5.0(®) with collimator angle set to 0°. All plans were normalized 7600 cGy dose to the 95% of clinical target volume (CTV) volume. The single arc VMAT plans were reoptimized with different collimator angles (0°, 15°, 30°, 45°, 60°, 75°, and 90°), and for double arc VMAT plans (0-0°, 15°-345, 30-330°, 45-315°, 60-300°, 75-285°, 90-270°) using the same optimization parameters. For the comparison the parameters of heterogeneity index (HI), dose-volume histogram and minimum dose to the 95% of PTV volume (D95 PTV) calculated and analyzed. The best plans were verified using 2 dimensional ion chamber array IBA Matrixx(®) and three-dimensional IBA Compass(®) program. The comparison between calculation and measurement were made by the γ-index (3%/3 mm) analysis. A higher D95 (PTV) were found for single arc VMAT with 15° collimator angle. For double arc, VMAT with 60-300° and 75-285° collimator angles. However, lower rectum doses obtained for 75-285° collimator angles. There was no significant dose difference, based on other OARs which are bladder and femur head. When we compared single and double arc VMAT's D95 (PTV), we determined 2.44% high coverage and lower HI with double arc VMAT. All plans passed the γ-index (3%/3 mm) analysis with more than 97% of the points and we had an average γ-index for CTV 0.36, for PTV 0.32 with double arc VMAT. These results were significant by Wilcoxon signed rank test statistically. The results show that dose coverage of target and OAR's doses also depend significantly on the collimator angles due to the geometry of target and OARs. Based on the results we have decided to plan prostate cancer patients in our
DEFF Research Database (Denmark)
Dimitrov, Nikolay Krasimirov
2016-01-01
We have tested the performance of statistical extrapolation methods in predicting the extreme response of a multi-megawatt wind turbine generator. We have applied the peaks-over-threshold, block maxima and average conditional exceedance rates (ACER) methods for peaks extraction, combined with four...... levels, based on the assumption that the response tail is asymptotically Gumbel distributed. Example analyses were carried out, aimed at comparing the different methods, analysing the statistical uncertainties and identifying the factors, which are critical to the accuracy and reliability...
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2013-01-01
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...
DEFF Research Database (Denmark)
Rojas-Nandayapa, Leonardo
Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... of insurance companies facing losses due to natural disasters, banks seeking protection against huge losses, failures in expensive and sophisticated systems or loss of valuable information in electronic systems. The main difficulty when dealing with this kind of problems is the unavailability of a closed...
Veldkamp, T. I. E.; Wada, Y.; Aerts, J. C. J. H.; Ward, P. J.
2016-01-01
Changing hydro-climatic and socioeconomic conditions increasingly put pressure on fresh water resources and are expected to aggravate water scarcity conditions towards the future. Despite numerous calls for risk-based water scarcity assessments, a global-scale framework that includes UNISDR's definition of risk does not yet exist. This study provides a first step towards such a risk based assessment, applying a Gamma distribution to estimate water scarcity conditions at the global scale under historic and future conditions, using multiple climate change and population growth scenarios. Our study highlights that water scarcity risk, expressed in terms of expected annual exposed population, increases given all future scenarios, up to greater than 56.2% of the global population in 2080. Looking at the drivers of risk, we find that population growth outweigh the impacts of climate change at global and regional scales. Using a risk-based method to assess water scarcity, we show the results to be less sensitive than traditional water scarcity assessments to the use of fixed threshold to represent different levels of water scarcity. This becomes especially important when moving from global to local scales, whereby deviations increase up to 50% of estimated risk levels.
Veldkamp, T. I. E.; Wada, Y.; Aerts, J. C. J. H.; Ward, P. J.
2016-02-01
Changing hydro-climatic and socioeconomic conditions increasingly put pressure on fresh water resources and are expected to aggravate water scarcity conditions towards the future. Despite numerous calls for risk-based water scarcity assessments, a global-scale framework that includes UNISDR’s definition of risk does not yet exist. This study provides a first step towards such a risk-based assessment, applying a Gamma distribution to estimate water scarcity conditions at the global scale under historic and future conditions, using multiple climate change and population growth scenarios. Our study highlights that water scarcity risk, expressed in terms of expected annual exposed population, increases given all future scenarios, up to >56.2% of the global population in 2080. Looking at the drivers of risk, we find that population growth outweigh the impacts of climate change at global and regional scales. Using a risk-based method to assess water scarcity, we show the results to be less sensitive than traditional water scarcity assessments to the use of fixed threshold to represent different levels of water scarcity. This becomes especially important when moving from global to local scales, whereby deviations increase up to 50% of estimated risk levels.
Energy Technology Data Exchange (ETDEWEB)
Agosteo, S. [Politecnico di Milano, Dipartimento di Energia, Piazza Leonardo da Vinci 32, 20133 Milano (Italy)] [Istituto Nazionale di Fisica Nucleare, Sezione di Milano, via Celoria 16, 20133 Milano (Italy); Colautti, P., E-mail: paolo.colautti@lnl.infn.it [INFN, Laboratori Nazionali di Legnaro (LNL), Via dell' Universita, 2, I-35020 Legnaro (PD) (Italy); Esposito, J., E-mail: juan.esposito@tin.it [INFN, Laboratori Nazionali di Legnaro (LNL), Via dell' Universita, 2, I-35020 Legnaro (PD) (Italy); Fazzi, A.; Introini, M.V.; Pola, A. [Politecnico di Milano, Dipartimento di Energia, Piazza Leonardo da Vinci 32, 20133 Milano (Italy)] [Istituto Nazionale di Fisica Nucleare, Sezione di Milano, via Celoria 16, 20133 Milano (Italy)
2011-12-15
Neutron energy spectra at different emission angles, between 0 Degree-Sign and 120 Degree-Sign from the Be(p,xn) reaction generated by a beryllium thick-target bombarded with 5 MeV protons, have been measured at the Legnaro Laboratories (LNL) of the Italian National Institute for Nuclear Physics research (INFN). A new and quite compact recoil-proton spectrometer, based on a monolithic silicon telescope, coupled to a polyethylene converter, was efficiently used with respect to the traditional Time-of-Flight (TOF) technique. The measured distributions of recoil-protons were processed through an iterative unfolding algorithm in order to determine the neutron energy spectra at all the angles accounted for. The neutron energy spectrum measured at 0 Degree-Sign resulted to be in good agreement with the only one so far available at the requested energy and measured years ago with TOF technique. Moreover, the results obtained at different emission angles resulted to be consistent with detailed past measurements performed at 4 MeV protons at the same angles by TOF techniques.
Fractal dimension and unscreened angles measured for radial viscous fingering.
Praud, Olivier; Swinney, Harry L
2005-07-01
We have examined fractal patterns formed by the injection of air into oil in a thin (0.127 mm) layer contained between two cylindrical glass plates of 288 mm diameter (a Hele-Shaw cell), for pressure differences in the range 0.25 DLA) clusters. We have also measured the probability distribution of unscreened angles. At late times, the distribution approaches a universal (i.e., forcing and size-independent) asymptotic form that has mean 145 degrees Celsius and standard deviation 36 degrees Celsius. These results indicate that the distribution function for the unscreened angle is an invariant property of the growth process. PMID:16089960
Institute of Scientific and Technical Information of China (English)
SHA Yun-dong; GUO Xiao-peng; LIAO Lian-fang; XIE Li-juan
2011-01-01
As to the sonic fatigue problem of an aero-engine combustor liner structure under the random acoustic loadings, an effective method for predicting the fatigue life of a structure under random loadings was studied. Firstly, the probability distribution of Von Mises stress of thin-walled structure under random loadings was studied, analysis suggested that probability density function of Von Mises stress process accord approximately with two-parameter Weibull distribution. The formula for calculating Weibull parameters were given. Based on the Miner linear theory, the method for predicting the random sonic fatigue life based on the stress probability density was developed, and the model for fatigue life prediction was constructed. As an example, an aero-engine combustor liner structure was considered. The power spectrum density (PSD) of the vibrational stress response was calculated by using the coupled FEM/BEM (finite element method/boundary element method) model, the fatigue life was estimated by using the constructed model. And considering the influence of the wide frequency band, the calculated results were modified. Comparetive analysis shows that the estimated results of sonic fatigue of the combustor liner structure by using Weibull distribution of Von Mises stress are more conservative than using Dirlik distribution to some extend. The results show that the methods presented in this paper are practical for the random fatigue life analysis of the aeronautical thin-walled structures.
Energy Technology Data Exchange (ETDEWEB)
Hodel, Jerome [Unite Analyse et Restauration du Mouvement, UMR-CNRS, 8005 LBM ParisTech Ensam, Paris (France); University Paris Est Creteil (UPEC), Creteil (France); Assistance Publique-Hopitaux de Paris, Paris (France); Hopital Henri Mondor, Department of Neuroradiology, Creteil (France); Hopital Henri Mondor, Creteil (France); Silvera, Jonathan [University Paris Est Creteil (UPEC), Creteil (France); Assistance Publique-Hopitaux de Paris, Paris (France); Hopital Henri Mondor, Department of Neuroradiology, Creteil (France); Bekaert, Olivier; Decq, Philippe [Unite Analyse et Restauration du Mouvement, UMR-CNRS, 8005 LBM ParisTech Ensam, Paris (France); University Paris Est Creteil (UPEC), Creteil (France); Assistance Publique-Hopitaux de Paris, Paris (France); Hopital Henri Mondor, Department of Neurosurgery, Creteil (France); Rahmouni, Alain [University Paris Est Creteil (UPEC), Creteil (France); Assistance Publique-Hopitaux de Paris, Paris (France); Hopital Henri Mondor, Department of Radiology, Creteil (France); Bastuji-Garin, Sylvie [University Paris Est Creteil (UPEC), Creteil (France); Assistance Publique-Hopitaux de Paris, Paris (France); Hopital Henri Mondor, Department of Public Health, Creteil (France); Vignaud, Alexandre [Siemens Healthcare, Saint Denis (France); Petit, Eric; Durning, Bruno [Laboratoire Images Signaux et Systemes Intelligents, UPEC, Creteil (France)
2011-02-15
To assess the three-dimensional turbo spin echo with variable flip-angle distribution magnetic resonance sequence (SPACE: Sampling Perfection with Application optimised Contrast using different flip-angle Evolution) for the imaging of intracranial cerebrospinal fluid (CSF) spaces. We prospectively investigated 18 healthy volunteers and 25 patients, 20 with communicating hydrocephalus (CH), five with non-communicating hydrocephalus (NCH), using the SPACE sequence at 1.5T. Volume rendering views of both intracranial and ventricular CSF were obtained for all patients and volunteers. The subarachnoid CSF distribution was qualitatively evaluated on volume rendering views using a four-point scale. The CSF volumes within total, ventricular and subarachnoid spaces were calculated as well as the ratio between ventricular and subarachnoid CSF volumes. Three different patterns of subarachnoid CSF distribution were observed. In healthy volunteers we found narrowed CSF spaces within the occipital aera. A diffuse narrowing of the subarachnoid CSF spaces was observed in patients with NCH whereas patients with CH exhibited narrowed CSF spaces within the high midline convexity. The ratios between ventricular and subarachnoid CSF volumes were significantly different among the volunteers, patients with CH and patients with NCH. The assessment of CSF spaces volume and distribution may help to characterise hydrocephalus. (orig.)
Zou, Z.; Ni, B.; Gu, X.; Zhao, Z.; Zhou, C.
2015-12-01
Fifteen month of pitch angle resolved Van Allen Probes Relativistic Electron-Proton Telescope (REPT) measurements of differential electron flux are analyzed to investigate the characteristics of the pitch angle distribution of radiation belt ultrarelativistic(> 2 MeV) electrons during storm conditions and during the long-storm decay. By modeling the ultrarelativistic electron pitch angle distribution as ,where is the equatorial pitch angle we examine the spatiotemporal variations of n value. The results show that in general n values increases with the level of geomagnetic activity. In principle the ultrarelativistic electrons respond to geomagnetic storms by becoming peaked at 90° pitch angle with n-values of 2 - 3 as a supportive signature of chorus acceleration outside the plasmasphere. High n-values also exists inside the plasmasphere, being localized adjacent to the plasmapause and energy dependent, which suggests a significant contribution from electronmagnetic ion cyclotron (EMIC) waves scattering. During quiet periods, n values generally evolve to become small, i.e., 0-1. The slow and long-term decays of the ultrarelativistic electrons after geomagnetic storms, while prominent, produce energy and L-shell-dependent decay time scales in association with the solar and geomagnetic activity and wave-particle interaction processes. At lower L shells inside the plasmasphere, the decay time scales for electrons at REPT energies are generally larger, varying from tens of days to hundreds of days, which can be mainly attributed to the combined effect of hiss-induced pitch angle scattering and inward radial diffusion. As L shell increases to L~3.5, a narrow region exists (with a width of ~0.5 L), where the observed ultrarelativistic electrons decay fastest, possibly resulting from efficient EMIC wave scattering. As L shell continues to increase, generally becomes larger again, indicating an overall slower loss process by waves at high L shells. Our investigation based
Li, Xin; Hong, Kunlun; Liu, Yun; Shew, Chwen-Yang; Liu, Emily; Herwig, Kenneth W.; Smith, Gregory S.; Zhao, Junpeng; Zhang, Guangzhao; Pispas, Stergios; Chen, Wei-Ren
2010-10-01
We develop an experimental approach to analyze the water distribution around a core-shell micelle formed by polystyrene-block-poly[styrene-g-poly(ethylene oxide (PEO)] block copolymers in aqueous media at a fixed polymeric concentration of 10 mg/ml through contrast variation small angle neutron scattering (SANS) study. Through varying the D2O/H2O ratio, the scattering contributions from the water molecules and the micellar constituent components can be determined. Based on the commonly used core-shell model, a theoretical coherent scattering cross section incorporating the effect of water penetration is developed and used to analyze the SANS I(Q ). We have successfully quantified the intramicellar water distribution and found that the overall micellar hydration level increases with the increase in the molecular weight of hydrophilic PEO side chains. Our work presents a practical experimental means for evaluating the intramacromolecular solvent distributions of general soft matter systems.
Kim, C S; Lü, C D; Morozumi, T; Kim, Yeong Gyun; Lu, Cai-Dian; Morozumi, Takuya
2000-01-01
We present the angular distribution of the rare B decay, $B \\to K^* (\\to K invariant mass region of dileptons, we can probe new physics effects efficiently. In particular, this distribution is found to be quite sensitive to the ratio of the contributions from two independent magnetic moment operators, which also contribute to $B \\to K^* \\gamma$. Therefore, our method can be very useful when new physics is introduced without changing the total decay rate of the $b \\to s \\gamma$. The angular distributions are compared with the predictions of the standard model, and are shown for the cases when the afore-mentioned ratio is different from the standard model prediction.
Tremblin, P; Minier, V; Didelon, P; Hill, T; Anderson, L D; Motte, F; Zavagno, A; André, Ph; Arzoumanian, D; Audit, E; Benedettini, M; Bontemps, S; Csengeri, T; Di Francesco, J; Giannini, T; Hennemann, M; Luong, Q Nguyen; Marston, A P; Peretto, N; Rivera-Ingraham, A; Russeil, D; Rygl, K L J; Spinoglio, L; White, G J
2014-01-01
Ionization feedback should impact the probability distribution function (PDF) of the column density around the ionized gas. We aim to quantify this effect and discuss its potential link to the Core and Initial Mass Function (CMF/IMF). We used in a systematic way Herschel column density maps of several regions observed within the HOBYS key program: M16, the Rosette and Vela C molecular cloud, and the RCW 120 H ii region. We fitted the column density PDFs of all clouds with two lognormal distributions, since they present a double-peak or enlarged shape in the PDF. Our interpretation is that the lowest part of the column density distribution describes the turbulent molecular gas while the second peak corresponds to a compression zone induced by the expansion of the ionized gas into the turbulent molecular cloud. The condensations at the edge of the ionized gas have a steep compressed radial profile, sometimes recognizable in the flattening of the power-law tail. This could lead to an unambiguous criterion able t...
Hughes, Annie; Schinnerer, Eva; Colombo, Dario; Pety, Jerome; Leroy, Adam K; Dobbs, Clare L; Garcia-Burillo, Santiago; Thompson, Todd A; Dumas, Gaelle; Schuster, Karl F; Kramer, Carsten
2013-01-01
We analyse the distribution of CO brightness temperature and integrated intensity in M51 at ~40 pc resolution using new CO data from the Plateau de Bure Arcsecond Whirlpool Survey (PAWS). We present probability distribution functions (PDFs) of the CO emission within the PAWS field, which covers the inner 11 x 7 kpc of M51. We find variations in the shape of CO PDFs within different M51 environments, and between M51 and M33 and the Large Magellanic Cloud (LMC). Globally, the PDFs for the inner disk of M51 can be represented by narrow lognormal functions that cover 1 to 2 orders of magnitude in CO brightness and integrated intensity. The PDFs for M33 and the LMC are narrower and peak at lower CO intensities. However, the CO PDFs for different dynamical environments within the PAWS field depart from the shape of the global distribution. The PDFs for the interarm region are approximately lognormal, but in the spiral arms and central region of M51, they exhibit diverse shapes with a significant excess of bright CO...
RANDOM VARIABLE WITH FUZZY PROBABILITY
Institute of Scientific and Technical Information of China (English)
吕恩琳; 钟佑明
2003-01-01
Mathematic description about the second kind fuzzy random variable namely the random variable with crisp event-fuzzy probability was studied. Based on the interval probability and using the fuzzy resolution theorem, the feasible condition about a probability fuzzy number set was given, go a step further the definition arid characters of random variable with fuzzy probability ( RVFP ) and the fuzzy distribution function and fuzzy probability distribution sequence of the RVFP were put forward. The fuzzy probability resolution theorem with the closing operation of fuzzy probability was given and proved. The definition and characters of mathematical expectation and variance of the RVFP were studied also. All mathematic description about the RVFP has the closing operation for fuzzy probability, as a result, the foundation of perfecting fuzzy probability operation method is laid.
Energy Technology Data Exchange (ETDEWEB)
Ramos, Alessandro Candido Lopes [CELG - Companhia Energetica de Goias, Goiania, GO (Brazil). Generation and Transmission. System' s Operation Center], E-mail: alessandro.clr@celg.com.br; Batista, Adalberto Jose [Universidade Federal de Goias (UFG), Goiania, GO (Brazil)], E-mail: batista@eee.ufg.br; Leborgne, Roberto Chouhy [Universidade Federal do Rio Grande do Sul (UFRS), Porto Alegre, RS (Brazil)], E-mail: rcl@ece.ufrgs.br; Emiliano, Pedro Henrique Mota, E-mail: ph@phph.com.br
2009-07-01
This article presents the impact of distributed generation in studies of voltage sags caused by faults in the electrical system. We simulated short-circuit-to-ground in 62 lines of 230, 138, 69 and 13.8 kV that are part of the electrical system of the city of Goiania, of Goias state . For each fault position was monitored the bus voltage of 380 V in an industrial consumer sensitive to such sags. Were inserted different levels of GD near the consumer. The simulations of a short circuit, with the monitoring bar 380 V, were performed again. A study using stochastic simulation Monte Carlo (SMC) was performed to obtain, at each level of GD, the probability curves and sags of the probability density and its voltage class. With these curves were obtained the average number of sags according to each class, that the consumer bar may be submitted annually. The simulations were performed using the Program Analysis of Simultaneous Faults - ANAFAS. In order to overcome the intrinsic limitations of the methods of simulation of this program and allow data entry via windows, a computational tool was developed in Java language. Data processing was done using the MATLAB software.
Energy Technology Data Exchange (ETDEWEB)
Mantsyzov, Alexey B. [M.V. Lomonosov Moscow State University, Faculty of Fundamental Medicine (Russian Federation); Shen, Yang; Lee, Jung Ho [National Institutes of Health, Laboratory of Chemical Physics, National Institute of Diabetes and Digestive and Kidney Diseases (United States); Hummer, Gerhard [Max Planck Institute of Biophysics (Germany); Bax, Ad, E-mail: bax@nih.gov [National Institutes of Health, Laboratory of Chemical Physics, National Institute of Diabetes and Digestive and Kidney Diseases (United States)
2015-09-15
MERA (Maximum Entropy Ramachandran map Analysis from NMR data) is a new webserver that generates residue-by-residue Ramachandran map distributions for disordered proteins or disordered regions in proteins on the basis of experimental NMR parameters. As input data, the program currently utilizes up to 12 different parameters. These include three different types of short-range NOEs, three types of backbone chemical shifts ({sup 15}N, {sup 13}C{sup α}, and {sup 13}C′), six types of J couplings ({sup 3}J{sub HNHα}, {sup 3}J{sub C′C′}, {sup 3}J{sub C′Hα}, {sup 1}J{sub HαCα}, {sup 2}J{sub CαN} and {sup 1}J{sub CαN}), as well as the {sup 15}N-relaxation derived J(0) spectral density. The Ramachandran map distributions are reported in terms of populations of their 15° × 15° voxels, and an adjustable maximum entropy weight factor is available to ensure that the obtained distributions will not deviate more from a newly derived coil library distribution than required to account for the experimental data. MERA output includes the agreement between each input parameter and its distribution-derived value. As an application, we demonstrate performance of the program for several residues in the intrinsically disordered protein α-synuclein, as well as for several static and dynamic residues in the folded protein GB3.
Energy Technology Data Exchange (ETDEWEB)
Cavallaro, M., E-mail: manuela.cavallaro@lns.infn.it [INFN, Laboratori Nazionali del Sud, Via S. Sofia 62, I-95125 Catania (Italy); Cappuzzello, F.; Carbone, D.; Cunsolo, A. [INFN, Laboratori Nazionali del Sud, Via S. Sofia 62, I-95125 Catania (Italy); Dipartimento di Fisica e Astronomia, Universita di Catania, Via S. Sofia 64, I-95125 Catania (Italy); Foti, A. [Dipartimento di Fisica e Astronomia, Universita di Catania, Via S. Sofia 64, I-95125 Catania (Italy); INFN, Sezione di Catania, Via S. Sofia 64, I-95125 Catania (Italy); Linares, R. [Instituto de Fisica, Universidade Federal Fluminense, Litoranea s/n, Gragoata, Niteroi, Rio de Janeiro 24210-340 (Brazil); Pereira, D.; Oliveira, J.R.B.; Gomes, P.R.S.; Lubian, J. [Universidade de Sao Paulo, Departamento de Fisica Nuclear, Instituto de Fisica da Universidade de Sao Paulo, Caixa Postal 66318, 05315-970 Sao Paulo, SP (Brazil); Chen, R. [Institute of Modern Physics, CAS, Lanzhou (China)
2011-08-21
The {sup 16}O+{sup 27}Al elastic and inelastic angular distributions have been measured in a broad angular range (13{sup o}<{theta}{sub lab}<52{sup o}) at about 100 MeV incident energy. The use of the MAGNEX large acceptance magnetic spectrometer and of the ray-reconstruction analysis technique has been crucial in order to provide, in the same experiment, high-resolution energy spectra and cross-section measurements distributed over more than seven orders of magnitude down to hundreds of nb/sr.
Institute of Scientific and Technical Information of China (English)
Qian Zhong-Hua; Hu Jing-Guo; Feng Guo-Lin; Cao Yong-Zhong
2012-01-01
Based on the skewed function,the most probable temperature is defined and the spatiotemporal distributions of the frequencies and strengths of extreme temperature events in different climate states over China are investigated,where the climate states are referred to as State Ⅰ,State Ⅱ and State Ⅲ,i.e.,the daily minimum temperature records of 1961-1990,1971-2000,and 1981-2009.The results show that in space the frequency of high temperature events in summer decreases clearly in the lower and middle reaches of the Yellow River in State Ⅰ and that low temperature events decrease in northern China in State Ⅱ.In the present state,the frequency of high temperature events increases significantly in most areas over China except the north east,while the frequency of low temperature events decreases mainly in north China and the regions between the Yangtze River and the Yellow River.The distributions of frequencies and strengths of extreme temperature events are consistent in space.The analysis of time evolution of extreme events shows that the occurrence of high temperature events become higher with the change in state,while that of low temperature events decreases.High temperature events are becoming stronger as well and deserve to be paid special attention.
Grimmett, Geoffrey
2014-01-01
Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...
Directory of Open Access Journals (Sweden)
J. J. Lee
2012-11-01
Full Text Available Electron microburst energy spectra in the range of 170 keV to 360 keV have been measured using two solid-state detectors onboard the low-altitude (680 km, polar-orbiting Korean STSAT-1 (Science and Technology SATellite-1. Applying a unique capability of the spacecraft attitude control system, microburst energy spectra have been accurately resolved into two components: perpendicular to and parallel to the geomagnetic field direction. The former measures trapped electrons and the latter those electrons with pitch angles in the loss cone and precipitating into atmosphere. It is found that the perpendicular component energy spectra are harder than the parallel component and the loss cone is not completely filled by the electrons in the energy range of 170 keV to 360 keV. These results have been modeled assuming a wave-particle cyclotron resonance mechanism, where higher energy electrons travelling within a magnetic flux tube interact with whistler mode waves at higher latitudes (lower altitudes. Our results suggest that because higher energy (relativistic microbursts do not fill the loss cone completely, only a small portion of electrons is able to reach low altitude (~100 km atmosphere. Thus assuming that low energy microbursts and relativistic microbursts are created by cyclotron resonance with chorus elements (but at different locations, the low energy portion of the microburst spectrum will dominate at low altitudes. This explains why relativistic microbursts have not been observed by balloon experiments, which typically float at altitudes of ~30 km and measure only X-ray flux produced by collisions between neutral atmospheric particles and precipitating electrons.
International Nuclear Information System (INIS)
The aim of this work is to investigate the influence of the statistical fluctuations of Monte Carlo (MC) dose distributions on the dose volume histograms (DVHs) and radiobiological models, in particular the Poisson model for tumour control probability (tcp). The MC matrix is characterized by a mean dose in each scoring voxel, d, and a statistical error on the mean dose, σd; whilst the quantities d and σd depend on many statistical and physical parameters, here we consider only their dependence on the phantom voxel size and the number of histories from the radiation source. Dose distributions from high-energy photon beams have been analysed. It has been found that the DVH broadens when increasing the statistical noise of the dose distribution, and the tcp calculation systematically underestimates the real tumour control value, defined here as the value of tumour control when the statistical error of the dose distribution tends to zero. When increasing the number of energy deposition events, either by increasing the voxel dimensions or increasing the number of histories from the source, the DVH broadening decreases and tcp converges to the 'correct' value. It is shown that the underestimation of the tcp due to the noise in the dose distribution depends on the degree of heterogeneity of the radiobiological parameters over the population; in particular this error decreases with increasing the biological heterogeneity, whereas it becomes significant in the hypothesis of a radiosensitivity assay for single patients, or for subgroups of patients. It has been found, for example, that when the voxel dimension is changed from a cube with sides of 0.5 cm to a cube with sides of 0.25 cm (with a fixed number of histories of 108 from the source), the systematic error in the tcp calculation is about 75% in the homogeneous hypothesis, and it decreases to a minimum value of about 15% in a case of high radiobiological heterogeneity. The possibility of using the error on the tcp to
Ocampo-Duque, William; Osorio, Carolina; Piamba, Christian; Schuhmacher, Marta; Domingo, José L
2013-02-01
The integration of water quality monitoring variables is essential in environmental decision making. Nowadays, advanced techniques to manage subjectivity, imprecision, uncertainty, vagueness, and variability are required in such complex evaluation process. We here propose a probabilistic fuzzy hybrid model to assess river water quality. Fuzzy logic reasoning has been used to compute a water quality integrative index. By applying a Monte Carlo technique, based on non-parametric probability distributions, the randomness of model inputs was estimated. Annual histograms of nine water quality variables were built with monitoring data systematically collected in the Colombian Cauca River, and probability density estimations using the kernel smoothing method were applied to fit data. Several years were assessed, and river sectors upstream and downstream the city of Santiago de Cali, a big city with basic wastewater treatment and high industrial activity, were analyzed. The probabilistic fuzzy water quality index was able to explain the reduction in water quality, as the river receives a larger number of agriculture, domestic, and industrial effluents. The results of the hybrid model were compared to traditional water quality indexes. The main advantage of the proposed method is that it considers flexible boundaries between the linguistic qualifiers used to define the water status, being the belongingness of water quality to the diverse output fuzzy sets or classes provided with percentiles and histograms, which allows classify better the real water condition. The results of this study show that fuzzy inference systems integrated to stochastic non-parametric techniques may be used as complementary tools in water quality indexing methodologies.
Ocampo-Duque, William; Osorio, Carolina; Piamba, Christian; Schuhmacher, Marta; Domingo, José L
2013-02-01
The integration of water quality monitoring variables is essential in environmental decision making. Nowadays, advanced techniques to manage subjectivity, imprecision, uncertainty, vagueness, and variability are required in such complex evaluation process. We here propose a probabilistic fuzzy hybrid model to assess river water quality. Fuzzy logic reasoning has been used to compute a water quality integrative index. By applying a Monte Carlo technique, based on non-parametric probability distributions, the randomness of model inputs was estimated. Annual histograms of nine water quality variables were built with monitoring data systematically collected in the Colombian Cauca River, and probability density estimations using the kernel smoothing method were applied to fit data. Several years were assessed, and river sectors upstream and downstream the city of Santiago de Cali, a big city with basic wastewater treatment and high industrial activity, were analyzed. The probabilistic fuzzy water quality index was able to explain the reduction in water quality, as the river receives a larger number of agriculture, domestic, and industrial effluents. The results of the hybrid model were compared to traditional water quality indexes. The main advantage of the proposed method is that it considers flexible boundaries between the linguistic qualifiers used to define the water status, being the belongingness of water quality to the diverse output fuzzy sets or classes provided with percentiles and histograms, which allows classify better the real water condition. The results of this study show that fuzzy inference systems integrated to stochastic non-parametric techniques may be used as complementary tools in water quality indexing methodologies. PMID:23266912
Introduction to probability with R
Baclawski, Kenneth
2008-01-01
FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable
Energy Technology Data Exchange (ETDEWEB)
Hughes, Annie; Meidt, Sharon E.; Schinnerer, Eva; Colombo, Dario [Max-Planck-Institut für Astronomie, Königstuhl 17, D-69117 Heidelberg (Germany); Pety, Jerôme; Dumas, Gaëlle; Schuster, Karl F. [Institut de Radioastronomie Millimétrique, 300 Rue de la Piscine, F-38406 Saint Martin d' Hères (France); Leroy, Adam K. [National Radio Astronomy Observatory, 520 Edgemont Road, Charlottesville, VA 22903 (United States); Dobbs, Clare L. [School of Physics and Astronomy, University of Exeter, Stocker Road, Exeter EX4 4QL (United Kingdom); García-Burillo, Santiago [Observatorio Astronómico Nacional, Observatorio de Madrid, Alfonso XII, 3, E-28014 Madrid (Spain); Thompson, Todd A. [Department of Astronomy, The Ohio State University, 140 West 18th Avenue, Columbus, OH 43210 (United States); Kramer, Carsten [Instituto Radioastronomía Milimétrica, Avenida Divina Pastora 7, Nucleo Central, E-18012 Granada (Spain)
2013-12-10
We analyze the distribution of CO brightness temperature and integrated intensity in M51 at ∼40 pc resolution using new {sup 12}CO(J = 1 → 0) data from the Plateau de Bure Arcsecond Whirlpool Survey (PAWS). We present probability distribution functions (PDFs) of the CO emission within the PAWS field of view, which covers the inner ∼11 × 7 kpc of M51. We find clear variations in the shape of CO PDFs both within different M51 environments, defined according to dynamical criteria, and among M51 and two nearby low-mass galaxies, M33 and the Large Magellanic Cloud (LMC). Globally, the PDFs for the inner disk of M51 can be represented by narrow lognormal functions that cover ∼1-2 orders of magnitude in CO brightness and integrated intensity. The PDFs for M33 and the LMC are narrower and peak at lower CO intensities, consistent with their lower gas surface densities. However, the CO PDFs for different dynamical environments within the PAWS field depart significantly from the shape of the global distribution. The PDFs for the interarm region are approximately lognormal, but in the spiral arms and central region of M51, they exhibit diverse shapes with a significant excess of bright CO emission. The observed environmental dependence on the shape of the CO PDFs is qualitatively consistent with changes that would be expected if molecular gas in the spiral arms is characterized by a larger range of average densities, gas temperatures, and velocity fluctuations, although further work is required to disentangle the relative importance of large-scale dynamical effects versus star formation feedback in regulating these properties. We show that the shape of the CO PDFs for different M51 environments is only weakly related to global properties of the CO emission, e.g., the total CO luminosity, but is strongly correlated with properties of the local giant molecular cloud (GMC) and young stellar cluster populations, including the shape of their mass distributions. For
International Nuclear Information System (INIS)
We analyze the distribution of CO brightness temperature and integrated intensity in M51 at ∼40 pc resolution using new 12CO(J = 1 → 0) data from the Plateau de Bure Arcsecond Whirlpool Survey (PAWS). We present probability distribution functions (PDFs) of the CO emission within the PAWS field of view, which covers the inner ∼11 × 7 kpc of M51. We find clear variations in the shape of CO PDFs both within different M51 environments, defined according to dynamical criteria, and among M51 and two nearby low-mass galaxies, M33 and the Large Magellanic Cloud (LMC). Globally, the PDFs for the inner disk of M51 can be represented by narrow lognormal functions that cover ∼1-2 orders of magnitude in CO brightness and integrated intensity. The PDFs for M33 and the LMC are narrower and peak at lower CO intensities, consistent with their lower gas surface densities. However, the CO PDFs for different dynamical environments within the PAWS field depart significantly from the shape of the global distribution. The PDFs for the interarm region are approximately lognormal, but in the spiral arms and central region of M51, they exhibit diverse shapes with a significant excess of bright CO emission. The observed environmental dependence on the shape of the CO PDFs is qualitatively consistent with changes that would be expected if molecular gas in the spiral arms is characterized by a larger range of average densities, gas temperatures, and velocity fluctuations, although further work is required to disentangle the relative importance of large-scale dynamical effects versus star formation feedback in regulating these properties. We show that the shape of the CO PDFs for different M51 environments is only weakly related to global properties of the CO emission, e.g., the total CO luminosity, but is strongly correlated with properties of the local giant molecular cloud (GMC) and young stellar cluster populations, including the shape of their mass distributions. For galaxies with
Institute of Scientific and Technical Information of China (English)
郑智捷
2011-01-01
. Under the conditional probability model, intrinsic wave-like statistical distributions are observed on both normal conditions and interferenced conditions in their spatial statistical distributions respectively.
Zhao, H.; Li, X.; Blake, J. B.; Fennell, J.; Claudepierre, S. G.; Baker, D. N.; Jaynes, A. N.; Malaspina, D.
2014-12-01
The pitch angle distribution (PAD) of energetic electrons in the slot region and inner radiation belt received little attention in the past decades due to the lack of quality measurements. Using the state-of-art pitch-angle-resolved data from the Magnetic Electron Ion Spectrometer (MagEIS) instrument onboard the Van Allen Probes, a detailed analysis of 100s keV electron PADs below L =4 is performed, in which the PADs is categorized into three types: normal (flux peaking at 90°), cap (exceedingly peaking narrowly around 90°) and 90°-minimum (lower flux at 90°) PADs. By examining the characteristics of the PADs of 460 keV electrons for over a year, we find that the 90°-minimum PADs are generally present in the inner belt (Lbelt and relatively constant in the inner belt but changes significantly in the slot region (2mechanism can hardly explain the formation of 90°-minimum PADs at the center of inner belt. These new and compelling observations, made possible by the high-quality measurements of MagEIS, present a challenge for the wave modelers, and future work is still needed to fully understand them.
Directory of Open Access Journals (Sweden)
Enrico R Crema
Full Text Available Recent advances in the use of summed probability distribution (SPD of calibrated 14C dates have opened new possibilities for studying prehistoric demography. The degree of correlation between climate change and population dynamics can now be accurately quantified, and divergences in the demographic history of distinct geographic areas can be statistically assessed. Here we contribute to this research agenda by reconstructing the prehistoric population change of Jomon hunter-gatherers between 7,000 and 3,000 cal BP. We collected 1,433 14C dates from three different regions in Eastern Japan (Kanto, Aomori and Hokkaido and established that the observed fluctuations in the SPDs were statistically significant. We also introduced a new non-parametric permutation test for comparing multiple sets of SPDs that highlights point of divergences in the population history of different geographic regions. Our analyses indicate a general rise-and-fall pattern shared by the three regions but also some key regional differences during the 6th millennium cal BP. The results confirm some of the patterns suggested by previous archaeological studies based on house and site counts but offer statistical significance and an absolute chronological framework that will enable future studies aiming to establish potential correlation with climatic changes.
Rixen, M.; Ferreira-Coelho, E.; Signell, R.
2008-01-01
Despite numerous and regular improvements in underlying models, surface drift prediction in the ocean remains a challenging task because of our yet limited understanding of all processes involved. Hence, deterministic approaches to the problem are often limited by empirical assumptions on underlying physics. Multi-model hyper-ensemble forecasts, which exploit the power of an optimal local combination of available information including ocean, atmospheric and wave models, may show superior forecasting skills when compared to individual models because they allow for local correction and/or bias removal. In this work, we explore in greater detail the potential and limitations of the hyper-ensemble method in the Adriatic Sea, using a comprehensive surface drifter database. The performance of the hyper-ensembles and the individual models are discussed by analyzing associated uncertainties and probability distribution maps. Results suggest that the stochastic method may reduce position errors significantly for 12 to 72??h forecasts and hence compete with pure deterministic approaches. ?? 2007 NATO Undersea Research Centre (NURC).
Crema, Enrico R; Habu, Junko; Kobayashi, Kenichi; Madella, Marco
2016-01-01
Recent advances in the use of summed probability distribution (SPD) of calibrated 14C dates have opened new possibilities for studying prehistoric demography. The degree of correlation between climate change and population dynamics can now be accurately quantified, and divergences in the demographic history of distinct geographic areas can be statistically assessed. Here we contribute to this research agenda by reconstructing the prehistoric population change of Jomon hunter-gatherers between 7,000 and 3,000 cal BP. We collected 1,433 14C dates from three different regions in Eastern Japan (Kanto, Aomori and Hokkaido) and established that the observed fluctuations in the SPDs were statistically significant. We also introduced a new non-parametric permutation test for comparing multiple sets of SPDs that highlights point of divergences in the population history of different geographic regions. Our analyses indicate a general rise-and-fall pattern shared by the three regions but also some key regional differences during the 6th millennium cal BP. The results confirm some of the patterns suggested by previous archaeological studies based on house and site counts but offer statistical significance and an absolute chronological framework that will enable future studies aiming to establish potential correlation with climatic changes. PMID:27128032
Directory of Open Access Journals (Sweden)
Sean S Downey
Full Text Available Analysis of the proportion of immature skeletons recovered from European prehistoric cemeteries has shown that the transition to agriculture after 9000 BP triggered a long-term increase in human fertility. Here we compare the largest analysis of European cemeteries to date with an independent line of evidence, the summed calibrated date probability distribution of radiocarbon dates (SCDPD from archaeological sites. Our cemetery reanalysis confirms increased growth rates after the introduction of agriculture; the radiocarbon analysis also shows this pattern, and a significant correlation between both lines of evidence confirms the demographic validity of SCDPDs. We analyze the areal extent of Neolithic enclosures and demographic data from ethnographically known farming and foraging societies and we estimate differences in population levels at individual sites. We find little effect on the overall shape and precision of the SCDPD and we observe a small increase in the correlation with the cemetery trends. The SCDPD analysis supports the hypothesis that the transition to agriculture dramatically increased demographic growth, but it was followed within centuries by a general pattern of collapse even after accounting for higher settlement densities during the Neolithic. The study supports the unique contribution of SCDPDs as a valid demographic proxy for the demographic patterns associated with early agriculture.
Allevato, V; Finoguenov, A; Marchesi, S; Zamorani, G; Hasinger, G; Salvato, M; Miyaji, T; Gilli, R; Cappelluti, N; Brusa, M; Suh, H; Lanzuisi, G; Trakhtenbrot, B; Griffiths, R; Vignali, C; Schawinski, K; Karim, A
2016-01-01
We present the measurement of the projected and redshift space 2-point correlation function (2pcf) of the new catalog of Chandra COSMOS-Legacy AGN at 2.9$\\leq$z$\\leq$5.5 ($\\langle L_{bol} \\rangle \\sim$10$^{46}$ erg/s) using the generalized clustering estimator based on phot-z probability distribution functions (Pdfs) in addition to any available spec-z. We model the projected 2pcf estimated using $\\pi_{max}$ = 200 h$^{-1}$ Mpc with the 2-halo term and we derive a bias at z$\\sim$3.4 equal to b = 6.6$^{+0.60}_{-0.55}$, which corresponds to a typical mass of the hosting halos of log M$_h$ = 12.83$^{+0.12}_{-0.11}$ h$^{-1}$ M$_{\\odot}$. A similar bias is derived using the redshift-space 2pcf, modelled including the typical phot-z error $\\sigma_z$ = 0.052 of our sample at z$\\geq$2.9. Once we integrate the projected 2pcf up to $\\pi_{max}$ = 200 h$^{-1}$ Mpc, the bias of XMM and \\textit{Chandra} COSMOS at z=2.8 used in Allevato et al. (2014) is consistent with our results at higher redshift. The results suggest only...
Falandysz, Jerzy; Zhang, Ji; Wang, Yuanzhong; Krasińska, Grażyna; Kojta, Anna; Saba, Martyna; Shen, Tao; Li, Tao; Liu, Honggao
2015-12-15
This study focused on investigation of the accumulation and distribution of mercury (Hg) in mushrooms of the genus Leccinum that emerged on soils of totally different geochemical bedrock composition. Hg in 6 species from geographically diverse regions of the mercuriferous belt areas in Yunnan of SW China, and 8 species from the non-mercuriferous regions of Poland in Europe was measured. Also assessed was the probable dietary intake of Hg from consumption of Leccinum spp., which are traditional organic food items in SW China and Poland. The results showed that L. chromapes, L. extremiorientale, L. griseum and L. rugosicepes are good accumulators of Hg and the sequestered Hg in caps were up to 4.8, 3.5, 3.6 and 4.7 mg Hg kg(-1) dry matter respectively. Leccinum mushrooms from Poland also efficiently accumulated Hg with their average Hg content being an order of magnitude lower due to low concentrations of Hg in forest topsoil of Poland compared to the elevated contents in Yunnan. Consumption of Leccinum mushrooms with elevated Hg contents in Yunnan at rates of up to 300 g fresh product per week during the foraging season would not result in Hg intake that exceeds the provisional weekly tolerance limit of 0.004 mg kg(-1) body mass, assuming no Hg ingestion from other foods. PMID:26322595
Slater, Paul B
2010-01-01
The nonnegativity of the determinant of the partial transpose of a two-qubit (4 x 4) density matrix is both a necessary and sufficient condition for its separability. While the determinant is restricted to the interval [0,1/256], the determinant of the partial transpose can range over [-1/16,1/256], with negative values corresponding to entangled states. We report here the exact values of the first nine moments of the probability distribution of the partial transpose over this interval, with respect to the Hilbert-Schmidt (metric volume element) measure on the nine-dimensional convex set of real two-qubit density matrices. Rational functions C_{2 j}(m), yielding the coefficients of the 2j-th power of even polynomials occurring at intermediate steps in our derivation of the m-th moment, emerge. These functions possess poles at finite series of consecutive half-integers (m=-3/2,-1/2,...,(2j-1)/2), and certain (trivial) roots at finite series of consecutive natural numbers (m=0, 1,...). Additionally, the (nontri...
Energy Technology Data Exchange (ETDEWEB)
Alkhazov, G.D. E-mail: alkhazov@pcfarm.pnpi.spb.ru; Dobrovolsky, A.V.; Egelhof, P.; Geissel, H.; Irnich, H.; Khanzadeev, A.V.; Korolev, G.A.; Lobodenko, A.A.; Muenzenberg, G.; Mutterer, M.; Neumaier, S.R.; Schwab, W.; Seliverstov, D.M.; Suzuki, T.; Vorobyov, A.A
2002-12-30
A Glauber based analysis of the experimental cross sections for small-angle elastic p {sup 6,8}He scattering at 0.7 GeV has been performed. The radii and radial shape of the {sup 6}He and {sup 8}He nuclei have been determined using phenomenological nuclear density distributions with two free parameters. The deduced shapes of the {sup 6}He and {sup 8}He nuclear matter radial distributions conform with the concept that both nuclei consist of an {alpha}-particle core and a significant neutron halo. The accuracy of the theoretical analysis of the elastic-scattering cross-section data is discussed, and possible sources of systematic uncertainty related to some basic limitations in the applied method are outlined. The experimental p {sup 6,8}He elastic-scattering cross sections have also been utilized for probing the matter density distributions resulting from various nuclear microscopic models. Besides, the sensitivity of the total p {sup 6,8}He reaction cross sections to the size of the {sup 6}He and {sup 8}He nuclei has been considered.
Energy Technology Data Exchange (ETDEWEB)
Kato, S., E-mail: eun1302@mail4.doshsha.ac.jp [Doshisha University, Kyotanabe, Kyoto 610-0321 (Japan); Tanaka, N. [Institute of Laser Engineering, Osaka University, Suita, Osaka 565-0871 (Japan); Sasao, M. [Doshisha University, Kyotanabe, Kyoto 610-0321 (Japan); Kisaki, M.; Tsumori, K. [National Institute for Fusion Science, Toki, Gifu 509-5292 (Japan); Nishiura, M. [University of Tokyo, Kashiwa, Chiba 277-8568 (Japan); Matsumoto, Y. [Tokushima Bunri University, Yamashiro, Tokushima 770-8514 (Japan); Kenmotsu, T.; Wada, M. [Doshisha University, Kyotanabe, Kyoto 610-0321 (Japan); Yamaoka, H. [RIKEN SPring-8 Center, Sayo, Hyogo 679-5148 (Japan)
2015-08-15
Hydrogen ion reflection properties have been investigated following the injection of H{sup +}, H{sub 2}{sup +} and H{sub 3}{sup +} ions onto a polycrystalline W surface. Angle- and energy-resolved intensity distributions of both scattered H{sup +} and H{sup −} ions are measured by a magnetic momentum analyzer. We have detected atomic hydrogen ions reflected from the surface, while molecular hydrogen ions are unobserved within our detection limit. The reflected hydrogen ion energy is approximately less than one-third of the incident beam energy for H{sub 3}{sup +} ion injection and less than a half of that for H{sub 2}{sup +} ion injection. Other reflection properties are very similar to those of monoatomic H{sup +} ion injection. Experimental results are compared to the classical trajectory simulations using the ACAT code based on the binary collision approximation.
Kato, S.; Tanaka, N.; Sasao, M.; Kisaki, M.; Tsumori, K.; Nishiura, M.; Matsumoto, Y.; Kenmotsu, T.; Wada, M.; Yamaoka, H.
2015-08-01
Hydrogen ion reflection properties have been investigated following the injection of H+, H2+ and H3+ ions onto a polycrystalline W surface. Angle- and energy-resolved intensity distributions of both scattered H+ and H- ions are measured by a magnetic momentum analyzer. We have detected atomic hydrogen ions reflected from the surface, while molecular hydrogen ions are unobserved within our detection limit. The reflected hydrogen ion energy is approximately less than one-third of the incident beam energy for H3+ ion injection and less than a half of that for H2+ ion injection. Other reflection properties are very similar to those of monoatomic H+ ion injection. Experimental results are compared to the classical trajectory simulations using the ACAT code based on the binary collision approximation.
Collision Probability Analysis
DEFF Research Database (Denmark)
Hansen, Peter Friis; Pedersen, Preben Terndrup
1998-01-01
It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2010-01-01
This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....
von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo
2014-06-01
Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.
Review on Gene Expression Models: Probability Distribution%基因表达模型的研究进展:概率分布
Institute of Scientific and Technical Information of China (English)
周天寿
2012-01-01
Quantifying gene expression (including mathematical modeling and qualitative and quantitative analysis) -is not only an important step toward to understanding intracellular processes but also the core of the current sys-tems biology. Gene expression models have been developed to complicated multi-state models considering detailed biological processes and a number of biological factors from the initial simple single-state models. Based on central dogma in biology, The proceeding in the study of gene expression models, focusing on improvement of mathe-matical models, probability distribution of mRNAs and proteins, etc. are simply reviewed. Consequently, some general laws related to gene expression are summarized. In addition, some issues to deserve further studies are discussed and potential directions are pointed out.%定量化基因表达(包括数学建模及定性与定量分析)是理解细胞内部过程的重要一步,也是当今系统生物学的核心研究内容.基因表达模型已从最初的单状态简单模型发展到考虑细化生物过程、众多生物因素的多状态复杂模型.基于生物学的中心法则,综述了有关基因表达模型的最新研究进展,聚焦于数学模型的完善、mRNA与蛋白质数目的概率分布等研究方面.通过综述,试图总结出有关基因表达的某些一般性规律,并提出今后需要进一步研究的问题与发展方向.
Multivariate joint probability distribution of droughts in Wei River basin%渭河流域干旱特征联合概率分布研究
Institute of Scientific and Technical Information of China (English)
马明卫; 宋松柏; 于艺; 张雨; 李扬
2012-01-01
This study aims to model the dependence structures of multivariate drought variables using elliptical copulas.Bivariate dependence was estimated with Pearson′s classical correlation coefficient γn,Spearman′s ρnand Kendall′s τn,together with rank scatter plot and Chi-plot and K-plot,while parameters of trivariate copulas were estimated with the maximum likelihood method.For best-fitting of these copulas,Akaike information criterion(AIC),Bayesian information criterion(BIC) and RMS error(RMSE) were used,and a bootstrap version of Rosenblatt′s transformation was used to test goodness-of-fit for Gaussian copula and student t copula.In application to the Wei River basin for determination of its spatial distribution of drought return periods,Gaussian copula was selected for modeling the multivariate joint probability distribution of its drought duration,drought severity and severity peak.The results show that both Gaussian and student t copulas are applicable,but Gaussian copula gives better fitting.In the basin,prolonged droughts had frequently broken out with rather short return periods and thus more emphases should be placed on drought forecast and management in the future.%应用椭圆copulas描述干旱多变量间的相依性结构。采用Pearson＇sγn、Spearman＇sρn、Kendall＇sτn、秩相关图、Chi-plot和K-plot度量2变量相依性;根据极大似然法估计3维copulas的参数,并以AIC、BIC和RMSE进行copulas拟合效果评价;运用基于Rosenblatt变换的Bootstrap法进行Gaussian copula和Student t copula的拟合度检验;选择Gaussiancopula描述干旱历时D、烈度S、和峰值P的联合概率分布,探讨渭河流域干旱重现期的空间分布规律。研究表明：①3维Gaussian copula和Student t copula均适合用来描述干旱多变量联合概率分布,且前者拟合效果优于后者;②渭河流域发生较长时期持续干旱的频率高、重现期短,应加强干旱预报与管理。
Kahn, R. A.; Gaitley, B. J.; Nelson, D. L.; Garay, M. J.; Misr Team
2010-12-01
Although volcanic eruptions occur about once per week globally, on average, relatively few of them affect the daily lives of millions of people. Significant exceptions were two eruptions of the Eyjafjallajökull volcano in southern Iceland, which produced ash clouds lasting several weeks during each of April and May 2010. During the first eruption, air traffic over most of Europe was halted, severely affecting international transportation, trade, and economics. For the second ash cloud, space-based and suborbital observations, together with aerosol transport modeling, were used to predict ash plume distribution, making it possible to selectively close only the limited airspace in which there was actual risk of significant ash exposure. These events highlight the immense value of aerosol measurement and modeling capabilities when integrated and applied in emergency response situations. Geosynchronous satellite and continuous, ground-based observations played the most immediate roles in constraining model ash-cloud-extent predictions. However, the rich information content of large-scale though less frequent observations from instruments such as the NASA Earth Observing System’s Multi-angle Imaging SpectroRadiometer (MISR) are key to improving the underlying representations of processes upon which the plume transport models rely. MISR contributes to this pool of information by providing maps of plume height derived from stereo imaging that are independent of knowledge of the temperature structure of the atmosphere or assumptions that the ash cloud is in thermal equilibrium with the environment. Such maps are obtained primarily near-source, where features of the ash cloud can be observed and co-registered in the multi-angle views. A distribution of heights is produced, making it possible to report all-important layer extent rather than just a characteristic plume elevation. Results are derived at 1.1 km horizontal and about 0.5 km vertical resolution. In addition
Energy Technology Data Exchange (ETDEWEB)
Kozier, K.S. [Atomic Energy of Canada Limited, Chalk River Laboratories, Chalk River, Ontario (Canada)
2008-07-01
Different evaluated (n,d) energy-angle elastic scattering distributions produce k-effective differences in MCNP5 simulations of critical experiments involving heavy water (D{sub 2}O) of sufficient magnitude to suggest a need for new (n,d) scattering measurements and/or distributions derived from modern theoretical nuclear models, especially at neutron energies below a few MeV. The present work focuses on the small reactivity change of < 1 mk that is observed in the MCNP5 D{sub 2}O coolant-void-reactivity calculation bias for simulations of two pairs of critical experiments performed in the ZED-2 reactor at the Chalk River Laboratories when different nuclear data libraries are used for deuterium. The deuterium data libraries tested include Endf/B-VII.0, Endf/B-VI.4, JENDL-3.3 and a new evaluation, labelled Bonn-B, which is based on recent theoretical nuclear-model calculations. Comparison calculations were also performed for a simplified, two-region, spherical model having an inner, 250-cm radius, homogeneous sphere of UO{sub 2}, without and with deuterium, and an outer 20-cm-thick deuterium reflector. A notable observation from this work is the reduction of about 0.4 mk in the MCNP5 ZED-2 CVR calculation bias that is obtained when the O-in-UO{sub 2} thermal scattering data comes from Endf-B-VII.0. (author)
Energy Technology Data Exchange (ETDEWEB)
Mehta, Virat; Ikeda, Yoshihiro; Takano, Ken; Terris, Bruce D.; Hellwig, Olav [San Jose Research Center, HGST a Western Digital company, 3403 Yerba Buena Rd., San Jose, California 95135 (United States); Wang, Tianhan [Department of Materials Science and Engineering, Stanford University, Stanford, California 94035 (United States); Stanford Institute for Materials and Energy Science (SIMES), SLAC National Accelerator Laboratory, 2575 Sand Hill Road, Menlo Park, California 94025 (United States); Wu, Benny; Graves, Catherine [Stanford Institute for Materials and Energy Science (SIMES), SLAC National Accelerator Laboratory, 2575 Sand Hill Road, Menlo Park, California 94025 (United States); Department of Applied Physics, Stanford University, Stanford, California 94035 (United States); Dürr, Hermann A.; Scherz, Andreas; Stöhr, Jo [Stanford Institute for Materials and Energy Science (SIMES), SLAC National Accelerator Laboratory, 2575 Sand Hill Road, Menlo Park, California 94025 (United States)
2015-05-18
We analyze the magnetic cluster size (MCS) and magnetic cluster size distribution (MCSD) in a variety of perpendicular magnetic recording (PMR) media designs using resonant small angle x-ray scattering at the Co L{sub 3} absorption edge. The different PMR media flavors considered here vary in grain size between 7.5 and 9.5 nm as well as in lateral inter-granular exchange strength, which is controlled via the segregant amount. While for high inter-granular exchange, the MCS increases rapidly for grain sizes below 8.5 nm, we show that for increased amount of segregant with less exchange the MCS remains relatively small, even for grain sizes of 7.5 and 8 nm. However, the MCSD still increases sharply when shrinking grains from 8 to 7.5 nm. We show evidence that recording performance such as signal-to-noise-ratio on the spin stand correlates well with the product of magnetic cluster size and magnetic cluster size distribution.
Directory of Open Access Journals (Sweden)
R. M. Thorne
2012-04-01
Full Text Available We present a detailed numerical study on the effects of a non-dipole magnetic field on the Earth's plasma sheet electron distribution and its implication for diffuse auroral precipitation. Use of the modified bounce-averaged Fokker-Planck equation developed in the companion paper by Ni et al. (2012 for 2-D non-dipole magnetic fields suggests that we can adopt a numerical scheme similar to that used for a dipole field, but should evaluate bounce-averaged diffusion coefficients and bounce period related terms in non-dipole magnetic fields. Focusing on nightside whistler-mode chorus waves at L = 6, and using various Dungey magnetic models, we calculate and compare of the bounce-averaged diffusion coefficients in each case. Using the Alternative Direction Implicit (ADI scheme to numerically solve the 2-D Fokker-Planck diffusion equation, we demonstrate that chorus driven resonant scattering causes plasma sheet electrons to be scattered much faster into loss cone in a non-dipole field than a dipole. The electrons subject to such scattering extends to lower energies and higher equatorial pitch angles when the southward interplanetary magnetic field (IMF increases in the Dungey magnetic model. Furthermore, we find that changes in the diffusion coefficients are the dominant factor responsible for variations in the modeled temporal evolution of plasma sheet electron distribution. Our study demonstrates that the effects of realistic ambient magnetic fields need to be incorporated into both the evaluation of resonant diffusion coefficients and the calculation of Fokker-Planck diffusion equation to understand quantitatively the evolution of plasma sheet electron distribution and the occurrence of diffuse aurora, in particular at L > 5 during geomagnetically disturbed periods when the ambient magnetic field considerably deviates from a magnetic dipole.
International Nuclear Information System (INIS)
Purpose: We aim to achieve new four-dimensional radiotherapy (4DRT) using the next generation real-time tumor-tracking (RTRT) system and flattening-filter-free techniques. To achieve new 4DRT, it is necessary to understand the respiratory motion of tumor. The purposes of this study were: 1.To develop the respiratory motion analysis tool using log files. 2.To evaluate the reproducibility of tumor motion probability distribution function (PDF) during stereotactic body RT (SBRT) of lung tumor. Methods: Seven patients having fiducial markers closely implanted to the lung tumor were enrolled in this study. The positions of fiducial markers were measured using the RTRT system (Mitsubishi Electronics Co., JP) and recorded as two types of log files during the course of SBRT. For each patients, tumor motion range and tumor motion PDFs in left-right (LR), anterior-posterior (AP) and superior-inferior (SI) directions were calculated using log files of all beams per fraction (PDFn). Fractional PDF reproducibility (Rn) was calculated as Kullback-Leibler (KL) divergence between PDF1 and PDFn of tumor motion. The mean of Rn (Rm) was calculated for each patient and correlated to the patient’s mean tumor motion range (Am). The change of Rm during the course of SBRT was also evluated. These analyses were performed using in-house developed software. Results: The Rm were 0.19 (0.07–0.30), 0.14 (0.07–0.32) and 0.16 (0.09–0.28) in LR, AP and SI directions, respectively. The Am were 5.11 mm (2.58–9.99 mm), 7.81 mm (2.87–15.57 mm) and 11.26 mm (3.80–21.27 mm) in LR, AP and SI directions, respectively. The PDF reproducibility decreased as the tumor motion range increased in AP and SI direction. That decreased slightly through the course of RT in SI direction. Conclusion: We developed the respiratory motion analysis tool for 4DRT using log files and quantified the range and reproducibility of respiratory motion for lung tumors
Joint probabilities and quantum cognition
de Barros, J Acacio
2012-01-01
In this paper we discuss the existence of joint probability distributions for quantum-like response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.
Novel Bounds on Marginal Probabilities
Mooij, Joris M.; Kappen, Hilbert J
2008-01-01
We derive two related novel bounds on single-variable marginal probability distributions in factor graphs with discrete variables. The first method propagates bounds over a subtree of the factor graph rooted in the variable, and the second method propagates bounds over the self-avoiding walk tree starting at the variable. By construction, both methods not only bound the exact marginal probability distribution of a variable, but also its approximate Belief Propagation marginal (``belief''). Th...
Directory of Open Access Journals (Sweden)
Shinichi Kinugasa
2012-01-01
Full Text Available Accurate determination of the intensity-average diameter of polystyrene latex (PS-latex by dynamic light scattering (DLS was carried out through extrapolation of both the concentration of PS-latex and the observed scattering angle. Intensity-average diameter and size distribution were reliably determined by asymmetric flow field flow fractionation (AFFFF using multi-angle light scattering (MALS with consideration of band broadening in AFFFF separation. The intensity-average diameter determined by DLS and AFFFF-MALS agreed well within the estimated uncertainties, although the size distribution of PS-latex determined by DLS was less reliable in comparison with that determined by AFFFF-MALS.
Directory of Open Access Journals (Sweden)
Monir Sharifi
2012-05-01
Full Text Available Periodic mesoporous materials of the type (R′O3Si-R-Si(OR′3 with benzene as an organic bridge and a crystal-like periodicity within the pore walls were functionalized with SO3H or SO3− groups and investigated by small-angle neutron scattering (SANS with in situ nitrogen adsorption at 77 K. If N2 is adsorbed in the pores the SANS measurements show a complete matching of all of the diffraction signals that are caused by the long-range ordering of the mesopores in the benzene-PMO, due to the fact that the benzene-PMO walls possess a neutron scattering length density (SLD similar to that of nitrogen in the condensed state. However, signals at higher q-values (>1 1/Å are not affected with respect to their SANS intensity, even after complete pore filling, confirming the assumption of a crystal-like periodicity within the PMO material walls due to π–π interactions between the organic bridges. The SLD of pristine benzene-PMO was altered by functionalizing the surface with different amounts of SO3H-groups, using the grafting method. For a low degree of functionalization (0.81 mmol SO3H·g−1 and/or an inhomogeneous distribution of the SO3H-groups, the SLD changes only negligibly, and thus, complete contrast matching is still found. However, for higher amounts of SO3H-groups (1.65 mmol SO3H·g−1 being present in the mesopores, complete matching of the neutron diffraction signals is no longer observed proving that homogeneously distributed SO3H-groups on the inner pore walls of the benzene-PMO alter the SLD in a way that it no longer fits to the SLD of the condensed N2.
Institute of Scientific and Technical Information of China (English)
王淳; 高元海
2014-01-01
以有功网损期望值最小为优化目标，以节点电压的合格概率大于一定的阈值为约束条件，建立了同时考虑风能、太阳能分布式发电出力和负荷随机波动的配电网无功优化模型。目标函数和约束项中所涉及的概率潮流由一种结合传统解析法的基于全概率公式的计算方法求得。使用化学反应算法对所建优化模型进行求解。在同时接入风能、太阳能分布式电源的33节点和69节点系统上对所提方法进行了验证，得到了具有概率统计意义的最优方案。通过与包括遗传算法(genetic algorithm ， GA)、Stud GA(stud genetic algorithm)、生物地理学算法(biogeography based optimization ， BBO)和粒子群算法(particle swarm optimization，PSO)在内的多个智能算法对比，验证了所构建的化学反应算法在求解上述无功优化模型时性能更加稳定。%Taking the minimum expectation of active network loss as the optimization objective and the qualified probability of nodal voltage, which larger than a certain threshold, as the constraint, a reactive power optimization model of distribution network, in which the output fluctuation of distributed wind power generation and PV generation as well as the random fluctuation of load are considered simultaneously, is established. The probabilistic power flows involved in objective function and constraints are solved by a complete probability formula based computing method that combines with traditional analytical method. The established reactive power optimization model is solved by chemical reaction optimization (CRO). The proposed model is verified by IEEE 33-bus system and PG&E 69-bus system respectively, to which the distributed PV generation and wind power generation are simultaneously added, and an optimal scheme possessing the meaning of probability statistics is achieved. Comparing the constructed CRO algorithm with other intelligent algorithms, such as genetic
Eliazar, Iddo; Klafter, Joseph
2008-06-01
We explore six classes of fractal probability laws defined on the positive half-line: Weibull, Frechét, Lévy, hyper Pareto, hyper beta, and hyper shot noise. Each of these classes admits a unique statistical power-law structure, and is uniquely associated with a certain operation of renormalization. All six classes turn out to be one-dimensional projections of underlying Poisson processes which, in turn, are the unique fixed points of Poissonian renormalizations. The first three classes correspond to linear Poissonian renormalizations and are intimately related to extreme value theory (Weibull, Frechét) and to the central limit theorem (Lévy). The other three classes correspond to nonlinear Poissonian renormalizations. Pareto's law--commonly perceived as the "universal fractal probability distribution"--is merely a special case of the hyper Pareto class.
Matsuda, Hiroyuki; Daimon, Hiroshi; Tóth, László; Matsui, Fumihiko
2007-04-01
This paper provides a way of focusing wide-angle charged-particle beams in multiple lens systems. In previous papers [H. Matsuda , Phys. Rev. E 71, 066503 (2005); 74, 036501 (2006)], it was shown that an ellipsoidal mesh, combined with electrostatic lenses, enables correction of spherical aberration over wide acceptance angles up to ±60° . In this paper, practical situations where ordinary electron lenses are arranged behind the wide-angle electrostatic lenses are taken into account using ray tracing calculation. For practical realization of the wide-angle lens systems, the acceptance angle is set to ±50° . We note that the output beams of the wide-angle electrostatic lenses have somewhat large divergence angles which cause unacceptable or non-negligible spherical aberration in additional lenses. A solution to this problem is presented showing that lens combinations to cancel spherical aberration are available, whereby wide-angle charged-particle beams can be finely focused with considerably reduced divergence angles less than ±5° .
Energetic Electron Pitch Angle Diffusion due to Whistler Wave during Terrestrial Storms
Institute of Scientific and Technical Information of China (English)
XIAO Fu-Liang; HE Hui-Yong
2006-01-01
A concise and elegant expression of cyclotron harmonic resonant quasi-pure pitch-angle diffusion is constructed for the parallel whistler mode waves, and the quasi-linear diffusion coefficient is prescribed in terms of the whistler mode wave spectral intensity. Numerical computations are performed for the specific case of energetic electrons interacting with a band of frequency of whistler mode turbulence at L ≈ 3. It is found that the quasi-pure pitch-angle diffusion driven by the whistler mode scatters energetic electrons from the larger pitch-angles into the loss cone, and causes pitch-angle distribution to evolve from the pancake-shaped before the terrestrial storms to the flat-top during the main phase. This probably accounts for the quasi-isotropic pitch-angle distribution observed by the combined release and radiation effects satellite spacecraft at L ≈ 3.
The Inductive Applications of Probability Calculus
Directory of Open Access Journals (Sweden)
Corrado Gini
2015-06-01
Full Text Available The Author goes back to Founders of Probability calculus to investigate their original interpretation of the probability measure in the applications of the probability theory to real problems. The Author puts in evidence some misunderstandings related to the inversion of deductions derived by the use of probability distributions for investigating the causes of events.
Institute of Scientific and Technical Information of China (English)
陈刚; 王梦婕
2014-01-01
通过对χ2分布概率密度函数的自变量进行标准化变换,将其展开成如下形式：2nχ2( x；n)=1+r1(t)n +r2(t)n +r3(t)n n +r4(t)n2éëùûφ(t)+o 1n2(),其中n为自由度,φ(t)为标准正态分布的密度函数,ri(t)(1≤i≤4)均为关于t的多项式。从该展开式得到χ2分布密度函数的一个近似计算公式。进一步建立φ( t)的幂系数积分递推关系,得到χ2分布函数的渐近展开式。最后通过数值计算验证了这些结果在实际应用中的有效性。%Through the transformation of the independent variable of χ2 distribution probability density function,degree of freedom of which is n,the equation can be expanded as follows: 2nχ2(x;n)=f(t;n)= 1+r1(t)n +r2(t)n +r3(t)n n +r4(t)n2éë ùûφ(t)+o 1n2( ) ,here,φ(t) is a density function of standard normal distribution;ri(t) is a 3i order polynomial of t(1≤i≤4). An approximate formula can be obtained from the expansion of the distribution density function. We further establish the integral recurrence relations of the power coefficients of the standard normal density function and obtain the asymptotic expansion of the distribution function ofχ2 . Finally,the effectiveness of these results in practical application was verified by the numerical calculations.
Probably Almost Bayes Decisions
DEFF Research Database (Denmark)
Anoulova, S.; Fischer, Paul; Poelt, S.;
1996-01-01
In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...... discriminant functions for this purpose. We analyze this approach for different classes of distribution functions of Boolean features:kth order Bahadur-Lazarsfeld expansions andkth order Chow expansions. In both cases, we obtain upper bounds for the required sample size which are small polynomials...
Sirca, Simon
2016-01-01
This book is designed as a practical and intuitive introduction to probability, statistics and random quantities for physicists. The book aims at getting to the main points by a clear, hands-on exposition supported by well-illustrated and worked-out examples. A strong focus on applications in physics and other natural sciences is maintained throughout. In addition to basic concepts of random variables, distributions, expected values and statistics, the book discusses the notions of entropy, Markov processes, and fundamentals of random number generation and Monte-Carlo methods.
Mueller, J.J.; Hansen, S; Lukowski, G.; Gast, K.
1995-01-01
Acrylic acid copolymers are potential carriers for drug delivery. The surface, surface rugosity and the absolute dimension of the particles are parameters that determine the binding of drugs or detergents, diffusion phenomena at the surface and the distribution of the carrier within the human body. The particle-size distribution and surface rugosity of the particles have been investigated by small-angle X-ray scattering and dynamic light scattering. Direct Fourier transform as well as a new s...
Ruggier, C. J.
1992-01-01
The probability of exceeding interference power levels and the duration of interference at the Deep Space Network (DSN) antenna is calculated parametrically when the state vector of an Earth-orbiting satellite over the DSN station view area is not known. A conditional probability distribution function is derived, transformed, and then convolved with the interference signal uncertainties to yield the probability distribution of interference at any given instant during the orbiter's mission period. The analysis is applicable to orbiting satellites having circular orbits with known altitude and inclination angle.
Contributions to quantum probability
Energy Technology Data Exchange (ETDEWEB)
Fritz, Tobias
2010-06-25
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a
Contributions to quantum probability
International Nuclear Information System (INIS)
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome
Probability representations of fuzzy systems
Institute of Scientific and Technical Information of China (English)
LI Hongxing
2006-01-01
In this paper, the probability significance of fuzzy systems is revealed. It is pointed out that COG method, a defuzzification technique used commonly in fuzzy systems, is reasonable and is the optimal method in the sense of mean square. Based on different fuzzy implication operators, several typical probability distributions such as Zadeh distribution, Mamdani distribution, Lukasiewicz distribution, etc. are given. Those distributions act as "inner kernels" of fuzzy systems. Furthermore, by some properties of probability distributions of fuzzy systems, it is also demonstrated that CRI method, proposed by Zadeh, for constructing fuzzy systems is basically reasonable and effective. Besides, the special action of uniform probability distributions in fuzzy systems is characterized. Finally, the relationship between CRI method and triple I method is discussed. In the sense of construction of fuzzy systems, when restricting three fuzzy implication operators in triple I method to the same operator, CRI method and triple I method may be related in the following three basic ways: 1) Two methods are equivalent; 2) the latter is a degeneration of the former; 3) the latter is trivial whereas the former is not. When three fuzzy implication operators in triple I method are not restricted to the same operator, CRI method is a special case of triple I method; that is, triple I method is a more comprehensive algorithm. Since triple I method has a good logical foundation and comprises an idea of optimization of reasoning, triple I method will possess a beautiful vista of application.
Survival probability and ruin probability of a risk model
Institute of Scientific and Technical Information of China (English)
LUO Jian-hua
2008-01-01
In this paper, a new risk model is studied in which the rate of premium income is regarded as a random variable, the arrival of insurance policies is a Poisson process and the process of claim occurring is p-thinning process. The integral representations of the survival probability are gotten. The explicit formula of the survival probability on the infinite interval is obtained in the special casc--exponential distribution.The Lundberg inequality and the common formula of the ruin probability are gotten in terms of some techniques from martingale theory.
Probability theory and mathematical statistics for engineers
Pugachev, V S
1984-01-01
Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector
Cosmological dynamics in tomographic probability representation
Man'ko, V. I.; G. Marmo(Università di Napoli and INFN, Napoli, Italy); Stornaiolo, C.
2004-01-01
The probability representation for quantum states of the universe in which the states are described by a fair probability distribution instead of wave function (or density matrix) is developed to consider cosmological dynamics. The evolution of the universe state is described by standard positive transition probability (tomographic transition probability) instead of the complex transition probability amplitude (Feynman path integral) of the standard approach. The latter one is expressed in te...
Institute of Scientific and Technical Information of China (English)
梁莉; 赵琳娜; 巩远发; 包红军; 王成鑫; 王志
2011-01-01
利用淮河流域158个站点1980-2007年夏季降水量资料,选取淮河上游、淮河中上游、淮河中下游、洪泽湖以下和沂沭河5个子流域,采用(Τ)分布函数分析了淮河流域首雨日(前1日无雨)和连续雨日(前1日有雨)的夏季多年降水的概率分布特点.通过对代表站息县、阜阳、商丘、淮安、连云港(Τ)分布概率密度与样本频率的对比分析和K-S检验表明:(Τ)分布函数能较好拟合分条件的淮河流域夏季雨日的概率分布,用该分布函数递推得到的1d,10 d,20 d内最大日降水量概率分布比较规则合理.淮河流域5个子流域中淮河上游、淮河中下游、沂沭河流域在10 d,20 d内最大日降水量不低于10 mm,25 mm,50mm的可能性更大.%The daily precipitation records of 158 meteorological rain gauges over the Huaihe Basins make it possible to analyze the probability distribution, using gamma distribution of precipitation during the summer of 1980-2007 by distinguishing rainy days following a dry or wet preceding day over the years. Five precipitation rain gauge stations, namely Xixian, Fuyang, Shangqiu, Huaian, Lianyungang stations, are investigated as representative stations of five catchments, namely the upper stream of the Huaihe River, the part stream between Wangjiaba Dam and Bengbu Station of the Huaihe River, the part stream between Bengbu Station and Hongze Lake of Huaihe River, the Huaihe River downstream below Hongze Lake and the Yishu River watershed, to analyze their probability distribution respectively. Through the Kolmogorov-Smirnov (K-S) test and the comparison between the gamma distribution probability density function of the five representative stations and the sample frequency of the daily precipitation records, it is proved that gamma distribution function can be an adequate fitting to the probability distribution of the precipitation in summer of the rainy days following a dry or wet preceding day. The probability
Energy Technology Data Exchange (ETDEWEB)
Schmieden, Kristof
2013-04-15
The measurement of the effective weak mixing angle with the ATLAS experiment at the LHC is presented. It is extracted from the forward-backward asymmetry in the polar angle distribution of the muons originating from Z boson decays in the reaction pp{yields}Z/{gamma}{sup *}+X{yields} {mu}{sup +}{mu}{sup -}+X. In total 4.7 fb{sup -1} of proton-proton collisions at {radical}(s)=7 TeV are analysed. In addition, the full polar and azimuthal angular distributions are measured as a function of the transverse momentum of the Z/{gamma}{sup *} system and are compared to several simulations as well as recent results obtained in p anti p collisions. Finally, the angular distributions are used to confirm the spin of the gluon using the Lam-Tung relation.
International Nuclear Information System (INIS)
The measurement of the effective weak mixing angle with the ATLAS experiment at the LHC is presented. It is extracted from the forward-backward asymmetry in the polar angle distribution of the muons originating from Z boson decays in the reaction pp → Z/γ* + X → μ+μ- + X. In total 4.7 fb-1 of proton-proton collisions at √(s) = 7 TeV are analysed. In addition, the full polar and azimuthal angular distributions are measured as a function of the transverse momentum of the Z/γ* system. The comparisons to several simulations as well as recent results obtained in p anti p collisions are presented. Finally, the angular distributions are used to confirm the spin of the gluon using the Lam-Tung relation.
International Nuclear Information System (INIS)
A previously developed random matrix/transition state theory (RM/TST) model for the probability distribution of state-specific unimolecular decay rates has been generalized to incorporate total angular momentum conservation and other dynamical symmetries. The model is made into a predictive theory by using a semiclassical method to determine the transmission probabilities of a nonseparable rovibrational Hamiltonian at the transition state. The overall theory gives a good description of the state-specific rates for the D2CO→D2+CO unimolecular decay; in particular, it describes the dependence of the distribution of rates on total angular momentum J. Comparison of the experimental values with results of the RM/TST theory suggests that there is mixing among the rovibrational states
Subjective probability models for lifetimes
Spizzichino, Fabio
2001-01-01
Bayesian methods in reliability cannot be fully utilized and understood without full comprehension of the essential differences that exist between frequentist probability and subjective probability. Switching from the frequentist to the subjective approach requires that some fundamental concepts be rethought and suitably redefined. Subjective Probability Models for Lifetimes details those differences and clarifies aspects of subjective probability that have a direct influence on modeling and drawing inference from failure and survival data. In particular, within a framework of Bayesian theory, the author considers the effects of different levels of information in the analysis of the phenomena of positive and negative aging.The author coherently reviews and compares the various definitions and results concerning stochastic ordering, statistical dependence, reliability, and decision theory. He offers a detailed but accessible mathematical treatment of different aspects of probability distributions for exchangea...
Koo, Reginald; Jones, Martin L.
2011-01-01
Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.
A Novel Approach to Probability
Kafri, Oded
2016-01-01
When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...
Energy Technology Data Exchange (ETDEWEB)
Serov, A. V., E-mail: serov@x4u.lebedev.ru [Russian Academy of Sciences, Lebedev Physical Institute (Russian Federation); Mamonov, I. A. [National Research Nuclear University MEPhI (Russian Federation); Kol’tsov, A. V., E-mail: koltsov@x4u.lebedev.ru [Russian Academy of Sciences, Lebedev Physical Institute (Russian Federation)
2015-10-15
The scattering of electrons by aluminum, copper, and lead foils, as well as by bimetallic aluminum-lead and aluminum-copper foils, has been studied experimentally. A microtron with an energy of particles of 7.4 MeV has been used as a source of electrons. The beam of particles incident on a target at small angles is split into particles reflected from the foil, which constitute a reflected beam, and particles crossing the foil, which constitute a refracted beam. The effect of the material and thickness of the foil, as well as the angle between the initial trajectory of the beam and the plane of the target, on the direction of motion and the angular divergence of the beam crossing the foil and the beam reflected from the foil has been analyzed. Furthermore, the effect of the sequence of metal layers in bimetallic films on the angles of refraction and reflection of the beam has been examined.
Rao, Zhi-Tao; Yuan, Feng; Li, Bing; Ma, Ning
2014-01-01
Objectives This study aimed to explore the surface stress at the proximal ends of the ulna and radius at different elbow flexion angles using the resistance strain method. Methods Eight fresh adult cadaveric elbows were tested. The forearms were fixed in a neutral position. Axial load increment experiments were conducted at four different elbow flexion angles (0°, 15°, 30°, and 45°). Surface stain was measured at six sites (tip, middle, and base of the coronoid process; back ulnar notch; olec...
Bratchenko, M I
2001-01-01
A novel method of Monte Carlo simulation of small-angle reflection of charged particles from solid surfaces has been developed. Instead of atomic-scale simulation of particle-surface collisions the method treats the reflection macroscopically as 'condensed history' event. Statistical parameters of reflection are sampled from the theoretical distributions upon energy and angles. An efficient sampling algorithm based on combination of inverse probability distribution function method and rejection method has been proposed and tested. As an example of application the results of statistical modeling of particles flux enhancement near the bottom of vertical Wehner cone are presented and compared with simple geometrical model of specular reflection.
Institute of Scientific and Technical Information of China (English)
张永栋; 林俊; 朱天宝; 张海青; 朱智勇
2016-01-01
Background:Particles coated by TRISO (Tristructural isotropic) embedded in spherical fuel elements are used in solid fuel molten salt reactor. Temperature distribution during operation can affect the failure probability of TRISO particles embedded in different parts of fuel elements. Purpose: This study aims to investigate the temperature distribution effects on failure probability of coated fuel particles. Methods: Micro-volume element analysis of temperature distribution effect on the failure probability of coated particles was carried out for the first time, and the impact of spherical fuel element size on the average failure probability of TRISO particles was also evaluated. Results: At a given power density, the failure probability of TRISO particles would be deviated by an order of magnitude when either core temperature or average temperature of the fuel element was used to calculate the average failure probability. With the same power density and the same burnups, the average failure probability of coated particles could be lowered by two orders of magnitude through reducing the diameter of fuel element by 1 cm. Conclusion:It is necessary to take the temperature distribution into account for calculating the failure probability of coated fuel particles. In addition, it is found that the average failure probability of coated fuel particles can be lowered by reducing the sizes of the fuel element. This may be a proper way to secure the fuel elements working at high power densities.%固态熔盐堆采用TRISO (Tristructural isotropic)包覆颗粒球形燃料元件。在运行工况下，燃料元件内部存在一定的温度分布，填充在燃料元件内部不同位置的TRISO颗粒的失效概率会因此受到影响。利用体积微元的方法分析了温度分布对包覆颗粒失效概率的影响，并进一步研究了球形燃料元件尺寸对TRISO颗粒平均失效概率的影响。结果表明，在一定的功率密度下，如果利用球心
Work on probability distribution of breakdown of NC press%数控压力机故障概率分布研究
Institute of Scientific and Technical Information of China (English)
张强; 马立强; 贾亚洲
2001-01-01
开发了分析数控机床可靠性的专用软件。建立了数控压力机的操作和维修数据库。故障数据验证了数控压力机故障分布符合指数分布，并给出了数控压力机可靠性特性的计算方法。%Special software of reliability analysis of NC machine-tool has been developed with establishment of operation and maintenance database of NC press. The distribution of breakdown of NC press has been proved to accord with exponent distribution. Calculating method of reliability characteristics of NC press has been offered.
VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS
Directory of Open Access Journals (Sweden)
Smirnov Vladimir Alexandrovich
2012-10-01
Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.
A lattice determination of gA and left angle x right angle from overlap fermions
International Nuclear Information System (INIS)
We present results for the nucleon's axial charge gA and the first moment left angle x right angle of the unpolarized parton distribution function from a simulation of quenched overlap fermions. (orig.)
Evaluating probability forecasts
Lai, Tze Leung; Shen, David Bo; 10.1214/11-AOS902
2012-01-01
Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability forecasts. This approach uses loss functions relating the predicted to the actual probabilities of the events and applies martingale theory to exploit the temporal structure between the forecast and the subsequent occurrence or nonoccurrence of the event.
Benci, Vieri; Wenmackers, Sylvia
2011-01-01
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and zero- and unit-probability events pose no particular epistemological problems. We use a non-Archimedean field as the range of the probability function. As a result, the property of countable additivity in Kolmogorov's axiomatization of probability is replaced by a different type of infinite additivity.
Probability elements of the mathematical theory
Heathcote, C R
2000-01-01
Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.
Institute of Scientific and Technical Information of China (English)
苏布达; Marco Gemmer; 姜彤
2008-01-01
Based on the daily observational precipitation data of 147 stations in the Yangtze River basin for 1960-2005,and the projected daily data of 79 grids from ECHAM5/MPI-OM in the 20th century,time series of precipitation extremes which contain annual maximum(AM)and Munger index(MI)were constructed.The distribution feature of precipitation extremes was analyzed based on the two index series.Research results show that(1)the intensity and probability of extreme heavy precipitation are higher in the middle Mintuo River sub-catchment,the Dongting Lake area,the mid-lower main stream section of the Yangtze River,and the southeastern Poyang Lake sub-catchment;whereas,the intensity and probability of drought events are higher in the mid-lower Jinsha River sub-catchment and the Jialing River sub-catchment;(2)compared with observational data,the averaged value of AM is higher but the deviation coefficient is lower in projected data,and the center of precipitation extremes moves northwards;(3)in spite of certain differences in the spatial distributions of observed and projected precipitation extremes,by applying General Extreme Value(GEV)and Wakeby(WAK)models with the method of L-Moment Estimator(LME)to the precipitation extremes,it is proved that WAK can simulate the probability distribution of precipitation extremes calculated from both observed and projected data quite well.The WAK could be an important function for estimating the precipitation extreme events in the Yangtze River basin under future climatic scenarios.
Roussas, George G
2006-01-01
Roussas's Introduction to Probability features exceptionally clear explanations of the mathematics of probability theory and explores its diverse applications through numerous interesting and motivational examples. It provides a thorough introduction to the subject for professionals and advanced students taking their first course in probability. The content is based on the introductory chapters of Roussas's book, An Intoduction to Probability and Statistical Inference, with additional chapters and revisions. Written by a well-respected author known for great exposition an
Gu, G.-F.; Chen, W.; Zhou, W.-X.
2007-05-01
The statistical properties of the bid-ask spread of a frequently traded Chinese stock listed on the Shenzhen Stock Exchange are investigated using the limit-order book data. Three different definitions of spread are considered based on the time right before transactions, the time whenever the highest buying price or the lowest selling price changes, and a fixed time interval. The results are qualitatively similar no matter linear prices or logarithmic prices are used. The average spread exhibits evident intraday patterns consisting of a big L-shape in morning transactions and a small L-shape in the afternoon. The distributions of the spread with different definitions decay as power laws. The tail exponents of spreads at transaction level are well within the interval (2,3) and that of average spreads are well in line with the inverse cubic law for different time intervals. Based on the detrended fluctuation analysis, we found the evidence of long memory in the bid-ask spread time series for all three definitions, even after the removal of the intraday pattern. Using the classical box-counting approach for multifractal analysis, we show that the time series of bid-ask spread do not possess multifractal nature.
Gu, G F; Zhou, W X; Chen, Wei; Gu, Gao-Feng; Zhou, Wei-Xing
2006-01-01
The statistical properties of the bid-ask spread of a frequently traded Chinese stock listed on the Shenzhen Stock Exchange are investigated using the limit-order book data. Three different definitions of spread are considered based on the time right before transactions, the time whenever the highest buying price or the lowest selling price changes, and a fixed time interval. The results are qualitatively similar no matter linear prices or logarithmic prices are used. The average spread exhibits evident intraday patterns consisting of a big L-shape in the morning and a small L-shape in the afternoon. The distributions of the spread with different definitions decay as power laws. The tail exponents of spreads at transaction level are well within the interval $(2,3)$ and that of average spreads are well in line with the inverse cubic law for different time intervals. Based on the detrended fluctuation analysis, we find evidence of long memory in the bid-ask spread time series for all three definitions, even aft...
International Nuclear Information System (INIS)
The relevance of concepts brought to mind by stimulus terms concerning atomic energy and radiation utilization has been investigated to learn how people understand the present status of nuclear technology. The relevance of concepts was defined as the frequency distribution of words that came to mind immediately after seeing selected terms needed for present-day life as well as for nuclear engineering. An analysis of knowledge structure shows that a concept of atomic energy has a close relation with that of electric power generation; an understanding of nuclear power utilization may be promoted in relation to an understanding of energy and environmental problems because the concepts of energy, atomic energy, electric power generation, and natural environment have closer relations with one another; a concept of radiation has various relations with harmful radiological health effects, but little relation with industrial, agricultural, and other beneficial uses except of nuclear power generation or medical applications. It also became clear from the investigation that studies on natural radiation may be important to promote an understanding of radiation utilization because a concept of the natural environment does not yet relate to that of natural radiation. (author)
Interpretations of probability
Khrennikov, Andrei
2009-01-01
This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.
Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia
2013-01-01
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned prob
Edwards, William F.; Shiflett, Ray C.; Shultz, Harris
2008-01-01
The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…
Normal probability plots with confidence.
Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang
2015-01-01
Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods.
Institute of Scientific and Technical Information of China (English)
邓厚斌; 葛毅; 范璐敏; 刘晓雯; 李盈盈
2012-01-01
In order to proceed depreciation accounting of medical equipment reasonably, this paper analyses and compares the advantages and disadvantages of several common depreciation methods, with use efficiency of medical equipment, proposes distribution rule of static depreciation rate of x2 distribution probability density function, meanwhile, introduces benchmark benefit ratio of funds, establishes dynamic depreciation method of medical equipment.%为能够较合理地进行医疗设备的折旧核算,本文分析比较了常见的几种折旧方法的优缺点,结合医疗设备的使用效能,提出了医疗设备拟合x 2分布概率密度函数的静态折旧率分布规律,建立新的医疗设备折旧法.
System Downlink Outage Probability Analysis in Distributed Antenna Systems%分布式天线系统中的系统下行中断概率分析
Institute of Scientific and Technical Information of China (English)
王俊波; 王金元; 陈华敏; 陈明
2011-01-01
The main focus of this paper is to investigate the system downlink outage probability in distributed antenna systems (DAS).Firstly,this paper establishes a comtposite channel model which takes into account three factors such as path loss,lognormal shadowing and Rayleigh fading. Then, by making use of the moment generating function (MGF), this paper derives the probability density function (PDF) of the output signal-to-noise ratio (SNR) after maximal ratio combining (MRC) at the receiver.After that,an approximate analytical expression of the outage probability for a mobile station (MS) over a given position is derived with an antenna selective transmission (ST) scheme. Further, considering the distribution of MSs in the system, a closed-form expression of the system outage probability is obtained. Numerical results show that the closed-form expression can provide sufficient precision for evaluating the outage probability performance of DAS.%本文针对分布式天线系统的系统下行中断概率问题展开研究.文章首先建立了包含路径损耗、阴影衰落和瑞利衰落的复合信道模型.接着,在接收端采用最大比合并的方式接收信号,并运用矩生成函数推导出输出信噪比的概率密度函数.然后,对分布式天线采用选择传输策略,并分析出给定移动台位置时的中断概率的表达式.最后,考虑小区内移动台任意分布特点,进一步推导出系统下行中断概率闭合表达式.仿真结果表明,所推导的闭合表达式能准确地评估系统中断概率性能.
Nuclear data uncertainties: I, Basic concepts of probability
International Nuclear Information System (INIS)
Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs
Nuclear data uncertainties: I, Basic concepts of probability
Energy Technology Data Exchange (ETDEWEB)
Smith, D.L.
1988-12-01
Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.
Childers, Timothy
2013-01-01
Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,
Introduction to probability models
Ross, Sheldon M
2006-01-01
Introduction to Probability Models, Tenth Edition, provides an introduction to elementary probability theory and stochastic processes. There are two approaches to the study of probability theory. One is heuristic and nonrigorous, and attempts to develop in students an intuitive feel for the subject that enables him or her to think probabilistically. The other approach attempts a rigorous development of probability by using the tools of measure theory. The first approach is employed in this text. The book begins by introducing basic concepts of probability theory, such as the random v
Molecular contingencies: reinforcement probability.
Hale, J M; Shimp, C P
1975-11-01
Pigeons obtained food by responding in a discrete-trials two-choice probability-learning experiment involving temporal stimuli. A given response alternative, a left- or right-key peck, had 11 associated reinforcement probabilities within each session. Reinforcement probability for a choice was an increasing or a decreasing function of the time interval immediately preceding the choice. The 11 equiprobable temporal stimuli ranged from 1 to 11 sec in 1-sec classes. Preference tended to deviate from probability matching in the direction of maximizing; i.e., the percentage of choices of the preferred response alternative tended to exceed the probability of reinforcement for that alternative. This result was qualitatively consistent with probability-learning experiments using visual stimuli. The result is consistent with a molecular analysis of operant behavior and poses a difficulty for molar theories holding that local variations in reinforcement probability may safely be disregarded in the analysis of behavior maintained by operant paradigms. PMID:16811883
Institute of Scientific and Technical Information of China (English)
杨娟; 卞保民; 何幼权; 贺安之; 王亚伟
2001-01-01
A more reasonable particle counter mod el suitable for airborne particle counter is constructed in the probability spac e.The amplitude spectrum line and the amplitude spectrum function are proposed f or the airborne particle scattering light signal,and their properties are studie d in probability space.Based on the irrelevance of the single distributing parti cle group and the probability theory of signal processing model,the essence of t his signal transformation is summarized in a total probability equation.Experime ntal study has been performed with Y09 laser airborne particle counter.%针对粒度仪标定的准 确度，在概率空间基础上，提出了尘埃粒子散射光信号幅度概率谱线和散射光信号幅度概率 谱函数的概念，并对谱线在概率空间下的性质进行了研究。利用单分散粒子间的不相关性， 将概率论运用于信号处理中，用全概率公式概括了尘埃粒子计数器信号传输的本质，建立了 适 用于处理尘埃粒子计数器信号的更完善的概率理论模型。文中给出了这一方法的理论分析及 国产Y09激光尘埃粒子计数器的实验测试结果。
Zangooei Dovom, Hossein; Shafahi, Yousef; Zangooei Dovom, Mehdi
2013-01-01
Several studies have investigated road traffic deaths, but few have compared by road user type. Iran, with an estimated 44 road traffic deaths per 100,000 population in 2002 had higher road traffic deaths than any other country for which reliable estimates can be made. So, the present study was conducted on road death data and identified fatal accident distribution by age, gender and head injury as well as the influences of age and gender on deaths at accident scenes for all road user groups. Data used in this study are on fatal road accidents recorded by forensic medicine experts of the Khorasan Razavi province in Mashhad, the capital of the province, the second largest city and the largest place of pilgrimage, immigration and tourism in Iran. Chi-square test and odds ratio were used to identify the relation of death place with age and gender in 2495 fatal road accidents from 2006 to 2009. The t-test and analysis of variance were employed for continues variable, age, to compare males' and females' mean age for all road user categories. For two genders, all three groups of fatalities (pedestrian, motorcyclist and motor vehicle occupant) had a peak at the ages of 21-30. The youngest were male motorcyclists (mean age = 28). Old pedestrians were included in road deaths very much, too. Male/female overall ratio was 3.41 and the highest male/female ratio was related to motorcyclists (14). The overall ratio of head injury to other organ injuries (torso and underbody) was 2.51 and pedestrians had the largest amount of head injury (38.2%). Regarding death at accident scene, for all road users, gender did not have any significant relation with death at the scene (P-value > 0.1); on the contrary, age had significant relation (P-value < 0.05). Females were more vulnerable at accident scenes (male/female ratio at accident sense < 1). Pedestrians aged 21-30, motorcyclists 41-50 and motor vehicle occupants 31-40 died the most at accident scenes. Identifying the most
Probability on real Lie algebras
Franz, Uwe
2016-01-01
This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.
Directory of Open Access Journals (Sweden)
José Alves Junqueira Júnior
2007-06-01
Full Text Available Nos dias atuais a irrigação é uma das principais técnicas a serviço da agricultura. Entretanto, a consideração da irrigação como única fonte de suprir a demanda de água para as plantas pode acarretar em sistemas superdimensionados, o que contribui para elevar seu custo de implantação. Uma das alternativas utilizadas na solução desse problema consiste em considerar a precipitação a um determinado nível de probabilidade, ou seja, a precipitação provável, o que possibilitaria fazer a irrigação complementar. Assim, objetivou-se com o presente trabalho, caracterizar a precipitação provável na região do município de Madre de Deus, MG, comparando quatro diferentes modelos de distribuição de freqüência (Gama, Normal, Log-normal 2 e 3 parâmetros. As lâminas diárias foram totalizadas em períodos de 10, 15 e 30 dias, sendo avaliadas com 13 diferentes níveis de probabilidades, para séries históricas de 57 anos de observação, compreendido entre 1942 e 1999. Foi aplicado o teste de Kolmogorov-Smirnov a fim de avaliar a adequabilidade das mesmas e verificar qual modelo é mais adequado para cada uma das séries históricas. Observou-se que os modelos de probabilidade adequaram-se melhor ao período chuvoso, sendo a distribuição Log-normal 3 parâmetros a mais adequada para as séries históricas de período mensal e a distribuição Gama para os períodos quinzenal e decendial.Nowadays, irrigation is one of the most important agricultural technique. Therefore, this technique can not be the only source to supply water for crops, because the irrigation system may be over designed, increasing installation costs. One of alternatives to solve this problem is to analyze the probability of rainfall, decreasing costs and easing the irrigation management. This study purposes to characterize probable rainfall for Madre de Deus Village, comparing four (4 probability distribution models (Gama, Normal, Log-normal at 2 and 3
International Nuclear Information System (INIS)
In this work, I formulate the persistence probability for a qubit device as the probability of measuring its computational degrees of freedom in the unperturbed state without the decoherence arising from environmental interactions. A decoherence time can be obtained from the persistence probability. Drawing on recent work of Garg, and also Palma, Suomine, and Ekert, I apply the persistence probability formalism to a generic single-qubit device coupled to a thermal environment, and also apply it to a trapped-ion quantum register coupled to the ion vibrational modes. (author)
Florescu, Ionut
2013-01-01
THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio
Probability of satellite collision
Mccarter, J. W.
1972-01-01
A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
Perception of perspective angles
Erkelens, C.J.
2015-01-01
We perceive perspective angles, that is, angles that have an orientation in depth, differently from what they are in physical space. Extreme examples are angles between rails of a railway line or between lane dividers of a long and straight road. In this study, subjects judged perspective angles bet
Tight Bernoulli tail probability bounds
Dzindzalieta, Dainius
2014-01-01
The purpose of the dissertation is to prove universal tight bounds for deviation from the mean probability inequalities for functions of random variables. Universal bounds shows that they are uniform with respect to some class of distributions and quantity of variables and other parameters. The bounds are called tight, if we can construct a sequence of random variables, such that the upper bounds are achieved. Such inequalities are useful for example in insurance mathematics, for constructing...
Advanced Probability Theory for Biomedical Engineers
Enderle, John
2006-01-01
This is the third in a series of short books on probability theory and random processes for biomedical engineers. This book focuses on standard probability distributions commonly encountered in biomedical engineering. The exponential, Poisson and Gaussian distributions are introduced, as well as important approximations to the Bernoulli PMF and Gaussian CDF. Many important properties of jointly Gaussian random variables are presented. The primary subjects of the final chapter are methods for determining the probability distribution of a function of a random variable. We first evaluate the prob
Spiral Arm Pitch Angle and its Significance for Theories of Galactic Structure
Kennefick, D.
2014-03-01
I argue that the pitch angle of spiral arms in disk galaxies is one of a number of characteristics of galaxies (which we may refer to as “traits” of a galaxy) which correlate reasonably well with each other, most of them probably determined by the mass of the galaxy's central bulge. Although often dealt with qualitatively in the past, as in Hubble's galaxy classification scheme, quantifying pitch angle opens up the prospect of using it as a probe of the mass distribution of a galaxy and as a tool for testing various theories of the origins of spiral structure in disk galaxies.
Probability, Nondeterminism and Concurrency
DEFF Research Database (Denmark)
Varacca, Daniele
Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...
Dynamic update with probabilities
J. van Benthem; J. Gerbrandy; B. Kooi
2009-01-01
Current dynamic-epistemic logics model different types of information change in multi-agent scenarios. We generalize these logics to a probabilistic setting, obtaining a calculus for multi-agent update with three natural slots: prior probability on states, occurrence probabilities in the relevant pr
Directory of Open Access Journals (Sweden)
Elena Druica
2007-05-01
Full Text Available The science of probabilities has earned a special place because it tried through its concepts to build a bridge between theory and experimenting.As a formal notion which by definition does not lead to polemic, probability, nevertheless, meets a series of difficulties of interpretation whenever the probability must be applied to certain particular situations.Usually, the economic literature brings into discussion two interpretations of the concept of probability:the objective interpretation often found under the name of frequency or statistical interpretation and the subjective or personal interpretation. Surprisingly, the third appproach is excluded:the logical interpretation.The purpose of the present paper is to study some aspects of the subjective and logical interpretation of the probability, aswell as the implications in the economics.
International Nuclear Information System (INIS)
On the territory of the Republic of Macedonia there are few types of rainfall regimes: Mediterranean, changed Mediterranean, changed Continental and Continental rainfall regime. In the period between 1951-2000 year, there were few significant climate variations of the pluviometric conditions in many parts of the world. These climate variations also happened on the territory of Republic of Macedonia. In the work are shown summary probabilities of the changes in seasonal, monthly and yearly values of the precipitation in the net of the main meteorological - climatological stations (with professional observers) by Gauss low for normal distribution with repeats from 5% to 95% on every 5%, resulting in classification within values of 5%, 50% and 95% repeats as well as values in between the highest and lowest values of the summary probability. Also there are isohyets maps for average, most rainy and most dry year as characteristics of pluviometric regime, witch are necessity for defining of the climate and hydro meteorology conditions of the catchments areas in Republic of Macedonia.(Author)
Imaging electron dynamics with time- and angle-resolved photoelectron spectroscopy
Popova-Gorelova, Daria; Küpper, Jochen; Santra, Robin
2016-07-01
We theoretically study how time- and angle-resolved photoemission spectroscopy can be applied for imaging coherent electron dynamics in molecules. We consider a process in which a pump pulse triggers coherent electronic dynamics in a molecule by creating a valence electron hole. An ultrashort extreme ultraviolet probe pulse creates a second electron hole in the molecule. Information about the electron dynamics is accessed by analyzing angular distributions of photoemission probabilities at a fixed photoelectron energy. We demonstrate that a rigorous theoretical analysis, which takes into account the indistinguishability of transitions induced by the ultrashort, broadband probe pulse and electron hole correlation effects, is necessary for the interpretation of time- and angle-resolved photoelectron spectra. We show how a Fourier analysis of time- and angle-resolved photoelectron spectra from a molecule can be applied to follow its electron dynamics by considering photoelectron distributions from an indole molecular cation with coherent electron dynamics.
Imaging electron dynamics with time- and angle-resolved photoelectron spectroscopy
Popova-Gorelova, Daria; Santra, Robin
2016-01-01
We theoretically study how time- and angle-resolved photoemission spectroscopy can be applied for imaging coherent electron dynamics in molecules. We consider a process in which a pump pulse triggers coherent electronic dynamics in a molecule by creating a valence electron hole. An ultrashort extreme ultraviolet (XUV) probe pulse creates a second electron hole in the molecule. Information about the electron dynamics is accessed by analyzing angular distributions of photoemission probabilities at a fixed photoelectron energy. We demonstrate that a rigorous theoretical analysis, which takes into account the indistinguishability of transitions induced by the ultrashort, broadband probe pulse and electron hole correlation effects, is necessary for the interpretation of time- and angle-resolved photoelectron spectra. We show how a Fourier analysis of time- and angle-resolved photoelectron spectra from a molecule can be applied to follow its electron dynamics by considering photoelectron distributions from an indol...
Ghosh, Indranil
2011-01-01
Consider a discrete bivariate random variable (X, Y) with possible values x[subscript 1], x[subscript 2],..., x[subscript I] for X and y[subscript 1], y[subscript 2],..., y[subscript J] for Y. Further suppose that the corresponding families of conditional distributions, for X given values of Y and of Y for given values of X are available. We…
Abstract Models of Probability
Maximov, V. M.
2001-12-01
Probability theory presents a mathematical formalization of intuitive ideas of independent events and a probability as a measure of randomness. It is based on axioms 1-5 of A.N. Kolmogorov 1 and their generalizations 2. Different formalized refinements were proposed for such notions as events, independence, random value etc., 2,3, whereas the measure of randomness, i.e. numbers from [0,1], remained unchanged. To be precise we mention some attempts of generalization of the probability theory with negative probabilities 4. From another side the physicists tryed to use the negative and even complex values of probability to explain some paradoxes in quantum mechanics 5,6,7. Only recently, the necessity of formalization of quantum mechanics and their foundations 8 led to the construction of p-adic probabilities 9,10,11, which essentially extended our concept of probability and randomness. Therefore, a natural question arises how to describe algebraic structures whose elements can be used as a measure of randomness. As consequence, a necessity arises to define the types of randomness corresponding to every such algebraic structure. Possibly, this leads to another concept of randomness that has another nature different from combinatorical - metric conception of Kolmogorov. Apparenly, discrepancy of real type of randomness corresponding to some experimental data lead to paradoxes, if we use another model of randomness for data processing 12. Algebraic structure whose elements can be used to estimate some randomness will be called a probability set Φ. Naturally, the elements of Φ are the probabilities.
Institute of Scientific and Technical Information of China (English)
赵旭; 程维虎; 李婧兰
2012-01-01
The generalized Pareto distribution (GPD) is one of the most important distribution in statistics analysis that has been widely used in finance, insurance, hydrology and meteorology applications and so on. While traditional estimation methods, such as maximum likelihood (ML), methods of moments (MOM) and probability weighted moments (PWM) methods have been extensively applied, the use of these methods are often restricted. Alternative approaches (e.g., generalized probability weighted moments, L-moments and LH-moments) exist but they use complete or non-censored samples. However, censored samples are often encountered in hydrology and meteorology fields. In this article, we propose a computationally easy method from censored data for fitting the GPD, which is resistant against extremely small or large outliers, I.e., they will be robust with the lower and upper breakdown points. This method is based on probability weighted moments. Firstly, we solve shape parameter estimator which has high estimated precision, then the location and scale parameters are given for the GPD. Simulation studies show that the proposed method performs well compared to traditional techniques.%广义Pareto分布(GPD)是统计分析中一个极为重要的分布,被广泛应用于金融、保险、水文及气象等领域.传统的参数估计方法如极大似然估计、矩估计及概率加权矩估计方法等已被广泛应用,但使用中存在一定的局限性.虽然提出很多改进方法如广义概率加权矩估计、L矩和LH矩法等,但都是研究完全样本的估计问题,而在水文及气象等应用领域常出现截尾样本.本文基于概率加权矩理论,利用截尾样本对三参数GPD提出一种应用范围广且简单易行的参数估计方法,可有效减弱异常值的影响.首先求解出具有较高精度的形状参数的参数估计,其次得出位置参数及尺度参数的参数估计.通过Monte Carlo模拟说明该方法估计精度较高.
Probability of Failure in Random Vibration
DEFF Research Database (Denmark)
Nielsen, Søren R.K.; Sørensen, John Dalsgaard
1988-01-01
Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out...
Billingsley, Patrick
2012-01-01
Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this
International Nuclear Information System (INIS)
The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs
Hartmann, Stephan
2011-01-01
Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...
Stochastic Programming with Probability
Andrieu, Laetitia; Vázquez-Abad, Felisa
2007-01-01
In this work we study optimization problems subject to a failure constraint. This constraint is expressed in terms of a condition that causes failure, representing a physical or technical breakdown. We formulate the problem in terms of a probability constraint, where the level of "confidence" is a modelling parameter and has the interpretation that the probability of failure should not exceed that level. Application of the stochastic Arrow-Hurwicz algorithm poses two difficulties: one is structural and arises from the lack of convexity of the probability constraint, and the other is the estimation of the gradient of the probability constraint. We develop two gradient estimators with decreasing bias via a convolution method and a finite difference technique, respectively, and we provide a full analysis of convergence of the algorithms. Convergence results are used to tune the parameters of the numerical algorithms in order to achieve best convergence rates, and numerical results are included via an example of ...
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
Probability and Statistical Inference
Prosper, Harrison B.
2006-01-01
These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.
Hemmo, Meir
2012-01-01
What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive.
Estimating Subjective Probabilities
DEFF Research Database (Denmark)
Andersen, Steffen; Fountain, John; Harrison, Glenn W.;
either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake “calibrating adjustments” to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments that...
Marshall, Jennings B.
2007-01-01
This article describes how roulette can be used to teach basic concepts of probability. Various bets are used to illustrate the computation of expected value. A betting system shows variations in patterns that often appear in random events.
Characteristic Functions over C*-Probability Spaces
Institute of Scientific and Technical Information of China (English)
王勤; 李绍宽
2003-01-01
Various properties of the characteristic functions of random variables in a non-commutative C*-probability space are studied in this paper. It turns out that the distributions of random variables are uniquely determined by their characteristic functions. By using the properties of characteristic functions, a central limit theorem for a sequence of independent identically distributed random variables in a C*-probability space is established as well.
Representing Uncertainty by Probability and Possibility
DEFF Research Database (Denmark)
Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance of uncer......Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance...
The New Method Judged Horizontal Distribution Pattern by Uniform Angle Index%角尺度判断林木水平分布格局的新方法
Institute of Scientific and Technical Information of China (English)
赵中华; 惠刚盈; 胡艳波; 张弓乔
2016-01-01
正态分布检验林分(树种)平均角尺度判断林木水平分布格局方法对2个林分/种群的判断结果与 Ripley’s L函数点格局分析方法判断结果完全一致，而聚集指数 R与 Ripley’s L检验的判断结果的差别明显增加，说明置信水平对水平分布格局判断结果影响比较明显。【结论】研究提出的正态分布检验林分(树种)平均角尺度判断林木水平分布格局的方法克服了统一的置信区间不适用于评判抽样调查或群落中数量较少的种群水平分布格局问题，进一步完善了角尺度判断林木水平分布格局理论，提升了角尺度判断林木水平分布格局的准确性与适用范围。%Objective]This paper proposed a new method to judge tree horizontal distribution pattern by uniform angle index in order to further improve the theory of the uniform angle index to judge tree horizontal distribution pattern.[Method]6 000 simulated stands with an area of 70 m × 70 m and with different densities and distribution patterns were produced by stand spatial structure analysis software ( Winkelmass) ,the 2 field-tested broad-leaved korean pine forests in northeast China were then used to verify the accuracy of the new method for judging the stand and population horizontal distribution pattern,and the results were also compared with R aggregation index and Ripley’s L.[Result]According to the conclusion of the mean value of uniform angle index ( W ) of random distribution stand conform to the normal distribution and its relationships with the standard deviation,this contribution proposed the new method of judgment stand/population spatial horizontal distribution pattern by uniform angle index. The 6 000 simulated stands with different density and horizontal distribution patterns were produced by Winkelmass with an area of 70 m × 70 m. The results of simulation data showed that the coincidence rate of uniform angle index normal distribution test method was 100% to different density
Monte Carlo transition probabilities
Lucy, L. B.
2001-01-01
Transition probabilities governing the interaction of energy packets and matter are derived that allow Monte Carlo NLTE transfer codes to be constructed without simplifying the treatment of line formation. These probabilities are such that the Monte Carlo calculation asymptotically recovers the local emissivity of a gas in statistical equilibrium. Numerical experiments with one-point statistical equilibrium problems for Fe II and Hydrogen confirm this asymptotic behaviour. In addition, the re...
Asteroidal collision probabilities
Bottke, William F., Jr.; Greenberg, Richard
1993-01-01
Several past calculations of collision probabilities between pairs of bodies on independent orbits have yielded inconsistent results. We review the methodologies and identify their various problems. Greenberg's (1982) collision probability formalism (now with a corrected symmetry assumption) is equivalent to Wetherill's (1967) approach, except that it includes a way to avoid singularities near apsides. That method shows that the procedure by Namiki and Binzel (1991) was accurate for those cases where singularities did not arise.
Quantum computing and probability.
Ferry, David K
2009-11-25
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.
Probability in quantum mechanics
Directory of Open Access Journals (Sweden)
J. G. Gilson
1982-01-01
Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.
Deshler, Terry
2016-04-01
Balloon-borne optical particle counters were used to make in situ size resolved particle concentration measurements within polar stratospheric clouds (PSCs) over 20 years in the Antarctic and over 10 years in the Arctic. The measurements were made primarily during the late winter in the Antarctic and in the early and mid-winter in the Arctic. Measurements in early and mid-winter were also made during 5 years in the Antarctic. For the analysis bimodal lognormal size distributions are fit to 250 meter averages of the particle concentration data. The characteristics of these fits, along with temperature, water and nitric acid vapor mixing ratios, are used to classify the PSC observations as either NAT, STS, ice, or some mixture of these. The vapor mixing ratios are obtained from satellite when possible, otherwise assumptions are made. This classification of the data is used to construct probability density functions for NAT, STS, and ice number concentration, median radius and distribution width for mid and late winter clouds in the Antarctic and for early and mid-winter clouds in the Arctic. Additional analysis is focused on characterizing the temperature histories associated with the particle classes and the different time periods. The results from theses analyses will be presented, and should be useful to set bounds for retrievals of PSC properties from remote measurements, and to constrain model representations of PSCs.
Bollenbacher, Gary; Guptill, James D.
1999-01-01
This report analyzes the probability of a launch vehicle colliding with one of the nearly 10,000 tracked objects orbiting the Earth, given that an object on a near-collision course with the launch vehicle has been identified. Knowledge of the probability of collision throughout the launch window can be used to avoid launching at times when the probability of collision is unacceptably high. The analysis in this report assumes that the positions of the orbiting objects and the launch vehicle can be predicted as a function of time and therefore that any tracked object which comes close to the launch vehicle can be identified. The analysis further assumes that the position uncertainty of the launch vehicle and the approaching space object can be described with position covariance matrices. With these and some additional simplifying assumptions, a closed-form solution is developed using two approaches. The solution shows that the probability of collision is a function of position uncertainties, the size of the two potentially colliding objects, and the nominal separation distance at the point of closest approach. ne impact of the simplifying assumptions on the accuracy of the final result is assessed and the application of the results to the Cassini mission, launched in October 1997, is described. Other factors that affect the probability of collision are also discussed. Finally, the report offers alternative approaches that can be used to evaluate the probability of collision.
Directory of Open Access Journals (Sweden)
José A. Raynal
2005-01-01
Full Text Available Se propone un método alternativo para la estimación de parámetros de la función de distribución de probabilidad general de valores extremos (GVE, por medio del método de momentos de probabilidad pesada, para el análisis de gastos máximos anuales. El método propuesto presenta una mayor flexibilidad de modelación que el existente y se ha podido aplicar a una gran cantidad de datos de gastos máximos anuales sin ninguna dificultad observada. El artículo contiene ejemplos numéricos de estimación de parámetros usando las dos metodologías citadas. Los resultados producidos por ambos métodos son prácticamente iguales, tanto en la evaluación de los parámetros de la distribución GVE, como en la obtención de los eventos de diseño. En cuanto a las características estadísticas, el método propuesto es muy superior al existente, como ha sido demostrado por medio de experimentos de muestreo distribucionalAn alternative method for estimating the parameters of the general extreme value (GEV probability distribution function, using the method of probability weighted moments, is compared with existing methods, to analyze maximum annual floods. The proposed method shows better modeling flexibility than the existing methods, and has been applied to a large number of maximum annual floods samples with no observed difficulty. The paper contains numerical examples for the estimation of parameters using the methods cited. The results produced by both methods are practically the same, both in the parameter estimation phase for the GEV distribution, as well as in the evaluation of the design events. Regarding statistical characteristics, the proposed methodology is much better than the existing one, as it has been shown through distribution sampling experiments.
Classical Probability and Quantum Outcomes
Directory of Open Access Journals (Sweden)
James D. Malley
2014-05-01
Full Text Available There is a contact problem between classical probability and quantum outcomes. Thus, a standard result from classical probability on the existence of joint distributions ultimately implies that all quantum observables must commute. An essential task here is a closer identification of this conflict based on deriving commutativity from the weakest possible assumptions, and showing that stronger assumptions in some of the existing no-go proofs are unnecessary. An example of an unnecessary assumption in such proofs is an entangled system involving nonlocal observables. Another example involves the Kochen-Specker hidden variable model, features of which are also not needed to derive commutativity. A diagram is provided by which user-selected projectors can be easily assembled into many new, graphical no-go proofs.
Energy Technology Data Exchange (ETDEWEB)
Alkhazov, G.; Andronenko, M.; Dobrovolsky, A.; Gavrilov, G.; Khanzadeev, A.; Korolev, G.; Lobodenko, A.; Seliverstov, D.; Timofeev, N.; Vorobyov, A.; Yatsoura, V. [Petersburg Nuclear Physics Institute (PNPI), 188350 Gatchina (Russia); Egelhof, P.; Geissel, H.; Irnich, H.; Muenzenberg, G.; Nickel, F.; Schwab, W.; Suzuki, T. [Gesellschaft fuer Schwerionenforschung (GSI), 64291 Darmstadt (Germany); Mutterer, M.; Neumaier, S.; Theobald, J. [Institut fuer Kernphysik (IKP), TH-Darmstadt, 64289 Darmstadt (Germany)
1997-03-01
Differential cross sections for {ital p}-{sup 6}He and {ital p}-{sup 8}He elastic scattering have been measured in inverse kinematics at small momentum transfers up to {vert_bar}t{vert_bar}=0.05(GeV/ c){sup 2} and projectile energies of about 700MeV/nucleon. Nuclear matter densities deduced from the data are consistent with the concept that {sup 6}He and {sup 8}He nuclei have an {alpha}-like core and a significant neutron skin. The rms radii of the nuclear matter distributions were determined to be R{sub m} ({sup 6}He)=2.30{plus_minus}0.07fm and R{sub m} ({sup 8}He)=2.45{plus_minus}0.07fm. {copyright} {ital 1997} {ital The American Physical Society}
Energy Technology Data Exchange (ETDEWEB)
Alkhazov, G.D. [St. Petersburg Inst. of Nuclear Physics, Gatchina (Russian Federation); Andronenko, M.N. [St. Petersburg Inst. of Nuclear Physics, Gatchina (Russian Federation); Dobrovolsky, A.V. [St. Petersburg Inst. of Nuclear Physics, Gatchina (Russian Federation); Egelhof, P. [Gesellschaft fuer Schwerionenforschung mbH, Darmstadt (Germany); Gavrilov, G.E. [St. Petersburg Inst. of Nuclear Physics, Gatchina (Russian Federation); Geissel, H. [Gesellschaft fuer Schwerionenforschung mbH, Darmstadt (Germany); Irnich, H. [Gesellschaft fuer Schwerionenforschung mbH, Darmstadt (Germany); Khanzadeev, A.V. [St. Petersburg Inst. of Nuclear Physics, Gatchina (Russian Federation); Korolev, G.A. [St. Petersburg Inst. of Nuclear Physics, Gatchina (Russian Federation); Lobodenko, A.A. [St. Petersburg Inst. of Nuclear Physics, Gatchina (Russian Federation); Muenzenberg, G. [Gesellschaft fuer Schwerionenforschung mbH, Darmstadt (Germany); Mutterer, M. [Technische Hochschule Darmstadt (Germany). Inst. fuer Kernphysik; Neumaier, S.R. [Technische Hochschule Darmstadt (Germany). Inst. fuer Kernphysik; Nickel, F. [Gesellschaft fuer Schwerionenforschung mbH, Darmstadt (Germany); Schwab, W. [Technische Hochschule Darmstadt (Germany). Inst. fuer Kernphysik; Seliverstov, D.M. [St. Petersburg Inst. of Nuclear Physics, Gatchina (Russian Federation); Suzuki, T. [Gesellschaft fuer Schwerionenforschung mbH, Darmstadt (Germany); Theobald, J.P. [Technische Hochschule Darmstadt (Germany). Inst. fuer Kernphysik; Timofeev, N.A. [St. Petersburg Inst. of Nuclear Physics, Gatchina (Russian Federation); Vorobyov, A.A. [St. Petersburg Inst. of Nuclear Physics, Gatchina (Russian Federation); Yatsoura, V.I. [St. Petersburg Inst. of Nuclear Physics, Gatchina (Russian Federation)
1996-11-01
Differential cross sections for p{sup 6}He and p{sup 8}He elastic scattering have been measured in inverse kinematics at small momentum transfers up to vertical stroke t vertical stroke =0.05 (GeV/c){sup 2} and projectile energies of about 700 MeV/u. Nuclear matter densities deduced from the data are consistent with the concept that {sup 6}He and {sup 8}He nuclei have an {alpha}-like core and a significant neutron skin. The r.m.s. radii of the nuclear matter distributions were determined to be R{sub m}({sup 6}He)=2.30{+-}0.07 fm and R{sub m}({sup 8}He)=2.45{+-}0.07 fm. (orig.)
Experimental Probability in Elementary School
Andrew, Lane
2009-01-01
Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.
Isaac, Richard
1995-01-01
The ideas of probability are all around us. Lotteries, casino gambling, the al most non-stop polling which seems to mold public policy more and more these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...
Scatttering of High-energy Particles at a Collisionless Shock Front: Dependence on the Shock Angle
Gedalin, M.; Dröge, W.; Kartavykh, Y. Y.
2015-07-01
Many shock acceleration theories deal with gyrophase-averaged particle distributions that depend only on the energy and pitch angle of the particles. Diffusive shock acceleration includes shock crossing as a necessary component. As long as the shock width is much smaller than the mean free path of a particle, the crossing is governed by the macroscopic fields inside the transition layer. The dynamics of high-energy particles in these fields is non-adiabatic and gyrophase dependent. The magnetic moment is not conserved in a wide range of shock angles, nor is the condition of reflection determined by the magnetic bottle relation. Instead, for a pitch angle and unknown gyrophase of an incident particle there is a finite probability of reflection. This probability varies between zero and unity in a wide range of pitch angles. In this work we investigate how the matching conditions at the shock front could be modified with the gyrophase dependence taken into account, e.g., in the form of the scattering probabilities.
Zurek, W H
2004-01-01
I show how probabilities arise in quantum physics by exploring implications of {\\it environment - assisted invariance} or {\\it envariance}, a recently discovered symmetry exhibited by entangled quantum systems. Envariance of perfectly entangled states can be used to rigorously justify complete ignorance of the observer about the outcome of any measurement on either of the members of the entangled pair. Envariance leads to Born's rule, $p_k \\propto |\\psi_k|^2$. Probabilities derived in this manner are an objective reflection of the underlying state of the system -- they reflect experimentally verifiable symmetries, and not just a subjective ``state of knowledge'' of the observer. Envariance - based approach is compared with and found superior to the key pre-quantum definitions of probability including the {\\it standard definition} based on the `principle of indifference' due to Laplace, and the {\\it relative frequency approach} advocated by von Mises. Implications of envariance for the interpretation of quantu...
Introduction to imprecise probabilities
Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M
2014-01-01
In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin
Negative Probabilities and Contextuality
de Barros, J Acacio; Oas, Gary
2015-01-01
There has been a growing interest, both in physics and psychology, in understanding contextuality in experimentally observed quantities. Different approaches have been proposed to deal with contextual systems, and a promising one is contextuality-by-default, put forth by Dzhafarov and Kujala. The goal of this paper is to present a tutorial on a different approach: negative probabilities. We do so by presenting the overall theory of negative probabilities in a way that is consistent with contextuality-by-default and by examining with this theory some simple examples where contextuality appears, both in physics and psychology.
Classic Problems of Probability
Gorroochurn, Prakash
2012-01-01
"A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2012-01-01
This book provides a unique and balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and
Spatial probability aids visual stimulus discrimination
Directory of Open Access Journals (Sweden)
Michael Druker
2010-08-01
Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.
Estimating Long GRB Jet Opening Angles and Rest-Frame Energetics
Goldstein, Adam; Briggs, Michael S; Burns, Eric
2015-01-01
We present a method to estimate the jet opening angles of long duration Gamma-Ray Bursts (GRBs) using the prompt gamma-ray energetics and an inversion of the Ghirlanda relation, which is a correlation between the time-integrated peak energy of the GRB prompt spectrum and the collimation-corrected energy in gamma rays. The derived jet opening angles using this method and detailed assumptions match well with the corresponding inferred jet opening angles obtained when a break in the afterglow is observed. Furthermore, using a model of the predicted long GRB redshift probability distribution observable by the Fermi Gamma-ray Burst Monitor (GBM), we estimate the probability distributions for the jet opening angle and rest-frame energetics for a large sample of GBM GRBs for which the redshifts have not been observed. Previous studies have only used a handful of GRBs to estimate these properties due to the paucity of observed afterglow jet breaks, spectroscopic redshifts, and comprehensive prompt gamma-ray observati...
Plotnitsky, Arkady
2010-01-01
Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general
Quznetsov, G. A.
2003-01-01
The propositional logic is generalized on the real numbers field. The logical analog of the Bernoulli independent tests scheme is constructed. The variant of the nonstandard analysis is adopted for the definition of the logical function, which has all properties of the classical probability function. The logical analog of the Large Number Law is deduced from properties of this function.
Counterexamples in probability
Stoyanov, Jordan M
2013-01-01
While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.