WorldWideScience

Sample records for angle probability distributions

  1. Predicting dihedral angle probability distributions for protein coil residues from primary sequence using neural networks

    DEFF Research Database (Denmark)

    Helles, Glennie; Fonseca, Rasmus

    2009-01-01

    Predicting the three-dimensional structure of a protein from its amino acid sequence is currently one of the most challenging problems in bioinformatics. The internal structure of helices and sheets is highly recurrent and help reduce the search space significantly. However, random coil segments...... make up nearly 40\\% of proteins, and they do not have any apparent recurrent patterns which complicates overall prediction accuracy of protein structure prediction methods. Luckily, previous work has indicated that coil segments are in fact not completely random in structure and flanking residues do...... seem to have a significant influence on the dihedral angles adopted by the individual amino acids in coil segments. In this work we attempt to predict a probability distribution of these dihedral angles based on the flanking residues. While attempts to predict dihedral angles of coil segments have been...

  2. Superpositions of probability distributions.

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=sigma;{2} play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  3. Superpositions of probability distributions

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=σ2 play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  4. Probability distribution relationships

    Directory of Open Access Journals (Sweden)

    Yousry Abdelkader

    2013-05-01

    Full Text Available In this paper, we are interesting to show the most famous distributions and their relations to the other distributions in collected diagrams. Four diagrams are sketched as networks. The first one is concerned to the continuous distributions and their relations. The second one presents the discrete distributions. The third diagram is depicted the famous limiting distributions. Finally, the Balakrishnan skew-normal density and its relationship with the other distributions are shown in the fourth diagram.

  5. Diurnal distribution of sunshine probability

    Energy Technology Data Exchange (ETDEWEB)

    Aydinli, S.

    1982-01-01

    The diurnal distribution of the sunshine probability is essential for the predetermination of average irradiances and illuminances by solar radiation on sloping surfaces. The most meteorological stations have only monthly average values of the sunshine duration available. It is, therefore, necessary to compute the diurnal distribution of sunshine probability starting from the average monthly values. It is shown how the symmetric component of the distribution of the sunshine probability which is a consequence of a ''sidescene effect'' of the clouds can be calculated. The asymmetric components of the sunshine probability depending on the location and the seasons and their influence on the predetermination of the global radiation are investigated and discussed.

  6. Exact Probability Distribution versus Entropy

    Directory of Open Access Journals (Sweden)

    Kerstin Andersson

    2014-10-01

    Full Text Available The problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations  are necessary in order to estimate the number of guesses.  Several  kinds of approximations  are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic  sizes of alphabets and words (100, the number of guesses can be estimated  within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions.  For many probability  distributions,  the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion  of guesses needed on average compared to the total number  decreases almost exponentially with the word length. The leading term in an asymptotic  expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.

  7. Probability distributions for multimeric systems.

    Science.gov (United States)

    Albert, Jaroslav; Rooman, Marianne

    2016-01-01

    We propose a fast and accurate method of obtaining the equilibrium mono-modal joint probability distributions for multimeric systems. The method necessitates only two assumptions: the copy number of all species of molecule may be treated as continuous; and, the probability density functions (pdf) are well-approximated by multivariate skew normal distributions (MSND). Starting from the master equation, we convert the problem into a set of equations for the statistical moments which are then expressed in terms of the parameters intrinsic to the MSND. Using an optimization package on Mathematica, we minimize a Euclidian distance function comprising of a sum of the squared difference between the left and the right hand sides of these equations. Comparison of results obtained via our method with those rendered by the Gillespie algorithm demonstrates our method to be highly accurate as well as efficient.

  8. Scaling of misorientation angle distributions

    DEFF Research Database (Denmark)

    Hughes, D.A.; Chrzan, D.C.; Liu, Q.

    1998-01-01

    The measurement of misorientation angle distributions following different amounts of deformation in cold-rolled aluminum and nickel and compressed stainless steel is reported. The sealing of the dislocation cell boundary misorientation angle distributions is studied. Surprisingly, the distributions...... for the small to large strain regimes for aluminum, 304L stainless steel, nickel, and copper (taken from the literature )appear to be identical. Hence the distributions may be "universal." These results have significant implications for the development of dislocation based deformation models. [S0031...

  9. ASYMPTOTIC QUANTIZATION OF PROBABILITY DISTRIBUTIONS

    Institute of Scientific and Technical Information of China (English)

    Klaus P(o)tzelberger

    2003-01-01

    We give a brief introduction to results on the asymptotics of quantization errors.The topics discussed include the quantization dimension,asymptotic distributions of sets of prototypes,asymptotically optimal quantizations,approximations and random quantizations.

  10. The Multivariate Gaussian Probability Distribution

    DEFF Research Database (Denmark)

    Ahrendt, Peter

    2005-01-01

    This technical report intends to gather information about the multivariate gaussian distribution, that was previously not (at least to my knowledge) to be found in one place and written as a reference manual. Additionally, some useful tips and tricks are collected that may be useful in practical...

  11. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  12. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  13. Exact probability distribution functions for Parrondo's games

    Science.gov (United States)

    Zadourian, Rubina; Saakian, David B.; Klümper, Andreas

    2016-12-01

    We study the discrete time dynamics of Brownian ratchet models and Parrondo's games. Using the Fourier transform, we calculate the exact probability distribution functions for both the capital dependent and history dependent Parrondo's games. In certain cases we find strong oscillations near the maximum of the probability distribution with two limiting distributions for odd and even number of rounds of the game. Indications of such oscillations first appeared in the analysis of real financial data, but now we have found this phenomenon in model systems and a theoretical understanding of the phenomenon. The method of our work can be applied to Brownian ratchets, molecular motors, and portfolio optimization.

  14. Proposal for Modified Damage Probability Distribution Functions

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup; Hansen, Peter Friis

    1996-01-01

    Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...

  15. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... computations of aggregate values. The paper also reports on the experiments with the methods. The work is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. No previous work considers the combination of the aspects of uncertain...

  16. Calculating Cumulative Binomial-Distribution Probabilities

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CUMBIN, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), used independently of one another. Reliabilities and availabilities of k-out-of-n systems analyzed. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Used for calculations of reliability and availability. Program written in C.

  17. Generating pseudo-random discrete probability distributions

    Energy Technology Data Exchange (ETDEWEB)

    Maziero, Jonas, E-mail: jonasmaziero@gmail.com [Universidade Federal de Santa Maria (UFSM), RS (Brazil). Departamento de Fisica

    2015-08-15

    The generation of pseudo-random discrete probability distributions is of paramount importance for a wide range of stochastic simulations spanning from Monte Carlo methods to the random sampling of quantum states for investigations in quantum information science. In spite of its significance, a thorough exposition of such a procedure is lacking in the literature. In this article, we present relevant details concerning the numerical implementation and applicability of what we call the iid, normalization, and trigonometric methods for generating an unbiased probability vector p=(p{sub 1},⋯ ,p{sub d}). An immediate application of these results regarding the generation of pseudo-random pure quantum states is also described. (author)

  18. Probability distributions with summary graph structure

    CERN Document Server

    Wermuth, Nanny

    2010-01-01

    A set of independence statements may define the independence structure of interest in a family of joint probability distributions. This structure is often captured by a graph that consists of nodes representing the random variables and of edges that couple node pairs. One important class are multivariate regression chain graphs. They describe the independences of stepwise processes, in which at each step single or joint responses are generated given the relevant explanatory variables in their past. For joint densities that then result after possible marginalising or conditioning, we use summary graphs. These graphs reflect the independence structure implied by the generating process for the reduced set of variables and they preserve the implied independences after additional marginalising and conditioning. They can identify generating dependences which remain unchanged and alert to possibly severe distortions due to direct and indirect confounding. Operators for matrix representations of graphs are used to de...

  19. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  20. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd;

    report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...

  1. Polarization Mode Dispersion Probability Distribution for Arbitrary Mode Coupling

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    The probability distribution of the differential group delay for arbitrary mode coupling is simulated with Monte-Carlo method. Fitting the simulation results, we obtain probability distribution function for arbitrary mode coupling.

  2. Some New Approaches to Multivariate Probability Distributions.

    Science.gov (United States)

    1986-12-01

    Forte, B. (1985). Mutual dependence of random variables and maximum discretized entropy , Ann. Prob., 13, 630-637. .. 3. Billingsley, P. (1968...characterizations of distributions, such as the Marshall-Olkin bivariate distribution or Frechet’s multi- variate distribution with continuous marginals or a...problem mentioned in Remark 8. He has given in this context a uniqueness theorem in the bivariate case under certain assump- tions. The following

  3. Foundations of quantization for probability distributions

    CERN Document Server

    Graf, Siegfried

    2000-01-01

    Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.

  4. Probability distribution fitting of schedule overruns in construction projects

    OpenAIRE

    P E D Love; C-P Sing; WANG, X; Edwards, D.J.; H Odeyinka

    2013-01-01

    The probability of schedule overruns for construction and engineering projects can be ascertained using a ‘best fit’ probability distribution from an empirical distribution. The statistical characteristics of schedule overruns occurring in 276 Australian construction and engineering projects were analysed. Skewness and kurtosis values revealed that schedule overruns are non-Gaussian. Theoretical probability distributions were then fitted to the schedule overrun data; including the Kolmogorov–...

  5. Eliciting Subjective Probability Distributions with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd;

    2015-01-01

    We test in a laboratory experiment the theoretical prediction that risk attitudes have a surprisingly small role in distorting reports from true belief distributions. We find evidence consistent with theory in our experiment.......We test in a laboratory experiment the theoretical prediction that risk attitudes have a surprisingly small role in distorting reports from true belief distributions. We find evidence consistent with theory in our experiment....

  6. NONPARAMETRIC ESTIMATION OF CHARACTERISTICS OF PROBABILITY DISTRIBUTIONS

    Directory of Open Access Journals (Sweden)

    Orlov A. I.

    2015-10-01

    Full Text Available The article is devoted to the nonparametric point and interval estimation of the characteristics of the probabilistic distribution (the expectation, median, variance, standard deviation, variation coefficient of the sample results. Sample values are regarded as the implementation of independent and identically distributed random variables with an arbitrary distribution function having the desired number of moments. Nonparametric analysis procedures are compared with the parametric procedures, based on the assumption that the sample values have a normal distribution. Point estimators are constructed in the obvious way - using sample analogs of the theoretical characteristics. Interval estimators are based on asymptotic normality of sample moments and functions from them. Nonparametric asymptotic confidence intervals are obtained through the use of special output technology of the asymptotic relations of Applied Statistics. In the first step this technology uses the multidimensional central limit theorem, applied to the sums of vectors whose coordinates are the degrees of initial random variables. The second step is the conversion limit multivariate normal vector to obtain the interest of researcher vector. At the same considerations we have used linearization and discarded infinitesimal quantities. The third step - a rigorous justification of the results on the asymptotic standard for mathematical and statistical reasoning level. It is usually necessary to use the necessary and sufficient conditions for the inheritance of convergence. This article contains 10 numerical examples. Initial data - information about an operating time of 50 cutting tools to the limit state. Using the methods developed on the assumption of normal distribution, it can lead to noticeably distorted conclusions in a situation where the normality hypothesis failed. Practical recommendations are: for the analysis of real data we should use nonparametric confidence limits

  7. Stable Probability Distributions and their Domains of Attraction

    NARCIS (Netherlands)

    J.L. Geluk (Jaap); L.F.M. de Haan (Laurens)

    1997-01-01

    textabstractThe theory of stable probability distributions and their domains of attraction is derived in a direct way (avoiding the usual route via infinitely divisible distributions) using Fourier transforms. Regularly varying functions play an important role in the exposition.

  8. Semi-stable distributions in free probability theory

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Semi-stable distributions, in classical probability theory, are characterized as limiting distributions of subsequences of normalized partial sums of independent and identically distributed random variables. We establish the noncommutative counterpart of semi-stable distributions. We study the characterization of noncommutative semi-stability through free cumulant transform and develop the free semi-stability and domain of semi-stable attraction in free probability theory.

  9. Incorporating Skew into RMS Surface Roughness Probability Distribution

    Science.gov (United States)

    Stahl, Mark T.; Stahl, H. Philip.

    2013-01-01

    The standard treatment of RMS surface roughness data is the application of a Gaussian probability distribution. This handling of surface roughness ignores the skew present in the surface and overestimates the most probable RMS of the surface, the mode. Using experimental data we confirm the Gaussian distribution overestimates the mode and application of an asymmetric distribution provides a better fit. Implementing the proposed asymmetric distribution into the optical manufacturing process would reduce the polishing time required to meet surface roughness specifications.

  10. Midcourse Guidance Law Based on High Target Acquisition Probability Considering Angular Constraint and Line-of-Sight Angle Rate Control

    Directory of Open Access Journals (Sweden)

    Xiao Liu

    2016-01-01

    Full Text Available Random disturbance factors would lead to the variation of target acquisition point during the long distance flight. To acquire a high target acquisition probability and improve the impact precision, missiles should be guided to an appropriate target acquisition position with certain attitude angles and line-of-sight (LOS angle rate. This paper has presented a new midcourse guidance law considering the influences of random disturbances, detection distance restraint, and target acquisition probability with Monte Carlo simulation. Detailed analyses of the impact points on the ground and the random distribution of the target acquisition position in the 3D space are given to get the appropriate attitude angles and the end position for the midcourse guidance. Then, a new formulation biased proportional navigation (BPN guidance law with angular constraint and LOS angle rate control has been derived to ensure the tracking ability when attacking the maneuvering target. Numerical simulations demonstrates that, compared with the proportional navigation guidance (PNG law and the near-optimal spatial midcourse guidance (NSMG law, BPN guidance law demonstrates satisfactory performances and can meet both the midcourse terminal angular constraint and the LOS angle rate requirement.

  11. Probability distributions in risk management operations

    CERN Document Server

    Artikis, Constantinos

    2015-01-01

    This book is about the formulations, theoretical investigations, and practical applications of new stochastic models for fundamental concepts and operations of the discipline of risk management. It also examines how these models can be useful in the descriptions, measurements, evaluations, and treatments of risks threatening various modern organizations. Moreover, the book makes clear that such stochastic models constitute very strong analytical tools which substantially facilitate strategic thinking and strategic decision making in many significant areas of risk management. In particular the incorporation of fundamental probabilistic concepts such as the sum, minimum, and maximum of a random number of continuous, positive, independent, and identically distributed random variables in the mathematical structure of stochastic models significantly supports the suitability of these models in the developments, investigations, selections, and implementations of proactive and reactive risk management operations. The...

  12. Some explicit expressions for the probability distribution of force magnitude

    Indian Academy of Sciences (India)

    Saralees Nadarajah

    2008-08-01

    Recently, empirical investigations have suggested that the components of contact forces follow the exponential distribution. However, explicit expressions for the probability distribution of the corresponding force magnitude have not been known and only approximations have been used in the literature. In this note, for the first time, I provide explicit expressions for the probability distribution of the force magnitude. Both two-dimensional and three-dimensional cases are considered.

  13. The estimation of tree posterior probabilities using conditional clade probability distributions.

    Science.gov (United States)

    Larget, Bret

    2013-07-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample.

  14. How Can Histograms Be Useful for Introducing Continuous Probability Distributions?

    Science.gov (United States)

    Derouet, Charlotte; Parzysz, Bernard

    2016-01-01

    The teaching of probability has changed a great deal since the end of the last century. The development of technologies is indeed part of this evolution. In France, continuous probability distributions began to be studied in 2002 by scientific 12th graders, but this subject was marginal and appeared only as an application of integral calculus.…

  15. Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution

    Energy Technology Data Exchange (ETDEWEB)

    Hamadameen, Abdulqader Othman [Optimization, Department of Mathematical Sciences, Faculty of Science, UTM (Malaysia); Zainuddin, Zaitul Marlizawati [Department of Mathematical Sciences, Faculty of Science, UTM (Malaysia)

    2014-06-19

    This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α{sup –}. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen’s method is employed to find a compromise solution, supported by illustrative numerical example.

  16. Information-theoretic methods for estimating of complicated probability distributions

    CERN Document Server

    Zong, Zhi

    2006-01-01

    Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur

  17. Application-dependent Probability Distributions for Offshore Wind Speeds

    Science.gov (United States)

    Morgan, E. C.; Lackner, M.; Vogel, R. M.; Baise, L. G.

    2010-12-01

    The higher wind speeds of the offshore environment make it an attractive setting for future wind farms. With sparser field measurements, the theoretical probability distribution of short-term wind speeds becomes more important in estimating values such as average power output and fatigue load. While previous studies typically compare the accuracy of probability distributions using R2, we show that validation based on this metric is not consistent with validation based on engineering parameters of interest, namely turbine power output and extreme wind speed. Thus, in order to make the most accurate estimates possible, the probability distribution that an engineer picks to characterize wind speeds should depend on the design parameter of interest. We introduce the Kappa and Wakeby probability distribution functions to wind speed modeling, and show that these two distributions, along with the Biweibull distribution, fit wind speed samples better than the more widely accepted Weibull and Rayleigh distributions based on R2. Additionally, out of the 14 probability distributions we examine, the Kappa and Wakeby give the most accurate and least biased estimates of turbine power output. The fact that the 2-parameter Lognormal distribution estimates extreme wind speeds (i.e. fits the upper tail of wind speed distributions) with least error indicates that not one single distribution performs satisfactorily for all applications. Our use of a large dataset composed of 178 buoys (totaling ~72 million 10-minute wind speed observations) makes these findings highly significant, both in terms of large sample size and broad geographical distribution across various wind regimes. Boxplots of R2 from the fit of each of the 14 distributions to the 178 boy wind speed samples. Distributions are ranked from left to right by ascending median R2, with the Biweibull having the closest median to 1.

  18. Most probable degree distribution at fixed structural entropy

    Indian Academy of Sciences (India)

    Ginestra Bianconi

    2008-06-01

    The structural entropy is the entropy of the ensemble of uncorrelated networks with given degree sequence. Here we derive the most probable degree distribution emerging when we distribute stubs (or half-edges) randomly through the nodes of the network by keeping fixed the structural entropy. This degree distribution is found to decay as a Poisson distribution when the entropy is maximized and to have a power-law tail with an exponent → 2 when the entropy is minimized.

  19. PROBABILITY DISTRIBUTION FUNCTION OF NEAR-WALL TURBULENT VELOCITY FLUCTUATIONS

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    By large eddy simulation (LES), turbulent databases of channel flows at different Reynolds numbers were established. Then, the probability distribution functions of the streamwise and wall-normal velocity fluctuations were obtained and compared with the corresponding normal distributions. By hypothesis test, the deviation from the normal distribution was analyzed quantitatively. The skewness and flatness factors were also calculated. And the variations of these two factors in the viscous sublayer, buffer layer and log-law layer were discussed. Still illustrated were the relations between the probability distribution functions and the burst events-sweep of high-speed fluids and ejection of low-speed fluids-in the viscous sub-layer, buffer layer and loglaw layer. Finally the variations of the probability distribution functions with Reynolds number were examined.

  20. Generating Probability Distributions using Multivalued Stochastic Relay Circuits

    CERN Document Server

    Lee, David

    2011-01-01

    The problem of random number generation dates back to von Neumann's work in 1951. Since then, many algorithms have been developed for generating unbiased bits from complex correlated sources as well as for generating arbitrary distributions from unbiased bits. An equally interesting, but less studied aspect is the structural component of random number generation as opposed to the algorithmic aspect. That is, given a network structure imposed by nature or physical devices, how can we build networks that generate arbitrary probability distributions in an optimal way? In this paper, we study the generation of arbitrary probability distributions in multivalued relay circuits, a generalization in which relays can take on any of N states and the logical 'and' and 'or' are replaced with 'min' and 'max' respectively. Previous work was done on two-state relays. We generalize these results, describing a duality property and networks that generate arbitrary rational probability distributions. We prove that these network...

  1. Evidence for Truncated Exponential Probability Distribution of Earthquake Slip

    KAUST Repository

    Thingbaijam, Kiran K. S.

    2016-07-13

    Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.

  2. NORMALLY DISTRIBUTED PROBABILITY MEASURE ON THE METRIC SPACE OF NORMS

    Institute of Scientific and Technical Information of China (English)

    Á.G. HORVÁTH

    2013-01-01

    In this paper we propose a method to construct probability measures on the space of convex bodies. For this purpose, first, we introduce the notion of thinness of a body. Then we show the existence of a measure with the property that its pushforward by the thinness function is a probability measure of truncated normal distribution. Finally, we improve this method to find a measure satisfying some important properties in geometric measure theory.

  3. Subjective Probability Distribution Elicitation in Cost Risk Analysis: A Review

    Science.gov (United States)

    2007-01-01

    where reasonable, to counteract known biases in elicitation). 1 For the triangle distribution, the probability is set to zero outside the endpoints...probability is set to zero outside the endpoints, while between the endpoints the density rises linearly from the lower value to the most-likely values...Wheeler, T. A., S. C. Hora , W. R. Cramond, and S. D. Unwin, Analysis of Core Damage Frequency from Internal Events: Expert Judgment Elicitation

  4. Probability distributions for Poisson processes with pile-up

    CERN Document Server

    Sevilla, Diego J R

    2013-01-01

    In this paper, two parametric probability distributions capable to describe the statistics of X-ray photon detection by a CCD are presented. They are formulated from simple models that account for the pile-up phenomenon, in which two or more photons are counted as one. These models are based on the Poisson process, but they have an extra parameter which includes all the detailed mechanisms of the pile-up process that must be fitted to the data statistics simultaneously with the rate parameter. The new probability distributions, one for number of counts per time bins (Poisson-like), and the other for waiting times (exponential-like) are tested fitting them to statistics of real data, and between them through numerical simulations, and their results are analyzed and compared. The probability distributions presented here can be used as background statistical models to derive likelihood functions for statistical methods in signal analysis.

  5. Probability distribution functions in the finite density lattice QCD

    CERN Document Server

    Ejiri, S; Aoki, S; Kanaya, K; Saito, H; Hatsuda, T; Ohno, H; Umeda, T

    2012-01-01

    We study the phase structure of QCD at high temperature and density by lattice QCD simulations adopting a histogram method. We try to solve the problems which arise in the numerical study of the finite density QCD, focusing on the probability distribution function (histogram). As a first step, we investigate the quark mass dependence and the chemical potential dependence of the probability distribution function as a function of the Polyakov loop when all quark masses are sufficiently large, and study the properties of the distribution function. The effect from the complex phase of the quark determinant is estimated explicitly. The shape of the distribution function changes with the quark mass and the chemical potential. Through the shape of the distribution, the critical surface which separates the first order transition and crossover regions in the heavy quark region is determined for the 2+1-flavor case.

  6. Assigning probability distributions to input parameters of performance assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, Srikanta [INTERA Inc., Austin, TX (United States)

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available.

  7. Contact angle distribution of particles at fluid interfaces.

    Science.gov (United States)

    Snoeyink, Craig; Barman, Sourav; Christopher, Gordon F

    2015-01-27

    Recent measurements have implied a distribution of interfacially adsorbed particles' contact angles; however, it has been impossible to measure statistically significant numbers for these contact angles noninvasively in situ. Using a new microscopy method that allows nanometer-scale resolution of particle's 3D positions on an interface, we have measured the contact angles for thousands of latex particles at an oil/water interface. Furthermore, these measurements are dynamic, allowing the observation of the particle contact angle with high temporal resolution, resulting in hundreds of thousands of individual contact angle measurements. The contact angle has been found to fit a normal distribution with a standard deviation of 19.3°, which is much larger than previously recorded. Furthermore, the technique used allows the effect of measurement error, constrained interfacial diffusion, and particle property variation on the contact angle distribution to be individually evaluated. Because of the ability to measure the contact angle noninvasively, the results provide previously unobtainable, unique data on the dynamics and distribution of the adsorbed particles' contact angle.

  8. Computer routines for probability distributions, random numbers, and related functions

    Science.gov (United States)

    Kirby, W.H.

    1980-01-01

    Use of previously codes and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main programs. The probability distributions provided include the beta, chisquare, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F tests. Other mathematical functions include the Bessel function I (subzero), gamma and log-gamma functions, error functions and exponential integral. Auxiliary services include sorting and printer plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)

  9. Augmenting momentum resolution with well tuned probability distributions

    CERN Document Server

    Landi, Gregorio

    2016-01-01

    The realistic probability distributions of a previous article are applied to the reconstruction of tracks in constant magnetic field. The complete forms and their schematic approximations produce excellent momentum estimations, drastically better than standard fits. A simplified derivation of one of our probability distributions is illustrated. The momentum reconstructions are compared with standard fits (least squares) with two different position algorithms: the eta-algorithm and the two-strip center of gravity. The quality of our results are expressed as the increase of the magnetic field and signal-to-noise ratio that overlap the standard fit reconstructions with ours best distributions. The data and the simulations are tuned on the tracker of a running experiment and its double sided microstrip detectors, here each detector side is simulated to measure the magnetic bending. To overlap with our best distributions, the magnetic field must be increased by a factor 1.5 for the least squares based on the eta-a...

  10. Probability distribution of extreme share returns in Malaysia

    Science.gov (United States)

    Zin, Wan Zawiah Wan; Safari, Muhammad Aslam Mohd; Jaaman, Saiful Hafizah; Yie, Wendy Ling Shin

    2014-09-01

    The objective of this study is to investigate the suitable probability distribution to model the extreme share returns in Malaysia. To achieve this, weekly and monthly maximum daily share returns are derived from share prices data obtained from Bursa Malaysia over the period of 2000 to 2012. The study starts with summary statistics of the data which will provide a clue on the likely candidates for the best fitting distribution. Next, the suitability of six extreme value distributions, namely the Gumbel, Generalized Extreme Value (GEV), Generalized Logistic (GLO) and Generalized Pareto (GPA), the Lognormal (GNO) and the Pearson (PE3) distributions are evaluated. The method of L-moments is used in parameter estimation. Based on several goodness of fit tests and L-moment diagram test, the Generalized Pareto distribution and the Pearson distribution are found to be the best fitted distribution to represent the weekly and monthly maximum share returns in Malaysia stock market during the studied period, respectively.

  11. Log-concave Probability Distributions: Theory and Statistical Testing

    DEFF Research Database (Denmark)

    An, Mark Yuing

    1996-01-01

    This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete and mul...

  12. Probability Measure of Navigation pattern predition using Poisson Distribution Analysis

    Directory of Open Access Journals (Sweden)

    Dr.V.Valli Mayil

    2012-06-01

    Full Text Available The World Wide Web has become one of the most important media to store, share and distribute information. The rapid expansion of the web has provided a great opportunity to study user and system behavior by exploring web access logs. Web Usage Mining is the application of data mining techniques to large web data repositories in order to extract usage patterns. Every web server keeps a log of all transactions between the server and the clients. The log data which are collected by web servers contains information about every click of user to the web documents of the site. The useful log information needs to be analyzed and interpreted in order to obtainknowledge about actual user preferences in accessing web pages. In recent years several methods have been proposed for mining web log data. This paper addresses the statistical method of Poisson distribution analysis to find out the higher probability session sequences which is then used to test the web application performance.The analysis of large volumes of click stream data demands the employment of data mining methods. Conducting data mining on logs of web servers involves the determination of frequently occurring access sequences. A statistical poisson distribution shows the frequency probability of specific events when the average probability of a single occurrence is known. The Poisson distribution is a discrete function wich is used in this paper to find out the probability frequency of particular page is visited by the user.

  13. On Probability Distributions for Trees: Representations, Inference and Learning

    CERN Document Server

    Denis, François; Gilleron, Rémi; Tommasi, Marc; Gilbert, Édouard

    2008-01-01

    We study probability distributions over free algebras of trees. Probability distributions can be seen as particular (formal power) tree series [Berstel et al 82, Esik et al 03], i.e. mappings from trees to a semiring K . A widely studied class of tree series is the class of rational (or recognizable) tree series which can be defined either in an algebraic way or by means of multiplicity tree automata. We argue that the algebraic representation is very convenient to model probability distributions over a free algebra of trees. First, as in the string case, the algebraic representation allows to design learning algorithms for the whole class of probability distributions defined by rational tree series. Note that learning algorithms for rational tree series correspond to learning algorithms for weighted tree automata where both the structure and the weights are learned. Second, the algebraic representation can be easily extended to deal with unranked trees (like XML trees where a symbol may have an unbounded num...

  14. Probability distributions of continuous measurement results for conditioned quantum evolution

    Science.gov (United States)

    Franquet, A.; Nazarov, Yuli V.

    2017-02-01

    We address the statistics of continuous weak linear measurement on a few-state quantum system that is subject to a conditioned quantum evolution. For a conditioned evolution, both the initial and final states of the system are fixed: the latter is achieved by the postselection in the end of the evolution. The statistics may drastically differ from the nonconditioned case, and the interference between initial and final states can be observed in the probability distributions of measurement outcomes as well as in the average values exceeding the conventional range of nonconditioned averages. We develop a proper formalism to compute the distributions of measurement outcomes, and evaluate and discuss the distributions in experimentally relevant setups. We demonstrate the manifestations of the interference between initial and final states in various regimes. We consider analytically simple examples of nontrivial probability distributions. We reveal peaks (or dips) at half-quantized values of the measurement outputs. We discuss in detail the case of zero overlap between initial and final states demonstrating anomalously big average outputs and sudden jump in time-integrated output. We present and discuss the numerical evaluation of the probability distribution aiming at extending the analytical results and describing a realistic experimental situation of a qubit in the regime of resonant fluorescence.

  15. Convolutions Induced Discrete Probability Distributions and a New Fibonacci Constant

    CERN Document Server

    Rajan, Arulalan; Rao, Vittal; Rao, Ashok

    2010-01-01

    This paper proposes another constant that can be associated with Fibonacci sequence. In this work, we look at the probability distributions generated by the linear convolution of Fibonacci sequence with itself, and the linear convolution of symmetrized Fibonacci sequence with itself. We observe that for a distribution generated by the linear convolution of the standard Fibonacci sequence with itself, the variance converges to 8.4721359... . Also, for a distribution generated by the linear convolution of symmetrized Fibonacci sequences, the variance converges in an average sense to 17.1942 ..., which is approximately twice that we get with common Fibonacci sequence.

  16. Probability Distribution Function of Passive Scalars in Shell Models

    Institute of Scientific and Technical Information of China (English)

    LIU Chun-Ping; ZHANG Xiao-Qiang; LIU Yu-Rong; WANG Guang-Rui; HE Da-Ren; CHEN Shi-Gang; ZHU Lu-Jin

    2008-01-01

    A shell-model version of passive scalar problem is introduced, which is inspired by the model of K. Ohkitani and M. Yakhot [K. Ohkitani and M. Yakhot, Phys. Rev. Lett. 60 (1988) 983; K. Ohkitani and M. Yakhot, Prog. Theor. Phys. 81 (1988) 329]. As in the original problem, the prescribed random velocity field is Gaussian and 5 correlated in time. Deterministic differential equations are regarded as nonlinear Langevin equation. Then, the Fokker-Planck equations of PDF for passive scalars axe obtained and solved numerically. In energy input range (n < 5, n is the shell number.), the probability distribution function (PDF) of passive scalars is near the Gaussian distribution. In inertial range (5 < n < 16) and dissipation range (n ≥ 17), the probability distribution function (PDF) of passive scalars has obvious intermittence. And the scaling power of passive scalar is anomalous. The results of numerical simulations are compared with experimental measurements.

  17. Measuring Robustness of Timetables at Stations using a Probability Distribution

    DEFF Research Database (Denmark)

    Jensen, Lars Wittrup; Landex, Alex

    Stations are often the limiting capacity factor in a railway network. This induces interdependencies, especially at at-grade junctions, causing network effects. This paper presents three traditional methods that can be used to measure the complexity of a station, indicating the robustness...... infrastructure layouts given a timetable. These two methods provide different precision at the expense of a more complex calculation process. The advanced and more precise method is based on a probability distribution that can describe the expected delay between two trains as a function of the buffer time....... This paper proposes to use the exponential distribution, only taking non-negative delays into account, but any probability distribution can be used. Furthermore, the paper proposes that the calculation parameters are estimated from existing delay data, at a station, to achieve a higher precision. As delay...

  18. Distribution probability of large-scale landslides in central Nepal

    Science.gov (United States)

    Timilsina, Manita; Bhandary, Netra P.; Dahal, Ranjan Kumar; Yatabe, Ryuichi

    2014-12-01

    Large-scale landslides in the Himalaya are defined as huge, deep-seated landslide masses that occurred in the geological past. They are widely distributed in the Nepal Himalaya. The steep topography and high local relief provide high potential for such failures, whereas the dynamic geology and adverse climatic conditions play a key role in the occurrence and reactivation of such landslides. The major geoscientific problems related with such large-scale landslides are 1) difficulties in their identification and delineation, 2) sources of small-scale failures, and 3) reactivation. Only a few scientific publications have been published concerning large-scale landslides in Nepal. In this context, the identification and quantification of large-scale landslides and their potential distribution are crucial. Therefore, this study explores the distribution of large-scale landslides in the Lesser Himalaya. It provides simple guidelines to identify large-scale landslides based on their typical characteristics and using a 3D schematic diagram. Based on the spatial distribution of landslides, geomorphological/geological parameters and logistic regression, an equation of large-scale landslide distribution is also derived. The equation is validated by applying it to another area. For the new area, the area under the receiver operating curve of the landslide distribution probability in the new area is 0.699, and a distribution probability value could explain > 65% of existing landslides. Therefore, the regression equation can be applied to areas of the Lesser Himalaya of central Nepal with similar geological and geomorphological conditions.

  19. Unitary equilibrations: probability distribution of the Loschmidt echo

    CERN Document Server

    Venuti, Lorenzo Campos

    2009-01-01

    Closed quantum systems evolve unitarily and therefore cannot converge in a strong sense to an equilibrium state starting out from a generic pure state. Nevertheless for large system size one observes temporal typicality. Namely, for the overwhelming majority of the time instants, the statistics of observables is practically indistinguishable from an effective equilibrium one. In this paper we consider the Loschmidt echo (LE) to study this sort of unitary equilibration after a quench. We draw several conclusions on general grounds and on the basis of an exactly-solvable example of a quasi-free system. In particular we focus on the whole probability distribution of observing a given value of the LE after waiting a long time. Depending on the interplay between the initial state and the quench Hamiltonian, we find different regimes reflecting different equilibration dynamics. When the perturbation is small and the system is away from criticality the probability distribution is Gaussian. However close to criticali...

  20. Fibonacci Sequence, Recurrence Relations, Discrete Probability Distributions and Linear Convolution

    CERN Document Server

    Rajan, Arulalan; Rao, Ashok; Jamadagni, H S

    2012-01-01

    The classical Fibonacci sequence is known to exhibit many fascinating properties. In this paper, we explore the Fibonacci sequence and integer sequences generated by second order linear recurrence relations with positive integer coe?cients from the point of view of probability distributions that they induce. We obtain the generalizations of some of the known limiting properties of these probability distributions and present certain optimal properties of the classical Fibonacci sequence in this context. In addition, we also look at the self linear convolution of linear recurrence relations with positive integer coefficients. Analysis of self linear convolution is focused towards locating the maximum in the resulting sequence. This analysis, also highlights the influence that the largest positive real root, of the "characteristic equation" of the linear recurrence relations with positive integer coefficients, has on the location of the maximum. In particular, when the largest positive real root is 2,the locatio...

  1. Outage probability of distributed beamforming with co-channel interference

    KAUST Repository

    Yang, Liang

    2012-03-01

    In this letter, we consider a distributed beamforming scheme (DBF) in the presence of equal-power co-channel interferers for both amplify-and-forward and decode-and-forward relaying protocols over Rayleigh fading channels. We first derive outage probability expressions for the DBF systems. We then present a performance analysis for a scheme relying on source selection. Numerical results are finally presented to verify our analysis. © 2011 IEEE.

  2. Testing for the maximum cell probabilities in multinomial distributions

    Institute of Scientific and Technical Information of China (English)

    XIONG Shifeng; LI Guoying

    2005-01-01

    This paper investigates one-sided hypotheses testing for p[1], the largest cell probability of multinomial distribution. A small sample test of Ethier (1982) is extended to the general cases. Based on an estimator of p[1], a kind of large sample tests is proposed. The asymptotic power of the above tests under local alternatives is derived. An example is presented at the end of this paper.

  3. Estimating probable flaw distributions in PWR steam generator tubes

    Energy Technology Data Exchange (ETDEWEB)

    Gorman, J.A.; Turner, A.P.L. [Dominion Engineering, Inc., McLean, VA (United States)

    1997-02-01

    This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regarding uncertainties and assumptions in the data and analyses.

  4. Steady-state distributions of probability fluxes on complex networks

    Science.gov (United States)

    Chełminiak, Przemysław; Kurzyński, Michał

    2017-02-01

    We consider a simple model of the Markovian stochastic dynamics on complex networks to examine the statistical properties of the probability fluxes. The additional transition, called hereafter a gate, powered by the external constant force breaks a detailed balance in the network. We argue, using a theoretical approach and numerical simulations, that the stationary distributions of the probability fluxes emergent under such conditions converge to the Gaussian distribution. By virtue of the stationary fluctuation theorem, its standard deviation depends directly on the square root of the mean flux. In turn, the nonlinear relation between the mean flux and the external force, which provides the key result of the present study, allows us to calculate the two parameters that entirely characterize the Gaussian distribution of the probability fluxes both close to as well as far from the equilibrium state. Also, the other effects that modify these parameters, such as the addition of shortcuts to the tree-like network, the extension and configuration of the gate and a change in the network size studied by means of computer simulations are widely discussed in terms of the rigorous theoretical predictions.

  5. The Probability Distribution Model of Wind Speed over East Malaysia

    Directory of Open Access Journals (Sweden)

    Nurulkamal Masseran

    2013-07-01

    Full Text Available Many studies have found that wind speed is the most significant parameter of wind power. Thus, an accurate determination of the probability distribution of wind speed is an important parameter to measure before estimating the wind energy potential over a particular region. Utilizing an accurate distribution will minimize the uncertainty in wind resource estimates and improve the site assessment phase of planning. In general, different regions have different wind regimes. Hence, it is reasonable that different wind distributions will be found for different regions. Because it is reasonable to consider that wind regimes vary according to the region of a particular country, nine different statistical distributions have been fitted to the mean hourly wind speed data from 20 wind stations in East Malaysia, for the period from 2000 to 2009. The values from Kolmogorov-Smirnov statistic, Akaike’s Information Criteria, Bayesian Information Criteria and R2 correlation coefficient were compared with the distributions to determine the best fit for describing the observed data. A good fit for most of the stations in East Malaysia was found using the Gamma and Burr distributions, though there was no clear pattern observed for all regions in East Malaysia. However, the Gamma distribution was a clear fit to the data from all stations in southern Sabah.

  6. Research on probability distribution of port cargo throughput

    Institute of Scientific and Technical Information of China (English)

    SUN Liang; TAN De-rong

    2008-01-01

    In order to more accurately examine developing trends in gross cargo throughput, we have modeled the probability distribution of cargo throughput. Gross cargo throughput is determined by the time spent by cargo ships in the port and the operating efficiency of handling equipment. Gross cargo throughput is the sum of all compound variables determining each aspect of cargo throughput for every cargo ship arriving at the port. Probability distribution was determined using the Wald equation. The results show that the variability of gross cargo throughput primarily depends on the different times required by different cargo ships arriving at the port. This model overcomes the shortcoming of previous models: inability to accurately determine the probability of a specific value of future gross cargo throughput. Our proposed model of cargo throughput depends on the relationship between time required by a cargo ship arriving at the port and the operational capacity of handling equipment at the port. At the same time, key factors affecting gross cargo throughput are analyzed. In order to test the efficiency of the model, the cargo volume of a port in Shandong Province was used as an example. In the case study the actual results matched our theoretical analysis.

  7. Methods for fitting a parametric probability distribution to most probable number data.

    Science.gov (United States)

    Williams, Michael S; Ebel, Eric D

    2012-07-01

    Every year hundreds of thousands, if not millions, of samples are collected and analyzed to assess microbial contamination in food and water. The concentration of pathogenic organisms at the end of the production process is low for most commodities, so a highly sensitive screening test is used to determine whether the organism of interest is present in a sample. In some applications, samples that test positive are subjected to quantitation. The most probable number (MPN) technique is a common method to quantify the level of contamination in a sample because it is able to provide estimates at low concentrations. This technique uses a series of dilution count experiments to derive estimates of the concentration of the microorganism of interest. An application for these data is food-safety risk assessment, where the MPN concentration estimates can be fitted to a parametric distribution to summarize the range of potential exposures to the contaminant. Many different methods (e.g., substitution methods, maximum likelihood and regression on order statistics) have been proposed to fit microbial contamination data to a distribution, but the development of these methods rarely considers how the MPN technique influences the choice of distribution function and fitting method. An often overlooked aspect when applying these methods is whether the data represent actual measurements of the average concentration of microorganism per milliliter or the data are real-valued estimates of the average concentration, as is the case with MPN data. In this study, we propose two methods for fitting MPN data to a probability distribution. The first method uses a maximum likelihood estimator that takes average concentration values as the data inputs. The second is a Bayesian latent variable method that uses the counts of the number of positive tubes at each dilution to estimate the parameters of the contamination distribution. The performance of the two fitting methods is compared for two

  8. Maximum-entropy probability distributions under Lp-norm constraints

    Science.gov (United States)

    Dolinar, S.

    1991-01-01

    Continuous probability density functions and discrete probability mass functions are tabulated which maximize the differential entropy or absolute entropy, respectively, among all probability distributions with a given L sub p norm (i.e., a given pth absolute moment when p is a finite integer) and unconstrained or constrained value set. Expressions for the maximum entropy are evaluated as functions of the L sub p norm. The most interesting results are obtained and plotted for unconstrained (real valued) continuous random variables and for integer valued discrete random variables. The maximum entropy expressions are obtained in closed form for unconstrained continuous random variables, and in this case there is a simple straight line relationship between the maximum differential entropy and the logarithm of the L sub p norm. Corresponding expressions for arbitrary discrete and constrained continuous random variables are given parametrically; closed form expressions are available only for special cases. However, simpler alternative bounds on the maximum entropy of integer valued discrete random variables are obtained by applying the differential entropy results to continuous random variables which approximate the integer valued random variables in a natural manner. All the results are presented in an integrated framework that includes continuous and discrete random variables, constraints on the permissible value set, and all possible values of p. Understanding such as this is useful in evaluating the performance of data compression schemes.

  9. Net-proton probability distribution in heavy ion collisions

    CERN Document Server

    Braun-Munzinger, P; Karsch, F; Redlich, K; Skokov, V

    2011-01-01

    We compute net-proton probability distributions in heavy ion collisions within the hadron resonance gas model. The model results are compared with data taken by the STAR Collaboration in Au-Au collisions at sqrt(s_{NN})= 200 GeV for different centralities. We show that in peripheral Au-Au collisions the measured distributions, and the resulting first four moments of net-proton fluctuations, are consistent with results obtained from the hadron resonance gas model. However, data taken in central Au-Au collisions differ from the predictions of the model. The observed deviations can not be attributed to uncertainties in model parameters. We discuss possible interpretations of the observed deviations.

  10. Landslide Probability Assessment by the Derived Distributions Technique

    Science.gov (United States)

    Muñoz, E.; Ochoa, A.; Martínez, H.

    2012-12-01

    Landslides are potentially disastrous events that bring along human and economic losses; especially in cities where an accelerated and unorganized growth leads to settlements on steep and potentially unstable areas. Among the main causes of landslides are geological, geomorphological, geotechnical, climatological, hydrological conditions and anthropic intervention. This paper studies landslides detonated by rain, commonly known as "soil-slip", which characterize by having a superficial failure surface (Typically between 1 and 1.5 m deep) parallel to the slope face and being triggered by intense and/or sustained periods of rain. This type of landslides is caused by changes on the pore pressure produced by a decrease in the suction when a humid front enters, as a consequence of the infiltration initiated by rain and ruled by the hydraulic characteristics of the soil. Failure occurs when this front reaches a critical depth and the shear strength of the soil in not enough to guarantee the stability of the mass. Critical rainfall thresholds in combination with a slope stability model are widely used for assessing landslide probability. In this paper we present a model for the estimation of the occurrence of landslides based on the derived distributions technique. Since the works of Eagleson in the 1970s the derived distributions technique has been widely used in hydrology to estimate the probability of occurrence of extreme flows. The model estimates the probability density function (pdf) of the Factor of Safety (FOS) from the statistical behavior of the rainfall process and some slope parameters. The stochastic character of the rainfall is transformed by means of a deterministic failure model into FOS pdf. Exceedance probability and return period estimation is then straightforward. The rainfall process is modeled as a Rectangular Pulses Poisson Process (RPPP) with independent exponential pdf for mean intensity and duration of the storms. The Philip infiltration model

  11. Neighbor-dependent Ramachandran probability distributions of amino acids developed from a hierarchical Dirichlet process model.

    Directory of Open Access Journals (Sweden)

    Daniel Ting

    2010-04-01

    Full Text Available Distributions of the backbone dihedral angles of proteins have been studied for over 40 years. While many statistical analyses have been presented, only a handful of probability densities are publicly available for use in structure validation and structure prediction methods. The available distributions differ in a number of important ways, which determine their usefulness for various purposes. These include: 1 input data size and criteria for structure inclusion (resolution, R-factor, etc.; 2 filtering of suspect conformations and outliers using B-factors or other features; 3 secondary structure of input data (e.g., whether helix and sheet are included; whether beta turns are included; 4 the method used for determining probability densities ranging from simple histograms to modern nonparametric density estimation; and 5 whether they include nearest neighbor effects on the distribution of conformations in different regions of the Ramachandran map. In this work, Ramachandran probability distributions are presented for residues in protein loops from a high-resolution data set with filtering based on calculated electron densities. Distributions for all 20 amino acids (with cis and trans proline treated separately have been determined, as well as 420 left-neighbor and 420 right-neighbor dependent distributions. The neighbor-independent and neighbor-dependent probability densities have been accurately estimated using Bayesian nonparametric statistical analysis based on the Dirichlet process. In particular, we used hierarchical Dirichlet process priors, which allow sharing of information between densities for a particular residue type and different neighbor residue types. The resulting distributions are tested in a loop modeling benchmark with the program Rosetta, and are shown to improve protein loop conformation prediction significantly. The distributions are available at http://dunbrack.fccc.edu/hdp.

  12. Probability Distribution and Projected Trends of Daily Precipitation in China

    Institute of Scientific and Technical Information of China (English)

    CAO; Li-Ge; ZHONG; Jun; SU; Bu-Da; ZHAI; Jian-Qing; Macro; GEMMER

    2013-01-01

    Based on observed daily precipitation data of 540 stations and 3,839 gridded data from the high-resolution regional climate model COSMO-Climate Limited-area Modeling(CCLM)for 1961–2000,the simulation ability of CCLM on daily precipitation in China is examined,and the variation of daily precipitation distribution pattern is revealed.By applying the probability distribution and extreme value theory to the projected daily precipitation(2011–2050)under SRES A1B scenario with CCLM,trends of daily precipitation series and daily precipitation extremes are analyzed.Results show that except for the western Qinghai-Tibetan Plateau and South China,distribution patterns of the kurtosis and skewness calculated from the simulated and observed series are consistent with each other;their spatial correlation coefcients are above 0.75.The CCLM can well capture the distribution characteristics of daily precipitation over China.It is projected that in some parts of the Jianghuai region,central-eastern Northeast China and Inner Mongolia,the kurtosis and skewness will increase significantly,and precipitation extremes will increase during 2011–2050.The projected increase of maximum daily rainfall and longest non-precipitation period during flood season in the aforementioned regions,also show increasing trends of droughts and floods in the next 40 years.

  13. Uncertainty squared: Choosing among multiple input probability distributions and interpreting multiple output probability distributions in Monte Carlo climate risk models

    Science.gov (United States)

    Baer, P.; Mastrandrea, M.

    2006-12-01

    Simple probabilistic models which attempt to estimate likely transient temperature change from specified CO2 emissions scenarios must make assumptions about at least six uncertain aspects of the causal chain between emissions and temperature: current radiative forcing (including but not limited to aerosols), current land use emissions, carbon sinks, future non-CO2 forcing, ocean heat uptake, and climate sensitivity. Of these, multiple PDFs (probability density functions) have been published for the climate sensitivity, a couple for current forcing and ocean heat uptake, one for future non-CO2 forcing, and none for current land use emissions or carbon cycle uncertainty (which are interdependent). Different assumptions about these parameters, as well as different model structures, will lead to different estimates of likely temperature increase from the same emissions pathway. Thus policymakers will be faced with a range of temperature probability distributions for the same emissions scenarios, each described by a central tendency and spread. Because our conventional understanding of uncertainty and probability requires that a probabilistically defined variable of interest have only a single mean (or median, or modal) value and a well-defined spread, this "multidimensional" uncertainty defies straightforward utilization in policymaking. We suggest that there are no simple solutions to the questions raised. Crucially, we must dispel the notion that there is a "true" probability probabilities of this type are necessarily subjective, and reasonable people may disagree. Indeed, we suggest that what is at stake is precisely the question, what is it reasonable to believe, and to act as if we believe? As a preliminary suggestion, we demonstrate how the output of a simple probabilistic climate model might be evaluated regarding the reasonableness of the outputs it calculates with different input PDFs. We suggest further that where there is insufficient evidence to clearly

  14. probably

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    【说词】1. He can probably tell us the truth.2. Will it rain this afternoong ? Probably【解语】作副词,意为“大概、或许”,表示可能性很大,通常指根据目前情况作出积极推测或判断;

  15. Non-Gaussian probability distributions of solar wind fluctuations

    Directory of Open Access Journals (Sweden)

    E. Marsch

    Full Text Available The probability distributions of field differences ∆x(τ=x(t+τ-x(t, where the variable x(t may denote any solar wind scalar field or vector field component at time t, have been calculated from time series of Helios data obtained in 1976 at heliocentric distances near 0.3 AU. It is found that for comparatively long time lag τ, ranging from a few hours to 1 day, the differences are normally distributed according to a Gaussian. For shorter time lags, of less than ten minutes, significant changes in shape are observed. The distributions are often spikier and narrower than the equivalent Gaussian distribution with the same standard deviation, and they are enhanced for large, reduced for intermediate and enhanced for very small values of ∆x. This result is in accordance with fluid observations and numerical simulations. Hence statistical properties are dominated at small scale τ by large fluctuation amplitudes that are sparsely distributed, which is direct evidence for spatial intermittency of the fluctuations. This is in agreement with results from earlier analyses of the structure functions of ∆x. The non-Gaussian features are differently developed for the various types of fluctuations. The relevance of these observations to the interpretation and understanding of the nature of solar wind magnetohydrodynamic (MHD turbulence is pointed out, and contact is made with existing theoretical concepts of intermittency in fluid turbulence.

  16. Some Useful Distributions and Probabilities for Cellular Networks

    CERN Document Server

    Yu, Seung Min

    2011-01-01

    The cellular network is one of the most useful networks for wireless communications and now universally used. There have been a lot of analytic results about the performance of the mobile user at a specific location such as the cell center or edge. On the other hand, there have been few analytic results about the performance of the mobile user at an arbitrary location. Moreover, to the best of our knowledge, there is no analytic result on the performance of the mobile user at an arbitrary location considering the mobile user density. In this paper, we use the stochastic geometry approach and derive useful distributions and probabilities for cellular networks. Using those, we analyze the performance of the mobile user, e.g., outage probability at an arbitrary location considering the mobile user density. Under some assumptions, those can be expressed by closed form formulas. Our analytic results will provide a fundamental framework for the performance analysis of cellular networks, which will significantly red...

  17. Characterizing the \\lyaf\\ flux probability distribution function using Legendre polynomials

    CERN Document Server

    Cieplak, Agnieszka M

    2016-01-01

    The Lyman-$\\alpha$ forest is a highly non-linear field with a lot of information available in the data beyond the power spectrum. The flux probability distribution function (PDF) has been used as a successful probe of small-scale physics. In this paper we argue that measuring coefficients of the Legendre polyonomial expansion of the PDF offers several advantages over measuring the binned values as is commonly done. In particular, $n$-th coefficient can be expressed as a linear combination of the first $n$ moments, allowing these coefficients to be measured in the presence of noise and allowing a clear route for marginalisation over mean flux. Moreover, in the presence of noise, our numerical work shows that a finite number of coefficients are well measured with very sharp transition into noise dominance. This compresses the available information into a small number of well-measured quantities.

  18. Gesture Recognition Based on the Probability Distribution of Arm Trajectories

    Science.gov (United States)

    Wan, Khairunizam; Sawada, Hideyuki

    The use of human motions for the interaction between humans and computers is becoming an attractive alternative to verbal media, especially through the visual interpretation of the human body motion. In particular, hand gestures are used as non-verbal media for the humans to communicate with machines that pertain to the use of the human gestures to interact with them. This paper introduces a 3D motion measurement of the human upper body for the purpose of the gesture recognition, which is based on the probability distribution of arm trajectories. In this study, by examining the characteristics of the arm trajectories given by a signer, motion features are selected and classified by using a fuzzy technique. Experimental results show that the use of the features extracted from arm trajectories effectively works on the recognition of dynamic gestures of a human, and gives a good performance to classify various gesture patterns.

  19. Seismic pulse propagation with constant Q and stable probability distributions

    Directory of Open Access Journals (Sweden)

    M. Tomirotti

    1997-06-01

    Full Text Available The one-dimensional propagation of seismic waves with constant Q is shown to be governed by an evolution equation of fractional order in time, which interpolates the heat equation and the wave equation. The fundamental solutions for the Cauchy and Signalling problems are expressed in terms of entire functions (of Wright type in the similarity variable and their behaviours turn out to be intermediate between those for the limiting cases of a perfectly viscous fluid and a perfectly elastic solid. In view of the small dissipation exhibited by the seismic pulses, the nearly elastic limit is considered. Furthermore, the fundamental solutions for the Cauchy and Signalling problems are shown to be related to stable probability distributions with an index of stability determined by the order of the fractional time derivative in the evolution equation.

  20. A probability distribution approach to synthetic turbulence time series

    Science.gov (United States)

    Sinhuber, Michael; Bodenschatz, Eberhard; Wilczek, Michael

    2016-11-01

    The statistical features of turbulence can be described in terms of multi-point probability density functions (PDFs). The complexity of these statistical objects increases rapidly with the number of points. This raises the question of how much information has to be incorporated into statistical models of turbulence to capture essential features such as inertial-range scaling and intermittency. Using high Reynolds number hot-wire data obtained at the Variable Density Turbulence Tunnel at the Max Planck Institute for Dynamics and Self-Organization, we establish a PDF-based approach on generating synthetic time series that reproduce those features. To do this, we measure three-point conditional PDFs from the experimental data and use an adaption-rejection method to draw random velocities from this distribution to produce synthetic time series. Analyzing these synthetic time series, we find that time series based on even low-dimensional conditional PDFs already capture some essential features of real turbulent flows.

  1. Probability distributions for one component equations with multiplicative noise

    CERN Document Server

    Deutsch, J M

    1993-01-01

    Abstract: Systems described by equations involving both multiplicative and additive noise are common in nature. Examples include convection of a passive scalar field, polymersin turbulent flow, and noise in dye lasers. In this paper the one component version of this problem is studied. The steady state probability distribution is classified into two different types of behavior. One class has power law tails and the other is of the form of an exponential to a power law. The value of the power law exponent is determined analytically for models having colored gaussian noise. It is found to only depend on the power spectrum of the noise at zero frequency. When non-gaussian noise is considered it is shown that stretched exponential tails are possible. An intuitive understanding of the results is found and makes use of the Lyapunov exponents for these systems.

  2. Seismic pulse propagation with constant Q and stable probability distributions

    CERN Document Server

    Mainardi, Francesco

    2010-01-01

    The one-dimensional propagation of seismic waves with constant Q is shown to be governed by an evolution equation of fractional order in time, which interpolates the heat equation and the wave equation. The fundamental solutions for the Cauchy and Signalling problems are expressed in terms of entire functions (of Wright type) in the similarity variable and their behaviours turn out to be intermediate between those for the limiting cases of a perfectly viscous fluid and a perfectly elastic solid. In view of the small dissipation exhibited by the seismic pulses, the nearly elastic limit is considered. Furthermore, the fundamental solutions for the Cauchy and Signalling problems are shown to be related to stable probability distributions with index of stability determined by the order of the fractional time derivative in the evolution equation.

  3. Cosmological constraints from the convergence 1-point probability distribution

    CERN Document Server

    Patton, Kenneth; Honscheid, Klaus; Huff, Eric; Melchior, Peter; Ross, Ashley J; Suchyta, Eric

    2016-01-01

    We examine the cosmological information available from the 1-point probability distribution (PDF) of the weak-lensing convergence field, utilizing fast L-PICOLA simulations and a Fisher analysis. We find competitive constraints in the $\\Omega_m$-$\\sigma_8$ plane from the convergence PDF with $188\\ arcmin^2$ pixels compared to the cosmic shear power spectrum with an equivalent number of modes ($\\ell < 886$). The convergence PDF also partially breaks the degeneracy cosmic shear exhibits in that parameter space. A joint analysis of the convergence PDF and shear 2-point function also reduces the impact of shape measurement systematics, to which the PDF is less susceptible, and improves the total figure of merit by a factor of $2-3$, depending on the level of systematics. Finally, we present a correction factor necessary for calculating the unbiased Fisher information from finite differences using a limited number of cosmological simulations.

  4. Probability distribution function for inclinations of merging compact binaries detected by gravitational wave interferometers

    CERN Document Server

    Seto, Naoki

    2014-01-01

    We analytically discuss probability distribution function (PDF) for inclinations of merging compact binaries whose gravitational waves are coherently detected by a network of ground based interferometers. The PDF would be useful for studying prospects of (1) simultaneously detecting electromagnetic signals (such as gamma-ray-bursts) associated with binary mergers and (2) statistically constraining the related theoretical models from the actual observational data of multi-messenger astronomy. Our approach is similar to Schutz (2011), but we explicitly include the dependence of the polarization angles of the binaries, based on the concise formulation given in Cutler and Flanagan (1994). We find that the overall profiles of the PDFs are similar for any networks composed by the second generation detectors (Advanced-LIGO, Advanced-Virgo, KAGRA, LIGO-India). For example, 5.1% of detected binaries would have inclination angle less than 10 degree with at most 0.1% differences between the potential networks. A perturb...

  5. Insights from probability distribution functions of intensity maps

    CERN Document Server

    Breysse, Patrick C; Behroozi, Peter S; Dai, Liang; Kamionkowski, Marc

    2016-01-01

    In the next few years, intensity-mapping surveys that target lines such as CO, Ly$\\alpha$, and CII stand to provide powerful probes of high-redshift astrophysics. However, these line emissions are highly non-Gaussian, and so the typical power-spectrum methods used to study these maps will leave out a significant amount of information. We propose a new statistic, the probability distribution of voxel intensities, which can access this extra information. Using a model of a CO intensity map at $z\\sim3$ as an example, we demonstrate that this voxel intensity distribution (VID) provides substantial constraining power beyond what is obtainable from the power spectrum alone. We find that a future survey similar to the planned COMAP Full experiment could constrain the CO luminosity function to order $\\sim10\\%$. We also explore the effects of contamination from continuum emission, interloper lines, and gravitational lensing on our constraints and find that the VID statistic retains significant constraining power even ...

  6. Simulations of the Hadamard Variance: Probability Distributions and Confidence Intervals.

    Science.gov (United States)

    Ashby, Neil; Patla, Bijunath

    2016-04-01

    Power-law noise in clocks and oscillators can be simulated by Fourier transforming a modified spectrum of white phase noise. This approach has been applied successfully to simulation of the Allan variance and the modified Allan variance in both overlapping and nonoverlapping forms. When significant frequency drift is present in an oscillator, at large sampling times the Allan variance overestimates the intrinsic noise, while the Hadamard variance is insensitive to frequency drift. The simulation method is extended in this paper to predict the Hadamard variance for the common types of power-law noise. Symmetric real matrices are introduced whose traces-the sums of their eigenvalues-are equal to the Hadamard variances, in overlapping or nonoverlapping forms, as well as for the corresponding forms of the modified Hadamard variance. We show that the standard relations between spectral densities and Hadamard variance are obtained with this method. The matrix eigenvalues determine probability distributions for observing a variance at an arbitrary value of the sampling interval τ, and hence for estimating confidence in the measurements.

  7. Performance Probability Distributions for Sediment Control Best Management Practices

    Science.gov (United States)

    Ferrell, L.; Beighley, R.; Walsh, K.

    2007-12-01

    Controlling soil erosion and sediment transport can be a significant challenge during the construction process due to the extent and conditions of bare, disturbed soils. Best Management Practices (BMPs) are used as the framework for the design of sediment discharge prevention systems in stormwater pollution prevention plans which are typically required for construction sites. This research focuses on commonly-used BMP systems for perimeter control of sediment export: silt fences and fiber rolls. Although these systems are widely used, the physical and engineering parameters describing their performance are not well understood. Performance expectations are based on manufacturer results, but due to the dynamic conditions that exist on a construction site performance expectations are not always achievable in the field. Based on experimental results product performance is shown to be highly variable. Experiments using the same installation procedures show inconsistent sediment removal performances ranging from (>)85 percent to zero. The goal of this research is to improve the determination of off-site sediment yield based on probabilistic performance results of perimeter control BMPs. BMPs are evaluated in the Soil Erosion Research Laboratory (SERL) in the Civil and Environmental Engineering department at San Diego State University. SERL experiments are performed on a 3-m by 10-m tilting soil bed with a soil depth of 0.5 meters and a slope of 33 percent. The simulated storm event consists of 17 mm/hr for 20 minutes followed by 51 mm/hr for 30 minutes. The storm event is based on an ASTM design storm intended to simulate BMP failures. BMP performance is assessed based on experiments where BMPs are installed per manufacture specifications, less than optimal installations, and no treatment conditions. Preliminary results from 30 experiments are presented and used to develop probability distributions for BMP sediment removal efficiencies. The results are then combined with

  8. Target Tracking Using SePDAF under Ambiguous Angles for Distributed Array Radar

    Directory of Open Access Journals (Sweden)

    Teng Long

    2016-09-01

    Full Text Available Distributed array radar can improve radar detection capability and measurement accuracy. However, it will suffer cyclic ambiguity in its angle estimates according to the spatial Nyquist sampling theorem since the large sparse array is undersampling. Consequently, the state estimation accuracy and track validity probability degrades when the ambiguous angles are directly used for target tracking. This paper proposes a second probability data association filter (SePDAF-based tracking method for distributed array radar. Firstly, the target motion model and radar measurement model is built. Secondly, the fusion result of each radar’s estimation is employed to the extended Kalman filter (EKF to finish the first filtering. Thirdly, taking this result as prior knowledge, and associating with the array-processed ambiguous angles, the SePDAF is applied to accomplish the second filtering, and then achieving a high accuracy and stable trajectory with relatively low computational complexity. Moreover, the azimuth filtering accuracy will be promoted dramatically and the position filtering accuracy will also improve. Finally, simulations illustrate the effectiveness of the proposed method.

  9. Target Tracking Using SePDAF under Ambiguous Angles for Distributed Array Radar

    Science.gov (United States)

    Long, Teng; Zhang, Honggang; Zeng, Tao; Chen, Xinliang; Liu, Quanhua; Zheng, Le

    2016-01-01

    Distributed array radar can improve radar detection capability and measurement accuracy. However, it will suffer cyclic ambiguity in its angle estimates according to the spatial Nyquist sampling theorem since the large sparse array is undersampling. Consequently, the state estimation accuracy and track validity probability degrades when the ambiguous angles are directly used for target tracking. This paper proposes a second probability data association filter (SePDAF)-based tracking method for distributed array radar. Firstly, the target motion model and radar measurement model is built. Secondly, the fusion result of each radar’s estimation is employed to the extended Kalman filter (EKF) to finish the first filtering. Thirdly, taking this result as prior knowledge, and associating with the array-processed ambiguous angles, the SePDAF is applied to accomplish the second filtering, and then achieving a high accuracy and stable trajectory with relatively low computational complexity. Moreover, the azimuth filtering accuracy will be promoted dramatically and the position filtering accuracy will also improve. Finally, simulations illustrate the effectiveness of the proposed method. PMID:27618058

  10. Light Scattering of Rough Orthogonal Anisotropic Surfaces with Secondary Most Probable Slope Distributions

    Institute of Scientific and Technical Information of China (English)

    LI Hai-Xia; CHENG Chuan-Fu

    2011-01-01

    @@ We study the light scattering of an orthogonal anisotropic rough surface with secondary most probable slope distribution It is found that the scattered intensity profiles have obvious secondary maxima, and in the direction perpendicular to the plane of incidence, the secondary maxima are oriented in a curve on the observation plane,which is called the orientation curve.By numerical calculation of the scattering wave fields with the height data of the sample, it is validated that the secondary maxima are induced by the side face element, which constitutes the prismoid structure of the anisotropic surface.We derive the equation of the quadratic orientation curve.Experimentally, we construct the system for light scattering measurement using a CCD.The scattered intensity profiles are extracted from the images at different angles of incidence along the orientation curves.The experimental results conform to the theory.

  11. The Probability Distribution of Inter-car Spacings

    Science.gov (United States)

    Xian, Jin Guo; Han, Dong

    In this paper, the celluar automation model with Fukui-Ishibashi-type acceleration rule is used to study the inter-car spacing distribution for traffic flow. The method used in complex network analysis is applied to study the spacings distribution. By theoretical analysis, we obtain the result that the distribution of inter-car spacings follows power law when vehicle density is low and spacing is not large, while, when the vehicle density is high or the spacing is large, the distribution can be described by exponential distribution. Moreover, the numerical simulations support the theoretical result.

  12. The Impact of an Instructional Intervention Designed to Support Development of Stochastic Understanding of Probability Distribution

    Science.gov (United States)

    Conant, Darcy Lynn

    2013-01-01

    Stochastic understanding of probability distribution undergirds development of conceptual connections between probability and statistics and supports development of a principled understanding of statistical inference. This study investigated the impact of an instructional course intervention designed to support development of stochastic…

  13. Measuring Robustness of Timetables in Stations using a Probability Distribution

    DEFF Research Database (Denmark)

    Jensen, Lars Wittrup; Landex, Alex

    of a station based on the plan of operation and the minimum headway times However, none of the above methods take a given timetable into account when the complexity of the station is calculated. E.g. two timetable candidates are given following the same plan of operation in a station; one will be more...... vulnerable to delays (less robust) while the other will be less vulnerable (more robust), but this cannot be measured by the above methods. In the light of this, the article will describe a new method where the complexity of a given station with a given timetable can be calculated based on a probability...... delays caused by interdependencies, and result in a more robust operation. Currently three methods to calculate the complexity of station exists: 1. Complexity of a station based on the track layout 2. Complexity of a station based on the probability of a conflict using a plan of operation 3. Complexity...

  14. Martingale Couplings and Bounds on the Tails of Probability Distributions

    CERN Document Server

    Luh, Kyle J

    2011-01-01

    Hoeffding has shown that tail bounds on the distribution for sampling from a finite population with replacement also apply to the corresponding cases of sampling without replacement. (A special case of this result is that binomial tail bounds apply to the corresponding hypergeometric tails.) We give a new proof of Hoeffding's result by constructing a martingale coupling between the sampling distributions. This construction is given by an explicit combinatorial procedure involving balls and urns. We then apply this construction to create martingale couplings between other pairs of sampling distributions, both without replacement and with "surreplacement" (that is, sampling in which not only is the sampled individual replaced, but some number of "copies" of that individual are added to the population).

  15. APL: An angle probability list to improve knowledge-based metaheuristics for the three-dimensional protein structure prediction.

    Science.gov (United States)

    Borguesan, Bruno; Barbachan e Silva, Mariel; Grisci, Bruno; Inostroza-Ponta, Mario; Dorn, Márcio

    2015-12-01

    Tertiary protein structure prediction is one of the most challenging problems in structural bioinformatics. Despite the advances in algorithm development and computational strategies, predicting the folded structure of a protein only from its amino acid sequence remains as an unsolved problem. We present a new computational approach to predict the native-like three-dimensional structure of proteins. Conformational preferences of amino acid residues and secondary structure information were obtained from protein templates stored in the Protein Data Bank and represented as an Angle Probability List. Two knowledge-based prediction methods based on Genetic Algorithms and Particle Swarm Optimization were developed using this information. The proposed method has been tested with twenty-six case studies selected to validate our approach with different classes of proteins and folding patterns. Stereochemical and structural analysis were performed for each predicted three-dimensional structure. Results achieved suggest that the Angle Probability List can improve the effectiveness of metaheuristics used to predicted the three-dimensional structure of protein molecules by reducing its conformational search space.

  16. Extreme Points of the Convex Set of Joint Probability Distributions with Fixed Marginals

    Indian Academy of Sciences (India)

    K R Parthasarathy

    2007-11-01

    By using a quantum probabilistic approach we obtain a description of the extreme points of the convex set of all joint probability distributions on the product of two standard Borel spaces with fixed marginal distributions.

  17. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    Science.gov (United States)

    Yilmaz, Şeyda; Bayrak, Erdem; Bayrak, Yusuf

    2016-04-01

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed time using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.

  18. Exact probability distribution for the Bernoulli-Malthus-Verhulst model driven by a multiplicative colored noise

    Science.gov (United States)

    Calisto, H.; Bologna, M.

    2007-05-01

    We report an exact result for the calculation of the probability distribution of the Bernoulli-Malthus-Verhulst model driven by a multiplicative colored noise. We study the conditions under which the probability distribution of the Malthus-Verhulst model can exhibit a transition from a unimodal to a bimodal distribution depending on the value of a critical parameter. Also we show that the mean value of x(t) in the latter model always approaches asymptotically the value 1.

  19. The equilibrium probability distribution of a conductive sphere's floating charge in a collisionless, drifting Maxwellian plasma

    CERN Document Server

    Thomas, Drew M

    2013-01-01

    A dust grain in a plasma has a fluctuating electric charge, and past work concludes that spherical grains in a stationary, collisionless plasma have an essentially Gaussian charge probability distribution. This paper extends that work to flowing plasmas and arbitrarily large spheres, deriving analytic charge probability distributions up to normalizing constants. We find that these distributions also have good Gaussian approximations, with analytic expressions for their mean and variance.

  20. Constructing the probability distribution function for the total capacity of a power system

    Energy Technology Data Exchange (ETDEWEB)

    Vasin, V.P.; Prokhorenko, V.I.

    1980-01-01

    The difficulties involved in constructing the probability distribution function for the total capacity of a power system consisting of numerous power plants are discussed. A method is considered for the approximate determination of such a function by a Monte Carlo method and by exact calculation based on special recursion formulas on a particular grid of argument values. It is shown that there may be significant deviations between the true probability distribution and a normal distribution.

  1. Generalized parton distributions and wide-angle exclusive scattering

    CERN Document Server

    Kroll, P

    2004-01-01

    The handbag mechanism for wide-angle exlusive scattering reactions is discussed and compared with other theoretical approaches. Its application to Compton scattering, meson photoproduction and two-photon annihilations into pairs of hadrons is reviewed.

  2. A New Probability of Detection Model for Updating Crack Distribution of Offshore Structures

    Institute of Scientific and Technical Information of China (English)

    李典庆; 张圣坤; 唐文勇

    2003-01-01

    There exists model uncertainty of probability of detection for inspecting ship structures with nondestructive inspection techniques. Based on a comparison of several existing probability of detection (POD) models, a new probability of detection model is proposed for the updating of crack size distribution. Furthermore, the theoretical derivation shows that most existing probability of detection models are special cases of the new probability of detection model. The least square method is adopted for determining the values of parameters in the new POD model. This new model is also compared with other existing probability of detection models. The results indicate that the new probability of detection model can fit the inspection data better. This new probability of detection model is then applied to the analysis of the problem of crack size updating for offshore structures. The Bayesian updating method is used to analyze the effect of probability of detection models on the posterior distribution of a crack size. The results show that different probabilities of detection models generate different posterior distributions of a crack size for offshore structures.

  3. Some possible q-exponential type probability distribution in the non-extensive statistical physics

    Science.gov (United States)

    Chung, Won Sang

    2016-08-01

    In this paper, we present two exponential type probability distributions which are different from Tsallis’s case which we call Type I: one given by pi = 1 Zq[eq(Ei)]-β (Type IIA) and another given by pi = 1 Zq[eq(-β)]Ei (Type IIIA). Starting with the Boltzman-Gibbs entropy, we obtain the different probability distribution by using the Kolmogorov-Nagumo average for the microstate energies. We present the first-order differential equations related to Types I, II and III. For three types of probability distributions, we discuss the quantum harmonic oscillator, two-level problem and the spin-1 2 paramagnet.

  4. Probability distributions for directed polymers in random media with correlated noise

    Science.gov (United States)

    Chu, Sherry; Kardar, Mehran

    2016-07-01

    The probability distribution for the free energy of directed polymers in random media (DPRM) with uncorrelated noise in d =1 +1 dimensions satisfies the Tracy-Widom distribution. We inquire if and how this universal distribution is modified in the presence of spatially correlated noise. The width of the distribution scales as the DPRM length to an exponent β , in good (but not full) agreement with previous renormalization group and numerical results. The scaled probability is well described by the Tracy-Widom form for uncorrelated noise, but becomes symmetric with increasing correlation exponent. We thus find a class of distributions that continuously interpolates between Tracy-Widom and Gaussian forms.

  5. A measure of mutual divergence among a number of probability distributions

    Directory of Open Access Journals (Sweden)

    J. N. Kapur

    1987-01-01

    major inequalities due to Shannon, Renyi and Holder. The inequalities are then used to obtain some useful results in information theory. In particular measures are obtained to measure the mutual divergence among two or more probability distributions.

  6. The probability distribution of fatigue damage and the statistical moment of fatigue life

    Institute of Scientific and Technical Information of China (English)

    熊峻江; 高镇同

    1997-01-01

    The randomization of deterministic fatigue damage equation leads to the stochastic differential equation and the Fokker-Planck equation affected by random fluctuation. By means of the solution of equation, the probability distribution of fatigue damage with the change of time is obtained. Then the statistical moment of fatigue life in consideration of the stationary random fluctuation is derived. Finally, the damage probability distributions during the fatigue crack initiation and fatigue crack growth are given

  7. Quantitative non-monotonic modeling of economic uncertainty by probability and possibility distributions

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans

    2012-01-01

    to the understanding of similarities and differences of the two approaches as well as practical applications. The probability approach offers a good framework for representation of randomness and variability. Once the probability distributions of uncertain parameters and their correlations are known the resulting...... uncertainty can be calculated. The possibility approach is particular well suited for representation of uncertainty of a non-statistical nature due to lack of knowledge and requires less information than the probability approach. Based on the kind of uncertainty and knowledge present, these aspects...... by probability distributions is readily done by means of Monte Carlo simulation. Calculation of non-monotonic functions of possibility distributions is done within the theoretical framework of fuzzy intervals, but straight forward application of fuzzy arithmetic in general results in overestimation of interval...

  8. Simultaneous distribution between the deflection angle and the lateral displacement under the Moliere theory of multiple scattering

    Energy Technology Data Exchange (ETDEWEB)

    Nakatsuka, Takao [Okayama Shoka University, Laboratory of Information Science, Okayama (Japan); Okei, Kazuhide [Kawasaki Medical School, Dept. of Information Sciences, Kurashiki (Japan); Iyono, Atsushi [Okayama university of Science, Dept. of Fundamental Science, Faculty of Science, Okayama (Japan); Bielajew, Alex F. [Univ. of Michigan, Dept. Nuclear Engineering and Radiological Sciences, Ann Arbor, MI (United States)

    2015-12-15

    Simultaneous distribution between the deflection angle and the lateral displacement of fast charged particles traversing through matter is derived by applying numerical inverse Fourier transforms on the Fourier spectral density solved analytically under the Moliere theory of multiple scattering, taking account of ionization loss. Our results show the simultaneous Gaussian distribution at the region of both small deflection angle and lateral displacement, though they show the characteristic contour patterns of probability density specific to the single and the double scatterings at the regions of large deflection angle and/or lateral displacement. The influences of ionization loss on the distribution are also investigated. An exact simultaneous distribution is derived under the fixed energy condition based on a well-known model of screened single scattering, which indicates the limit of validity of the Moliere theory applied to the simultaneous distribution. The simultaneous distribution will be valuable for improving the accuracy and the efficiency of experimental analyses and simulation studies relating to charged particle transports. (orig.)

  9. A simple derivation and classification of common probability distributions based on information symmetry and measurement scale

    CERN Document Server

    Frank, Steven A

    2010-01-01

    Commonly observed patterns typically follow a few distinct families of probability distributions. Over one hundred years ago, Karl Pearson provided a systematic derivation and classification of the common continuous distributions. His approach was phenomenological: a differential equation that generated common distributions without any underlying conceptual basis for why common distributions have particular forms and what explains the familial relations. Pearson's system and its descendants remain the most popular systematic classification of probability distributions. Here, we unify the disparate forms of common distributions into a single system based on two meaningful and justifiable propositions. First, distributions follow maximum entropy subject to constraints, where maximum entropy is equivalent to minimum information. Second, different problems associate magnitude to information in different ways, an association we describe in terms of the relation between information invariance and measurement scale....

  10. Predicting the probability of slip in gait: methodology and distribution study.

    Science.gov (United States)

    Gragg, Jared; Yang, James

    2016-01-01

    The likelihood of a slip is related to the available and required friction for a certain activity, here gait. Classical slip and fall analysis presumed that a walking surface was safe if the difference between the mean available and required friction coefficients exceeded a certain threshold. Previous research was dedicated to reformulating the classical slip and fall theory to include the stochastic variation of the available and required friction when predicting the probability of slip in gait. However, when predicting the probability of a slip, previous researchers have either ignored the variation in the required friction or assumed the available and required friction to be normally distributed. Also, there are no published results that actually give the probability of slip for various combinations of required and available frictions. This study proposes a modification to the equation for predicting the probability of slip, reducing the previous equation from a double-integral to a more convenient single-integral form. Also, a simple numerical integration technique is provided to predict the probability of slip in gait: the trapezoidal method. The effect of the random variable distributions on the probability of slip is also studied. It is shown that both the required and available friction distributions cannot automatically be assumed as being normally distributed. The proposed methods allow for any combination of distributions for the available and required friction, and numerical results are compared to analytical solutions for an error analysis. The trapezoidal method is shown to be highly accurate and efficient. The probability of slip is also shown to be sensitive to the input distributions of the required and available friction. Lastly, a critical value for the probability of slip is proposed based on the number of steps taken by an average person in a single day.

  11. WIENER-HOPF SOLVER WITH SMOOTH PROBABILITY DISTRIBUTIONS OF ITS COMPONENTS

    Directory of Open Access Journals (Sweden)

    Mr. Vladimir A. Smagin

    2016-12-01

    Full Text Available The Wiener – Hopf solver with smooth probability distributions of its component is presented. The method is based on hyper delta approximations of initial distributions. The use of Fourier series transformation and characteristic function allows working with the random variable method concentrated in transversal axis of absc.

  12. Multi-scale Characterization and Modeling of Surface Slope Probability Distribution for ~20-km Diameter Lunar Craters

    Science.gov (United States)

    Mahanti, P.; Robinson, M. S.; Boyd, A. K.

    2013-12-01

    Craters ~20-km diameter and above significantly shaped the lunar landscape. The statistical nature of the slope distribution on their walls and floors dominate the overall slope distribution statistics for the lunar surface. Slope statistics are inherently useful for characterizing the current topography of the surface, determining accurate photometric and surface scattering properties, and in defining lunar surface trafficability [1-4]. Earlier experimental studies on the statistical nature of lunar surface slopes were restricted either by resolution limits (Apollo era photogrammetric studies) or by model error considerations (photoclinometric and radar scattering studies) where the true nature of slope probability distribution was not discernible at baselines smaller than a kilometer[2,3,5]. Accordingly, historical modeling of lunar surface slopes probability distributions for applications such as in scattering theory development or rover traversability assessment is more general in nature (use of simple statistical models such as the Gaussian distribution[1,2,5,6]). With the advent of high resolution, high precision topographic models of the Moon[7,8], slopes in lunar craters can now be obtained at baselines as low as 6-meters allowing unprecedented multi-scale (multiple baselines) modeling possibilities for slope probability distributions. Topographic analysis (Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC) 2-m digital elevation models (DEM)) of ~20-km diameter Copernican lunar craters revealed generally steep slopes on interior walls (30° to 36°, locally exceeding 40°) over 15-meter baselines[9]. In this work, we extend the analysis from a probability distribution modeling point-of-view with NAC DEMs to characterize the slope statistics for the floors and walls for the same ~20-km Copernican lunar craters. The difference in slope standard deviations between the Gaussian approximation and the actual distribution (2-meter sampling) was

  13. A Class of Chaotic Sequences with Gauss Probability Distribution for Radar Mask Jamming

    Institute of Scientific and Technical Information of China (English)

    Ni-Ni Rao; Yu-Chuan Huang; Bin Liu

    2007-01-01

    A simple generation approach for chaotic sequences with Gauss probability distribution is proposed. Theoretical analysis and simulation based on Logistic chaotic model show that the approach is feasible and effective. The distribution characteristics of the novel chaotic sequence are comparable to that of the standard normal distribution. Its mean and variance can be changed to the desired values. The novel sequences have also good randomness. The applications for radar mask jamming are analyzed.

  14. Investigating and improving student understanding of the probability distributions for measuring physical observables in quantum mechanics

    Science.gov (United States)

    Marshman, Emily; Singh, Chandralekha

    2017-03-01

    A solid grasp of the probability distributions for measuring physical observables is central to connecting the quantum formalism to measurements. However, students often struggle with the probability distributions of measurement outcomes for an observable and have difficulty expressing this concept in different representations. Here we first describe the difficulties that upper-level undergraduate and PhD students have with the probability distributions for measuring physical observables in quantum mechanics. We then discuss how student difficulties found in written surveys and individual interviews were used as a guide in the development of a quantum interactive learning tutorial (QuILT) to help students develop a good grasp of the probability distributions of measurement outcomes for physical observables. The QuILT strives to help students become proficient in expressing the probability distributions for the measurement of physical observables in Dirac notation and in the position representation and be able to convert from Dirac notation to position representation and vice versa. We describe the development and evaluation of the QuILT and findings about the effectiveness of the QuILT from in-class evaluations.

  15. Score distributions of gapped multiple sequence alignments down to the low-probability tail

    Science.gov (United States)

    Fieth, Pascal; Hartmann, Alexander K.

    2016-08-01

    Assessing the significance of alignment scores of optimally aligned DNA or amino acid sequences can be achieved via the knowledge of the score distribution of random sequences. But this requires obtaining the distribution in the biologically relevant high-scoring region, where the probabilities are exponentially small. For gapless local alignments of infinitely long sequences this distribution is known analytically to follow a Gumbel distribution. Distributions for gapped local alignments and global alignments of finite lengths can only be obtained numerically. To obtain result for the small-probability region, specific statistical mechanics-based rare-event algorithms can be applied. In previous studies, this was achieved for pairwise alignments. They showed that, contrary to results from previous simple sampling studies, strong deviations from the Gumbel distribution occur in case of finite sequence lengths. Here we extend the studies to multiple sequence alignments with gaps, which are much more relevant for practical applications in molecular biology. We study the distributions of scores over a large range of the support, reaching probabilities as small as 10-160, for global and local (sum-of-pair scores) multiple alignments. We find that even after suitable rescaling, eliminating the sequence-length dependence, the distributions for multiple alignment differ from the pairwise alignment case. Furthermore, we also show that the previously discussed Gaussian correction to the Gumbel distribution needs to be refined, also for the case of pairwise alignments.

  16. Probability collectives a distributed multi-agent system approach for optimization

    CERN Document Server

    Kulkarni, Anand Jayant; Abraham, Ajith

    2015-01-01

    This book provides an emerging computational intelligence tool in the framework of collective intelligence for modeling and controlling distributed multi-agent systems referred to as Probability Collectives. In the modified Probability Collectives methodology a number of constraint handling techniques are incorporated, which also reduces the computational complexity and improved the convergence and efficiency. Numerous examples and real world problems are used for illustration, which may also allow the reader to gain further insight into the associated concepts.

  17. The Exit Distribution for Smart Kinetic Walk with Symmetric and Asymmetric Transition Probability

    Science.gov (United States)

    Dai, Yan

    2017-03-01

    It has been proved that the distribution of the point where the smart kinetic walk (SKW) exits a domain converges in distribution to harmonic measure on the hexagonal lattice. For other lattices, it is believed that this result still holds, and there is good numerical evidence to support this conjecture. Here we examine the effect of the symmetry and asymmetry of the transition probability on each step of the SKW on the square lattice and test if the exit distribution converges in distribution to harmonic measure as well. From our simulations, the limiting exit distribution of the SKW with a non-uniform but symmetric transition probability as the lattice spacing goes to zero is the harmonic measure. This result does not hold for asymmetric transition probability. We are also interested in the difference between the SKW with symmetric transition probability exit distribution and harmonic measure. Our simulations provide strong support for a explicit conjecture about this first order difference. The explicit formula for the conjecture will be given below.

  18. Ruin Probability and Joint Distributions of Some Actuarial Random Vectors in the Compound Pascal Model

    Institute of Scientific and Technical Information of China (English)

    Xian-min Geng; Shu-chen Wan

    2011-01-01

    The compound negative binomial model, introduced in this paper, is a discrete time version. We discuss the Markov properties of the surplus process, and study the ruin probability and the joint distributions of actuarial random vectors in this model. By the strong Markov property and the mass function of a defective renewal sequence, we obtain the explicit expressions of the ruin probability, the finite-horizon ruin probability,the joint distributions of T, U(T - 1), |U(T)| and inf 0≤n<T1 U(n) (i.e., the time of ruin, the surplus immediately before ruin, the deficit at ruin and maximal deficit from ruin to recovery) and the distributions of some actuariai random vectors.

  19. Net-charge probability distributions in heavy ion collisions at chemical freeze-out

    CERN Document Server

    Braun-Munzinger, P; Karsch, F; Redlich, K; Skokov, V

    2011-01-01

    We explore net charge probability distributions in heavy ion collisions within the hadron resonance gas model. The distributions for strangeness, electric charge and baryon number are derived. We show that, within this model, net charge probability distributions and the resulting fluctuations can be computed directly from the measured yields of charged and multi-charged hadrons. The influence of multi-charged particles and quantum statistics on the shape of the distribution is examined. We discuss the properties of the net proton distribution along the chemical freeze-out line. The model results presented here can be compared with data at RHIC energies and at the LHC to possibly search for the relation between chemical freeze-out and QCD cross-over lines in heavy ion collisions.

  20. THE LEBESGUE-STIELJES INTEGRAL AS APPLIED IN PROBABILITY DISTRIBUTION THEORY

    Science.gov (United States)

    bounded variation and Borel measureable functions are set forth in the introduction. Chapter 2 is concerned with establishing a one to one correspondence between LebesgueStieljes measures and certain equivalence classes of functions which are monotone non decreasing and continuous on the right. In Chapter 3 the Lebesgue-Stieljes Integral is defined and some of its properties are demonstrated. In Chapter 4 probability distribution function is defined and the notions in Chapters 2 and 3 are used to show that the Lebesgue-Stieljes integral of any probability distribution

  1. LAGRANGE MULTIPLIERS IN THE PROBABILITY DISTRIBUTIONS ELICITATION PROBLEM: AN APPLICATION TO THE 2013 FIFA CONFEDERATIONS CUP

    Directory of Open Access Journals (Sweden)

    Diogo de Carvalho Bezerra

    2015-12-01

    Full Text Available ABSTRACT Contributions from the sensitivity analysis of the parameters of the linear programming model for the elicitation of experts' beliefs are presented. The process allows for the calibration of the family of probability distributions obtained in the elicitation process. An experiment to obtain the probability distribution of a future event (Brazil vs. Spain soccer game in the 2013 FIFA Confederations Cup final game was conducted. The proposed sensitivity analysis step may help to reduce the vagueness of the information given by the expert.

  2. Estimating the Upper Limit of Lifetime Probability Distribution, Based on Data of Japanese Centenarians.

    Science.gov (United States)

    Hanayama, Nobutane; Sibuya, Masaaki

    2016-08-01

    In modern biology, theories of aging fall mainly into two groups: damage theories and programed theories. If programed theories are true, the probability that human beings live beyond a specific age will be zero. In contrast, if damage theories are true, such an age does not exist, and a longevity record will be eventually destroyed. In this article, for examining real state, a special type of binomial model based on the generalized Pareto distribution has been applied to data of Japanese centenarians. From the results, it is concluded that the upper limit of lifetime probability distribution in the Japanese population has been estimated 123 years.

  3. Comparative assessment of surface fluxes from different sources using probability density distributions

    Science.gov (United States)

    Gulev, Sergey; Tilinina, Natalia; Belyaev, Konstantin

    2015-04-01

    Surface turbulent heat fluxes from modern era and first generation reanalyses (NCEP-DOE, ERA-Interim, MERRA NCEP-CFSR, JRA) as well as from satellite products (SEAFLUX, IFREMER, HOAPS) were intercompared using framework of probability distributions for sensible and latent heat fluxes. For approximation of probability distributions and estimation of extreme flux values Modified Fisher-Tippett (MFT) distribution has been used. Besides mean flux values, consideration is given to the comparative analysis of (i) parameters of the MFT probability density functions (scale and location), (ii) extreme flux values corresponding high order percentiles of fluxes (e.g. 99th and higher) and (iii) fractional contribution of extreme surface flux events in the total surface turbulent fluxes integrated over months and seasons. The latter was estimated using both fractional distribution derived from MFT and empirical estimates based upon occurrence histograms. The strongest differences in the parameters of probability distributions of surface fluxes and extreme surface flux values between different reanalyses are found in the western boundary current extension regions and high latitudes, while the highest differences in the fractional contributions of surface fluxes may occur in mid ocean regions being closely associated with atmospheric synoptic dynamics. Generally, satellite surface flux products demonstrate relatively stronger extreme fluxes compared to reanalyses, even in the Northern Hemisphere midlatitudes where data assimilation input in reanalyses is quite dense compared to the Southern Ocean regions.

  4. Importance measures for imprecise probability distributions and their sparse grid solutions

    Institute of Scientific and Technical Information of China (English)

    WANG; Pan; LU; ZhenZhou; CHENG; Lei

    2013-01-01

    For the imprecise probability distribution of structural system, the variance based importance measures (IMs) of the inputs are investigated, and three IMs are defined on the conditions of random distribution parameters, interval distribution parameters and the mixture of those two types of distribution parameters. The defined IMs can reflect the influence of the inputs on the output of the structural system with imprecise distribution parameters, respectively. Due to the large computational cost of the variance based IMs, sparse grid method is employed in this work to compute the variance based IMs at each reference point of distribution parameters. For the three imprecise distribution parameter cases, the sparse grid method and the combination of sparse grid method with genetic algorithm are used to compute the defined IMs. Numerical and engineering examples are em-ployed to demonstrate the rationality of the defined IMs and the efficiency of the applied methods.

  5. The probability distribution model of air pollution index and its dominants in Kuala Lumpur

    Science.gov (United States)

    AL-Dhurafi, Nasr Ahmed; Razali, Ahmad Mahir; Masseran, Nurulkamal; Zamzuri, Zamira Hasanah

    2016-11-01

    This paper focuses on the statistical modeling for the distributions of air pollution index (API) and its sub-indexes data observed at Kuala Lumpur in Malaysia. Five pollutants or sub-indexes are measured including, carbon monoxide (CO); sulphur dioxide (SO2); nitrogen dioxide (NO2), and; particulate matter (PM10). Four probability distributions are considered, namely log-normal, exponential, Gamma and Weibull in search for the best fit distribution to the Malaysian air pollutants data. In order to determine the best distribution for describing the air pollutants data, five goodness-of-fit criteria's are applied. This will help in minimizing the uncertainty in pollution resource estimates and improving the assessment phase of planning. The conflict in criterion results for selecting the best distribution was overcome by using the weight of ranks method. We found that the Gamma distribution is the best distribution for the majority of air pollutants data in Kuala Lumpur.

  6. Analysis of low probability of intercept (LPI) radar signals using the Wigner Distribution

    OpenAIRE

    Gau, Jen-Yu

    2002-01-01

    Approved for public release, distribution is unlimited The parameters of Low Probability of Intercept (LPI) radar signals are hard to identify by using traditional periodogram signal processing techniques. Using the Wigner Distribution (WD), this thesis examines eight types of LPI radar signals. Signal to noise ratios of 0 dB and -6dB are also investigated. The eight types LPI radar signals examined include Frequency Modulation Continuous Wave (FMCW), Frank code, P1 code, P2 code, P3 code,...

  7. Calculation of Radar Probability of Detection in K-Distributed Sea Clutter and Noise

    Science.gov (United States)

    2011-04-01

    Expanded Swerling Target Models, IEEE Trans. AES 39 (2003) 1059-1069. 18. G. Arfken , Mathematical Methods for Physicists, Second Edition, Academic...form solution for the probability of detection in K-distributed clutter, so numerical methods are required. The K distribution is a compound model...the integration, with the nodes and weights calculated using matrix methods , so that a general purpose numerical integration routine is not required

  8. Statistical analysis of the Lognormal-Pareto distribution using Probability Weighted Moments and Maximum Likelihood

    OpenAIRE

    Marco Bee

    2012-01-01

    This paper deals with the estimation of the lognormal-Pareto and the lognormal-Generalized Pareto mixture distributions. The log-likelihood function is discontinuous, so that Maximum Likelihood Estimation is not asymptotically optimal. For this reason, we develop an alternative method based on Probability Weighted Moments. We show that the standard version of the method can be applied to the first distribution, but not to the latter. Thus, in the lognormal- Generalized Pareto case, we work ou...

  9. Providing probability distributions for the causal pathogen of clinical mastitis using naive Bayesian networks

    NARCIS (Netherlands)

    Steeneveld, W.; Gaag, van der L.C.; Barkema, H.W.; Hogeveen, H.

    2009-01-01

    Clinical mastitis (CM) can be caused by a wide variety of pathogens and farmers must start treatment before the actual causal pathogen is known. By providing a probability distribution for the causal pathogen, naive Bayesian networks (NBN) can serve as a management tool for farmers to decide which t

  10. Establishment and optimization of project investment risk income models on the basis of probability χ distribution

    Institute of Scientific and Technical Information of China (English)

    吕渭济; 崔巍

    2001-01-01

    In this paper, two kinds of models are presented and optimized for project investment risk income on the basis of probability X distribution. One kind of model being proved has only a maximal value and another kind being proved has no extreme values.

  11. Establishment and optimization of project investment risk income models on the basis of probability χ distribution

    Institute of Scientific and Technical Information of China (English)

    LU Wei-ji; CUI Wei

    2001-01-01

    In this paper, two kinds of models are presented and optimized for pro ject investment risk income on the basis of probability χ distribution. One kin d of model being proved has only a maximal value and another kind being proved h as no extreme values.

  12. INVESTIGATION OF INFLUENCE OF ENCODING FUNCTION COMPLEXITY ON DISTRIBUTION OF ERROR MASKING PROBABILITY

    Directory of Open Access Journals (Sweden)

    A. B. Levina

    2016-03-01

    Full Text Available Error detection codes are mechanisms that enable robust delivery of data in unreliable communication channels and devices. Unreliable channels and devices are error-prone objects. Respectively, error detection codes allow detecting such errors. There are two classes of error detecting codes - classical codes and security-oriented codes. The classical codes have high percentage of detected errors; however, they have a high probability to miss an error in algebraic manipulation. In order, security-oriented codes are codes with a small Hamming distance and high protection to algebraic manipulation. The probability of error masking is a fundamental parameter of security-oriented codes. A detailed study of this parameter allows analyzing the behavior of the error-correcting code in the case of error injection in the encoding device. In order, the complexity of the encoding function plays an important role in the security-oriented codes. Encoding functions with less computational complexity and a low probability of masking are the best protection of encoding device against malicious acts. This paper investigates the influence of encoding function complexity on the error masking probability distribution. It will be shownthat the more complex encoding function reduces the maximum of error masking probability. It is also shown in the paper that increasing of the function complexity changes the error masking probability distribution. In particular, increasing of computational complexity decreases the difference between the maximum and average value of the error masking probability. Our resultshave shown that functions with greater complexity have smoothed maximums of error masking probability, which significantly complicates the analysis of error-correcting code by attacker. As a result, in case of complex encoding function the probability of the algebraic manipulation is reduced. The paper discusses an approach how to measure the error masking

  13. Generalized quantum Fokker-Planck, diffusion and Smoluchowski equations with true probability distribution functions

    CERN Document Server

    Banik, S K; Ray, D S; Banik, Suman Kumar; Bag, Bidhan Chandra; Ray, Deb Shankar

    2002-01-01

    Traditionally, the quantum Brownian motion is described by Fokker-Planck or diffusion equations in terms of quasi-probability distribution functions, e.g., Wigner functions. These often become singular or negative in the full quantum regime. In this paper a simple approach to non-Markovian theory of quantum Brownian motion using {\\it true probability distribution functions} is presented. Based on an initial coherent state representation of the bath oscillators and an equilibrium canonical distribution of the quantum mechanical mean values of their co-ordinates and momenta we derive a generalized quantum Langevin equation in $c$-numbers and show that the latter is amenable to a theoretical analysis in terms of the classical theory of non-Markovian dynamics. The corresponding Fokker-Planck, diffusion and the Smoluchowski equations are the {\\it exact} quantum analogues of their classical counterparts. The present work is {\\it independent} of path integral techniques. The theory as developed here is a natural ext...

  14. An investigation of the dose distribution effect related with collimator angle for VMAT method

    Science.gov (United States)

    Tas, B.; Bilge, H.; Ozturk, S. Tokdemir

    2016-03-01

    Aim of this study is to investigate the efficacy of dose distribution in eleven prostate cancer patients with single VMAT and double VMAT when varying collimator angle. We generated optimum single and double VMAT treatment plans when collimator angle was 0°. We recalculated single VMAT plans at different collimator angles(0°,15°,30°,45°,60°,75°,90°) for double VMAT plans(0°-0°,15°-345°,30°-330°,45°-315°,60°-300°,75°-285°,90°-270°) without changing any optimization parameters. HI, DVH and %95 dose coverage of PTV calculated and analyzed. We determined better dose distribution with some collimator angles. Plans were verified using the 2 dimensional ion chamber array Matrixx® and 3 dimensional Compass® software program. A higher %95 dose coverage of PTV was found for single VMAT in the 15° collimator angle, for double VMAT in the 60°-300° and 75°-285° collimator angles. Because of lower rectum doses, we suggested 75°-285°. When we compared single and double VMAT's dose distribution, we had better % 95 dose coverage of PTV and lower HI with double VMAT. Our result was significant statistically. These finds are informative for choosing 75°-285° collimator angles in double VMAT plans for prostate cancer.

  15. Conical pitch angle distributions of very-low energy ion fluxes observed by ISEE 1

    Energy Technology Data Exchange (ETDEWEB)

    Horowitz, J.L.; Baugher, C.R.; Chappell, C.R.; Shelley, E.G.; Young, D.T.

    1982-04-01

    Observations of low-energy ionospheric ions by the plasma composition experiment abroad ISEE 1 often show conical pitch angle distributions, that is, peak fluxes between 0/sup 0/ and 90/sup 0/ to the directions parallel or antiparallel to the magnetic field. Frequently, all three primary ionospheric ion species (H/sup +/, He/sup +/, and O/sup +/) simultaneously exhibit conical distributions with peak fluxes at essentially the same pitch angle. A distinction is made here between unidirectional, or streaming, distributions, in which ions are traveling essentially from only one hemisphere, and symmetrical distributions, in which significant fluxes are observed traveling from both hemispheres. The orbital coverage for this survey was largely restricted to the night sector, approximately 2100--0600 LT, and moderate geomagnetic latitudes of 20/sup 0/--40/sup 0/. Also, lack of complete pitch angle coverage at all times may have reduced detection for conics with small cone angles. However, we may conclude that the unidirectional conical distributions observed in the northern hemisphere are always observed to be traveling from the northern hemisphere and that they exhibit the following characteristics relative to the symmetric distributions, in that they (1) are typically observed on higher L shells (that is, higher geomagnetic latitudes or larger geocentric distances or both), (2) tend to have significantly larger cone angles, and (3), are associated with higher magnetic activity levels.

  16. Small-Scale Spatio-Temporal Distribution of Bactrocera minax (Enderlein) (Diptera: Tephritidae) Using Probability Kriging.

    Science.gov (United States)

    Wang, S Q; Zhang, H Y; Li, Z L

    2016-10-01

    Understanding spatio-temporal distribution of pest in orchards can provide important information that could be used to design monitoring schemes and establish better means for pest control. In this study, the spatial and temporal distribution of Bactrocera minax (Enderlein) (Diptera: Tephritidae) was assessed, and activity trends were evaluated by using probability kriging. Adults of B. minax were captured in two successive occurrences in a small-scale citrus orchard by using food bait traps, which were placed both inside and outside the orchard. The weekly spatial distribution of B. minax within the orchard and adjacent woods was examined using semivariogram parameters. The edge concentration was discovered during the most weeks in adult occurrence, and the population of the adults aggregated with high probability within a less-than-100-m-wide band on both of the sides of the orchard and the woods. The sequential probability kriged maps showed that the adults were estimated in the marginal zone with higher probability, especially in the early and peak stages. The feeding, ovipositing, and mating behaviors of B. minax are possible explanations for these spatio-temporal patterns. Therefore, spatial arrangement and distance to the forest edge of traps or spraying spot should be considered to enhance pest control on B. minax in small-scale orchards.

  17. About the probability distribution of a quantity with given mean and variance

    CERN Document Server

    Olivares, Stefano

    2012-01-01

    Supplement 1 to GUM (GUM-S1) recommends the use of maximum entropy principle (MaxEnt) in determining the probability distribution of a quantity having specified properties, e.g., specified central moments. When we only know the mean value and the variance of a variable, GUM-S1 prescribes a Gaussian probability distribution for that variable. When further information is available, in the form of a finite interval in which the variable is known to lie, we indicate how the distribution for the variable in this case can be obtained. A Gaussian distribution should only be used in this case when the standard deviation is small compared to the range of variation (the length of the interval). In general, when the interval is finite, the parameters of the distribution should be evaluated numerically, as suggested by I. Lira, Metrologia, 46 L27 (2009). Here we note that the knowledge of the range of variation is equivalent to a bias of the distribution toward a flat distribution in that range, and the principle of mini...

  18. Earthquake probabilities and magnitude distribution (M≥6.7) along the Haiyuan fault, northwestern China

    Institute of Scientific and Technical Information of China (English)

    冉洪流

    2004-01-01

    In recent years, some researchers have studied the paleoearthquake along the Haiyuan fault and revealed a lot of paleoearthquake events. All available information allows more reliable analysis of earthquake recurrence interval and earthquake rupture patterns along the Haiyuan fault. Based on this paleoseismological information, the recurrence probability and magnitude distribution for M≥6.7 earthquakes in future 100 years along the Haiyuan fault can be obtained through weighted computation by using Poisson and Brownian passage time models and considering different rupture patterns. The result shows that the recurrence probability of MS≥6.7 earthquakes is about 0.035 in future 100 years along the Haiyuan fault.

  19. Comparative assessment of surface fluxes from different sources: a framework based on probability distributions

    Science.gov (United States)

    Gulev, S.

    2015-12-01

    Surface turbulent heat fluxes from modern era and first generation reanalyses (NCEP-DOE, ERA-Interim, MERRA NCEP-CFSR, JRA) as well as from satellite products (SEAFLUX, IFREMER, HOAPS) were intercompared using framework of probability distributions for sensible and latent heat fluxes. For approximation of probability distributions and estimation of extreme flux values Modified Fisher-Tippett (MFT) distribution has been used. Besides mean flux values, consideration is given to the comparative analysis of (i) parameters of the MFT probability density functions (scale and location), (ii) extreme flux values corresponding high order percentiles of fluxes (e.g. 99th and higher) and (iii) fractional contribution of extreme surface flux events in the total surface turbulent fluxes integrated over months and seasons. The latter was estimated using both fractional distribution derived from MFT and empirical estimates based upon occurrence histograms. The strongest differences in the parameters of probability distributions of surface fluxes and extreme surface flux values between different reanalyses are found in the western boundary current extension regions and high latitudes, while the highest differences in the fractional contributions of surface fluxes may occur in mid ocean regions being closely associated with atmospheric synoptic dynamics. Generally, satellite surface flux products demonstrate relatively stronger extreme fluxes compared to reanalyses, even in the Northern Hemisphere midlatitudes where data assimilation input in reanalyses is quite dense compared to the Southern Ocean regions. Our assessment also discriminated different reanalyses and satellite products with respect to their ability to quantify the role of extreme surface turbulent fluxes in forming ocean heat release in different regions.

  20. Probability distribution of surface wind speed induced by convective adjustment on Venus

    Science.gov (United States)

    Yamamoto, Masaru

    2017-03-01

    The influence of convective adjustment on the spatial structure of Venusian surface wind and probability distribution of its wind speed is investigated using an idealized weather research and forecasting model. When the initially uniform wind is much weaker than the convective wind, patches of both prograde and retrograde winds with scales of a few kilometers are formed during active convective adjustment. After the active convective adjustment, because the small-scale convective cells and their related vertical momentum fluxes dissipate quickly, the large-scale (>4 km) prograde and retrograde wind patches remain on the surface and in the longitude-height cross-section. This suggests the coexistence of local prograde and retrograde flows, which may correspond to those observed by Pioneer Venus below 10 km altitude. The probability distributions of surface wind speed V during the convective adjustment have a similar form in different simulations, with a sharp peak around ∼0.1 m s-1 and a bulge developing on the flank of the probability distribution. This flank bulge is associated with the most active convection, which has a probability distribution with a peak at the wind speed 1.5-times greater than the Weibull fitting parameter c during the convective adjustment. The Weibull distribution P(> V) (= exp[-(V/c)k]) with best-estimate coefficients of Lorenz (2016) is reproduced during convective adjustments induced by a potential energy of ∼7 × 107 J m-2, which is calculated from the difference in total potential energy between initially unstable and neutral states. The maximum vertical convective heat flux magnitude is proportional to the potential energy of the convective adjustment in the experiments with the initial unstable-layer thickness altered. The present work suggests that convective adjustment is a promising process for producing the wind structure with occasionally generating surface winds of ∼1 m s-1 and retrograde wind patches.

  1. Pitch Angle Distribution Evolution of Energetic Electrons by Whistler-Mode Chorus

    Institute of Scientific and Technical Information of China (English)

    ZHENG Hui-Nan; SU Zhen-Peng; XIONG Ming

    2008-01-01

    We develop a two-dimensional momentum and pitch angle code to solve the typical Fokker-Planck equation which governs wave-particle interaction in space plasmas. We carry out detailed calculations of momentum and pitch angle diffusion coefficients, and temporal evolution of pitch angle distribution for a band of chorus frequency distributed over a standard Gaussian spectrum particularly in the heart of the Earth's radiation belt L = 4.5,where peaks of the electron phase space density are observed. We find that the Whistler-mode chorus can produce significant acceleration of electrons at large pitch angles, and can enhance the phase space density for energies of 0.5~1 MeV by a factor of 10 or above after about 24h. This result can account for observation of significant enhancement in flux of energetic electrons during the recovery phase of a geomagnetic storm.

  2. Distribution of magnetic shear angle at the ascending phase of CYCLE 23

    Institute of Scientific and Technical Information of China (English)

    敦金平; 张洪起; 张柏荣; 李如风

    2002-01-01

    Using the vector magnetograms observed at Huairou Solar Observing Station of National Astronomical Observatories, the magnetic shear angles of solar active regions at the ascending phase of cycle 23 (1996-2000) were calculated. It is found that the statistical distribution of the magnetic shear angles can be fitted well by Gaussian curves. And the dominant sign of the magnetic shear angles is negative (positive) in the northern (southern) hemisphere. It is consistent with the N-S sign asymmetry of force-free field constant α and current helicity.

  3. Explicit Expressions for the Ruin Probabilities of Erlang Risk Processes with Pareto Individual Claim Distributions

    Institute of Scientific and Technical Information of China (English)

    Li Wei; Hai-liang Yang

    2004-01-01

    In this paper we first consider a risk process in which claim inter-arrival times and the time until the first claim have an Erlang (2)distribution.An explicit solution is derived for the probability of ultimate ruin,given an initial reserve of u when the claim size follows a Pareto distribution.Follow Ramsay [8] ,Laplace transforms and exponential integrals are used to derive the solution,which involves a single integral of real valued functions along the positive real line,and the integrand is not of an oscillating kind.Then we show that the ultimate ruin probability can be expressed as the sum of expected values of functions of two different Gamma random variables.Finally,the results are extended to the Erlang(n)case.Numerical examples are given to illustrate the main results.

  4. Estimating the probability distribution of von Mises stress for structures undergoing random excitation. Part 1: Derivation

    Energy Technology Data Exchange (ETDEWEB)

    Segalman, D.; Reese, G.

    1998-09-01

    The von Mises stress is often used as the metric for evaluating design margins, particularly for structures made of ductile materials. For deterministic loads, both static and dynamic, the calculation of von Mises stress is straightforward, as is the resulting calculation of reliability. For loads modeled as random processes, the task is different; the response to such loads is itself a random process and its properties must be determined in terms of those of both the loads and the system. This has been done in the past by Monte Carlo sampling of numerical realizations that reproduce the second order statistics of the problem. Here, the authors present a method that provides analytic expressions for the probability distributions of von Mises stress which can be evaluated efficiently and with good precision numerically. Further, this new approach has the important advantage of providing the asymptotic properties of the probability distribution.

  5. The force distribution probability function for simple fluids by density functional theory.

    Science.gov (United States)

    Rickayzen, G; Heyes, D M

    2013-02-28

    Classical density functional theory (DFT) is used to derive a formula for the probability density distribution function, P(F), and probability distribution function, W(F), for simple fluids, where F is the net force on a particle. The final formula for P(F) ∝ exp(-AF(2)), where A depends on the fluid density, the temperature, and the Fourier transform of the pair potential. The form of the DFT theory used is only applicable to bounded potential fluids. When combined with the hypernetted chain closure of the Ornstein-Zernike equation, the DFT theory for W(F) agrees with molecular dynamics computer simulations for the Gaussian and bounded soft sphere at high density. The Gaussian form for P(F) is still accurate at lower densities (but not too low density) for the two potentials, but with a smaller value for the constant, A, than that predicted by the DFT theory.

  6. Conditional probability distribution (CPD) method in temperature based death time estimation: Error propagation analysis.

    Science.gov (United States)

    Hubig, Michael; Muggenthaler, Holger; Mall, Gita

    2014-05-01

    Bayesian estimation applied to temperature based death time estimation was recently introduced as conditional probability distribution or CPD-method by Biermann and Potente. The CPD-method is useful, if there is external information that sets the boundaries of the true death time interval (victim last seen alive and found dead). CPD allows computation of probabilities for small time intervals of interest (e.g. no-alibi intervals of suspects) within the large true death time interval. In the light of the importance of the CPD for conviction or acquittal of suspects the present study identifies a potential error source. Deviations in death time estimates will cause errors in the CPD-computed probabilities. We derive formulae to quantify the CPD error as a function of input error. Moreover we observed the paradox, that in cases, in which the small no-alibi time interval is located at the boundary of the true death time interval, adjacent to the erroneous death time estimate, CPD-computed probabilities for that small no-alibi interval will increase with increasing input deviation, else the CPD-computed probabilities will decrease. We therefore advise not to use CPD if there is an indication of an error or a contra-empirical deviation in the death time estimates, that is especially, if the death time estimates fall out of the true death time interval, even if the 95%-confidence intervals of the estimate still overlap the true death time interval.

  7. An Undersea Mining Microseism Source Location Algorithm Considering Wave Velocity Probability Distribution

    OpenAIRE

    2014-01-01

    The traditional mine microseism locating methods are mainly based on the assumption that the wave velocity is uniform through the space, which leads to some errors for the assumption goes against the laws of nature. In this paper, the wave velocity is regarded as a random variable, and the probability distribution information of the wave velocity is fused into the traditional locating method. This paper puts forwards the microseism source location method for the undersea mining on condition o...

  8. Storm-Time Evolution of Energetic Electron Pitch Angle Distributions by Wave-Particle Interaction

    Institute of Scientific and Technical Information of China (English)

    XIAO Fuliang; HE Huiyong; ZHOU Qinghua; WU Guanhong; SHI Xianghua

    2008-01-01

    The quasi-pure pitch-angle scattering of energetic electrons driven by field-alignedpropagating whistler mode waves during the 9~15 October 1990 magnetic storm at L ≈ 3 ~ 4 is studied, and numerical calculations for energetic electrons in gyroresonance with a band of frequency of whistler mode waves distributed over a standard Gaussian spectrum is performed. It is found that the whistler.mode waves can efficiently drive energetic electrons from the larger pitch-angles into the loss cone, and lead to a flat-top distribution during the main phase of geomagnetic storms. This result perhaps presents a feasible interpretation for observation of time evolution of the quasi-isotropic pitch-angle distribution by Combined Release and Radiation Effects Satellite (CRRES) spacecraft at L ≈ 3 ~ 4.

  9. Pauling resonant structures in real space through electron number probability distributions.

    Science.gov (United States)

    Pendas, A Martín; Francisco, E; Blanco, M A

    2007-02-15

    A general hierarchy of the coarsed-grained electron probability distributions induced by exhaustive partitions of the physical space is presented. It is argued that when the space is partitioned into atomic regions the consideration of these distributions may provide a first step toward an orbital invariant treatment of resonant structures. We also show that, in this case, the total molecular energy and its components may be partitioned into structure contributions, providing a fruitful extension of the recently developed interacting quantum atoms approach (J. Chem. Theory Comput. 2005, 1, 1096). The above ideas are explored in the hydrogen molecule, where a complete statistical and energetic decomposition into covalent and ionic terms is presented.

  10. Optimal design of unit hydrographs using probability distribution and genetic algorithms

    Indian Academy of Sciences (India)

    Rajib Kumar Bhattacharjya

    2004-10-01

    A nonlinear optimization model is developed to transmute a unit hydrograph into a probability distribution function (PDF). The objective function is to minimize the sum of the square of the deviation between predicted and actual direct runoff hydrograph of a watershed. The predicted runoff hydrograph is estimated by using a PDF. In a unit hydrograph, the depth of rainfall excess must be unity and the ordinates must be positive. Incorporation of a PDF ensures that the depth of rainfall excess for the unit hydrograph is unity, and the ordinates are also positive. Unit hydrograph ordinates are in terms of intensity of rainfall excess on a discharge per unit catchment area basis, the unit area thus representing the unit rainfall excess. The proposed method does not have any constraint. The nonlinear optimization formulation is solved using binary-coded genetic algorithms. The number of variables to be estimated by optimization is the same as the number of probability distribution parameters; gamma and log-normal probability distributions are used. The existing nonlinear programming model for obtaining optimal unit hydrograph has also been solved using genetic algorithms, where the constrained nonlinear optimization problem is converted to an unconstrained problem using penalty parameter approach. The results obtained are compared with those obtained by the earlier LP model and are fairly similar.

  11. Pore size distribution, survival probability, and relaxation time in random and ordered arrays of fibers

    Science.gov (United States)

    Tomadakis, Manolis M.; Robertson, Teri J.

    2003-07-01

    We present a random walk based investigation of the pore size probability distribution and its moments, the survival probability and mean survival time, and the principal relaxation time, for random and ordered arrays of cylindrical fibers of various orientation distributions. The dimensionless mean survival time, principal relaxation time, mean pore size, and mean square pore size are found to increase with porosity, remain practically independent of the directionality of random fiber beds, and attain lower values for ordered arrays. Wide pore size distributions are obtained for random fiber structures and relatively narrow for ordered square arrays, all in very good agreement with theoretically predicted limiting values. Analytical results derived for the pore size probability and its lower moments for square arrays of fibers practically coincide with the corresponding simulation results. Earlier variational bounds on the mean survival time and principal relaxation time are obeyed by our numerical results in all cases, and are found to be quite sharp up to very high porosities. Dimensionless groups representing the deviation of such bounds from our simulation results vary in practically the same range as the corresponding values reported earlier for beds of spherical particles. A universal scaling expression of the literature relating the mean survival time to the mean pore size [S. Torquato and C. L. Y. Yeong, J. Chem. Phys. 106, 8814 (1997)] agrees very well with our results for all types of fiber structures, thus validated for the first time for anisotropic porous media.

  12. Probability distribution of financial returns in a model of multiplicative Brownian motion with stochastic diffusion coefficient

    Science.gov (United States)

    Silva, Antonio

    2005-03-01

    It is well-known that the mathematical theory of Brownian motion was first developed in the Ph. D. thesis of Louis Bachelier for the French stock market before Einstein [1]. In Ref. [2] we studied the so-called Heston model, where the stock-price dynamics is governed by multiplicative Brownian motion with stochastic diffusion coefficient. We solved the corresponding Fokker-Planck equation exactly and found an analytic formula for the time-dependent probability distribution of stock price changes (returns). The formula interpolates between the exponential (tent-shaped) distribution for short time lags and the Gaussian (parabolic) distribution for long time lags. The theoretical formula agrees very well with the actual stock-market data ranging from the Dow-Jones index [2] to individual companies [3], such as Microsoft, Intel, etc. [] [1] Louis Bachelier, ``Th'eorie de la sp'eculation,'' Annales Scientifiques de l''Ecole Normale Sup'erieure, III-17:21-86 (1900).[] [2] A. A. Dragulescu and V. M. Yakovenko, ``Probability distribution of returns in the Heston model with stochastic volatility,'' Quantitative Finance 2, 443--453 (2002); Erratum 3, C15 (2003). [cond-mat/0203046] [] [3] A. C. Silva, R. E. Prange, and V. M. Yakovenko, ``Exponential distribution of financial returns at mesoscopic time lags: a new stylized fact,'' Physica A 344, 227--235 (2004). [cond-mat/0401225

  13. Evolving Molecular Cloud Structure and the Column Density Probability Distribution Function

    CERN Document Server

    Ward, Rachel L; Sills, Alison

    2014-01-01

    The structure of molecular clouds can be characterized with the probability distribution function (PDF) of the mass surface density. In particular, the properties of the distribution can reveal the nature of the turbulence and star formation present inside the molecular cloud. In this paper, we explore how these structural characteristics evolve with time and also how they relate to various cloud properties as measured from a sample of synthetic column density maps of molecular clouds. We find that, as a cloud evolves, the peak of its column density PDF will shift to surface densities below the observational threshold for detection, resulting in an underlying lognormal distribution which has been effectively lost at late times. Our results explain why certain observations of actively star-forming, dynamically older clouds, such as the Orion molecular cloud, do not appear to have any evidence of a lognormal distribution in their column density PDFs. We also study the evolution of the slope and deviation point ...

  14. Probability distribution of the entanglement across a cut at an infinite-randomness fixed point

    Science.gov (United States)

    Devakul, Trithep; Majumdar, Satya N.; Huse, David A.

    2017-03-01

    We calculate the probability distribution of entanglement entropy S across a cut of a finite one-dimensional spin chain of length L at an infinite-randomness fixed point using Fisher's strong randomness renormalization group (RG). Using the random transverse-field Ising model as an example, the distribution is shown to take the form p (S |L ) ˜L-ψ (k ) , where k ≡S /ln[L /L0] , the large deviation function ψ (k ) is found explicitly, and L0 is a nonuniversal microscopic length. We discuss the implications of such a distribution on numerical techniques that rely on entanglement, such as matrix-product-state-based techniques. Our results are verified with numerical RG simulations, as well as the actual entanglement entropy distribution for the random transverse-field Ising model which we calculate for large L via a mapping to Majorana fermions.

  15. STEREO/LET Observations of Solar Energetic Particle Pitch Angle Distributions

    Science.gov (United States)

    Leske, Richard; Cummings, Alan; Cohen, Christina; Mewaldt, Richard; Labrador, Allan; Stone, Edward; Wiedenbeck, Mark; Christian, Eric; von Rosenvinge, Tycho

    2015-04-01

    As solar energetic particles (SEPs) travel through interplanetary space, the shape of their pitch angle distributions is determined by magnetic focusing and scattering. Measurements of SEP anisotropies therefore probe interplanetary conditions far from the observer and can provide insight into particle transport. Bidirectional flows of SEPs are often seen within interplanetary coronal mass ejections (ICMEs), resulting from injection of particles at both footpoints of the CME or from mirroring of a unidirectional beam. Mirroring is clearly implicated in those cases that show a loss cone distribution, in which particles with large pitch angles are reflected but the magnetic field enhancement at the mirror point is too weak to turn around particles with the smallest pitch angles. The width of the loss cone indicates the magnetic field strength at the mirror point far from the spacecraft, while if timing differences are detectable between outgoing and mirrored particles they may help constrain the location of the reflecting boundary.The Low Energy Telescopes (LETs) onboard both STEREO spacecraft measure energetic particle anisotropies for protons through iron at energies of about 2-12 MeV/nucleon. With these instruments we have observed loss cone distributions in several SEP events, as well as other interesting anisotropies, such as unusual oscillations in the widths of the pitch angle distributions on a timescale of several minutes during the 23 July 2012 SEP event and sunward-flowing particles when the spacecraft was magnetically connected to the back side of a distant shock well beyond 1 AU. We present the STEREO/LET anisotropy observations and discuss their implications for SEP transport. In particular, we find that the shapes of the pitch angle distributions generally vary with energy and particle species, possibly providing a signature of the rigidity dependence of the pitch angle diffusion coefficient.

  16. Assessing protein conformational sampling methods based on bivariate lag-distributions of backbone angles

    KAUST Repository

    Maadooliat, Mehdi

    2012-08-27

    Despite considerable progress in the past decades, protein structure prediction remains one of the major unsolved problems in computational biology. Angular-sampling-based methods have been extensively studied recently due to their ability to capture the continuous conformational space of protein structures. The literature has focused on using a variety of parametric models of the sequential dependencies between angle pairs along the protein chains. In this article, we present a thorough review of angular-sampling-based methods by assessing three main questions: What is the best distribution type to model the protein angles? What is a reasonable number of components in a mixture model that should be considered to accurately parameterize the joint distribution of the angles? and What is the order of the local sequence-structure dependency that should be considered by a prediction method? We assess the model fits for different methods using bivariate lag-distributions of the dihedral/planar angles. Moreover, the main information across the lags can be extracted using a technique called Lag singular value decomposition (LagSVD), which considers the joint distribution of the dihedral/planar angles over different lags using a nonparametric approach and monitors the behavior of the lag-distribution of the angles using singular value decomposition. As a result, we developed graphical tools and numerical measurements to compare and evaluate the performance of different model fits. Furthermore, we developed a web-tool (http://www.stat.tamu. edu/~madoliat/LagSVD) that can be used to produce informative animations. © The Author 2012. Published by Oxford University Press.

  17. The Lognormal Probability Distribution Function of the Perseus Molecular Cloud: A Comparison of HI and Dust

    CERN Document Server

    Burkhart, Blakesley; Murray, Claire; Stanimirovic, Snezana

    2015-01-01

    The shape of the probability distribution function (PDF) of molecular clouds is an important ingredient for modern theories of star formation and turbulence. Recently, several studies have pointed out observational difficulties with constraining the low column density (i.e. Av <1) PDF using dust tracers. In order to constrain the shape and properties of the low column density probability distribution function, we investigate the PDF of multiphase atomic gas in the Perseus molecular cloud using opacity-corrected GALFA-HI data and compare the PDF shape and properties to the total gas PDF and the N(H2) PDF. We find that the shape of the PDF in the atomic medium of Perseus is well described by a lognormal distribution, and not by a power-law or bimodal distribution. The peak of the atomic gas PDF in and around Perseus lies at the HI-H2 transition column density for this cloud, past which the N(H2) PDF takes on a powerlaw form. We find that the PDF of the atomic gas is narrow and at column densities larger than...

  18. Criticality of the net-baryon number probability distribution at finite density

    Directory of Open Access Journals (Sweden)

    Kenji Morita

    2015-02-01

    Full Text Available We compute the probability distribution P(N of the net-baryon number at finite temperature and quark-chemical potential, μ, at a physical value of the pion mass in the quark-meson model within the functional renormalization group scheme. For μ/T<1, the model exhibits the chiral crossover transition which belongs to the universality class of the O(4 spin system in three dimensions. We explore the influence of the chiral crossover transition on the properties of the net baryon number probability distribution, P(N. By considering ratios of P(N to the Skellam function, with the same mean and variance, we unravel the characteristic features of the distribution that are related to O(4 criticality at the chiral crossover transition. We explore the corresponding ratios for data obtained at RHIC by the STAR Collaboration and discuss their implications. We also examine O(4 criticality in the context of binomial and negative-binomial distributions for the net proton number.

  19. Probability distribution functions of turbulence in seepage-affected alluvial channel

    Science.gov (United States)

    Sharma, Anurag; Kumar, Bimlesh

    2017-02-01

    The present experimental study is carried out on the probability distribution functions (PDFs) of turbulent flow characteristics within near-bed-surface and away-from-bed surfaces for both no seepage and seepage flow. Laboratory experiments were conducted in the plane sand bed for no seepage (NS), 10% seepage (10%S) and 15% seepage (15%) cases. The experimental calculation of the PDFs of turbulent parameters such as Reynolds shear stress, velocity fluctuations, and bursting events is compared with theoretical expression obtained by Gram-Charlier (GC)-based exponential distribution. Experimental observations follow the computed PDF distributions for both no seepage and seepage cases. Jensen-Shannon divergence (JSD) method is used to measure the similarity between theoretical and experimental PDFs. The value of JSD for PDFs of velocity fluctuation lies between 0.0005 to 0.003 while the JSD value for PDFs of Reynolds shear stress varies between 0.001 to 0.006. Even with the application of seepage, the PDF distribution of bursting events, sweeps and ejections are well characterized by the exponential distribution of the GC series, except that a slight deflection of inward and outward interactions is observed which may be due to weaker events. The value of JSD for outward and inward interactions ranges from 0.0013 to 0.032, while the JSD value for sweep and ejection events varies between 0.0001 to 0.0025. The theoretical expression for the PDF of turbulent intensity is developed in the present study, which agrees well with the experimental observations and JSD lies between 0.007 and 0.015. The work presented is potentially applicable to the probability distribution of mobile-bed sediments in seepage-affected alluvial channels typically characterized by the various turbulent parameters. The purpose of PDF estimation from experimental data is that it provides a complete numerical description in the areas of turbulent flow either at a single or finite number of points.

  20. Label Ranking with Abstention: Predicting Partial Orders by Thresholding Probability Distributions (Extended Abstract)

    CERN Document Server

    Cheng, Weiwei

    2011-01-01

    We consider an extension of the setting of label ranking, in which the learner is allowed to make predictions in the form of partial instead of total orders. Predictions of that kind are interpreted as a partial abstention: If the learner is not sufficiently certain regarding the relative order of two alternatives, it may abstain from this decision and instead declare these alternatives as being incomparable. We propose a new method for learning to predict partial orders that improves on an existing approach, both theoretically and empirically. Our method is based on the idea of thresholding the probabilities of pairwise preferences between labels as induced by a predicted (parameterized) probability distribution on the set of all rankings.

  1. Evolution Equation for a Joint Tomographic Probability Distribution of Spin-1 Particles

    Science.gov (United States)

    Korennoy, Ya. A.; Man'ko, V. I.

    2016-11-01

    The nine-component positive vector optical tomographic probability portrait of quantum state of spin-1 particles containing full spatial and spin information about the state without redundancy is constructed. Also the suggested approach is expanded to symplectic tomography representation and to representations with quasidistributions like Wigner function, Husimi Q-function, and Glauber-Sudarshan P-function. The evolution equations for constructed vector optical and symplectic tomograms and vector quasidistributions for arbitrary Hamiltonian are found. The evolution equations are also obtained in special case of the quantum system of charged spin-1 particle in arbitrary electro-magnetic field, which are analogs of non-relativistic Proca equation in appropriate representations. The generalization of proposed approach to the cases of arbitrary spin is discussed. The possibility of formulation of quantum mechanics of the systems with spins in terms of joint probability distributions without the use of wave functions or density matrices is explicitly demonstrated.

  2. The probability distribution functions of emission line flux measurements and their ratios

    CERN Document Server

    Wesson, R; Scicluna, P

    2016-01-01

    Many physical parameters in astrophysics are derived using the ratios of two observed quantities. If the relative uncertainties on measurements are small enough, uncertainties can be propagated analytically using simplifying assumptions, but for large normally distributed uncertainties, the probability distribution of the ratio become skewed, with a modal value offset from that expected in Gaussian uncertainty propagation. Furthermore, the most likely value of a ratio A/B is not equal to the reciprocal of the most likely value of B/A. The effect is most pronounced when the uncertainty on the denominator is larger than that on the numerator. We show that this effect is seen in an analysis of 12,126 spectra from the Sloan Digital Sky Survey. The intrinsically fixed ratio of the [O III] lines at 4959 and 5007 ${\\AA}$ is conventionally expressed as the ratio of the stronger line to the weaker line. Thus, the uncertainty on the denominator is larger, and non-Gaussian probability distributions result. By taking thi...

  3. Finite de Finetti theorem for conditional probability distributions describing physical theories

    Science.gov (United States)

    Christandl, Matthias; Toner, Ben

    2009-04-01

    We work in a general framework where the state of a physical system is defined by its behavior under measurement and the global state is constrained by no-signaling conditions. We show that the marginals of symmetric states in such theories can be approximated by convex combinations of independent and identical conditional probability distributions, generalizing the classical finite de Finetti theorem of Diaconis and Freedman. Our results apply to correlations obtained from quantum states even when there is no bound on the local dimension, so that known quantum de Finetti theorems cannot be used.

  4. Comparison of Lauritzen-Spiegelhalter and successive restrictions algorithms for computing probability distributions in Bayesian networks

    Science.gov (United States)

    Smail, Linda

    2016-06-01

    The basic task of any probabilistic inference system in Bayesian networks is computing the posterior probability distribution for a subset or subsets of random variables, given values or evidence for some other variables from the same Bayesian network. Many methods and algorithms have been developed to exact and approximate inference in Bayesian networks. This work compares two exact inference methods in Bayesian networks-Lauritzen-Spiegelhalter and the successive restrictions algorithm-from the perspective of computational efficiency. The two methods were applied for comparison to a Chest Clinic Bayesian Network. Results indicate that the successive restrictions algorithm shows more computational efficiency than the Lauritzen-Spiegelhalter algorithm.

  5. Spectra and probability distributions of thermal flux in turbulent Rayleigh-B\\'{e}nard convection

    CERN Document Server

    Pharasi, Hirdesh K; Kumar, Krishna; Bhattacharjee, Jayanta K

    2016-01-01

    The spectra of turbulent heat flux $\\mathrm{H}(k)$ in Rayleigh-B\\'{e}nard convection with and without uniform rotation are presented. The spectrum $\\mathrm{H}(k)$ scales with wave number $k$ as $\\sim k^{-2}$. The scaling exponent is almost independent of the Taylor number $\\mathrm{Ta}$ and Prandtl number $\\mathrm{Pr}$ for higher values of the reduced Rayleigh number $r$ ($ > 10^3$). The exponent, however, depends on $\\mathrm{Ta}$ and $\\mathrm{Pr}$ for smaller values of $r$ ($<10^3$). The probability distribution functions of the local heat fluxes are non-Gaussian and have exponential tails.

  6. Wind speed analysis in La Vainest, Mexico: a bimodal probability distribution case

    Energy Technology Data Exchange (ETDEWEB)

    Jaramillo, O.A.; Borja, M.A. [Energias No Convencionales, Morelos (Mexico). Instituto de Investigaciones Electricas

    2004-08-01

    The statistical characteristics of the wind speed in La Vainest, Oxoic, Mexico, have been analyzed by using wind speed data recorded by Instituto de Investigaciones Electricas (IIE). By grouping the observations by annual, seasonal and wind direction, we show that the wind speed distribution, with calms included, is not represented by the typical two-parameter Weibull function. A mathematical formulation by using a bimodal Weibull and Weibull probability distribution function (PDF) has been developed to analyse the wind speed frequency distribution in that region. The model developed here can be applied for similar regions where the wind speed distribution presents a bimodal PDF. The two-parameter Weibull wind speed distribution must not be generalised, since it is not accurate to represent some wind regimes as the case of La Ventosa, Mexico. The analysis of wind data shows that computing the capacity factor for wind power plants to be installed in La Ventosa must be carded out by means of a bimodal PDF instead of the typical Weibull PDF. Otherwise, the capacity factor will be underestimated. (author)

  7. Study of the SEMG probability distribution of the paretic tibialis anterior muscle

    Science.gov (United States)

    Cherniz, Analía S.; Bonell, Claudia E.; Tabernig, Carolina B.

    2007-11-01

    The surface electromyographic signal is a stochastic signal that has been modeled as a Gaussian process, with a zero mean. It has been experimentally proved that this probability distribution can be adjusted with less error to a Laplacian type distribution. The selection of estimators for the detection of changes in the amplitude of the muscular signal depends, among other things, on the type of distribution. In the case of subjects with lesions to the superior motor neuron, the lack of central control affects the muscular tone, the force and the patterns of muscular movement involved in activities such as the gait cycle. In this work, the distribution types of the SEMG signal amplitudes of the tibialis anterior muscle are evaluated during gait, both in two healthy subjects and in two hemiparetic ones in order to select the estimators that best characterize them. It was observed that the Laplacian distribution function would be the one that best adjusts to the experimental data in the studied subjects, although this largely depends on the subject and on the data segment analyzed.

  8. Long-Term Probability Distribution of Wind Turbine Planetary Bearing Loads (Poster)

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Z.; Xing, Y.; Guo, Y.; Dong, W.; Moan, T.; Gao, Z.

    2013-04-01

    Among the various causes of bearing damage and failure, metal fatigue of the rolling contact surface is the dominant failure mechanism. The fatigue life is associated with the load conditions under which wind turbines operate in the field. Therefore, it is important to understand the long-term distribution of the bearing loads under various environmental conditions. The National Renewable Energy Laboratory's 750-kW Gearbox Reliability Collaborative wind turbine is studied in this work. A decoupled analysis using several computer codes is carried out. The global aero-elastic simulations are performed using HAWC2. The time series of the drivetrain loads and motions from the global dynamic analysis are fed to a drivetrain model in SIMPACK. The time-varying internal pressure distribution along the raceway is obtained analytically. A series of probability distribution functions are then used to fit the long-term statistical distribution at different locations along raceways. The long-term distribution of the bearing raceway loads are estimated under different environmental conditions. Finally, the bearing fatigue lives are calculated.

  9. Study of the SEMG probability distribution of the paretic tibialis anterior muscle

    Energy Technology Data Exchange (ETDEWEB)

    Cherniz, AnalIa S; Bonell, Claudia E; Tabernig, Carolina B [Laboratorio de Ingenieria de Rehabilitacion e Investigaciones Neuromusculares y Sensoriales, Facultad de Ingenieria, UNER, Oro Verde (Argentina)

    2007-11-15

    The surface electromyographic signal is a stochastic signal that has been modeled as a Gaussian process, with a zero mean. It has been experimentally proved that this probability distribution can be adjusted with less error to a Laplacian type distribution. The selection of estimators for the detection of changes in the amplitude of the muscular signal depends, among other things, on the type of distribution. In the case of subjects with lesions to the superior motor neuron, the lack of central control affects the muscular tone, the force and the patterns of muscular movement involved in activities such as the gait cycle. In this work, the distribution types of the SEMG signal amplitudes of the tibialis anterior muscle are evaluated during gait, both in two healthy subjects and in two hemiparetic ones in order to select the estimators that best characterize them. It was observed that the Laplacian distribution function would be the one that best adjusts to the experimental data in the studied subjects, although this largely depends on the subject and on the data segment analyzed.

  10. Computational analysis of the spatial distribution of mitotic spindle angles in mouse developing airway

    Science.gov (United States)

    Tang, Nan; Marshall, Wallace F.

    2013-02-01

    Investigating the spatial information of cellular processes in tissues during mouse embryo development is one of the major technical challenges in development biology. Many imaging methods are still limited to the volumes of tissue due to tissue opacity, light scattering and the availability of advanced imaging tools. For analyzing the mitotic spindle angle distribution in developing mouse airway epithelium, we determined spindle angles in mitotic epithelial cells on serial sections of whole airway of mouse embryonic lungs. We then developed a computational image analysis to obtain spindle angle distribution in three dimensional airway reconstructed from the data obtained from all serial sections. From this study, we were able to understand how mitotic spindle angles are distributed in a whole airway tube. This analysis provides a potentially fast, simple and inexpensive alternative method to quantitatively analyze cellular process at subcellular resolution. Furthermore, this analysis is not limited to the size of tissues, which allows to obtain three dimensional and high resolution information of cellular processes in cell populations deeper inside intact organs.

  11. Atmospheric gamma ray angle and energy distributions from 2 to 25 MeV

    Science.gov (United States)

    Ryan, J. M.; Moon, S. H.; Wilson, R. B.; Zych, A. D.; White, R. S.; Dayton, B.

    1977-01-01

    Results are given for gamma ray fluxes in six energy intervals from 2-25 MeV and five zenith angle intervals from 0-50 deg (downward moving) and five from 130-180 deg (upward moving). Observations were obtained with the University of California, Riverside double Compton scatter gamma ray telescope flown on a balloon to a 3.0 g/sq cm residual atmosphere at a geomagnetic cuttoff of 4.5 GV. It was found that the angular distribution of downward moving gamma rays is relatively flat, increasing slowly from 10-40 deg. The angular distribution of the upward moving gamma rays at 4.2 g/sq cm increases with angle from the vertical. Energy distributions of upward and downward moving gamma rays are in good agreement with the results of previous studies.

  12. EVALUATION OF THE PROBABILITY DISTRIBUTION OF PITTING CORROSION FATIGUE LIFE IN AIRCRAFT MATERIALS

    Institute of Scientific and Technical Information of China (English)

    WANG Qingyuan (王清远); N.KAWAGOISHI; Q.CHEN; R.M.PIDAPARTI

    2003-01-01

    Corrosion and fatigue properties of aircraft materials are known to have a considerable scatter due to the random nature of materials,loading,and environmental conditions.A probabilistic approach for predicting the pitting corrosion fatigue life has been investigated which captures the effect of the interaction of the cyclic load and corrosive environment and all stages of the corrosion fatigue process (i.e.the pit nucleation and growth,pit-crack transition,short- and long-crack propagation).The probabilistic model investigated considers the uncertainties in the initial pit size,corrosion pitting current,and material properties due to the scatter found in the experimental data.Monte Carlo simulations were performed to define the failure probability distribution.Predicted cumulative distribution functions of fatigue life agreed reasonably well with the existing experimental data.

  13. A Voting Based Approach to Detect Recursive Order Number of Photocopy Documents Using Probability Distributions

    Directory of Open Access Journals (Sweden)

    Rani K

    2014-08-01

    Full Text Available Photocopy documents are very common in our normal life. People are permitted to carry and present photocopied documents to avoid damages to the original documents. But this provision is misused for temporary benefits by fabricating fake photocopied documents. Fabrication of fake photocopied document is possible only in 2nd and higher order recursive order of photocopies. Whenever a photocopied document is submitted, it may be required to check its originality. When the document is 1st order photocopy, chances of fabrication may be ignored. On the other hand when the photocopy order is 2nd or above, probability of fabrication may be suspected. Hence when a photocopy document is presented, the recursive order number of photocopy is to be estimated to ascertain the originality. This requirement demands to investigate methods to estimate order number of photocopy. In this work, a voting based approach is used to detect the recursive order number of the photocopy document using probability distributions exponential, extreme values and lognormal distributions is proposed. A detailed experimentation is performed on a generated data set and the method exhibits efficiency close to 89%.

  14. Wave Packet Dynamics in the Infinite Square Well with the Wigner Quasi-probability Distribution

    Science.gov (United States)

    Belloni, Mario; Doncheski, Michael; Robinett, Richard

    2004-05-01

    Over the past few years a number of authors have been interested in the time evolution and revivals of Gaussian wave packets in one-dimensional infinite wells and in two-dimensional infinite wells of various geometries. In all of these circumstances, the wave function is guaranteed to revive at a time related to the inverse of the system's ground state energy, if not sooner. To better visualize these revivals we have calculated the time-dependent Wigner quasi-probability distribution for position and momentum, P_W(x; p), for Gaussian wave packet solutions of this system. The Wigner quasi-probability distribution clearly demonstrates the short-term semi-classical time dependence, as well as longer-term revival behavior and the structure during the collapsed state. This tool also provides an excellent way of demonstrating the patterns of highly-correlated Schrödinger-cat-like `mini-packets' which appear at fractional multiples of the exact revival time. This research is supported in part by a Research Corporation Cottrell College Science Award (CC5470) and the National Science Foundation under contracts DUE-0126439 and DUE-9950702.

  15. Density Estimation for Protein Conformation Angles Using a Bivariate von Mises Distribution and Bayesian Nonparametrics.

    Science.gov (United States)

    Lennox, Kristin P; Dahl, David B; Vannucci, Marina; Tsai, Jerry W

    2009-06-01

    Interest in predicting protein backbone conformational angles has prompted the development of modeling and inference procedures for bivariate angular distributions. We present a Bayesian approach to density estimation for bivariate angular data that uses a Dirichlet process mixture model and a bivariate von Mises distribution. We derive the necessary full conditional distributions to fit the model, as well as the details for sampling from the posterior predictive distribution. We show how our density estimation method makes it possible to improve current approaches for protein structure prediction by comparing the performance of the so-called "whole" and "half" position distributions. Current methods in the field are based on whole position distributions, as density estimation for the half positions requires techniques, such as ours, that can provide good estimates for small datasets. With our method we are able to demonstrate that half position data provides a better approximation for the distribution of conformational angles at a given sequence position, therefore providing increased efficiency and accuracy in structure prediction.

  16. Maximum Entropy Estimation of Probability Distribution of Variables in Higher Dimensions from Lower Dimensional Data

    Directory of Open Access Journals (Sweden)

    Jayajit Das '

    2015-07-01

    Full Text Available A common statistical situation concerns inferring an unknown distribution Q(x from a known distribution P(y, where X (dimension n, and Y (dimension m have a known functional relationship. Most commonly, n ≤ m, and the task is relatively straightforward for well-defined functional relationships. For example, if Y1 and Y2 are independent random variables, each uniform on [0, 1], one can determine the distribution of X = Y1 + Y2; here m = 2 and n = 1. However, biological and physical situations can arise where n > m and the functional relation Y→X is non-unique. In general, in the absence of additional information, there is no unique solution to Q in those cases. Nevertheless, one may still want to draw some inferences about Q. To this end, we propose a novel maximum entropy (MaxEnt approach that estimates Q(x based only on the available data, namely, P(y. The method has the additional advantage that one does not need to explicitly calculate the Lagrange multipliers. In this paper we develop the approach, for both discrete and continuous probability distributions, and demonstrate its validity. We give an intuitive justification as well, and we illustrate with examples.

  17. Serial Spike Time Correlations Affect Probability Distribution of Joint Spike Events.

    Science.gov (United States)

    Shahi, Mina; van Vreeswijk, Carl; Pipa, Gordon

    2016-01-01

    Detecting the existence of temporally coordinated spiking activity, and its role in information processing in the cortex, has remained a major challenge for neuroscience research. Different methods and approaches have been suggested to test whether the observed synchronized events are significantly different from those expected by chance. To analyze the simultaneous spike trains for precise spike correlation, these methods typically model the spike trains as a Poisson process implying that the generation of each spike is independent of all the other spikes. However, studies have shown that neural spike trains exhibit dependence among spike sequences, such as the absolute and relative refractory periods which govern the spike probability of the oncoming action potential based on the time of the last spike, or the bursting behavior, which is characterized by short epochs of rapid action potentials, followed by longer episodes of silence. Here we investigate non-renewal processes with the inter-spike interval distribution model that incorporates spike-history dependence of individual neurons. For that, we use the Monte Carlo method to estimate the full shape of the coincidence count distribution and to generate false positives for coincidence detection. The results show that compared to the distributions based on homogeneous Poisson processes, and also non-Poisson processes, the width of the distribution of joint spike events changes. Non-renewal processes can lead to both heavy tailed or narrow coincidence distribution. We conclude that small differences in the exact autostructure of the point process can cause large differences in the width of a coincidence distribution. Therefore, manipulations of the autostructure for the estimation of significance of joint spike events seem to be inadequate.

  18. Digital simulation of two-dimensional random fields with arbitrary power spectra and non-Gaussian probability distribution functions

    DEFF Research Database (Denmark)

    Yura, Harold; Hanson, Steen Grüner

    2012-01-01

    Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the......Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set...... with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative...

  19. Collimator angle influence on dose distribution optimization for vertebral metastases using volumetric modulated arc therapy

    Energy Technology Data Exchange (ETDEWEB)

    Mancosu, Pietro; Cozzi, Luca; Fogliata, Antonella; Lattuada, Paola; Reggiori, Giacomo; Cantone, Marie Claire; Navarria, Pierina; Scorsetti, Marta [Department of Radiation Oncology, IRCCS Istituto Clinico Humanitas, Milano (Rozzano) 20089 (Italy); Medical Physics Unit, Oncology Institute of Southern Switzerland, Bellinzona 6504 (Switzerland); Department of Radiation Oncology, IRCCS Istituto Clinico Humanitas, Milano (Rozzano) 20089 (Italy); Department of Physics, Universita Degli Studi di Milano, Milano 20133 (Italy); Department of Radiation Oncology, IRCCS Istituto Clinico Humanitas, Milano (Rozzano) 20089 (Italy)

    2010-08-15

    Purpose: The cylindrical symmetry of vertebrae favors the use of volumetric modulated arc therapy in generating a dose ''hole'' on the center of the vertebrae limiting the dose to the spinal cord. The authors have evaluated if collimator angle is a significant parameter for dose distribution optimization in vertebral metastases. Methods: Three patients with one-three vertebrae involved were considered. Twenty-one differently optimized plans (nine single-arc and 12 double-arc plans) were performed, testing various collimator angle positions. Clinical target volume was defined as the whole vertebrae, excluding the spinal cord canal. The planning target volume (PTV) was defined as CTV+5 mm. Dose prescription was 5x4 Gy{sup 2} with normalization to PTV mean dose. The dose at 1 cm{sup 3} of spinal cord was limited to 11.5Gy. Results: The best plans in terms of target coverage and spinal cord sparing were achieved by two arcs and Arc1-80 deg. and Arc2-280 deg. collimator angles for all the cases considered (i.e., leaf travel parallel to the spinal cord primary orientation). If one arc is used, only 80 deg. reached the objectives. Conclusions: This study demonstrated the role of collimation rotation for the vertebrae metastasis irradiation, with the leaf travel parallel to the spinal cord primary orientation to be better than other solutions. Thus, optimal choice of collimator angle increases the optimization freedom to shape a desired dose distribution.

  20. A laser speckle sensor to measure the distribution of static torsion angles of twisted targets

    DEFF Research Database (Denmark)

    Rose, B.; Imam, H.; Hanson, Steen Grüner

    1998-01-01

    A novel method for measuring the distribution of static torsion angles of twisted targets is presented. The method is based on Fourier transforming the scattered field in the direction perpendicular to the twist axis, while performing an imaging operation in the direction parallel to the axis....... The Fourier transform serves to map the angular distribution of the scattered light field at the target into a linear displacement on a two-dimensional array image sensor placed in the Fourier plane. Measuring this displacement facilitates the determination of the angular displacement of the target....... A cylindrical lens serves to image the closely spaced lateral positions of the target along the twist axis onto corresponding lines of the two dimensional image sensor. Thus, every single line of the image sensor measures the torsion angle of the corresponding surface position along the twist axis of the target...

  1. Modeling the probability distribution of positional errors incurred by residential address geocoding

    Directory of Open Access Journals (Sweden)

    Mazumdar Soumya

    2007-01-01

    Full Text Available Abstract Background The assignment of a point-level geocode to subjects' residences is an important data assimilation component of many geographic public health studies. Often, these assignments are made by a method known as automated geocoding, which attempts to match each subject's address to an address-ranged street segment georeferenced within a streetline database and then interpolate the position of the address along that segment. Unfortunately, this process results in positional errors. Our study sought to model the probability distribution of positional errors associated with automated geocoding and E911 geocoding. Results Positional errors were determined for 1423 rural addresses in Carroll County, Iowa as the vector difference between each 100%-matched automated geocode and its true location as determined by orthophoto and parcel information. Errors were also determined for 1449 60%-matched geocodes and 2354 E911 geocodes. Huge (> 15 km outliers occurred among the 60%-matched geocoding errors; outliers occurred for the other two types of geocoding errors also but were much smaller. E911 geocoding was more accurate (median error length = 44 m than 100%-matched automated geocoding (median error length = 168 m. The empirical distributions of positional errors associated with 100%-matched automated geocoding and E911 geocoding exhibited a distinctive Greek-cross shape and had many other interesting features that were not capable of being fitted adequately by a single bivariate normal or t distribution. However, mixtures of t distributions with two or three components fit the errors very well. Conclusion Mixtures of bivariate t distributions with few components appear to be flexible enough to fit many positional error datasets associated with geocoding, yet parsimonious enough to be feasible for nascent applications of measurement-error methodology to spatial epidemiology.

  2. Detection of two power-law tails in the probability distribution functions of massive GMCs

    CERN Document Server

    Schneider, N; Girichidis, P; Rayner, T; Motte, F; Andre, P; Russeil, D; Abergel, A; Anderson, L; Arzoumanian, D; Benedettini, M; Csengeri, T; Didelon, P; Francesco, J D; Griffin, M; Hill, T; Klessen, R S; Ossenkopf, V; Pezzuto, S; Rivera-Ingraham, A; Spinoglio, L; Tremblin, P; Zavagno, A

    2015-01-01

    We report the novel detection of complex high-column density tails in the probability distribution functions (PDFs) for three high-mass star-forming regions (CepOB3, MonR2, NGC6334), obtained from dust emission observed with Herschel. The low column density range can be fit with a lognormal distribution. A first power-law tail starts above an extinction (Av) of ~6-14. It has a slope of alpha=1.3-2 for the rho~r^-alpha profile for an equivalent density distribution (spherical or cylindrical geometry), and is thus consistent with free-fall gravitational collapse. Above Av~40, 60, and 140, we detect an excess that can be fitted by a flatter power law tail with alpha>2. It correlates with the central regions of the cloud (ridges/hubs) of size ~1 pc and densities above 10^4 cm^-3. This excess may be caused by physical processes that slow down collapse and reduce the flow of mass towards higher densities. Possible are: 1. rotation, which introduces an angular momentum barrier, 2. increasing optical depth and weaker...

  3. Pitch angle distributions of energetic ions in the lobes of the distant geomagnetic tail

    Energy Technology Data Exchange (ETDEWEB)

    Owen, C.J.; Cowley, S.W.H.; Richardson, I.G.; Balogh, A. (Imperial Coll. of Science and Technology, London (UK). Blackett Lab.)

    1990-07-01

    Analysis of energetic (> 35 keV) ion data from the ISEE-3 spacecraft obtained during 1982-1983, when the spacecraft made a series of traversals of the distant geomagnetic tail (X{sub GSE} > - 230 R{sub E}), indicates that the pitch angle distribution of energetic ions in the distant tail lobes is usually highly anisotropic, being peaked closely perpendicular to the magnetic field direction, but with a small net flow in the antisunward direction. In this paper we present a model, based on the motion of single particles into and within the tail lobes, which accounts for these observed distributions. This model assumes that the lobe ions originate in the magnetosheath, where the energetic ion population consists of two components; a spatially uniform ''solar'' population, and a population of ''terrestrial'' origin, which decreases in strength with downtail distance. The pitch angle distribution at any point within the lobe may be constructed, assuming that the value of the distribution function along the particle trajectory is conserved. In general, those ions with a large field-aligned component to their motion enter the lobes in the deep tail, where the ''terrestrial'' source is weak, whilst those moving closely perpendicular to the field enter the lobes at positions much closer to the Earth, where the source is strong. The fluxes of these latter ions are therefore much enhanced above the rest of the pitch angle distribution, and are shown to account for the form of the observed distributions. The model also accounts for the more isotropic ion population observed in the lobe during solar particle events, when the ''terrestrial'' component of the magnetosheath source may be considered negligible in comparison to the enhanced ''solar'' component. (author).

  4. Thickness distribution of multi-stage incremental forming with different forming stages and angle intervals

    Institute of Scientific and Technical Information of China (English)

    李军超; 杨芬芬; 周志强

    2015-01-01

    Although multi-stage incremental sheet forming has always been adopted instead of single-stage forming to form parts with a steep wall angle or to achieve a high forming performance, it is largely dependent on empirical designs. In order to research multi-stage forming further, the effect of forming stages (n) and angle interval between the two adjacent stages (Δα) on thickness distribution was investigated. Firstly, a finite element method (FEM) model of multi-stage incremental forming was established and experimentally verified. Then, based on the proposed simulation model, different strategies were adopted to form a frustum of cone with wall angle of 30° to research the thickness distribution of multi-pass forming. It is proved that the minimum thickness increases largely and the variance of sheet thickness decreases significantly as the value of n grows. Further, with the increase of Δα, the minimum thickness increases initially and then decreases, and the optimal thickness distribution is achieved with Δα of 10°. Additionally, a formula is deduced to estimate the sheet thickness after multi-stage forming and proved to be effective. And the simulation results fit well with the experimental results.

  5. Spatial probability distribution of future volcanic eruptions at El Hierro Island (Canary Islands, Spain)

    Science.gov (United States)

    Becerril, Laura; Cappello, Annalisa; Galindo, Inés; Neri, Marco; Del Negro, Ciro

    2013-05-01

    The 2011 submarine eruption that took place in the proximity of El Hierro Island (Canary Islands, Spain) has raised the need to identify the most likely future emission zones even on volcanoes characterized by low frequency activity. Here, we propose a probabilistic method to build the susceptibility map of El Hierro, i.e. the spatial distribution of vent opening for future eruptions, based on the probabilistic analysis of volcano-structural data of the Island collected through new fieldwork measurements, bathymetric information, as well as analysis of geological maps, orthophotos and aerial photographs. These data have been divided into different datasets and converted into separate and weighted probability density functions, which were included in a non-homogeneous Poisson process to produce the volcanic susceptibility map. The most likely area to host new eruptions in El Hierro is in the south-western part of the West rift. High probability locations are also found in the Northeast and South rifts, and along the submarine parts of the rifts. This map represents the first effort to deal with the volcanic hazard at El Hierro and can be a support tool for decision makers in land planning, emergency measures and civil defense actions.

  6. Analysis of Observation Data of Earth-Rockfill Dam Based on Cloud Probability Distribution Density Algorithm

    Directory of Open Access Journals (Sweden)

    Han Liwei

    2014-07-01

    Full Text Available Monitoring data on an earth-rockfill dam constitutes a form of spatial data. Such data include much uncertainty owing to the limitation of measurement information, material parameters, load, geometry size, initial conditions, boundary conditions and the calculation model. So the cloud probability density of the monitoring data must be addressed. In this paper, the cloud theory model was used to address the uncertainty transition between the qualitative concept and the quantitative description. Then an improved algorithm of cloud probability distribution density based on a backward cloud generator was proposed. This was used to effectively convert certain parcels of accurate data into concepts which can be described by proper qualitative linguistic values. Such qualitative description was addressed as cloud numerical characteristics-- {Ex, En, He}, which could represent the characteristics of all cloud drops. The algorithm was then applied to analyze the observation data of a piezometric tube in an earth-rockfill dam. And experiment results proved that the proposed algorithm was feasible, through which, we could reveal the changing regularity of piezometric tube’s water level. And the damage of the seepage in the body was able to be found out.

  7. Analysis of Low Probability of Intercept (LPI) Radar Signals Using the Wigner Distribution

    Science.gov (United States)

    Gau, Jen-Yu

    2002-09-01

    The parameters of Low Probability of Intercept (LPI) radar signals are hard to identity by using traditional periodogram signal processing techniques. Using the Wigner Distribution (WD), this thesis examines eight types of LPI radar signals. Signal to noise ratios of 0 dB and -6 dB are also investigated. The eight types LPI radar signals examined include Frequency Modulation Continuous Wave (FMCW), Frank code, Pt code, P2 code, P3 code, P4 code, COSTAS frequency hopping and Phase Shift Keying/Frequency Shift Keying (PSK/FSK) signals. Binary Phase Shift Keying (BPSK) signals although not used in modern LPI radars are also examined to further illustrate the principal characteristics of the WD.

  8. Probability distribution function and multiscaling properties in the Korean stock market

    Science.gov (United States)

    Lee, Kyoung Eun; Lee, Jae Woo

    2007-09-01

    We consider the probability distribution function (pdf) and the multiscaling properties of the index and the traded volume in the Korean stock market. We observed the power law of the pdf at the fat tail region for the return, volatility, the traded volume, and changes of the traded volume. We also investigate the multifractality in the Korean stock market. We consider the multifractality by the detrended fluctuation analysis (MFDFA). We observed the multiscaling behaviors for index, return, traded volume, and the changes of the traded volume. We apply MFDFA method for the randomly shuffled time series to observe the effects of the autocorrelations. The multifractality is strongly originated from the long time correlations of the time series.

  9. GENERALIZED FATIGUE CONSTANT LIFE CURVE AND TWO-DIMENSIONAL PROBABILITY DISTRIBUTION OF FATIGUE LIMIT

    Institute of Scientific and Technical Information of China (English)

    熊峻江; 武哲; 高镇同

    2002-01-01

    According to the traditional fatigue constant life curve, the concept and the universal expression of the generalized fatigue constant life curve were proposed.Then, on the basis of the optimization method of the correlation coefficient, the parameter estimation formulas were induced and the generalized fatigue constant life curve with the reliability level p was given.From P-Sa-Sm curve, the two-dimensional probability distribution of the fatigue limit was derived.After then, three set of tests of LY11 CZ corresponding to the different average stress were carried out in terms of the two-dimensional up-down method.Finally, the methods are used to analyze the test results, and it is found that the analyzedresults with the high precision may be obtained.

  10. On the reliability of observational measurements of column density probability distribution functions

    CERN Document Server

    Ossenkopf, Volker; Schneider, Nicola; Federrath, Christoph; Klessen, Ralf S

    2016-01-01

    Probability distribution functions (PDFs) of column densities are an established tool to characterize the evolutionary state of interstellar clouds. Using simulations, we show to what degree their determination is affected by noise, line-of-sight contamination, field selection, and the incomplete sampling in interferometric measurements. We solve the integrals that describe the convolution of a cloud PDF with contaminating sources and study the impact of missing information on the measured column density PDF. The effect of observational noise can be easily estimated and corrected for if the root mean square (rms) of the noise is known. For $\\sigma_{noise}$ values below 40\\,\\% of the typical cloud column density, $N_{peak}$, this involves almost no degradation of the accuracy of the PDF parameters. For higher noise levels and narrow cloud PDFs the width of the PDF becomes increasingly uncertain. A contamination by turbulent foreground or background clouds can be removed as a constant shield if the PDF of the c...

  11. Mean square delay dependent-probability-distribution stability analysis of neutral type stochastic neural networks.

    Science.gov (United States)

    Muralisankar, S; Manivannan, A; Balasubramaniam, P

    2015-09-01

    The aim of this manuscript is to investigate the mean square delay dependent-probability-distribution stability analysis of neutral type stochastic neural networks with time-delays. The time-delays are assumed to be interval time-varying and randomly occurring. Based on the new Lyapunov-Krasovskii functional and stochastic analysis approach, a novel sufficient condition is obtained in the form of linear matrix inequality such that the delayed stochastic neural networks are globally robustly asymptotically stable in the mean-square sense for all admissible uncertainties. Finally, the derived theoretical results are validated through numerical examples in which maximum allowable upper bounds are calculated for different lower bounds of time-delay.

  12. The H I Probability Distribution Function and the Atomic-to-molecular Transition in Molecular Clouds

    Science.gov (United States)

    Imara, Nia; Burkhart, Blakesley

    2016-10-01

    We characterize the column-density probability distribution functions (PDFs) of the atomic hydrogen gas, H i, associated with seven Galactic molecular clouds (MCs). We use 21 cm observations from the Leiden/Argentine/Bonn Galactic H i Survey to derive column-density maps and PDFs. We find that the peaks of the H i PDFs occur at column densities in the range ˜1-2 × 1021 {{cm}}-2 (equivalently, ˜0.5-1 mag). The PDFs are uniformly narrow, with a mean dispersion of {σ }{{H}{{I}}}≈ {10}20 {{cm}}-2 (˜0.1 mag). We also investigate the H i-to-H2 transition toward the cloud complexes and estimate H i surface densities ranging from 7 to 16 {M}⊙ {{pc}}-2 at the transition. We propose that the H i PDF is a fitting tool for identifying the H i-to-H2 transition column in Galactic MCs.

  13. EVALUATION OF THE PROBABILITY DISTRIBUTION OF PITTING CORROSION FATIGUE LIFE IN AIRCRAFT MATERIALS

    Institute of Scientific and Technical Information of China (English)

    王清远; N.KAWAGOISHI; Q.CHEN; R.M.PIDAPARTI

    2003-01-01

    Corrosion and fatigue properties of aircraft materials axe known to have a considerablescatter due to the random nature of materials, loading, and environmental conditions. A probabilisticapproach for predicting the pitting corrosion fatigue life has been investigated which captures the effectof the interaction of the cyclic load and corrosive environment and all stages of the corrosion fatigueprocess (i.e. the pit nucleation and growth, pit-crack transition, short- and long-crack propagation).The probabilistic model investigated considers the uncertainties in the initial pit size, corrosion pittingcurrent, and material properties due to the scatter found in the experimental data. Monte Carlo simu-lations were performed to define the failure probability distribution. Predicted cumulative distributionfunctions of fatigue life agreed reasonably well with the existing experimental data.

  14. Sampling the probability distribution of Type Ia Supernova lightcurve parameters in cosmological analysis

    Science.gov (United States)

    Dai, Mi; Wang, Yun

    2016-06-01

    In order to obtain robust cosmological constraints from Type Ia supernova (SN Ia) data, we have applied Markov Chain Monte Carlo (MCMC) to SN Ia lightcurve fitting. We develop a method for sampling the resultant probability density distributions (pdf) of the SN Ia lightcuve parameters in the MCMC likelihood analysis to constrain cosmological parameters, and validate it using simulated data sets. Applying this method to the `joint lightcurve analysis (JLA)' data set of SNe Ia, we find that sampling the SN Ia lightcurve parameter pdf's leads to cosmological parameters closer to that of a flat Universe with a cosmological constant, compared to the usual practice of using only the best-fitting values of the SN Ia lightcurve parameters. Our method will be useful in the use of SN Ia data for precision cosmology.

  15. Probability Distributions of Random Electromagnetic Fields in the Presence of a Semi-Infinite Isotropic Medium

    CERN Document Server

    Arnaut, L R

    2006-01-01

    Using a TE/TM decomposition for an angular plane-wave spectrum of free random electromagnetic waves and matched boundary conditions, we derive the probability density function for the energy density of the vector electric field in the presence of a semi-infinite isotropic medium. The theoretical analysis is illustrated with calculations and results for good electric conductors and for a lossless dielectric half-space. The influence of the permittivity and conductivity on the intensity, random polarization, statistical distribution and standard deviation of the field is investigated, both for incident plus reflected fields and for refracted fields. External refraction is found to result in compression of the fluctuations of the random field.

  16. The HI Probability Distribution Function and the Atomic-to-Molecular Transition in Molecular Clouds

    CERN Document Server

    Imara, Nia

    2016-01-01

    We characterize the column density probability distributions functions (PDFs) of the atomic hydrogen gas, HI, associated with seven Galactic molecular clouds (MCs). We use 21 cm observations from the Leiden/Argentine/Bonn Galactic HI Survey to derive column density maps and PDFs. We find that the peaks of the HI PDFs occur at column densities ranging from ~1-2$\\times 10^{21}$ cm$^2$ (equivalently, ~0.5-1 mag). The PDFs are uniformly narrow, with a mean dispersion of $\\sigma_{HI}\\approx 10^{20}$ cm$^2$ (~0.1 mag). We also investigate the HI-to-H$_2$ transition towards the cloud complexes and estimate HI surface densities ranging from 7-16 $M_\\odot$ pc$^{-2}$ at the transition. We propose that the HI PDF is a fitting tool for identifying the HI-to-H$_2$ transition column in Galactic MCs.

  17. Lower Bound Bayesian Networks - An Efficient Inference of Lower Bounds on Probability Distributions in Bayesian Networks

    CERN Document Server

    Andrade, Daniel

    2012-01-01

    We present a new method to propagate lower bounds on conditional probability distributions in conventional Bayesian networks. Our method guarantees to provide outer approximations of the exact lower bounds. A key advantage is that we can use any available algorithms and tools for Bayesian networks in order to represent and infer lower bounds. This new method yields results that are provable exact for trees with binary variables, and results which are competitive to existing approximations in credal networks for all other network structures. Our method is not limited to a specific kind of network structure. Basically, it is also not restricted to a specific kind of inference, but we restrict our analysis to prognostic inference in this article. The computational complexity is superior to that of other existing approaches.

  18. Random numbers from the tails of probability distributions using the transformation method

    CERN Document Server

    Fulger, Daniel; Germano, Guido

    2009-01-01

    The speed of many one-line transformation methods for the production of, for example, Levy alpha-stable random numbers, which generalize Gaussian ones, and Mittag-Leffler random numbers, which generalize exponential ones, is very high and satisfactory for most purposes. However, for the class of decreasing probability densities fast rejection implementations like the Ziggurat by Marsaglia and Tsang promise a significant speed-up if it is possible to complement them with a method that samples the tails of the infinite support. This requires the fast generation of random numbers greater or smaller than a certain value. We present a method to achieve this, and also to generate random numbers within any arbitrary interval. We demonstrate the method showing the properties of the transform maps of the above mentioned distributions as examples of stable and geometric stable random numbers used for the stochastic solution of the space-time fractional diffusion equation.

  19. Likelihood analysis of species occurrence probability from presence-only data for modelling species distributions

    Science.gov (United States)

    Royle, J. Andrew; Chandler, Richard B.; Yackulic, Charles; Nichols, James D.

    2012-01-01

    1. Understanding the factors affecting species occurrence is a pre-eminent focus of applied ecological research. However, direct information about species occurrence is lacking for many species. Instead, researchers sometimes have to rely on so-called presence-only data (i.e. when no direct information about absences is available), which often results from opportunistic, unstructured sampling. MAXENT is a widely used software program designed to model and map species distribution using presence-only data. 2. We provide a critical review of MAXENT as applied to species distribution modelling and discuss how it can lead to inferential errors. A chief concern is that MAXENT produces a number of poorly defined indices that are not directly related to the actual parameter of interest – the probability of occurrence (ψ). This focus on an index was motivated by the belief that it is not possible to estimate ψ from presence-only data; however, we demonstrate that ψ is identifiable using conventional likelihood methods under the assumptions of random sampling and constant probability of species detection. 3. The model is implemented in a convenient r package which we use to apply the model to simulated data and data from the North American Breeding Bird Survey. We demonstrate that MAXENT produces extreme under-predictions when compared to estimates produced by logistic regression which uses the full (presence/absence) data set. We note that MAXENT predictions are extremely sensitive to specification of the background prevalence, which is not objectively estimated using the MAXENT method. 4. As with MAXENT, formal model-based inference requires a random sample of presence locations. Many presence-only data sets, such as those based on museum records and herbarium collections, may not satisfy this assumption. However, when sampling is random, we believe that inference should be based on formal methods that facilitate inference about interpretable ecological quantities

  20. Size effect on strength and lifetime probability distributions of quasibrittle structures

    Indian Academy of Sciences (India)

    Zdeněk P Bažant; Jia-Liang Le

    2012-02-01

    Engineering structures such as aircraft, bridges, dams, nuclear containments and ships, as well as computer circuits, chips and MEMS, should be designed for failure probability < $10^{-6}-10^{-7}$ per lifetime. The safety factors required to ensure it are still determined empirically, even though they represent much larger and much more uncertain corrections to deterministic calculations than do the typical errors of modern computer analysis of structures. The empirical approach is sufficient for perfectly brittle and perfectly ductile structures since the cumulative distribution function (cdf) of random strength is known, making it possible to extrapolate to the tail from the mean and variance. However, the empirical approach does not apply to structures consisting of quasibrittle materials, which are brittle materials with inhomogeneities that are not negligible compared to structure size. This paper presents a refined theory on the strength distribution of quasibrittle structures, which is based on the fracture mechanics of nanocracks propagating by activation energy controlled small jumps through the atomic lattice and an analytical model for the multi-scale transition of strength statistics. Based on the power law for creep crack growth rate and the cdf of material strength, the lifetime distribution of quasibrittle structures under constant load is derived. Both the strength and lifetime cdfs are shown to be sizeand geometry-dependent. The theory predicts intricate size effects on both the mean structural strength and lifetime, the latter being much stronger. The theory is shown to match the experimentally observed systematic deviations of strength and lifetime histograms of industrial ceramics from the Weibull distribution.

  1. The probability distribution for non-Gaussianity estimators constructed from the CMB trispectrum

    CERN Document Server

    Smith, Tristan L

    2012-01-01

    Considerable recent attention has focussed on the prospects to use the cosmic microwave background (CMB) trispectrum to probe the physics of the early universe. Here we evaluate the probability distribution function (PDF) for the standard estimator tau_nle for the amplitude tau_nl of the CMB trispectrum both for the null-hypothesis (i.e., for Gaussian maps with tau_nl = 0) and for maps with a non-vanishing trispectrum (|tau_nl|>0). We find these PDFs to be highly non-Gaussian in both cases. We also evaluate the variance with which the trispectrum amplitude can be measured, , as a function of its underlying value, tau_nl. We find a strong dependence of this variance on tau_nl. We also find that the variance does not, given the highly non-Gaussian nature of the PDF, effectively characterize the distribution. Detailed knowledge of these PDFs will therefore be imperative in order to properly interpret the implications of any given trispectrum measurement. For example, if a CMB experiment with a maximum multipole ...

  2. Probability distribution functions for ELM bursts in a series of JET tokamak discharges

    Energy Technology Data Exchange (ETDEWEB)

    Greenhough, J [Space and Astrophysics Group, Department of Physics, Warwick University, Coventry CV4 7AL (United Kingdom); Chapman, S C [Space and Astrophysics Group, Department of Physics, Warwick University, Coventry CV4 7AL (United Kingdom); Dendy, R O [Space and Astrophysics Group, Department of Physics, Warwick University, Coventry CV4 7AL (United Kingdom); Ward, D J [EURATOM/UKAEA Fusion Association, Culham Science Centre, Abingdon, Oxfordshire OX14 3DB (United Kingdom)

    2003-05-01

    A novel statistical treatment of the full raw edge localized mode (ELM) signal from a series of previously studied JET plasmas is tested. The approach involves constructing probability distribution functions (PDFs) for ELM amplitudes and time separations, and quantifying the fit between the measured PDFs and model distributions (Gaussian, inverse exponential) and Poisson processes. Uncertainties inherent in the discreteness of the raw signal require the application of statistically rigorous techniques to distinguish ELM data points from background, and to extrapolate peak amplitudes. The accuracy of PDF construction is further constrained by the relatively small number of ELM bursts (several hundred) in each sample. In consequence the statistical technique is found to be difficult to apply to low frequency (typically Type I) ELMs, so the focus is narrowed to four JET plasmas with high frequency (typically Type III) ELMs. The results suggest that there may be several fundamentally different kinds of Type III ELMing process at work. It is concluded that this novel statistical treatment can be made to work, may have wider applications to ELM data, and has immediate practical value as an additional quantitative discriminant between classes of ELMing behaviour.

  3. A new probability distribution model of turbulent irradiance based on Born perturbation theory

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The subject of the PDF (Probability Density Function) of the irradiance fluctuations in a turbulent atmosphere is still unsettled.Theory reliably describes the behavior in the weak turbulence regime,but theoretical description in the strong and whole turbulence regimes are still controversial.Based on Born perturbation theory,the physical manifestations and correlations of three typical PDF models (Rice-Nakagami,exponential-Bessel and negative-exponential distribution) were theoretically analyzed.It is shown that these models can be derived by separately making circular-Gaussian,strong-turbulence and strong-turbulence-circular-Gaussian approximations in Born perturbation theory,which denies the viewpoint that the Rice-Nakagami model is only applicable in the extremely weak turbulence regime and provides theoretical arguments for choosing rational models in practical applications.In addition,a common shortcoming of the three models is that they are all approximations.A new model,called the Maclaurin-spread distribution,is proposed without any approximation except for assuming the correlation coefficient to be zero.So,it is considered that the new model can exactly reflect the Born perturbation theory.Simulated results prove the accuracy of this new model.

  4. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution.

    Science.gov (United States)

    Dinov, Ivo D; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2013-01-01

    Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students' understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference.

  5. The Lognormal Probability Distribution Function of the Perseus Molecular Cloud: A Comparison of HI and Dust

    Science.gov (United States)

    Burkhart, Blakesley; Lee, Min-Young; Murray, Claire E.; Stanimirović, Snezana

    2015-10-01

    The shape of the probability distribution function (PDF) of molecular clouds is an important ingredient for modern theories of star formation and turbulence. Recently, several studies have pointed out observational difficulties with constraining the low column density (i.e., {A}V\\lt 1) PDF using dust tracers. In order to constrain the shape and properties of the low column density PDF, we investigate the PDF of multiphase atomic gas in the Perseus molecular cloud using opacity-corrected GALFA-HI data and compare the PDF shape and properties to the total gas PDF and the N(H2) PDF. We find that the shape of the PDF in the atomic medium of Perseus is well described by a lognormal distribution and not by a power-law or bimodal distribution. The peak of the atomic gas PDF in and around Perseus lies at the HI-H2 transition column density for this cloud, past which the N(H2) PDF takes on a power-law form. We find that the PDF of the atomic gas is narrow, and at column densities larger than the HI-H2 transition, the HI rapidly depletes, suggesting that the HI PDF may be used to find the HI-H2 transition column density. We also calculate the sonic Mach number of the atomic gas by using HI absorption line data, which yield a median value of Ms = 4.0 for the CNM, while the HI emission PDF, which traces both the WNM and CNM, has a width more consistent with transonic turbulence.

  6. Rapid flattening of butterfly pitch angle distributions of radiation belt electrons by whistler-mode chorus

    Science.gov (United States)

    Yang, Chang; Su, Zhenpeng; Xiao, Fuliang; Zheng, Huinan; Wang, Yuming; Wang, Shui; Spence, H. E.; Reeves, G. D.; Baker, D. N.; Blake, J. B.; Funsten, H. O.

    2016-08-01

    Van Allen radiation belt electrons exhibit complex dynamics during geomagnetically active periods. Investigation of electron pitch angle distributions (PADs) can provide important information on the dominant physical mechanisms controlling radiation belt behaviors. Here we report a storm time radiation belt event where energetic electron PADs changed from butterfly distributions to normal or flattop distributions within several hours. Van Allen Probes observations showed that the flattening of butterfly PADs was closely related to the occurrence of whistler-mode chorus waves. Two-dimensional quasi-linear STEERB simulations demonstrate that the observed chorus can resonantly accelerate the near-equatorially trapped electrons and rapidly flatten the corresponding electron butterfly PADs. These results provide a new insight on how chorus waves affect the dynamic evolution of radiation belt electrons.

  7. Effect of slope angle of an artificial pool on distributions of turbulence

    Institute of Scientific and Technical Information of China (English)

    Atefeh Fazlollahi; Hossein Afzalimehr; Jueyi Sui

    2015-01-01

    abstract Experiments were carried out over a 2-dimentional pool with a constant length of 1.5 m and four different slopes. The distributions of velocity, Reynolds stress and turbulence intensities have been studied in this paper. Results show that as flow continues up the exit slope, the flow velocity increases near the channel bed and decreases near the water surface. The flow separation was not observed by ADV at the crest of the bed-form. In addition, the length of the separation zone increases with the increasing of entrance and exit slopes. The largest slope angle causes the maximum normalized shear stress. Based on the experiments, it is concluded that the shape of Reynolds stress distribution is generally dependent on the entrance and exit slopes of the pool. Also, the shape of Reynolds stress distribution is affected by both decelerating and accelerating flows. Additionally, with the increase in the slope angle, secondary currents are developed and become more stable. Results of the quadrant analysis show that the momentum between flow and bed-form is mostly transferred by sweep and ejection events.&2015 International Research and Training Centre on Erosion and Sedimentation/the World Association for Sedimentation and Erosion Research. Published by Elsevier B.V. All rights reserved.

  8. Universal Probability Distribution for the Wave Function of a Quantum System Entangled with its Environment

    Science.gov (United States)

    Goldstein, Sheldon; Lebowitz, Joel L.; Mastrodonato, Christian; Tumulka, Roderich; Zanghì, Nino

    2016-03-01

    A quantum system (with Hilbert space {H}1) entangled with its environment (with Hilbert space {H}2) is usually not attributed to a wave function but only to a reduced density matrix {ρ1}. Nevertheless, there is a precise way of attributing to it a random wave function {ψ1}, called its conditional wave function, whose probability distribution {μ1} depends on the entangled wave function {ψ in H1 ⊗ H2} in the Hilbert space of system and environment together. It also depends on a choice of orthonormal basis of H2 but in relevant cases, as we show, not very much. We prove several universality (or typicality) results about {μ1}, e.g., that if the environment is sufficiently large then for every orthonormal basis of H2, most entangled states {ψ} with given reduced density matrix {ρ1} are such that {μ1} is close to one of the so-called GAP (Gaussian adjusted projected) measures, {GAP(ρ1)}. We also show that, for most entangled states {ψ} from a microcanonical subspace (spanned by the eigenvectors of the Hamiltonian with energies in a narrow interval {[E, E+ δ E]}) and most orthonormal bases of H2, {μ1} is close to {GAP({tr}2 ρ_{mc})} with {ρ_{mc}} the normalized projection to the microcanonical subspace. In particular, if the coupling between the system and the environment is weak, then {μ1} is close to {GAP(ρ_β)} with {ρ_β} the canonical density matrix on H1 at inverse temperature {β=β(E)}. This provides the mathematical justification of our claim in Goldstein et al. (J Stat Phys 125: 1193-1221, 2006) that GAP measures describe the thermal equilibrium distribution of the wave function.

  9. Blind 2-D Angles of Arrival Estimation for Distributed Signals Using L-Shaped Arrays

    Institute of Scientific and Technical Information of China (English)

    Yi Zheng; Xue-Gang Wang; Tie-Qi Xia; Qun Wan

    2008-01-01

    Most existing two dimensional (2-D) angles of arrival (AOAs) estimation methods are based on the assumption that the signal sources are point sources. However, in mobile communications, local scattering in the vicinity of the mobile results in angular spreading as seen from a base station antenna array. In this paper, we consider the problem of estimating the 2-D AOAs of spatially distributed sources. First we perform blind estimation of the steering vectors by exploiting joint diagonalization, then the 2-D AOAs are obtained through two fast Fourier transforming of the estimated steering vectors. Simulations are carried out to illustrate the performance of the method.

  10. ANNz2: Photometric Redshift and Probability Distribution Function Estimation using Machine Learning

    Science.gov (United States)

    Sadeh, I.; Abdalla, F. B.; Lahav, O.

    2016-10-01

    We present ANNz2, a new implementation of the public software for photometric redshift (photo-z) estimation of Collister & Lahav, which now includes generation of full probability distribution functions (PDFs). ANNz2 utilizes multiple machine learning methods, such as artificial neural networks and boosted decision/regression trees. The objective of the algorithm is to optimize the performance of the photo-z estimation, to properly derive the associated uncertainties, and to produce both single-value solutions and PDFs. In addition, estimators are made available, which mitigate possible problems of non-representative or incomplete spectroscopic training samples. ANNz2 has already been used as part of the first weak lensing analysis of the Dark Energy Survey, and is included in the experiment's first public data release. Here we illustrate the functionality of the code using data from the tenth data release of the Sloan Digital Sky Survey and the Baryon Oscillation Spectroscopic Survey. The code is available for download at http://github.com/IftachSadeh/ANNZ.

  11. Exact probability distributions of selected species in stochastic chemical reaction networks.

    Science.gov (United States)

    López-Caamal, Fernando; Marquez-Lago, Tatiana T

    2014-09-01

    Chemical reactions are discrete, stochastic events. As such, the species' molecular numbers can be described by an associated master equation. However, handling such an equation may become difficult due to the large size of reaction networks. A commonly used approach to forecast the behaviour of reaction networks is to perform computational simulations of such systems and analyse their outcome statistically. This approach, however, might require high computational costs to provide accurate results. In this paper we opt for an analytical approach to obtain the time-dependent solution of the Chemical Master Equation for selected species in a general reaction network. When the reaction networks are composed exclusively of zeroth and first-order reactions, this analytical approach significantly alleviates the computational burden required by simulation-based methods. By building upon these analytical solutions, we analyse a general monomolecular reaction network with an arbitrary number of species to obtain the exact marginal probability distribution for selected species. Additionally, we study two particular topologies of monomolecular reaction networks, namely (i) an unbranched chain of monomolecular reactions with and without synthesis and degradation reactions and (ii) a circular chain of monomolecular reactions. We illustrate our methodology and alternative ways to use it for non-linear systems by analysing a protein autoactivation mechanism. Later, we compare the computational load required for the implementation of our results and a pure computational approach to analyse an unbranched chain of monomolecular reactions. Finally, we study calcium ions gates in the sarco/endoplasmic reticulum mediated by ryanodine receptors.

  12. Turbulence-Induced Relative Velocity of Dust Particles III: The Probability Distribution

    CERN Document Server

    Pan, Liubin; Scalo, John

    2014-01-01

    Motivated by its important role in the collisional growth of dust particles in protoplanetary disks, we investigate the probability distribution function (PDF) of the relative velocity of inertial particles suspended in turbulent flows. Using the simulation from our previous work, we compute the relative velocity PDF as a function of the friction timescales, tau_p1 and tau_p2, of two particles of arbitrary sizes. The friction time of particles included in the simulation ranges from 0.1 tau_eta to 54T_L, with tau_eta and T_L the Kolmogorov time and the Lagrangian correlation time of the flow, respectively. The relative velocity PDF is generically non-Gaussian, exhibiting fat tails. For a fixed value of tau_p1, the PDF is the fattest for equal-size particles (tau_p2~tau_p1), and becomes thinner at both tau_p2tau_p1. Defining f as the friction time ratio of the smaller particle to the larger one, we find that, at a given f in 1/2>T_L). These features are successfully explained by the Pan & Padoan model. Usin...

  13. Probability Distribution Function of a Forced Passive Tracer in the Lower Stratosphere

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The probability distribution function (PDF) of a passive tracer, forced by a "mean gradient", is studied. First, we take two theoretical approaches, the Lagrangian and the conditional closure formalisms, to study the PDFs of such an externally forced passive tracer. Then, we carry out numerical simulations for an idealized random flow on a sphere and for European Center for Medium-Range Weather Forecasts (ECMWF) stratospheric winds to test whether the mean-gradient model can be applied to studying stratospheric tracer mixing in midlatitude surf zones, in which a weak and poleward zonal-mean gradient is maintained by tracer leakage through polar and tropical mixing barriers, and whether the PDFs of tracer fluctuations in midlatitudes are consistent with the theoretical predictions. The numerical simulations show that when diffusive dissipation is balanced by the mean-gradient forcing, the PDF in the random flow and the Southern-Hemisphere PDFs in ECMWF winds show time-invariant exponential tails, consistent with theoretical predictions. In the Northern Hemisphere, the PDFs exhibit non-Gaussian tails. However, the PDF tails are not consistent with theoretical expectations. The long-term behavior of the PDF tails of the forced tracer is compared to that of a decaying tracer. It is found that the PDF tails of the decaying tracer are time-dependent, and evolve toward flatter than exponential.

  14. Understanding star formation in molecular clouds I. A universal probability distribution of column densities ?

    CERN Document Server

    Schneider, N; Csengeri, T; Klessen, R; Federrath, C; Tremblin, P; Girichidis, P; Bontemps, S; Andre, Ph

    2014-01-01

    Column density maps of molecular clouds are one of the most important observables in the context of molecular cloud- and star-formation (SF) studies. With Herschel it is now possible to reveal rather precisely the column density of dust, which is the best tracer of the bulk of material in molecular clouds. However, line-of-sight (LOS) contamination from fore- or background clouds can lead to an overestimation of the dust emission of molecular clouds, in particular for distant clouds. This implies too high values for column density and mass, and a misleading interpretation of probability distribution functions (PDFs) of the column density. In this paper, we demonstrate by using observations and simulations how LOS contamination affects the PDF. We apply a first-order approximation (removing a constant level) to the molecular clouds of Auriga and Maddalena (low-mass star-forming), and Carina and NGC3603(both high-mass SF regions). In perfect agreement with the simulations, we find that the PDFs become broader, ...

  15. Vertical changes in the probability distribution of downward irradiance within the near-surface ocean under sunny conditions

    Science.gov (United States)

    Gernez, Pierre; Stramski, Dariusz; Darecki, Miroslaw

    2011-07-01

    Time series measurements of fluctuations in underwater downward irradiance, Ed, within the green spectral band (532 nm) show that the probability distribution of instantaneous irradiance varies greatly as a function of depth within the near-surface ocean under sunny conditions. Because of intense light flashes caused by surface wave focusing, the near-surface probability distributions are highly skewed to the right and are heavy tailed. The coefficients of skewness and excess kurtosis at depths smaller than 1 m can exceed 3 and 20, respectively. We tested several probability models, such as lognormal, Gumbel, Fréchet, log-logistic, and Pareto, which are potentially suited to describe the highly skewed heavy-tailed distributions. We found that the models cannot approximate with consistently good accuracy the high irradiance values within the right tail of the experimental distribution where the probability of these values is less than 10%. This portion of the distribution corresponds approximately to light flashes with Ed > 1.5?, where ? is the time-averaged downward irradiance. However, the remaining part of the probability distribution covering all irradiance values smaller than the 90th percentile can be described with a reasonable accuracy (i.e., within 20%) with a lognormal model for all 86 measurements from the top 10 m of the ocean included in this analysis. As the intensity of irradiance fluctuations decreases with depth, the probability distribution tends toward a function symmetrical around the mean like the normal distribution. For the examined data set, the skewness and excess kurtosis assumed values very close to zero at a depth of about 10 m.

  16. Large field distributed aperture laser semiactive angle measurement system design with imaging fiber bundles.

    Science.gov (United States)

    Xu, Chunyun; Cheng, Haobo; Feng, Yunpeng; Jing, Xiaoli

    2016-09-01

    A type of laser semiactive angle measurement system is designed for target detecting and tracking. Only one detector is used to detect target location from four distributed aperture optical systems through a 4×1 imaging fiber bundle. A telecentric optical system in image space is designed to increase the efficiency of imaging fiber bundles. According to the working principle of a four-quadrant (4Q) detector, fiber diamond alignment is adopted between an optical system and a 4Q detector. The structure of the laser semiactive angle measurement system is, we believe, novel. Tolerance analysis is carried out to determine tolerance limits of manufacture and installation errors of the optical system. The performance of the proposed method is identified by computer simulations and experiments. It is demonstrated that the linear region of the system is ±12°, with measurement error of better than 0.2°. In general, this new system can be used with large field of view and high accuracy, providing an efficient, stable, and fast method for angle measurement in practical situations.

  17. Differential Evolution with Adaptive Mutation and Parameter Control Using Lévy Probability Distribution

    Institute of Scientific and Technical Information of China (English)

    Ren-Jie He; Zhen-Yu Yang

    2012-01-01

    Differential evolution (DE) has become a very popular and effective global optimization algorithm in the area of evolutionary computation.In spite of many advantages such as conceptual simplicity,high efficiency and ease of use,DE has two main components,i.e.,mutation scheme and parameter control,which significantly influence its performance.In this paper we intend to improve the performance of DE by using carefully considered strategies for both of the two components.We first design an adaptive mutation scheme,which adaptively makes use of the bias of superior individuals when generating new solutions.Although introducing such a bias is not a new idea,existing methods often use heuristic rules to control the bias.They can hardly maintain the appropriate balance between exploration and exploitation during the search process,because the preferred bias is often problem and evolution-stage dependent.Instead of using any fixed rule,a novel strategy is adopted in the new adaptive mutation scheme to adjust the bias dynamically based on the identified local fitness landscape captured by the current population.As for the other component,i.e.,parameter control,we propose a mechanism by using the Lévy probability distribution to adaptively control the scale factor F of DE.For every mutation in each generation,an Fi is produced from one of four different Lévy distributions according to their historical performance.With the adaptive mutation scheme and parameter control using Lévy distribution as the main components,we present a new DE variant called Lévy DE (LDE).Experimental studies were carried out on a broad range of benchmark functions in global numerical optimization.The results show that LDE is very competitive,and both of the two main components have contributed to its overall performance.The scalability of LDE is also discussed by conducting experiments on some selected benchmark functions with dimensions from 30 to 200.

  18. Optimized chord and twist angle distributions of wind turbine blade considering Reynolds number effects

    Energy Technology Data Exchange (ETDEWEB)

    Wang, L.; Tang, X. [Univ. of Central Lancashire. Engineering and Physical Sciences, Preston (United Kingdom); Liu, X. [Univ. of Cumbria. Sustainable Engineering, Workington (United Kingdom)

    2012-07-01

    The aerodynamic performance of a wind turbine depends very much on its blade geometric design, typically based on the blade element momentum (BEM) theory, which divides the blade into several blade elements. In current blade design practices based on Schmitz rotor design theory, the blade geometric parameters including chord and twist angle distributions are determined based on airfoil aerodynamic data at a specific Reynolds number. However, rotating wind turbine blade elements operate at different Reynolds numbers due to variable wind speed and different blade span locations. Therefore, the blade design through Schmitz rotor theory at a specific Reynolds number does not necessarily provide the best power performance under operational conditions. This paper aims to provide an optimal blade design strategy for horizontal-axis wind turbines operating at different Reynolds numbers. A fixed-pitch variable-speed (FPVS) wind turbine with S809 airfoil is chosen as a case study and a Matlab program which considers Reynolds number effects is developed to determine the optimized chord and twist angle distributions of the blade. The performance of the optimized blade is compared with that of the preliminary blade which is designed based on Schmitz rotor design theory at a specific Reynolds number. The results demonstrate that the proposed blade design optimization strategy can improve the power performance of the wind turbine. This approach can be further developed for any practice of horizontal axis wind turbine blade design. (Author)

  19. Investigating the magnetic inclination angle distribution of $\\gamma$-ray-loud radio pulsars

    CERN Document Server

    Rookyard, S C; Johnston, S

    2014-01-01

    Several studies have shown the distribution of pulsars' magnetic inclination angles to be skewed towards low values compared with the distribution expected if the rotation and magnetic axes are placed randomly on the star. Here we focus on a sample of 28 $\\gamma$-ray-detected pulsars using data taken as part of the Parkes telescope's \\emph{FERMI} timing program. In doing so we find a preference in the sample for low magnetic inclination angles, $\\alpha$, in stark contrast to both the expectation that the magnetic and rotation axes are orientated randomly at the birth of the pulsar and to $\\gamma$-ray-emission-model-based expected biases. In this paper, after exploring potential explanations, we conclude that there are two possible causes of this preference, namely that low $\\alpha$ values are intrinsic to the sample, or that the emission regions extend outside what is traditionally thought to be the open-field-line region in a way which is dependent on the magnetic inclination. Each possibility is expected to...

  20. 多元Beta分布特性分析%Analysis on Multi-dimensional Beta Probability Distribution Function

    Institute of Scientific and Technical Information of China (English)

    潘高田; 梁帆; 郭齐胜; 黄一斌

    2011-01-01

    Based on the quantitative truncated sequential test theory, multi-dimensional Beta probability distribution functions are come across in the problem of weapons system against aerial target hit accuracy tests. This paper analyses multi-dimensional Beta probability distribution function's properties and figures out" part of two-dimensional Beta probability distribution function values. This research plays an important role in the field of weapon system hit accuracy tests.%利用小样本截尾序贯检验理论,在武器系统对空中目标的命中精度检验问题中,遇到了一类多元Beta概率分布函数,讨论分析了多维Beta概率分布函数的特性并给出了概率计算表.结果对武器精度检验具有重要意义和实用价值.

  1. Emergence of visual saliency from natural scenes via context-mediated probability distributions coding.

    Directory of Open Access Journals (Sweden)

    Jinhua Xu

    Full Text Available Visual saliency is the perceptual quality that makes some items in visual scenes stand out from their immediate contexts. Visual saliency plays important roles in natural vision in that saliency can direct eye movements, deploy attention, and facilitate tasks like object detection and scene understanding. A central unsolved issue is: What features should be encoded in the early visual cortex for detecting salient features in natural scenes? To explore this important issue, we propose a hypothesis that visual saliency is based on efficient encoding of the probability distributions (PDs of visual variables in specific contexts in natural scenes, referred to as context-mediated PDs in natural scenes. In this concept, computational units in the model of the early visual system do not act as feature detectors but rather as estimators of the context-mediated PDs of a full range of visual variables in natural scenes, which directly give rise to a measure of visual saliency of any input stimulus. To test this hypothesis, we developed a model of the context-mediated PDs in natural scenes using a modified algorithm for independent component analysis (ICA and derived a measure of visual saliency based on these PDs estimated from a set of natural scenes. We demonstrated that visual saliency based on the context-mediated PDs in natural scenes effectively predicts human gaze in free-viewing of both static and dynamic natural scenes. This study suggests that the computation based on the context-mediated PDs of visual variables in natural scenes may underlie the neural mechanism in the early visual cortex for detecting salient features in natural scenes.

  2. Characterisation of seasonal flood types according to timescales in mixed probability distributions

    Science.gov (United States)

    Fischer, Svenja; Schumann, Andreas; Schulte, Markus

    2016-08-01

    When flood statistics are based on annual maximum series (AMS), the sample often contains flood peaks, which differ in their genesis. If the ratios among event types change over the range of observations, the extrapolation of a probability distribution function (pdf) can be dominated by a majority of events that belong to a certain flood type. If this type is not typical for extraordinarily large extremes, such an extrapolation of the pdf is misleading. To avoid this breach of the assumption of homogeneity, seasonal models were developed that differ between winter and summer floods. We show that a distinction between summer and winter floods is not always sufficient if seasonal series include events with different geneses. Here, we differentiate floods by their timescales into groups of long and short events. A statistical method for such a distinction of events is presented. To demonstrate their applicability, timescales for winter and summer floods in a German river basin were estimated. It is shown that summer floods can be separated into two main groups, but in our study region, the sample of winter floods consists of at least three different flood types. The pdfs of the two groups of summer floods are combined via a new mixing model. This model considers that information about parallel events that uses their maximum values only is incomplete because some of the realisations are overlaid. A statistical method resulting in an amendment of statistical parameters is proposed. The application in a German case study demonstrates the advantages of the new model, with specific emphasis on flood types.

  3. Simulating Tail Probabilities in GI/GI.1 Queues and Insurance Risk Processes with Subexponentail Distributions

    NARCIS (Netherlands)

    Boots, Nam Kyoo; Shahabuddin, Perwez

    2001-01-01

    This paper deals with estimating small tail probabilities of thesteady-state waiting time in a GI/GI/1 queue with heavy-tailed (subexponential) service times. The problem of estimating infinite horizon ruin probabilities in insurance risk processes with heavy-tailed claims can be transformed into th

  4. Understanding star formation in molecular clouds. III. Probability distribution functions of molecular lines in Cygnus X

    Science.gov (United States)

    Schneider, N.; Bontemps, S.; Motte, F.; Ossenkopf, V.; Klessen, R. S.; Simon, R.; Fechtenbaum, S.; Herpin, F.; Tremblin, P.; Csengeri, T.; Myers, P. C.; Hill, T.; Cunningham, M.; Federrath, C.

    2016-03-01

    The probability distribution function of column density (N-PDF) serves as a powerful tool to characterise the various physical processes that influence the structure of molecular clouds. Studies that use extinction maps or H2 column-density maps (N) that are derived from dust show that star-forming clouds can best be characterised by lognormal PDFs for the lower N range and a power-law tail for higher N, which is commonly attributed to turbulence and self-gravity and/or pressure, respectively. While PDFs from dust cover a large dynamic range (typically N ~ 1020-24 cm-2 or Av~ 0.1-1000), PDFs obtained from molecular lines - converted into H2 column density - potentially trace more selectively different regimes of (column) densities and temperatures. They also enable us to distinguish different clouds along the line of sight through using the velocity information. We report here on PDFs that were obtained from observations of 12CO, 13CO, C18O, CS, and N2H+ in the Cygnus X North region, and make a comparison to a PDF that was derived from dust observations with the Herschel satellite. The PDF of 12CO is lognormal for Av ~ 1-30, but is cut for higher Av because of optical depth effects. The PDFs of C18O and 13CO are mostly lognormal up to Av ~ 1-15, followed by excess up to Av ~ 40. Above that value, all CO PDFs drop, which is most likely due to depletion. The high density tracers CS and N2H+ exhibit only a power law distribution between Av ~ 15 and 400, respectively. The PDF from dust is lognormal for Av ~ 3-15 and has a power-law tail up to Av ~ 500. Absolute values for the molecular line column densities are, however, rather uncertain because of abundance and excitation temperature variations. If we take the dust PDF at face value, we "calibrate" the molecular line PDF of CS to that of the dust and determine an abundance [CS]/[H2] of 10-9. The slopes of the power-law tails of the CS, N2H+, and dust PDFs are -1.6, -1.4, and -2.3, respectively, and are thus consistent

  5. Conical pitch angle distributions of very low-energy ion fluxes observed by ISEE 1

    Science.gov (United States)

    Horwitz, J. L.; Baugher, C. R.; Chappell, C. R.; Shelley, E. G.; Young, D. T.

    1982-04-01

    Observations are presented of conical distributions of low-energy ion fluxes from throughout the magnetosphere. The data were provided by the plasma composition experiment (PCE) on ISEE 1. ISEE 1 was launched in October 1977 into a highly elliptical orbit with a 30 deg inclination to the equator and 22.5 earth radii apogee. Particular attention is given to data taken when the instrument was in its thermal plasma mode, sampling ions in the energy per charge range 0-100 eV/e. Attention is given to examples of conical distributions in 0- to 100-eV/e ions, the occurrence of conical distributions of 0- to 100-eV ions in local time-geocentric distance and latitude-geocentric distance coordinates, the cone angles in 0- to 100-eV ion conics, Kp distributions of 0- to 100-eV ion conics, and some compositional aspects of 0- to 100-eV ion conics.

  6. The VIMOS Public Extragalactic Redshift Survey (VIPERS). On the recovery of the count-in-cell probability distribution function

    Science.gov (United States)

    Bel, J.; Branchini, E.; Di Porto, C.; Cucciati, O.; Granett, B. R.; Iovino, A.; de la Torre, S.; Marinoni, C.; Guzzo, L.; Moscardini, L.; Cappi, A.; Abbas, U.; Adami, C.; Arnouts, S.; Bolzonella, M.; Bottini, D.; Coupon, J.; Davidzon, I.; De Lucia, G.; Fritz, A.; Franzetti, P.; Fumana, M.; Garilli, B.; Ilbert, O.; Krywult, J.; Le Brun, V.; Le Fèvre, O.; Maccagni, D.; Małek, K.; Marulli, F.; McCracken, H. J.; Paioro, L.; Polletta, M.; Pollo, A.; Schlagenhaufer, H.; Scodeggio, M.; Tasca, L. A. M.; Tojeiro, R.; Vergani, D.; Zanichelli, A.; Burden, A.; Marchetti, A.; Mellier, Y.; Nichol, R. C.; Peacock, J. A.; Percival, W. J.; Phleps, S.; Wolk, M.

    2016-04-01

    We compare three methods to measure the count-in-cell probability density function of galaxies in a spectroscopic redshift survey. From this comparison we found that, when the sampling is low (the average number of object per cell is around unity), it is necessary to use a parametric method to model the galaxy distribution. We used a set of mock catalogues of VIPERS to verify if we were able to reconstruct the cell-count probability distribution once the observational strategy is applied. We find that, in the simulated catalogues, the probability distribution of galaxies is better represented by a Gamma expansion than a skewed log-normal distribution. Finally, we correct the cell-count probability distribution function from the angular selection effect of the VIMOS instrument and study the redshift and absolute magnitude dependency of the underlying galaxy density function in VIPERS from redshift 0.5 to 1.1. We found a very weak evolution of the probability density distribution function and that it is well approximated by a Gamma distribution, independently of the chosen tracers. Based on observations collected at the European Southern Observatory, Cerro Paranal, Chile, using the Very Large Telescope under programmes 182.A-0886 and partly 070.A-9007. Also based on observations obtained with MegaPrime/MegaCam, a joint project of CFHT and CEA/DAPNIA, at the Canada-France-Hawaii Telescope (CFHT), which is operated by the National Research Council (NRC) of Canada, the Institut National des Sciences de l'Univers of the Centre National de la Recherche Scientifique (CNRS) of France, and the University of Hawaii. This work is based in part on data products produced at TERAPIX and the Canadian Astronomy Data Centre as part of the Canada-France-Hawaii Telescope Legacy Survey, a collaborative project of NRC and CNRS. The VIPERS web site is http://www.vipers.inaf.it/

  7. Region-based approximation of probability distributions (for visibility between imprecise points among obstacles)

    OpenAIRE

    Buchin, K Kevin; Kostitsyna, I Irina; Löffler, M; Silveira, RI

    2014-01-01

    Let $p$ and $q$ be two imprecise points, given as probability density functions on $\\mathbb R^2$, and let $\\cal R$ be a set of $n$ line segments (obstacles) in $\\mathbb R^2$. We study the problem of approximating the probability that $p$ and $q$ can see each other; that is, that the segment connecting $p$ and $q$ does not cross any segment of $\\cal R$. To solve this problem, we approximate each density function by a weighted set of polygons; a novel approach to dealing with probability densit...

  8. Projectile Two-dimensional Coordinate Measurement Method Based on Optical Fiber Coding Fire and its Coordinate Distribution Probability

    Science.gov (United States)

    Li, Hanshan; Lei, Zhiyong

    2013-01-01

    To improve projectile coordinate measurement precision in fire measurement system, this paper introduces the optical fiber coding fire measurement method and principle, sets up their measurement model, and analyzes coordinate errors by using the differential method. To study the projectile coordinate position distribution, using the mathematical statistics hypothesis method to analyze their distributing law, firing dispersion and probability of projectile shooting the object center were put under study. The results show that exponential distribution testing is relatively reasonable to ensure projectile position distribution on the given significance level. Through experimentation and calculation, the optical fiber coding fire measurement method is scientific and feasible, which can gain accurate projectile coordinate position.

  9. Effects of Turbulent Aberrations on Probability Distribution of Orbital Angular Momentum for Optical Communication

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yi-Xin; CANG Ji

    2009-01-01

    Effects of atmospheric turbulence tilt, defocus, astigmatism and coma aberrations on the orbital angular mo-mentum measurement probability of photons propagating in weak turbulent regime are modeled with Rytov approximation. By considering the resulting wave as a superposition of angular momentum eigenstates, the or-bital angular momentum measurement probabilities of the transmitted digit axe presented. Our results show that the effect of turbulent tilt aberration on the orbital angular momentum measurement probabilities of photons is the maximum among these four kinds of aberrations. As the aberration order increases, the effects of turbulence aberrations on the measurement probabilities of orbital angular momentum generally decrease, whereas the effect of turbulence defoens can be ignored. For tilt aberration, as the difference between the measured orbital angular momentum and the original orbital angular momentum increases, the orbital angular momentum measurement probabifity decreases.

  10. Calculation of ruin probabilities for a dense class of heavy tailed distributions

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis; Samorodnitsky, Gennady

    2015-01-01

    In this paper, we propose a class of infinite-dimensional phase-type distributions with finitely many parameters as models for heavy tailed distributions. The class of finite-dimensional phase-type distributions is dense in the class of distributions on the positive reals and may hence approximat...

  11. Fitting a distribution to censored contamination data using Markov Chain Monte Carlo methods and samples selected with unequal probabilities.

    Science.gov (United States)

    Williams, Michael S; Ebel, Eric D

    2014-11-18

    The fitting of statistical distributions to chemical and microbial contamination data is a common application in risk assessment. These distributions are used to make inferences regarding even the most pedestrian of statistics, such as the population mean. The reason for the heavy reliance on a fitted distribution is the presence of left-, right-, and interval-censored observations in the data sets, with censored observations being the result of nondetects in an assay, the use of screening tests, and other practical limitations. Considerable effort has been expended to develop statistical distributions and fitting techniques for a wide variety of applications. Of the various fitting methods, Markov Chain Monte Carlo methods are common. An underlying assumption for many of the proposed Markov Chain Monte Carlo methods is that the data represent independent and identically distributed (iid) observations from an assumed distribution. This condition is satisfied when samples are collected using a simple random sampling design. Unfortunately, samples of food commodities are generally not collected in accordance with a strict probability design. Nevertheless, pseudosystematic sampling efforts (e.g., collection of a sample hourly or weekly) from a single location in the farm-to-table continuum are reasonable approximations of a simple random sample. The assumption that the data represent an iid sample from a single distribution is more difficult to defend if samples are collected at multiple locations in the farm-to-table continuum or risk-based sampling methods are employed to preferentially select samples that are more likely to be contaminated. This paper develops a weighted bootstrap estimation framework that is appropriate for fitting a distribution to microbiological samples that are collected with unequal probabilities of selection. An example based on microbial data, derived by the Most Probable Number technique, demonstrates the method and highlights the

  12. Angle resolved photoelectron distribution of the 1{pi} resonance of CO/Pt(111)

    Energy Technology Data Exchange (ETDEWEB)

    Haarlammert, Thorben; Wegner, Sebastian; Tsilimis, Grigorius; Zacharias, Helmut [Physikalisches Institut, Westfaelische Wilhelms Universitaet, Muenster (Germany); Golovin, Alexander [Institute of Physics, St. Petersburg State University (Russian Federation)

    2009-07-01

    The CO 1{pi} level of a c(4 x 2)-2CO/Pt(111) reconstruction shows a significant resonance when varying the photon energy between h{nu}=23 eV and h{nu}=48 e V. This resonance has not been observed in gas phase measurements or on the Pt(1 10) surface. To investigate the photoelectron distribution of the 1{pi} level high harmonic radiaton has been used. By conversion in rare gases like argon, neon, or helium photon energies of up to 100 eV have been generated at repetition r ates of up to 10 kHz. The single harmonics have been separated and focused by a toroidal grating and directed to the sample surface. A time-of-flight detector with multiple anodes registers the kinetic energies of the emitted photoelectrons and enables the simultaneous detection of multiple emission angles. The angular distributions of photoelectrons emitted from the CO 1{pi} level have been measured for a variety of initial photon energies. Further the angular distributions of the CO 1{pi} level photoelectrons emitted from a CO-Pt{sub 7} cluster have been calculated using the MSX{alpha}-Method which shows good agreement with the ex perimental data.

  13. A Neural Network Approach for Identifying Relativistic Electron Pitch Angle Distributions in Van Allen Probes Data

    Science.gov (United States)

    Souza, V. M. C. E. S.; Vieira, L.; Alves, L. R.; Da Silva, L. A.; Koga, D.; Sibeck, D. G.; Walsh, B.; Kanekal, S. G.; Silveira, M. D.; Medeiros, C.; Mendes, O., Jr.; Marchezi, J.; Rockenbach, M.; Jauer, P. R.; Gonzalez, W.; Baker, D. N.

    2015-12-01

    A myriad of physical phenomena occur in the inner magnetosphere, in particular at the Earth's radiation belts, which can be a result of the combination of both internal and external processes. However, the connection between physical processes occurring deep within the magnetosphere and external interplanetary drivers it is not yet well understood. In this work we investigate whether a selected set of interplanetary structures affect the local time distribution of three different classes of high energy electron pitch angle distributions (PADs), namely normal, isotropic, and butterfly. We split this work into two parts: initially we focus on the methodology used which employs a Self-Organized Feature Map (SOFM) neural network for identifying different classes of electron PAD shapes in the Van Allen Probes' Relativistic Electron Proton Telescope (REPT) data. The algorithm can categorize the input data into an arbitrary number of classes from which three of them appears the most: normal, isotropic and butterfly. Other classes which are related with these three also emerge and deserve to be addressed in detail in future works. We also discuss the uncertainties of the algorithm. Then, we move to the second part where we describe in details the criteria used for selecting the interplanetary events, and also try to investigate the relation between key parameters characterizing such interplanetary structures and the local time distributions of electron PAD shapes.

  14. Fault Line Selection Method Considering Grounding Fault Angle for Distribution Network

    Institute of Scientific and Technical Information of China (English)

    Li Si-bo; Zhao Yu-lin; Li Ji-chang; Sui Tao

    2015-01-01

    In the distribution network system with its neutral point grounding via arc suppression coil, when single-phase grounding fault occurred near zero-crossing point of the phase voltage, the inaccuracy of the line selection always existed in existing methods. According to the characteristics that transient current was different between the fault feeder and other faultless feeders, wavelet transformation was performed on data of the transient current within a power frequency cycle after the fault occurred. Based on different fault angles, wavelet energy in corresponding frequency band was chosen to compare. The result was that wavelet energy in fault feeder was the largest of all, and it was larger than sum of those in other faultless feeders, when the bus broke down, the disparity between each wavelet energy was not significant. Fault line could be selected out by the criterion above. The results of MATLAB/simulink simulation experiment indicated that this method had anti-interference capacity and was feasible.

  15. Analysis of the distribution of pitch angles in model galactic disks - Numerical methods and algorithms

    Science.gov (United States)

    Russell, William S.; Roberts, William W., Jr.

    1993-01-01

    An automated mathematical method capable of successfully isolating the many different features in prototype and observed spiral galaxies and of accurately measuring the pitch angles and lengths of these individual features is developed. The method is applied to analyze the evolution of specific features in a prototype galaxy exhibiting flocculent spiral structure. The mathematical-computational method was separated into two components. Initially, the galaxy was partitioned into dense regions constituting features using two different methods. The results obtained using these two partitioning algorithms were very similar, from which it is inferred that no numerical biasing was evident and that capturing of the features was consistent. Standard least-squares methods underestimated the true slope of the cloud distribution and were incapable of approximating an orientation of 45 deg. The problems were overcome by introducing a superior fit least-squares method, developed with the intention of calculating true orientation rather than a regression line.

  16. Remark about Transition Probabilities Calculation for Single Server Queues with Lognormal Inter-Arrival or Service Time Distributions

    Science.gov (United States)

    Lee, Moon Ho; Dudin, Alexander; Shaban, Alexy; Pokhrel, Subash Shree; Ma, Wen Ping

    Formulae required for accurate approximate calculation of transition probabilities of embedded Markov chain for single-server queues of the GI/M/1, GI/M/1/K, M/G/1, M/G/1/K type with heavy-tail lognormal distribution of inter-arrival or service time are given.

  17. Benchmarking PARTISN with Analog Monte Carlo: Moments of the Neutron Number and the Cumulative Fission Number Probability Distributions

    Energy Technology Data Exchange (ETDEWEB)

    O' Rourke, Patrick Francis [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-10-27

    The purpose of this report is to provide the reader with an understanding of how a Monte Carlo neutron transport code was written, developed, and evolved to calculate the probability distribution functions (PDFs) and their moments for the neutron number at a final time as well as the cumulative fission number, along with introducing several basic Monte Carlo concepts.

  18. Predicting the side-chain dihedral angle distributions of nonpolar, aromatic, and polar amino acids using hard sphere models.

    Science.gov (United States)

    Zhou, Alice Qinhua; O'Hern, Corey S; Regan, Lynne

    2014-10-01

    The side-chain dihedral angle distributions of all amino acids have been measured from myriad high-resolution protein crystal structures. However, we do not yet know the dominant interactions that determine these distributions. Here, we explore to what extent the defining features of the side-chain dihedral angle distributions of different amino acids can be captured by a simple physical model. We find that a hard-sphere model for a dipeptide mimetic that includes only steric interactions plus stereochemical constraints is able to recapitulate the key features of the back-bone dependent observed amino acid side-chain dihedral angle distributions of Ser, Cys, Thr, Val, Ile, Leu, Phe, Tyr, and Trp. We find that for certain amino acids, performing the calculations with the amino acid of interest in the central position of a short α-helical segment improves the match between the predicted and observed distributions. We also identify the atomic interactions that give rise to the differences between the predicted distributions for the hard-sphere model of the dipeptide and that of the α-helical segment. Finally, we point out a case where the hard-sphere plus stereochemical constraint model is insufficient to recapitulate the observed side-chain dihedral angle distribution, namely the distribution P(χ₃) for Met.

  19. RUNS TEST FOR A CIRCULAR DISTRIBUTION AND A TABLE OF PROBABILITIES

    Science.gov (United States)

    of the well-known Wald - Wolfowitz runs test for a distribution on a straight line. The primary advantage of the proposed test is that it minimizes the number of assumptions on the theoretical distribution.

  20. Deduction of compound nucleus formation probability from the fragment angular distributions in heavy-ion reactions

    Science.gov (United States)

    Yadav, C.; Thomas, R. G.; Mohanty, A. K.; Kapoor, S. S.

    2015-07-01

    The presence of various fissionlike reactions in heavy-ion induced reactions is a major hurdle in the path to laboratory synthesis of heavy and super-heavy nuclei. It is known that the cross section of forming a heavy evaporation residue in fusion reactions depends on the three factors—the capture cross section, probability of compound nucleus formation PCN, and the survival probability of the compound nucleus against fission. As the probability of compound nucleus formation, PCN is difficult to theoretically estimate because of its complex dependence on several parameters; attempts have been made in the past to deduce it from the fission fragment anisotropy data. In the present work, the fragment anisotropy data for a number of heavy-ion reactions are analyzed and it is found that deduction of PCN from the anisotropy data also requires the knowledge of the ratio of relaxation time of the K degree of freedom to pre-equilibrium fission time.

  1. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...

  2. Comparison of the diagnostic ability of Moorfield′s regression analysis and glaucoma probability score using Heidelberg retinal tomograph III in eyes with primary open angle glaucoma

    Directory of Open Access Journals (Sweden)

    Jindal Shveta

    2010-01-01

    Full Text Available Purpose: To compare the diagnostic performance of the Heidelberg retinal tomograph (HRT glaucoma probability score (GPS with that of Moorfield′s regression analysis (MRA. Materials and Methods: The study included 50 eyes of normal subjects and 50 eyes of subjects with early-to-moderate primary open angle glaucoma. Images were obtained by using HRT version 3.0. Results: The agreement coefficient (weighted k for the overall MRA and GPS classification was 0.216 (95% CI: 0.119 - 0.315. The sensitivity and specificity were evaluated using the most specific (borderline results included as test negatives and least specific criteria (borderline results included as test positives. The MRA sensitivity and specificity were 30.61 and 98% (most specific and 57.14 and 98% (least specific. The GPS sensitivity and specificity were 81.63 and 73.47% (most specific and 95.92 and 34.69% (least specific. The MRA gave a higher positive likelihood ratio (28.57 vs. 3.08 and the GPS gave a higher negative likelihood ratio (0.25 vs. 0.44.The sensitivity increased with increasing disc size for both MRA and GPS. Conclusions: There was a poor agreement between the overall MRA and GPS classifications. GPS tended to have higher sensitivities, lower specificities, and lower likelihood ratios than the MRA. The disc size should be taken into consideration when interpreting the results of HRT, as both the GPS and MRA showed decreased sensitivity for smaller discs and the GPS showed decreased specificity for larger discs.

  3. Chaos optimization algorithms based on chaotic maps with different probability distribution and search speed for global optimization

    Science.gov (United States)

    Yang, Dixiong; Liu, Zhenjun; Zhou, Jilei

    2014-04-01

    Chaos optimization algorithms (COAs) usually utilize the chaotic map like Logistic map to generate the pseudo-random numbers mapped as the design variables for global optimization. Many existing researches indicated that COA can more easily escape from the local minima than classical stochastic optimization algorithms. This paper reveals the inherent mechanism of high efficiency and superior performance of COA, from a new perspective of both the probability distribution property and search speed of chaotic sequences generated by different chaotic maps. The statistical property and search speed of chaotic sequences are represented by the probability density function (PDF) and the Lyapunov exponent, respectively. Meanwhile, the computational performances of hybrid chaos-BFGS algorithms based on eight one-dimensional chaotic maps with different PDF and Lyapunov exponents are compared, in which BFGS is a quasi-Newton method for local optimization. Moreover, several multimodal benchmark examples illustrate that, the probability distribution property and search speed of chaotic sequences from different chaotic maps significantly affect the global searching capability and optimization efficiency of COA. To achieve the high efficiency of COA, it is recommended to adopt the appropriate chaotic map generating the desired chaotic sequences with uniform or nearly uniform probability distribution and large Lyapunov exponent.

  4. Classification of Knee Joint Vibration Signals Using Bivariate Feature Distribution Estimation and Maximal Posterior Probability Decision Criterion

    Directory of Open Access Journals (Sweden)

    Fang Zheng

    2013-04-01

    Full Text Available Analysis of knee joint vibration or vibroarthrographic (VAG signals using signal processing and machine learning algorithms possesses high potential for the noninvasive detection of articular cartilage degeneration, which may reduce unnecessary exploratory surgery. Feature representation of knee joint VAG signals helps characterize the pathological condition of degenerative articular cartilages in the knee. This paper used the kernel-based probability density estimation method to model the distributions of the VAG signals recorded from healthy subjects and patients with knee joint disorders. The estimated densities of the VAG signals showed explicit distributions of the normal and abnormal signal groups, along with the corresponding contours in the bivariate feature space. The signal classifications were performed by using the Fisher’s linear discriminant analysis, support vector machine with polynomial kernels, and the maximal posterior probability decision criterion. The maximal posterior probability decision criterion was able to provide the total classification accuracy of 86.67% and the area (Az of 0.9096 under the receiver operating characteristics curve, which were superior to the results obtained by either the Fisher’s linear discriminant analysis (accuracy: 81.33%, Az: 0.8564 or the support vector machine with polynomial kernels (accuracy: 81.33%, Az: 0.8533. Such results demonstrated the merits of the bivariate feature distribution estimation and the superiority of the maximal posterior probability decision criterion for analysis of knee joint VAG signals.

  5. In favor of general probability distributions: lateral prefrontal and insular cortices respond to stimulus inherent, but irrelevant differences.

    Science.gov (United States)

    Mestres-Missé, Anna; Trampel, Robert; Turner, Robert; Kotz, Sonja A

    2016-04-01

    A key aspect of optimal behavior is the ability to predict what will come next. To achieve this, we must have a fairly good idea of the probability of occurrence of possible outcomes. This is based both on prior knowledge about a particular or similar situation and on immediately relevant new information. One question that arises is: when considering converging prior probability and external evidence, is the most probable outcome selected or does the brain represent degrees of uncertainty, even highly improbable ones? Using functional magnetic resonance imaging, the current study explored these possibilities by contrasting words that differ in their probability of occurrence, namely, unbalanced ambiguous words and unambiguous words. Unbalanced ambiguous words have a strong frequency-based bias towards one meaning, while unambiguous words have only one meaning. The current results reveal larger activation in lateral prefrontal and insular cortices in response to dominant ambiguous compared to unambiguous words even when prior and contextual information biases one interpretation only. These results suggest a probability distribution, whereby all outcomes and their associated probabilities of occurrence--even if very low--are represented and maintained.

  6. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  7. Sampling informative/complex a priori probability distributions using Gibbs sampling assisted by sequential simulation

    DEFF Research Database (Denmark)

    Hansen, Thomas Mejer; Mosegaard, Klaus; Cordua, Knud Skou

    2010-01-01

    Markov chain Monte Carlo methods such as the Gibbs sampler and the Metropolis algorithm can be used to sample the solutions to non-linear inverse problems. In principle these methods allow incorporation of arbitrarily complex a priori information, but current methods allow only relatively simple...... this algorithm with the Metropolis algorithm to obtain an efficient method for sampling posterior probability densities for nonlinear inverse problems....

  8. Twenty-four hour predictions of the solar wind speed peaks by the probability distribution function model

    Science.gov (United States)

    Bussy-Virat, C. D.; Ridley, A. J.

    2016-10-01

    Abrupt transitions from slow to fast solar wind represent a concern for the space weather forecasting community. They may cause geomagnetic storms that can eventually affect systems in orbit and on the ground. Therefore, the probability distribution function (PDF) model was improved to predict enhancements in the solar wind speed. New probability distribution functions allow for the prediction of the peak amplitude and the time to the peak while providing an interval of uncertainty on the prediction. It was found that 60% of the positive predictions were correct, while 91% of the negative predictions were correct, and 20% to 33% of the peaks in the speed were found by the model. This represents a considerable improvement upon the first version of the PDF model. A direct comparison with the Wang-Sheeley-Arge model shows that the PDF model is quite similar, except that it leads to fewer false positive predictions and misses fewer events, especially when the peak reaches very high speeds.

  9. Probability distributions of bed load particle velocities, accelerations, hop distances, and travel times informed by Jaynes's principle of maximum entropy

    Science.gov (United States)

    Furbish, David J.; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan L.

    2016-01-01

    We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.

  10. Implications of unusual pitch-angle distributions observed by ISEE-1 and 2

    Directory of Open Access Journals (Sweden)

    C. A. Zuluaga

    2006-11-01

    Full Text Available Unusual energetic particle pitch angle distributions (PADs were observed by the ISEE-1 and 2 satellites at 3 h MLT and a radial distance of about 10–15 RE during the time period of 07:00-14:00 UT on 3 March 1979. The ISEE-1 satellite obtained complete 3-D distributions of energetic proton and electron fluxes as a function of energy, while ISEE-2 was configured to provide higher time resolution but less angular resolution than ISEE-1. The ISEE-1 observed a butterfly PAD (a minimum in the 90° PA particle flux for a period of about 2 h (10:00–12:00 UT for the electrons, and 3 h (09:00–12:00 UT for the protons over an energy range of 22.5–189 keV (E1–E4 for the electrons and 24–142 keV (P1–P4 for the protons. The small pitch angle (15°, 30° charged particles (electrons and protons are seen to behave collectively in all four energy ranges. The relative differences in electron fluxes between 15° PA and 90° PA are more significant for higher energy channels during the butterfly PAD period. Three different types of electron PADs (butterfly, isotropic, and peaked-at-90° were observed at the same location and time as a function of energy for a short period of time before 10:00 UT. Electron butterfly distributions were also observed by the ISEE-2 for about 1.5 h over 28–62 keV (E2–E4, although less well resolved than ISEE-1. Unlike the ISEE-1, no butterfly distributions were resolved in the ISEE-2 proton PADs due to less angular resolution. The measured drift effects by ISEE-1 suggest that the detected protons were much closer to the particle source than the electrons along their trajectories, and thus ruled out a nightside source within 18:00 MLT to 03:00 MLT. Compared to 07:30 UT, the charged particle fluxes measured by ISEE-1 were enhanced by up to three orders of magnitude during the period 08:30–12:00 UT. From 09:10:00 UT to 11:50 UT, the geomagnetic conditions were quiet (AE<100 nT, the LANL geosynchronous

  11. High-energy spectrum and zenith-angle distribution of atmospheric neutrinos

    CERN Document Server

    Sinegovsky, S I; Sinegovskaya, T S

    2011-01-01

    High-energy neutrinos, arising from decays of mesons produced through the collisions of cosmic ray particles with air nuclei, form the background in the astrophysical neutrino detection problem. An ambiguity in high-energy behavior of pion and especially kaon production cross sections for nucleon-nucleus collisions may affect essentially the calculated neutrino flux. We present results of the calculation of the energy spectrum and zenith-angle distribution of the muon and electron atmospheric neutrinos in the energy range 10 GeV to 10 PeV. The calculation was performed with usage of known hadronic models (QGSJET-II-03, SIBYLL 2.1, Kimel & Mokhov) for two of the primary spectrum parametrizations, by Gaisser & Honda and by Zatsepin & Sokolskaya. The comparison of the calculated muon neutrino spectrum with the IceCube40 experiment data make it clear that even at energies above 100 TeV the prompt neutrino contribution is not so apparent because of tangled uncertainties of the strange (kaons) and charm...

  12. Using hybrid angle/distance information for distributed topology control in vehicular sensor networks.

    Science.gov (United States)

    Huang, Chao-Chi; Chiu, Yang-Hung; Wen, Chih-Yu

    2014-10-27

    In a vehicular sensor network (VSN), the key design issue is how to organize vehicles effectively, such that the local network topology can be stabilized quickly. In this work, each vehicle with on-board sensors can be considered as a local controller associated with a group of communication members. In order to balance the load among the nodes and govern the local topology change, a group formation scheme using localized criteria is implemented. The proposed distributed topology control method focuses on reducing the rate of group member change and avoiding the unnecessary information exchange. Two major phases are sequentially applied to choose the group members of each vehicle using hybrid angle/distance information. The operation of Phase I is based on the concept of the cone-based method, which can select the desired vehicles quickly. Afterwards, the proposed time-slot method is further applied to stabilize the network topology. Given the network structure in Phase I, a routing scheme is presented in Phase II. The network behaviors are explored through simulation and analysis in a variety of scenarios. The results show that the proposed mechanism is a scalable and effective control framework for VSNs.

  13. A Probability Distribution of Surface Elevation for Wind Waves in Terms of the Gram-Charlier Series

    Institute of Scientific and Technical Information of China (English)

    黄传江; 戴德君; 王伟; 钱成春

    2003-01-01

    Laboratory experiments are conducted to study the probability distribution of surface elevation for wind waves and the convergence is discussed of the Gram-Charlier series in describing the surface elevation distribution. Results show that the agreement between the Gram-Charlier series and the observed distribution becomes better and better as the truncated order of the series increases in a certain range, which is contrary to the phenomenon observed by Huang and Long (1980). It is also shown that the Gram-Charlier series is sensitive to the anomalies in the data set which will make the agreement worse if they are not preprocessed appropriately. Negative values of the probability distribution expressed by the Gram-Charlier series in some ranges of surface elevations are discussed, but the absolute values of the negative values as well as the ranges of their occurrence become smaller gradually as more and more terms are included. Therefore the negative values will have no evident effect on the form of the whole surface elevation distribution when the series is truncated at higher orders. Furthermore, a simple recurrence formula is obtained to calculate the coefficients of the Gram-Charlier series in order to extend the Gram-Charlier series to high orders conveniently.

  14. Void probability as a function of the void's shape and scale-invariant models. [in studies of spacial galactic distribution

    Science.gov (United States)

    Elizalde, E.; Gaztanaga, E.

    1992-01-01

    The dependence of counts in cells on the shape of the cell for the large scale galaxy distribution is studied. A very concrete prediction can be done concerning the void distribution for scale invariant models. The prediction is tested on a sample of the CfA catalog, and good agreement is found. It is observed that the probability of a cell to be occupied is bigger for some elongated cells. A phenomenological scale invariant model for the observed distribution of the counts in cells, an extension of the negative binomial distribution, is presented in order to illustrate how this dependence can be quantitatively determined. An original, intuitive derivation of this model is presented.

  15. Classical probability density distributions with uncertainty relations for ground states of simple non-relativistic quantum-mechanical systems

    Science.gov (United States)

    Radożycki, Tomasz

    2016-11-01

    The probability density distributions for the ground states of certain model systems in quantum mechanics and for their classical counterparts are considered. It is shown, that classical distributions are remarkably improved by incorporating into them the Heisenberg uncertainty relation between position and momentum. Even the crude form of this incorporation makes the agreement between classical and quantum distributions unexpectedly good, except for the small area, where classical momenta are large. It is demonstrated that the slight improvement of this form, makes the classical distribution very similar to the quantum one in the whole space. The obtained results are much better than those from the WKB method. The paper is devoted to ground states, but the method applies to excited states too.

  16. Numerical Analysis on the Effect of Boom Sprayer Collecting Plate Angle to the Distribution of Granular Fertilizers

    Science.gov (United States)

    Pei Ying, Eng; Ngali, Zamani; Tukiman, Rosman

    2017-01-01

    Optimization of boom sprayer collecting plate angle is a tedious procedure if it is done fully experimental. This paper demonstrates that the optimization process is more practical by simulation analysis validate through logical reflection of particles. This study is carrying out through simulating the distribution parts of the boom sprayer by using the commercial software, ANSYS. The multiphysics capabilities of ANSYS enable ANSYS to carrying out this simulation. The simulation is carrying out by manipulating the angle of the collecting plate, 32o, 60o,90o and 120o of the boom sprayer to find the optimum range of angle that will produce a good distribution for different sizes of the granular fertilizers and air velocity of the blower. The constant variables in this simulation are the atmospheric pressure of 1 atm and the particles size of Potassium K is 1mm. There are 60 per cent of the images produce by ANSYS, through observing the number of stream lines and the angle of distribution show that the optimum angle is between 32o to 60o. For further study, in order to increase the accuracy, the simulation is further validate through experiment. It is preferred to carry up the experiment through scaled down model without causing any changes to the current design and in order to be carrying out in the lab.

  17. Research on the behavior of fiber orientation probability distribution function in the planar flows

    Institute of Scientific and Technical Information of China (English)

    ZHOU Kun; LIN Jian-zhong

    2005-01-01

    The equation of two-dimensional fiber direction vector was solved theoretically to give the fiber orientation distribution in simple shear flow, flow with two direction shears, extensional flow and arbitrary planar incompressible flow. The Fokker-Planck equation was solved numerically to validify the theoretical solutions. The stable orientation and orientation period of fiber were obtained. The results showed that the fiber orientation distribution is dependent on the relative not absolute magnitude of the matrix rate-of-strain of flow. The effect of fiber aspect ratio on the orientation distribution of fiber is insignificant in most conditions except the simple shear case. It was proved that the results for a planar flow could be generalized to the case of 3-D fiber direction vector.

  18. Low-energy (less than 100 eV) ion pitch angle distributions in the magnetosphere by ISEE 1

    Science.gov (United States)

    Johnson, J. F. E.; Chappell, C. R.; Nagai, T.

    1983-09-01

    Attention is given to isotropic distribution, bidirectional field alignment distribution, unidirectional field alignment distribution, and low flux, in a statistical examination of low energy ion data from the ISEE 1 plasma composition experiment whose aim was the study of pitch angle distributions in all local times of the magnetosphere. The isotropic distribution consisting of less than 10 eV ions is a persistent inner region feature, while the bidirectional field-aligned distribution consisting of warm ions is a persistent feature of the outer dayside and is seen just outside the isotropic distribution region of the nightside. On the outer nightside, the unidirectional field-aligned distribution consisting of warm ions is the dominant signature. The 'sources' of ions in various regions are discussed in view of the present and other results.

  19. Multiparameter probability distributions for heavy rainfall modeling in extreme southern Brazil

    Directory of Open Access Journals (Sweden)

    Samuel Beskow

    2015-09-01

    New hydrological insights for the region: The Anderson–Darling and Filliben tests were the most restrictive in this study. Based on the Anderson–Darling test, it was found that the Kappa distribution presented the best performance, followed by the GEV. This finding provides evidence that these multiparameter distributions result, for the region of study, in greater accuracy for the generation of intensity–duration–frequency curves and the prediction of peak streamflows and design hydrographs. As a result, this finding can support the design of hydraulic structures and flood management in river basins.

  20. Effect of Shouldering Angle on Distribution of Thermal Stress in Sapphire Single Crystal Growth Using Improved Kyropoulos

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    A two-dimensional model was established in the rectangular co-ordinate to study the thermal stress in the sapphire single crystal grown by the improved Kyropoulos. In the simulation, the distribution, the maximum and minimum values of the thermal stress were calculated. In addition, the relationship between the thermal stress and the shouldering angles was obtained that for lower shouldering angles, the maximum of the thermal stress value is lower and the minimum value is higher. It indicates that the distribution of the thermal stress can be improved by decreasing the shouldering angles of the crystal during the growth process. To evaluate the model, the experiment was carried out and the results are in good agreement with the calculation.

  1. Is extrapair mating random? On the probability distribution of extrapair young in avian broods

    NARCIS (Netherlands)

    Brommer, Jon E.; Korsten, Peter; Bouwman, Karen A.; Berg, Mathew L.; Komdeur, Jan

    2007-01-01

    A dichotomy in female extrapair copulation (EPC) behavior, with some females seeking EPC and others not, is inferred if the observed distribution of extrapair young (EPY) over broods differs from a random process on the level of individual offspring (binomial, hypergeometrical, or Poisson). A review

  2. Ruin Probabilities and Aggregrate Claims Distributions for Shot Noise Cox Processes

    DEFF Research Database (Denmark)

    Albrecher, H.; Asmussen, Søren

    We consider a risk process Rt where the claim arrival process is a superposition of a homogeneous Poisson process and a Cox process with a Poisson shot noise intensity process, capturing the effect of sudden increases of the claim intensity due to external events. The distribution of the aggregate...

  3. Optimization of beam angles for intensity modulated radiation therapy treatment planning using genetic algorithm on a distributed computing platform.

    Science.gov (United States)

    Nazareth, Daryl P; Brunner, Stephen; Jones, Matthew D; Malhotra, Harish K; Bakhtiari, Mohammad

    2009-07-01

    Planning intensity modulated radiation therapy (IMRT) treatment involves selection of several angle parameters as well as specification of structures and constraints employed in the optimization process. Including these parameters in the combinatorial search space vastly increases the computational burden, and therefore the parameter selection is normally performed manually by a clinician, based on clinical experience. We have investigated the use of a genetic algorithm (GA) and distributed-computing platform to optimize the gantry angle parameters and provide insight into additional structures, which may be necessary, in the dose optimization process to produce optimal IMRT treatment plans. For an IMRT prostate patient, we produced the first generation of 40 samples, each of five gantry angles, by selecting from a uniform random distribution, subject to certain adjacency and opposition constraints. Dose optimization was performed by distributing the 40-plan workload over several machines running a commercial treatment planning system. A score was assigned to each resulting plan, based on how well it satisfied clinically-relevant constraints. The second generation of 40 samples was produced by combining the highest-scoring samples using techniques of crossover and mutation. The process was repeated until the sixth generation, and the results compared with a clinical (equally-spaced) gantry angle configuration. In the sixth generation, 34 of the 40 GA samples achieved better scores than the clinical plan, with the best plan showing an improvement of 84%. Moreover, the resulting configuration of beam angles tended to cluster toward the patient's sides, indicating where the inclusion of additional structures in the dose optimization process may avoid dose hot spots. Additional parameter selection in IMRT leads to a large-scale computational problem. We have demonstrated that the GA combined with a distributed-computing platform can be applied to optimize gantry angle

  4. Unit-Sphere Anisotropic Multiaxial Stochastic-Strength Model Probability Density Distribution for the Orientation of Critical Flaws

    Science.gov (United States)

    Nemeth, Noel

    2013-01-01

    Models that predict the failure probability of monolithic glass and ceramic components under multiaxial loading have been developed by authors such as Batdorf, Evans, and Matsuo. These "unit-sphere" failure models assume that the strength-controlling flaws are randomly oriented, noninteracting planar microcracks of specified geometry but of variable size. This report develops a formulation to describe the probability density distribution of the orientation of critical strength-controlling flaws that results from an applied load. This distribution is a function of the multiaxial stress state, the shear sensitivity of the flaws, the Weibull modulus, and the strength anisotropy. Examples are provided showing the predicted response on the unit sphere for various stress states for isotropic and transversely isotropic (anisotropic) materials--including the most probable orientation of critical flaws for offset uniaxial loads with strength anisotropy. The author anticipates that this information could be used to determine anisotropic stiffness degradation or anisotropic damage evolution for individual brittle (or quasi-brittle) composite material constituents within finite element or micromechanics-based software

  5. Luminosity distance in Swiss cheese cosmology with randomized voids. II. Magnification probability distributions

    CERN Document Server

    Flanagan, Éanna É; Wasserman, Ira; Vanderveld, R Ali

    2011-01-01

    We study the fluctuations in luminosity distances due to gravitational lensing by large scale (> 35 Mpc) structures, specifically voids and sheets. We use a simplified "Swiss cheese" model consisting of a \\Lambda -CDM Friedman-Robertson-Walker background in which a number of randomly distributed non-overlapping spherical regions are replaced by mass compensating comoving voids, each with a uniform density interior and a thin shell of matter on the surface. We compute the distribution of magnitude shifts using a variant of the method of Holz & Wald (1998), which includes the effect of lensing shear. The standard deviation of this distribution is ~ 0.027 magnitudes and the mean is ~ 0.003 magnitudes for voids of radius 35 Mpc, sources at redshift z_s=1.0, with the voids chosen so that 90% of the mass is on the shell today. The standard deviation varies from 0.005 to 0.06 magnitudes as we vary the void size, source redshift, and fraction of mass on the shells today. If the shell walls are given a finite thic...

  6. Low-energy (<100 eV) ion pitch angle distributions in the magnetosphere by ISEE 1

    Science.gov (United States)

    Nagai, T.; Johnson, J. F. E.; Chappell, C. R.

    1983-09-01

    Low-energy (plasma composition experiment on ISEE 1 are examined statistically to study pitch angle distributions in all local times of the magnetosphere (L=3-10). The pitch angle distributions in the data set used here can be classified into seven types; however, there are four major types, i.e., isotropic distribution, bi-directional field-aligned distribution unidirectional field-aligned distribution, and low flux. The isotropic distribution that consists of very low energy (typically =10 eV) is a persistent feature on the outer dayside and it is seen just outside the isotropic distribution region of the nightside. It is noted that the loss cone-like structure is also a common feature of this type of distribution in the noon sector. On the outer nightside the unidirectional field-aligned distribution consisting of warm ions is the dominant signature, but in some cases only the low flux (no appreciable flux) is observed. The `sources' of ions in various regions are discussed on the basis of these results and others.

  7. A near-infrared SETI experiment: probability distribution of false coincidences

    Science.gov (United States)

    Maire, Jérôme; Wright, Shelley A.; Werthimer, Dan; Treffers, Richard R.; Marcy, Geoffrey W.; Stone, Remington P. S.; Drake, Frank; Siemion, Andrew

    2014-07-01

    A Search for Extraterrestrial Life (SETI), based on the possibility of interstellar communication via laser signals, is being designed to extend the search into the near-infrared spectral region (Wright et al, this conference). The dedicated near-infrared (900 to 1700 nm) instrument takes advantage of a new generation of avalanche photodiodes (APD), based on internal discrete amplification. These discrete APD (DAPD) detectors have a high speed response (laser light pulse detection in our experiment. These criteria are defined to optimize the trade between high detection efficiency and low false positive coincident signals, which can be produced by detector dark noise, background light, cosmic rays, and astronomical sources. We investigate experimentally how false coincidence rates depend on the number of detectors in parallel, and on the signal pulse height and width. We also look into the corresponding threshold to each of the signals to optimize the sensitivity while also reducing the false coincidence rates. Lastly, we discuss the analytical solution used to predict the probability of laser pulse detection with multiple detectors.

  8. Image denoising in bidimensional empirical mode decomposition domain: the role of Student's probability distribution function.

    Science.gov (United States)

    Lahmiri, Salim

    2016-03-01

    Hybridisation of the bi-dimensional empirical mode decomposition (BEMD) with denoising techniques has been proposed in the literature as an effective approach for image denoising. In this Letter, the Student's probability density function is introduced in the computation of the mean envelope of the data during the BEMD sifting process to make it robust to values that are far from the mean. The resulting BEMD is denoted tBEMD. In order to show the effectiveness of the tBEMD, several image denoising techniques in tBEMD domain are employed; namely, fourth order partial differential equation (PDE), linear complex diffusion process (LCDP), non-linear complex diffusion process (NLCDP), and the discrete wavelet transform (DWT). Two biomedical images and a standard digital image were considered for experiments. The original images were corrupted with additive Gaussian noise with three different levels. Based on peak-signal-to-noise ratio, the experimental results show that PDE, LCDP, NLCDP, and DWT all perform better in the tBEMD than in the classical BEMD domain. It is also found that tBEMD is faster than classical BEMD when the noise level is low. When it is high, the computational cost in terms of processing time is similar. The effectiveness of the presented approach makes it promising for clinical applications.

  9. How to use MATLAB to fit the ex-Gaussian and other probability functions to a distribution of response times

    Directory of Open Access Journals (Sweden)

    Denis Cousineau

    2008-03-01

    Full Text Available This article discusses how to characterize response time (RT frequency distributions in terms of probability functions and how to implement the necessary analysis tools using MATLAB. The first part of the paper discusses the general principles of maximum likelihood estimation. A detailed implementation that allows fitting the popular ex-Gaussian function is then presented followed by the results of a Monte Carlo study that shows the validity of the proposed approach. Although the main focus is the ex-Gaussian function, the general procedure described here can be used to estimate best fitting parameters of various probability functions. The proposed computational tools, written in MATLAB source code, are available through the Internet.

  10. Development of probability distributions for regional climate change from uncertain global mean warming and an uncertain scaling relationship

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available To produce probability distributions for regional climate change in surface temperature and precipitation, a probability distribution for global mean temperature increase has been combined with the probability distributions for the appropriate scaling variables, i.e. the changes in regional temperature/precipitation per degree global mean warming. Each scaling variable is assumed to be normally distributed. The uncertainty of the scaling relationship arises from systematic differences between the regional changes from global and regional climate model simulations and from natural variability. The contributions of these sources of uncertainty to the total variance of the scaling variable are estimated from simulated temperature and precipitation data in a suite of regional climate model experiments conducted within the framework of the EU-funded project PRUDENCE, using an Analysis Of Variance (ANOVA. For the area covered in the 2001–2004 EU-funded project SWURVE, five case study regions (CSRs are considered: NW England, the Rhine basin, Iberia, Jura lakes (Switzerland and Mauvoisin dam (Switzerland. The resulting regional climate changes for 2070–2099 vary quite significantly between CSRs, between seasons and between meteorological variables. For all CSRs, the expected warming in summer is higher than that expected for the other seasons. This summer warming is accompanied by a large decrease in precipitation. The uncertainty of the scaling ratios for temperature and precipitation is relatively large in summer because of the differences between regional climate models. Differences between the spatial climate-change patterns of global climate model simulations make significant contributions to the uncertainty of the scaling ratio for temperature. However, no meaningful contribution could be found for the scaling ratio for precipitation due to the small number of global climate models in the PRUDENCE project and natural variability, which is

  11. Confidence limits with multiple channels and arbitrary probability distributions for sensitivity and expected background

    CERN Document Server

    Perrotta, A

    2002-01-01

    A MC method is proposed to compute upper limits, in a pure Bayesian approach, when the errors associated with the experimental sensitivity and expected background content are not Gaussian distributed or not small enough to apply usual approximations. It is relatively easy to extend the procedure to the multichannel case (for instance when different decay branching, luminosities or experiments have to be combined). Some of the searches for supersymmetric particles performed in the DELPHI experiment at the LEP electron- positron collider use such a procedure to propagate systematics into the calculation of cross-section upper limits. One of these searches is described as an example. (6 refs).

  12. Probability density functions for description of diameter distribution in thinned stands of Tectona grandis

    Directory of Open Access Journals (Sweden)

    Julianne de Castro Oliveira

    2012-06-01

    Full Text Available The objective of this study was to evaluate the effectiveness of fatigue life, Frechet, Gamma, Generalized Gamma, Generalized Logistic, Log-logistic, Nakagami, Beta, Burr, Dagum, Weibull and Hyperbolic distributions in describing diameter distribution in teak stands subjected to thinning at different ages. Data used in this study originated from 238 rectangular permanent plots 490 m2 in size, installed in stands of Tectona grandis L. f. in Mato Grosso state, Brazil. The plots were measured at ages 34, 43, 55, 68, 81, 82, 92, 104, 105, 120, 134 and 145 months on average. Thinning was done in two occasions: the first was systematic at age 81months, with a basal area intensity of 36%, while the second was selective at age 104 months on average and removed poorer trees, reducing basal area by 30%. Fittings were assessed by the Kolmogorov-Smirnov goodness-of-fit test. The Log-logistic (3P, Burr (3P, Hyperbolic (3P, Burr (4P, Weibull (3P, Hyperbolic (2P, Fatigue Life (3P and Nakagami functions provided more satisfactory values for the k-s test than the more commonly used Weibull function.

  13. Codon information value and codon transition-probability distributions in short-term evolution

    Science.gov (United States)

    Jiménez-Montaño, M. A.; Coronel-Brizio, H. F.; Hernández-Montoya, A. R.; Ramos-Fernández, A.

    2016-07-01

    To understand the way the Genetic Code and the physical-chemical properties of coded amino acids affect accepted amino acid substitutions in short-term protein evolution, taking into account only overall amino acid conservation, we consider an underlying codon-level model. This model employs codon pair-substitution frequencies from an empirical matrix in the literature, modified for single-base mutations only. Ordering the degenerated codons according to their codon information value (Volkenstein, 1979), we found that three-fold and most of four-fold degenerated codons, which have low codon values, were best fitted to rank-frequency distributions with constant failure rate (exponentials). In contrast, almost all two-fold degenerated codons, which have high codon values, were best fitted to rank-frequency distributions with variable failure rate (inverse power-laws). Six-fold degenerated codons are considered to be doubly assigned. The exceptional behavior of some codons, including non-degenerate codons, is discussed.

  14. Confidence Limits with Multiple Channels and Arbitrary Probability Distributions for Sensitivity and Expected Background

    Science.gov (United States)

    Perrotta, Andrea

    A MC method is proposed to compute upper limits, in a pure Bayesian approach, when the errors associated to the experimental sensitivity and to the expected background content are not Gaussian distributed or not small enough to apply the usual approximations. It is relatively easy to extend the procedure to the multichannel case (for instance when different decay branchings, or luminosities or experiments have to be combined). Some of the searches for supersymmetric particles performed in the DELPHI experiment at the LEP electron-positron collider use such a procedure to propagate the systematics into the calculation of the cross-section upper limits. One of these searches will be described as an example.

  15. A Method for Justification of the View of Observables in Quantum Mechanics and Probability Distributions in Phase Space

    CERN Document Server

    Beniaminov, E M

    2001-01-01

    There are considered some corollaries of certain hypotheses on the observation process of microphenomena. We show that an enlargement of the phase space and of its motion group and an account for the diffusion motions of microsystems in the enlarged space, the motions which act by small random translations along the enlarged group, lead to observable quantum effects. This approach enables one to recover probability distributions in the phase space for wave functions. The parameters of the model considered here are estimated on the base of Lamb's shift in the spectrum of the hydrogen's atom.

  16. The Homotopic Probability Distribution and the Partition Function for the Entangled System Around a Ribbon Segment Chain

    Institute of Scientific and Technical Information of China (English)

    QIAN Shang-Wu; GU Zhi-Yu

    2001-01-01

    Using the Feynman's path integral with topological constraints arising from the presence of one singular line, we find the homotopic probability distribution PnL for the winding number n and the partition function PL of the entangled system around a ribbon segment chain. We find that when the width of the ribbon segment chain 2a increases,the partition function exponentially decreases, whereas the free energy increases an amount, which is proportional to the square of the width. When the width tends to zero we obtain the same results as those of a single chain with one singular point.

  17. Exact valence bond entanglement entropy and probability distribution in the XXX spin chain and the potts model.

    Science.gov (United States)

    Jacobsen, J L; Saleur, H

    2008-02-29

    We determine exactly the probability distribution of the number N_(c) of valence bonds connecting a subsystem of length L>1 to the rest of the system in the ground state of the XXX antiferromagnetic spin chain. This provides, in particular, the asymptotic behavior of the valence-bond entanglement entropy S_(VB)=N_(c)ln2=4ln2/pi(2)lnL disproving a recent conjecture that this should be related with the von Neumann entropy, and thus equal to 1/3lnL. Our results generalize to the Q-state Potts model.

  18. Families of Fokker-Planck equations and the associated entropic form for a distinct steady-state probability distribution with a known external force field.

    Science.gov (United States)

    Asgarani, Somayeh

    2015-02-01

    A method of finding entropic form for a given stationary probability distribution and specified potential field is discussed, using the steady-state Fokker-Planck equation. As examples, starting with the Boltzmann and Tsallis distribution and knowing the force field, we obtain the Boltzmann-Gibbs and Tsallis entropies. Also, the associated entropy for the gamma probability distribution is found, which seems to be in the form of the gamma function. Moreover, the related Fokker-Planck equations are given for the Boltzmann, Tsallis, and gamma probability distributions.

  19. Nonuniversal power law scaling in the probability distribution of scientific citations.

    Science.gov (United States)

    Peterson, George J; Pressé, Steve; Dill, Ken A

    2010-09-14

    We develop a model for the distribution of scientific citations. The model involves a dual mechanism: in the direct mechanism, the author of a new paper finds an old paper A and cites it. In the indirect mechanism, the author of a new paper finds an old paper A only via the reference list of a newer intermediary paper B, which has previously cited A. By comparison to citation databases, we find that papers having few citations are cited mainly by the direct mechanism. Papers already having many citations ("classics") are cited mainly by the indirect mechanism. The indirect mechanism gives a power-law tail. The "tipping point" at which a paper becomes a classic is about 25 citations for papers published in the Institute for Scientific Information (ISI) Web of Science database in 1981, 31 for Physical Review D papers published from 1975-1994, and 37 for all publications from a list of high h-index chemists assembled in 2007. The power-law exponent is not universal. Individuals who are highly cited have a systematically smaller exponent than individuals who are less cited.

  20. Structured Coupling of Probability Loss Distributions: Assessing Joint Flood Risk in Multiple River Basins.

    Science.gov (United States)

    Timonina, Anna; Hochrainer-Stigler, Stefan; Pflug, Georg; Jongman, Brenden; Rojas, Rodrigo

    2015-11-01

    Losses due to natural hazard events can be extraordinarily high and difficult to cope with. Therefore, there is considerable interest to estimate the potential impact of current and future extreme events at all scales in as much detail as possible. As hazards typically spread over wider areas, risk assessment must take into account interrelations between regions. Neglecting such interdependencies can lead to a severe underestimation of potential losses, especially for extreme events. This underestimation of extreme risk can lead to the failure of riskmanagement strategies when they are most needed, namely, in times of unprecedented events. In this article, we suggest a methodology to incorporate such interdependencies in risk via the use of copulas. We demonstrate that by coupling losses, dependencies can be incorporated in risk analysis, avoiding the underestimation of risk. Based on maximum discharge data of river basins and stream networks, we present and discuss different ways to couple loss distributions of basins while explicitly incorporating tail dependencies. We distinguish between coupling methods that require river structure data for the analysis and those that do not. For the later approach we propose a minimax algorithm to choose coupled basin pairs so that the underestimation of risk is avoided and the use of river structure data is not needed. The proposed methodology is especially useful for large-scale analysis and we motivate and apply our method using the case of Romania. The approach can be easily extended to other countries and natural hazards.

  1. Nonuniversal power law scaling in the probability distribution of scientific citations

    CERN Document Server

    Peterson, G J; Dill, K A; 10.1073/pnas.1010757107

    2010-01-01

    We develop a model for the distribution of scientific citations. The model involves a dual mechanism: in the direct mechanism, the author of a new paper finds an old paper A and cites it. In the indirect mechanism, the author of a new paper finds an old paper A only via the reference list of a newer intermediary paper B, which has previously cited A. By comparison to citation databases, we find that papers having few citations are cited mainly by the direct mechanism. Papers already having many citations (`classics') are cited mainly by the indirect mechanism. The indirect mechanism gives a power-law tail. The `tipping point' at which a paper becomes a classic is about 25 citations for papers published in the Institute for Scientific Information (ISI) Web of Science database in 1981, 31 for Physical Review D papers published from 1975-1994, and 37 for all publications from a list of high h-index chemists assembled in 2007. The power-law exponent is not universal. Individuals who are highly cited have a system...

  2. Understanding star formation in molecular clouds III. Probability distribution functions of molecular lines in Cygnus X

    CERN Document Server

    Schneider, N; Motte, F; Ossenkopf, V; Klessen, R S; Simon, R; Fechtenbaum, S; Herpin, F; Tremblin, P; Csengeri, T; Myers, P C; Hill, T; Cunningham, M; Federrath, C

    2015-01-01

    Column density (N) PDFs serve as a powerful tool to characterize the physical processes that influence the structure of molecular clouds. Star-forming clouds can best be characterized by lognormal PDFs for the lower N range and a power-law tail for higher N, commonly attributed to turbulence and self-gravity and/or pressure, respectively. We report here on PDFs obtained from observations of 12CO, 13CO, C18O, CS, and N2H+ in the Cygnus X North region and compare to a PDF derived from dust observations with the Herschel satellite. The PDF of 12CO is lognormal for Av~1-30, but is cut for higher Av due to optical depth effects. The PDFs of C18O and 13CO are mostly lognormal up for Av~1-15, followed by excess up to Av~40. Above that value, all CO PDFs drop, most likely due to depletion. The high density tracers CS and N2H+ exhibit only a power law distribution between Av~15 and 400, respectively. The PDF from dust is lognormal for Av~2-15 and has a power-law tail up to Av~500. Absolute values for the molecular lin...

  3. Computation of steady-state probability distributions in stochastic models of cellular networks.

    Directory of Open Access Journals (Sweden)

    Mark Hallen

    2011-10-01

    Full Text Available Cellular processes are "noisy". In each cell, concentrations of molecules are subject to random fluctuations due to the small numbers of these molecules and to environmental perturbations. While noise varies with time, it is often measured at steady state, for example by flow cytometry. When interrogating aspects of a cellular network by such steady-state measurements of network components, a key need is to develop efficient methods to simulate and compute these distributions. We describe innovations in stochastic modeling coupled with approaches to this computational challenge: first, an approach to modeling intrinsic noise via solution of the chemical master equation, and second, a convolution technique to account for contributions of extrinsic noise. We show how these techniques can be combined in a streamlined procedure for evaluation of different sources of variability in a biochemical network. Evaluation and illustrations are given in analysis of two well-characterized synthetic gene circuits, as well as a signaling network underlying the mammalian cell cycle entry.

  4. Systematic Study of Rogue Wave Probability Distributions in a Fourth-Order Nonlinear Schr\\"odinger Equation

    CERN Document Server

    Ying, L H

    2012-01-01

    Nonlinear instability and refraction by ocean currents are both important mechanisms that go beyond the Rayleigh approximation and may be responsible for the formation of freak waves. In this paper, we quantitatively study nonlinear effects on the evolution of surface gravity waves on the ocean, to explore systematically the effects of various input parameters on the probability of freak wave formation. The fourth-order current-modified nonlinear Schr\\"odinger equation (CNLS4) is employed to describe the wave evolution. By solving CNLS4 numerically, we are able to obtain quantitative predictions for the wave height distribution as a function of key environmental conditions such as average steepness, angular spread, and frequency spread of the local sea state. Additionally, we explore the spatial dependence of the wave height distribution, associated with the buildup of nonlinear development.

  5. Extinction probabilities and stationary distributions of mobile genetic elements in prokaryotes: The birth-death-diversification model.

    Science.gov (United States)

    Drakos, Nicole E; Wahl, Lindi M

    2015-12-01

    Theoretical approaches are essential to our understanding of the complex dynamics of mobile genetic elements (MGEs) within genomes. Recently, the birth-death-diversification model was developed to describe the dynamics of mobile promoters (MPs), a particular class of MGEs in prokaryotes. A unique feature of this model is that genetic diversification of elements was included. To explore the implications of diversification on the longterm fate of MGE lineages, in this contribution we analyze the extinction probabilities, extinction times and equilibrium solutions of the birth-death-diversification model. We find that diversification increases both the survival and growth rate of MGE families, but the strength of this effect depends on the rate of horizontal gene transfer (HGT). We also find that the distribution of MGE families per genome is not necessarily monotonically decreasing, as observed for MPs, but may have a peak in the distribution that is related to the HGT rate. For MPs specifically, we find that new families have a high extinction probability, and predict that the number of MPs is increasing, albeit at a very slow rate. Additionally, we develop an extension of the birth-death-diversification model which allows MGEs in different regions of the genome, for example coding and non-coding, to be described by different rates. This extension may offer a potential explanation as to why the majority of MPs are located in non-promoter regions of the genome.

  6. Molecular theory for the phase equilibria and cluster distribution of associating fluids with small bond angles.

    Science.gov (United States)

    Marshall, Bennett D; Chapman, Walter G

    2013-08-07

    We develop a new theory for associating fluids with multiple association sites. The theory accounts for small bond angle effects such as steric hindrance, ring formation, and double bonding. The theory is validated against Monte Carlo simulations for the case of a fluid of patchy colloid particles with three patches and is found to be very accurate. Once validated, the theory is applied to study the phase diagram of a fluid composed of three patch colloids. It is found that bond angle has a significant effect on the phase diagram and the very existence of a liquid-vapor transition.

  7. Large Quantum Probability Backflow and the Azimuthal Angle-Angular Momentum Uncertainty Relation for an Electron in a Constant Magnetic Field

    Science.gov (United States)

    Strange, P.

    2012-01-01

    In this paper we demonstrate a surprising aspect of quantum mechanics that is accessible to an undergraduate student. We discuss probability backflow for an electron in a constant magnetic field. It is shown that even for a wavepacket composed entirely of states with negative angular momentum the effective angular momentum can take on positive…

  8. Signatures of the various regions of the outer magnetosphere in the pitch angle distributions of energetic particles

    Energy Technology Data Exchange (ETDEWEB)

    West, H.I. Jr.

    1978-12-11

    An account is given of the obervations of the pitch angle distributions of energetic particles in the near equatorial regions of the Earth's magnetosphere. The emphasis is on relating the observed distributions to the field configuration responsible for the observed effects. The observed effects relate to drift-shell splitting, to the breakdown of adiabatic guiding center motion in regions of sharp field curvature relative to partial gyro radii, to wave-particle interactions, and to moving field configurations. 39 references.

  9. Fast Hadamard transforms for compressive sensing of joint systems: measurement of a 3.2 million-dimensional bi-photon probability distribution.

    Science.gov (United States)

    Lum, Daniel J; Knarr, Samuel H; Howell, John C

    2015-10-19

    We demonstrate how to efficiently implement extremely high-dimensional compressive imaging of a bi-photon probability distribution. Our method uses fast-Hadamard-transform Kronecker-based compressive sensing to acquire the joint space distribution. We list, in detail, the operations necessary to enable fast-transform-based matrix-vector operations in the joint space to reconstruct a 16.8 million-dimensional image in less than 10 minutes. Within a subspace of that image exists a 3.2 million-dimensional bi-photon probability distribution. In addition, we demonstrate how the marginal distributions can aid in the accuracy of joint space distribution reconstructions.

  10. Magnetization curves and probability angular distribution of the magnetization vector in Er2Fe14Si3

    Science.gov (United States)

    Sobh, Hala A.; Aly, Samy H.; Shabara, Reham M.; Yehia, Sherif

    2016-01-01

    Specific magnetic and magneto-thermal properties of Er2Fe14Si3, in the temperature range of 80-300 K, have been investigated using basic laws of classical statistical mechanics in a simple model. In this model, the constructed partition function was used to derive, and therefore calculate the temperature and/or field dependence of a host of physical properties. Examples of these properties are: the magnetization, magnetic heat capacity, magnetic susceptibility, probability angular distribution of the magnetization vector, and the associated angular dependence of energy. We highlight a correlation between the energy of the system, its magnetization behavior and the angular location of the magnetization vector. Our results show that Er2Fe14Si3 is an easy-axis system in the temperature range 80-114 K, but switches to an easy-plane system at T≥114 K. This transition is also supported by both of the temperature dependence of the magnetic heat capacity, which develops a peak at a temperature ~114 K, and the probability landscape which shows, in zero magnetic field, a prominent peak in the basal plane at T=113.5 K.

  11. Binomial probability distribution model-based protein identification algorithm for tandem mass spectrometry utilizing peak intensity information.

    Science.gov (United States)

    Xiao, Chuan-Le; Chen, Xiao-Zhou; Du, Yang-Li; Sun, Xuesong; Zhang, Gong; He, Qing-Yu

    2013-01-04

    Mass spectrometry has become one of the most important technologies in proteomic analysis. Tandem mass spectrometry (LC-MS/MS) is a major tool for the analysis of peptide mixtures from protein samples. The key step of MS data processing is the identification of peptides from experimental spectra by searching public sequence databases. Although a number of algorithms to identify peptides from MS/MS data have been already proposed, e.g. Sequest, OMSSA, X!Tandem, Mascot, etc., they are mainly based on statistical models considering only peak-matches between experimental and theoretical spectra, but not peak intensity information. Moreover, different algorithms gave different results from the same MS data, implying their probable incompleteness and questionable reproducibility. We developed a novel peptide identification algorithm, ProVerB, based on a binomial probability distribution model of protein tandem mass spectrometry combined with a new scoring function, making full use of peak intensity information and, thus, enhancing the ability of identification. Compared with Mascot, Sequest, and SQID, ProVerB identified significantly more peptides from LC-MS/MS data sets than the current algorithms at 1% False Discovery Rate (FDR) and provided more confident peptide identifications. ProVerB is also compatible with various platforms and experimental data sets, showing its robustness and versatility. The open-source program ProVerB is available at http://bioinformatics.jnu.edu.cn/software/proverb/ .

  12. Comparison between the probability distribution of returns in the Heston model and empirical data for stock indexes

    Science.gov (United States)

    Silva, A. Christian; Yakovenko, Victor M.

    2003-06-01

    We compare the probability distribution of returns for the three major stock-market indexes (Nasdaq, S&P500, and Dow-Jones) with an analytical formula recently derived by Drăgulescu and Yakovenko for the Heston model with stochastic variance. For the period of 1982-1999, we find a very good agreement between the theory and the data for a wide range of time lags from 1 to 250 days. On the other hand, deviations start to appear when the data for 2000-2002 are included. We interpret this as a statistical evidence of the major change in the market from a positive growth rate in 1980s and 1990s to a negative rate in 2000s.

  13. Influence of Coloured Correlated Noises on Probability Distribution and Mean of Tumour Cell Number in the Logistic Growth Model

    Institute of Scientific and Technical Information of China (English)

    HAN Li-Bo; GONG Xiao-Long; CAO Li; WU Da-Jin

    2007-01-01

    An approximate Fokker-P1anck equation for the logistic growth model which is driven by coloured correlated noises is derived by applying the Novikov theorem and the Fox approximation. The steady-state probability distribution (SPD) and the mean of the tumour cell number are analysed. It is found that the SPD is the single extremum configuration when the degree of correlation between the multiplicative and additive noises, λ, is in -1<λ ≤ 0 and can be the double extrema in 0<λ<1. A configuration transition occurs because of the variation of noise parameters. A minimum appears in the curve of the mean of the steady-state tumour cell number, 〈x〉, versus λ. The position and the value of the minimum are controlled by the noise-correlated times.

  14. Autoregressive processes with exponentially decaying probability distribution functions: applications to daily variations of a stock market index.

    Science.gov (United States)

    Porto, Markus; Roman, H Eduardo

    2002-04-01

    We consider autoregressive conditional heteroskedasticity (ARCH) processes in which the variance sigma(2)(y) depends linearly on the absolute value of the random variable y as sigma(2)(y) = a+b absolute value of y. While for the standard model, where sigma(2)(y) = a + b y(2), the corresponding probability distribution function (PDF) P(y) decays as a power law for absolute value of y-->infinity, in the linear case it decays exponentially as P(y) approximately exp(-alpha absolute value of y), with alpha = 2/b. We extend these results to the more general case sigma(2)(y) = a+b absolute value of y(q), with 0 history of the ARCH process is taken into account, the resulting PDF becomes a stretched exponential even for q = 1, with a stretched exponent beta = 2/3, in a much better agreement with the empirical data.

  15. Forecasting the Stock Market with Linguistic Rules Generated from the Minimize Entropy Principle and the Cumulative Probability Distribution Approaches

    Directory of Open Access Journals (Sweden)

    Chung-Ho Su

    2010-12-01

    Full Text Available To forecast a complex and non-linear system, such as a stock market, advanced artificial intelligence algorithms, like neural networks (NNs and genetic algorithms (GAs have been proposed as new approaches. However, for the average stock investor, two major disadvantages are argued against these advanced algorithms: (1 the rules generated by NNs and GAs are difficult to apply in investment decisions; and (2 the time complexity of the algorithms to produce forecasting outcomes is very high. Therefore, to provide understandable rules for investors and to reduce the time complexity of forecasting algorithms, this paper proposes a novel model for the forecasting process, which combines two granulating methods (the minimize entropy principle approach and the cumulative probability distribution approach and a rough set algorithm. The model verification demonstrates that the proposed model surpasses the three listed conventional fuzzy time-series models and a multiple regression model (MLR in forecast accuracy.

  16. Updated greenhouse gas and criteria air pollutant emission factors and their probability distribution functions for electricity generating units

    Energy Technology Data Exchange (ETDEWEB)

    Cai, H.; Wang, M.; Elgowainy, A.; Han, J. (Energy Systems)

    2012-07-06

    Greenhouse gas (CO{sub 2}, CH{sub 4} and N{sub 2}O, hereinafter GHG) and criteria air pollutant (CO, NO{sub x}, VOC, PM{sub 10}, PM{sub 2.5} and SO{sub x}, hereinafter CAP) emission factors for various types of power plants burning various fuels with different technologies are important upstream parameters for estimating life-cycle emissions associated with alternative vehicle/fuel systems in the transportation sector, especially electric vehicles. The emission factors are typically expressed in grams of GHG or CAP per kWh of electricity generated by a specific power generation technology. This document describes our approach for updating and expanding GHG and CAP emission factors in the GREET (Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation) model developed at Argonne National Laboratory (see Wang 1999 and the GREET website at http://greet.es.anl.gov/main) for various power generation technologies. These GHG and CAP emissions are used to estimate the impact of electricity use by stationary and transportation applications on their fuel-cycle emissions. The electricity generation mixes and the fuel shares attributable to various combustion technologies at the national, regional and state levels are also updated in this document. The energy conversion efficiencies of electric generating units (EGUs) by fuel type and combustion technology are calculated on the basis of the lower heating values of each fuel, to be consistent with the basis used in GREET for transportation fuels. On the basis of the updated GHG and CAP emission factors and energy efficiencies of EGUs, the probability distribution functions (PDFs), which are functions that describe the relative likelihood for the emission factors and energy efficiencies as random variables to take on a given value by the integral of their own probability distributions, are updated using best-fit statistical curves to characterize the uncertainties associated with GHG and CAP emissions in life

  17. A forward-angle-scattering method for the determination of optical constants and particle size distribution by collimated laser irradiation

    Science.gov (United States)

    Ren, Yatao; Qi, Hong; Yu, Xiaoying; Ruan, Liming

    2017-04-01

    This study examined the feasibility of using a secondary optimization technique and forward-angle-scattering method to retrieve optical constants (or complex refractive indices) and particle size distribution (PSD) simultaneously. In this work, two continuous wave lasers of different wavelengths were applied to irradiate the participating samples, and the scattered light of samples with different acceptance angles was obtained. First, the scattered signals within different acceptance angles were calculated by solving the radiative transfer equation. Then, the complex refractive index and PSD were retrieved simultaneously by applying quantum particle swarm optimization. However, the estimated results of PSD were inaccurate. Thus, a secondary optimization, which using the directional radiative intensity as input, was performed to improve the accuracy of PSD based on the first optimization process. Four commonly used kinds of monomodal PSD functions, i.e., the Rosin-Rammler, standard Normal, Logarithmic Normal, and Junge distribution, were retrieved. All results showed that the proposed technique can estimate the complex refractive index and PSD accurately.

  18. Critical Finite Size Scaling Relation of the Order-Parameter Probability Distribution for the Three-Dimensional Ising Model on the Creutz Cellular Automaton

    Institute of Scientific and Technical Information of China (English)

    B. Kutlu; M. Civi

    2006-01-01

    @@ We study the order parameter probability distribution at the critical point for the three-dimensional spin-1/2 and spin-1 Ising models on the simple cubic lattice under periodic boundary conditions.

  19. Finite Element Analysis of the Effect of Superstructure Materials and Loading Angle on Stress Distribution around the Implant

    Directory of Open Access Journals (Sweden)

    Jafari K

    2014-12-01

    Full Text Available Statement of Problem: A general process in implant design is to determine the reason of possible problems and to find the relevant solutions. The success of the implant depends on the control technique of implant biomechanical conditions. Objectives: The goal of this study was to evaluate the influence of both abutment and framework materials on the stress of the bone around the implant by using threedimensional finite element analysis. Materials and Methods: A three-dimensional model of a patient’s premaxillary bone was fabricated using Cone Beam Computed Tomography (CBCT. Then, three types of abutment from gold, nickel-chromium and zirconia and also three types of crown frame from silver-palladium, nickel-chromium and zirconia were designed. Finally, a 178 N force at angles of zero, 30 and 45 degrees was exerted on the implant axis and the maximum stress and strain in the trabecular, cortical bones and cement was calculated. Results: With changes of the materials and mechanical properties of abutment and frame, little difference was observed in the level and distribution pattern of stress. The stress level was increased with the rise in the angle of pressure exertion. The highest stress concentration was related to the force at the angle of 45 degrees. The results of the cement analysis proved an inverse relationship between the rate of elastic modulus of the frame material and that of the maximum stress in the cement. Conclusions: The impact of the angle at which the force was applied was more significant in stress distribution than that of abutment and framework core materials.

  20. Distribution of Sulfur in Carbon/Sulfur Nanocomposites Analyzed by Small-Angle X-ray Scattering.

    Science.gov (United States)

    Petzold, Albrecht; Juhl, Anika; Scholz, Jonas; Ufer, Boris; Goerigk, Günter; Fröba, Michael; Ballauff, Matthias; Mascotto, Simone

    2016-03-22

    The analysis of sulfur distribution in porous carbon/sulfur nanocomposites using small-angle X-ray scattering (SAXS) is presented. Ordered porous CMK-8 carbon was used as the host matrix and gradually filled with sulfur (20-50 wt %) via melt impregnation. Owing to the almost complete match between the electron densities of carbon and sulfur, the porous nanocomposites present in essence a two-phase system and the filling of the host material can be precisely followed by this method. The absolute scattering intensities normalized per unit of mass were corrected accounting for the scattering contribution of the turbostratic microstructure of carbon and amorphous sulfur. The analysis using the Porod parameter and the chord-length distribution (CLD) approach determined the specific surface areas and filling mechanism of the nanocomposite materials, respectively. Thus, SAXS provides comprehensive characterization of the sulfur distribution in porous carbon and valuable information for a deeper understanding of cathode materials of lithium-sulfur batteries.

  1. IGM Constraints from the SDSS-III/BOSS DR9 Ly-alpha Forest Flux Probability Distribution Function

    CERN Document Server

    Lee, Khee-Gan; Spergel, David N; Weinberg, David H; Hogg, David W; Viel, Matteo; Bolton, James S; Bailey, Stephen; Pieri, Matthew M; Carithers, William; Schlegel, David J; Lundgren, Britt; Palanque-Delabrouille, Nathalie; Suzuki, Nao; Schneider, Donald P; Yeche, Christophe

    2014-01-01

    The Ly$\\alpha$ forest flux probability distribution function (PDF) is an established probe of the intergalactic medium (IGM) astrophysics, especially the temperature-density relationship of the IGM. We measure the flux PDF from 3393 Baryon Oscillations Spectroscopic Survey (BOSS) quasars from SDSS Data Release 9, and compare with mock spectra that include careful modeling of the noise, continuum, and astrophysical uncertainties. The BOSS flux PDFs, measured at $\\langle z \\rangle = [2.3,2.6,3.0]$, are compared with PDFs created from mock spectra drawn from a suite of hydrodynamical simulations that sample the IGM temperature-density relationship, $\\gamma$, and temperature at mean-density, $T_0$, where $T(\\Delta) = T_0 \\Delta^{\\gamma-1}$. We find that a significant population of partial Lyman-limit systems with a column-density distribution slope of $\\beta_\\mathrm{pLLS} \\sim -2$ are required to explain the data at the low-flux end of flux PDF, while uncertainties in the mean \\lya\\ forest transmission affect the...

  2. Energy distributions of plume ions from silver at different angles ablated in vacuum

    DEFF Research Database (Denmark)

    Christensen, Bo Toftmann; Schou, Jørgen; Canulescu, Stela

    be comparatively difficult to measure the energy and angular distribution of neutrals, measurements of the ionic fraction will be valuable for any modeling of PLD. We have irradiated silver in a vacuum chamber (~ 10-7 mbar) with a Nd:YAG laser at a wavelength of 355 nm and made detailed measurements of the time...

  3. Angle-resolved energy distributions of laser ablated silver ions in vacuum

    DEFF Research Database (Denmark)

    Hansen, T.N.; Schou, Jørgen; Lunney, J.G.

    1998-01-01

    The energy distributions of ions ablated from silver in vacuum have been measured in situ for pulsed laser irradiation at 355 nm. We have determined the energy spectra for directions ranging from 5 degrees to 75 degrees with respect to the normal in the intensity range from 100 to 400 MW/cm(2...

  4. IGM CONSTRAINTS FROM THE SDSS-III/BOSS DR9 Lyα FOREST TRANSMISSION PROBABILITY DISTRIBUTION FUNCTION

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Khee-Gan; Hennawi, Joseph F. [Max Planck Institute for Astronomy, Königstuhl 17, D-69117 Heidelberg (Germany); Spergel, David N. [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States); Weinberg, David H. [Department of Astronomy and Center for Cosmology and Astro-Particle Physics, Ohio State University, Columbus, OH 43210 (United States); Hogg, David W. [Center for Cosmology and Particle Physics, New York University, 4 Washington Place, Meyer Hall of Physics, New York, NY 10003 (United States); Viel, Matteo [INAF, Osservatorio Astronomico di Trieste, Via G. B. Tiepolo 11, I-34131 Trieste (Italy); Bolton, James S. [School of Physics and Astronomy, University of Nottingham, University Park, Nottingham NG7 2RD (United Kingdom); Bailey, Stephen; Carithers, William; Schlegel, David J. [E.O. Lawrence Berkeley National Lab, 1 Cyclotron Road, Berkeley, CA 94720 (United States); Pieri, Matthew M. [Institute of Cosmology and Gravitation, University of Portsmouth, Dennis Sciama Building, Portsmouth PO1 3FX (United Kingdom); Lundgren, Britt [Department of Astronomy, University of Wisconsin, Madison, WI 53706 (United States); Palanque-Delabrouille, Nathalie; Yèche, Christophe [CEA, Centre de Saclay, Irfu/SPP, F-91191 Gif-sur-Yvette (France); Suzuki, Nao [Kavli Institute for the Physics and Mathematics of the Universe (IPMU), The University of Tokyo, Kashiwano-ha 5-1-5, Kashiwa-shi, Chiba (Japan); Schneider, Donald P., E-mail: lee@mpia.de [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States)

    2015-02-01

    The Lyα forest transmission probability distribution function (PDF) is an established probe of the intergalactic medium (IGM) astrophysics, especially the temperature-density relationship of the IGM. We measure the transmission PDF from 3393 Baryon Oscillations Spectroscopic Survey (BOSS) quasars from Sloan Digital Sky Survey Data Release 9, and compare with mock spectra that include careful modeling of the noise, continuum, and astrophysical uncertainties. The BOSS transmission PDFs, measured at (z) = [2.3, 2.6, 3.0], are compared with PDFs created from mock spectra drawn from a suite of hydrodynamical simulations that sample the IGM temperature-density relationship, γ, and temperature at mean density, T {sub 0}, where T(Δ) = T {sub 0}Δ{sup γ} {sup –} {sup 1}. We find that a significant population of partial Lyman-limit systems (LLSs) with a column-density distribution slope of β{sub pLLS} ∼ – 2 are required to explain the data at the low-transmission end of transmission PDF, while uncertainties in the mean Lyα forest transmission affect the high-transmission end. After modeling the LLSs and marginalizing over mean transmission uncertainties, we find that γ = 1.6 best describes the data over our entire redshift range, although constraints on T {sub 0} are affected by systematic uncertainties. Within our model framework, isothermal or inverted temperature-density relationships (γ ≤ 1) are disfavored at a significance of over 4σ, although this could be somewhat weakened by cosmological and astrophysical uncertainties that we did not model.

  5. CMB lensing beyond the power spectrum: Cosmological constraints from the one-point probability distribution function and peak counts

    Science.gov (United States)

    Liu, Jia; Hill, J. Colin; Sherwin, Blake D.; Petri, Andrea; Böhm, Vanessa; Haiman, Zoltán

    2016-11-01

    Unprecedentedly precise cosmic microwave background (CMB) data are expected from ongoing and near-future CMB stage III and IV surveys, which will yield reconstructed CMB lensing maps with effective resolution approaching several arcminutes. The small-scale CMB lensing fluctuations receive non-negligible contributions from nonlinear structure in the late-time density field. These fluctuations are not fully characterized by traditional two-point statistics, such as the power spectrum. Here, we use N -body ray-tracing simulations of CMB lensing maps to examine two higher-order statistics: the lensing convergence one-point probability distribution function (PDF) and peak counts. We show that these statistics contain significant information not captured by the two-point function and provide specific forecasts for the ongoing stage III Advanced Atacama Cosmology Telescope (AdvACT) experiment. Considering only the temperature-based reconstruction estimator, we forecast 9 σ (PDF) and 6 σ (peaks) detections of these statistics with AdvACT. Our simulation pipeline fully accounts for the non-Gaussianity of the lensing reconstruction noise, which is significant and cannot be neglected. Combining the power spectrum, PDF, and peak counts for AdvACT will tighten cosmological constraints in the Ωm-σ8 plane by ≈30 %, compared to using the power spectrum alone.

  6. Reconstruction of energy and angle distribution function of surface-emitted negative ions in hydrogen plasmas using mass spectrometry

    Science.gov (United States)

    Kogut, D.; Achkasov, K.; Dubois, J. P. J.; Moussaoui, R.; Faure, J. B.; Layet, J. M.; Simonin, A.; Cartry, G.

    2017-04-01

    A new method involving mass spectrometry and modeling is described in this work, which may highlight the production mechanisms of negative ions (NIs) on surface in low pressure plasmas. Positive hydrogen ions from plasma impact a sample which is biased negatively with respect to the plasma potential. NIs are produced on the surface through the ionization of sputtered and backscattered particles and detected according to their energy and mass by a mass spectrometer (MS) placed in front of the sample. The shape of the measured negative-ion energy distribution function (NIEDF) strongly differs from the NIEDF of the ions emitted by the sample because of the limited acceptance angle of the MS. The reconstruction method proposed here allows to compute the distribution function in energy and angle (NIEADF) of the NIs emitted by the sample based on the NIEDF measurements at different tilt angles of the sample. The reconstruction algorithm does not depend on the NI surface production mechanism, so it can be applied to any type of surface and/or NI. The NIEADFs for highly oriented pyrolitic graphite (HOPG) and gadolinium (low work-function metal) are presented and compared with the SRIM modeling. HOPG and Gd show comparable integrated NI yields, however the key differences in mechanisms of NI production can be identified. While for Gd the major process is backscattering of ions with the peak of NIEDF at 36 eV, in case of HOPG the sputtering contribution due to adsorbed H on the surface is also important and the NIEDF peak is found at 5 eV.

  7. Remote Sensing of Spatial Distributions of Greenhouse Gases in the Los Angles Basin

    Science.gov (United States)

    Fu, Dejian; Pongetti, Thomas J.; Sander, Stanley P.; Cheung, Ross; Stutz, Jochen; Park, Chang Hyoun; Li, Qinbin

    2011-01-01

    The Los Angeles air basin is a significant anthropogenic source of greenhouse gases and pollutants including CO2, CH4, N2O, and CO, contributing significantly to regional and global climate change. Recent legislation in California, the California Global Warming Solutions Act (AB32), established a statewide cap for greenhouse gas emissions for 2020 based on 1990 emissions. Verifying the effectiveness of regional greenhouse gas emissions controls requires high-precision, regional-scale measurement methods combined with models that capture the principal anthropogenic and biogenic sources and sinks. We present a novel approach for monitoring the spatial distributions of greenhouse gases in the Los Angeles basin using high resolution remote sensing spectroscopy. We participated in the CalNex 2010 campaign to provide greenhouse gas distributions for comparison between top-down and bottom-up emission estimates.

  8. Dependence of Spiral Galaxy Distribution on Viewing Angle in RC3

    Institute of Scientific and Technical Information of China (English)

    MA Jun; SONG Guo-Xuan; SHU Cheng-Gang

    2000-01-01

    The normalized inclination distributions are presented for the spiral galaxies in RC3. The results show that,except for the bin of 81°-90°, in which the apparent minor isophotal diameters that are used to obtain the inclinations are affected by the central bulges, the distributions for Sa, Sab, Scd and Sd are well consistent with the Monte-Carlo simulation of random inclinations within 3-σ, and Sb and Sbc almost, but Sc is different. One reason for the difference between the real distribution and the Monte-Carlo simulation of Sc may be that some quite inclined spirals, the arms of which are inherently loosely wound on the galactic plane and should be classified to Sc galaxies, have been incorrectly classified to the earlier ones, because the tightness of spiral arms which is one of the criteria of the Hubble classification in RC3 is different between on the galactic plane and on the tangent plane of the celestial sphere. Our result also implies that there might exist biases in the luminosity functions of individual Hubble types if spiral galaxies are only classified visually.

  9. Counterion distribution surrounding spherical nucleic acid-Au nanoparticle conjugates probed by small-angle x-ray scattering.

    Science.gov (United States)

    Kewalramani, Sumit; Zwanikken, Jos W; Macfarlane, Robert J; Leung, Cheuk-Yui; Olvera de la Cruz, Monica; Mirkin, Chad A; Bedzyk, Michael J

    2013-12-23

    The radial distribution of monovalent cations surrounding spherical nucleic acid-Au nanoparticle conjugates (SNA-AuNPs) is determined by in situ small-angle x-ray scattering (SAXS) and classical density functional theory (DFT) calculations. Small differences in SAXS intensity profiles from SNA-AuNPs dispersed in a series of solutions containing different monovalent ions (Na(+), K(+), Rb(+), or Cs(+)) are measured. Using the "heavy ion replacement" SAXS (HIRSAXS) approach, we extract the cation-distribution-dependent contribution to the SAXS intensity and show that it agrees with DFT predictions. The experiment-theory comparisons reveal the radial distribution of cations as well as the conformation of the DNA in the SNA shell. The analysis shows an enhancement to the average cation concentration in the SNA shell that can be up to 15-fold, depending on the bulk solution ionic concentration. The study demonstrates the feasibility of HIRSAXS in probing the distribution of monovalent cations surrounding nanoparticles with an electron dense core (e.g., metals).

  10. Elements of probability theory

    CERN Document Server

    Rumshiskii, L Z

    1965-01-01

    Elements of Probability Theory presents the methods of the theory of probability. This book is divided into seven chapters that discuss the general rule for the multiplication of probabilities, the fundamental properties of the subject matter, and the classical definition of probability. The introductory chapters deal with the functions of random variables; continuous random variables; numerical characteristics of probability distributions; center of the probability distribution of a random variable; definition of the law of large numbers; stability of the sample mean and the method of moments

  11. [Measurement of atmosphere NO2 amounts and angle spacial distribution using zenith-light spectra and sky-light spectra].

    Science.gov (United States)

    Zhao, Xiao-Yan; Yang, Jing-Guo; Gong, Min; He, Jie; Cao, Ting-Ting; Liang, Hui-Min; Sun, Peng

    2009-07-01

    A novel approach to retrieving atmosphere NO2 slant column density is described, in which the sunlight scattered in the zenith direction and the skylight are used as the light sources. The slant column density of the same azimuth but different obliquities, which are between 0.5 x 10(16) and 11 x 10(16) molecule x cm(-2), with the angle from 85 degrees to 10 degrees, as well as that of the same obliquity but different azimuths, which are between 10(16) and 10(17) molecule cm(-2), were calculated. The study indicates that the results have good correlation with real atmosphere status. The angle spatial distribution could be embodied by the difference of NO2 slant column density in different azimuths and obliquities. The reference spectrum and sample spectrum were collected with the same instrument at the same time, so the measurement accuracy has been improved. This method favored not only real-time monitoring NO2 content of space arbitrary direction, especially near the ground NO2 pollution emergencies, but also overcast and rainy areas where it is very difficult to collect good direct solar spectrum.

  12. PROBABILITY AND STATISTICS.

    Science.gov (United States)

    STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION

  13. Asymmetric distribution of cone-shaped lipids in a highly curved bilayer revealed by a small angle neutron scattering technique

    Energy Technology Data Exchange (ETDEWEB)

    Sakuma, Y; Imai, M [Department of Physics, Ochanomizu University, Bunkyo, Tokyo 112-8610 (Japan); Urakami, N [Department of Physics and Information Sciences, Yamaguchi University, Yamaguchi 753-8512 (Japan); Taniguchi, T, E-mail: imai@phys.ocha.ac.jp [Department of Chemical Engineering, Kyoto University, Kyoto 606-8510 (Japan)

    2011-07-20

    We have investigated the lipid sorting in a binary small unilamellar vesicle (SUV) composed of cone-shaped (1,2-dihexanoyl-sn-glycero-3-phosphocholine: DHPC) and cylinder-shaped (1,2-dipalmitoyl-sn-glycero-3-phosphocholine: DPPC) lipids. In order to reveal the lipid sorting we adopted a contrast matching technique of small angle neutron scattering (SANS), which extracts the distribution of deuterated lipids in the bilayer quantitatively without steric modification of lipids as in fluorescence probe techniques. First the SANS profile of protonated SUVs at a film contrast condition showed that SUVs have a spherical shape with an inner radius of 190 A and a bilayer thickness of 40 A. The SANS profile of deuterated SUVs at a contrast matching condition showed a characteristic scattering profile, indicating an asymmetric distribution of cone-shaped lipids in the bilayer. The characteristic profile was described well by a spherical bilayer model. The fitting revealed that most DHPC molecules are localized in the outer leaflet. Thus the shape of the lipid is strongly coupled with the membrane curvature. We compared the obtained asymmetric distribution of the cone-shaped lipids in the bilayer with the theoretical prediction based on the curvature energy model.

  14. Asymmetric distribution of cone-shaped lipids in a highly curved bilayer revealed by a small angle neutron scattering technique

    Science.gov (United States)

    Sakuma, Y.; Urakami, N.; Taniguchi, T.; Imai, M.

    2011-07-01

    We have investigated the lipid sorting in a binary small unilamellar vesicle (SUV) composed of cone-shaped (1,2-dihexanoyl-sn-glycero-3-phosphocholine: DHPC) and cylinder-shaped (1,2-dipalmitoyl-sn-glycero-3-phosphocholine: DPPC) lipids. In order to reveal the lipid sorting we adopted a contrast matching technique of small angle neutron scattering (SANS), which extracts the distribution of deuterated lipids in the bilayer quantitatively without steric modification of lipids as in fluorescence probe techniques. First the SANS profile of protonated SUVs at a film contrast condition showed that SUVs have a spherical shape with an inner radius of 190 Å and a bilayer thickness of 40 Å. The SANS profile of deuterated SUVs at a contrast matching condition showed a characteristic scattering profile, indicating an asymmetric distribution of cone-shaped lipids in the bilayer. The characteristic profile was described well by a spherical bilayer model. The fitting revealed that most DHPC molecules are localized in the outer leaflet. Thus the shape of the lipid is strongly coupled with the membrane curvature. We compared the obtained asymmetric distribution of the cone-shaped lipids in the bilayer with the theoretical prediction based on the curvature energy model.

  15. Control of the Diameter and Chiral Angle Distributions during Production of Single-Wall Carbon Nanotubes

    Science.gov (United States)

    Nikolaev, Pavel

    2009-01-01

    Many applications of single wall carbon nanotubes (SWCNT), especially in microelectronics, will benefit from use of certain (n,m) nanotube types (metallic, small gap semiconductor, etc.) Especially fascinating is the possibility of quantum conductors that require metallic armchair nanotubes. However, as produced SWCNT samples are polydisperse, with many (n,m) types present and typical approx.1:2 metal/semiconductor ratio. Nanotube nucleation models predict that armchair nuclei are energetically preferential due to formation of partial triple bonds along the armchair edge. However, nuclei can not reach any meaningful thermal equilibrium in a rapidly expanding and cooling plume of carbon clusters, leading to polydispersity. In the present work, SWCNTs were produced by a pulsed laser vaporization (PLV) technique. The carbon vapor plume cooling rate was either increased by change in the oven temperature (expansion into colder gas), or decreased via "warm-up" with a laser pulse at the moment of nucleation. The effect of oven temperature and "warm-up" on nanotube type population was studied via photoluminescence, UV-Vis-NIR absorption and Raman spectroscopy. It was found that reduced temperatures leads to smaller average diameters, progressively narrower diameter distributions, and some preference toward armchair structures. "Warm-up" shifts nanotube population towards arm-chair structures as well, but the effect is small. Possible improvement of the "warm-up" approach to produce armchair SWCNTs will be discussed. These results demonstrate that PLV production technique can provide at least partial control over the nanotube (n,m) population. In addition, these results have implications for the understanding the nanotube nucleation mechanism in the laser oven.

  16. A new in-situ technique for the determination of small scale spatial distribution of contact angles

    Science.gov (United States)

    Lamparter, Axel; Bachmann, Jörg; Woche, Susanne K.

    2010-05-01

    Water repellency is a common phenomenon in soils around the world. Its hydraulic impact reaches from decreased infiltration rates to preferential flow of water through the soil. The contact angle (CA), that forms at the three phase boundary solid-liquid-gas, has been established to quantify water repellency in soils. However, this CA is generally determined at a small amount of dry soil originating from homogenized samples. Thus, its spatial information is dependent on the size of the homogeneous sample. Information about the small scale spatial distribution of soil water repellency (SWR) cannot be obtained with this kind of sample preparation and thus the hydraulic relevance of the measured CA is questionable. Therefore we suggest a new sample preparation technique for measuring the spatial distribution of SWR of natural soils using the sessile drop method (SDM). Two horizontal and one vertical transects of about 1.2 m length have been measured on a sandy forest soil in northern Germany. The litter layer and vegetation, present at the site have been removed prior to the sampling. One side of a double sided adhesive tape has been pressed against the soil surface. This results in a mono-layer of sand grains attached to the tape that reflect the wetting properties in their original spatial surroundings. Using the Sessile Drop Method (SDM), CA have been measured on a straight line transect every 0.5 cm (Drop size 0.005 mL) in the laboratory with a contact angle microscope. Spatial differences in SWR can be measured at the research site. Results have been analyzed using spectral-analysis to reveal spatial correlations in SWR. Different spatial dependencies can be found in different depths of the soil. Results show that the new sampling technique is capable of detecting the spatial variability in natural soils. Thus, it might improve the hydraulic relevance of the small scale CA.

  17. Determining the Probability Distribution of Hillslope Peak Discharge Using an Analytical Solution of Kinematic Wave Time of Concentration

    Science.gov (United States)

    Baiamonte, Giorgio; Singh, Vijay P.

    2016-04-01

    extended to the case of pervious hillslopes, accounting for infiltration. In particular, an analytical solution for the time of concentration for overland flow on a rectangular plane surface was derived using the kinematic wave equation under the Green-Ampt infiltration (Baiamonte and Singh, 2015). The objective of this work is to apply the latter solution to determine the probability distribution of hillslope peak discharge by combining it with the familiar rainfall duration-intensity-frequency approach. References Agnese, C., Baiamonte, G., and Corrao, C. (2001). "A simple model of hillslope response for overland flow generation". Hydrol. Process., 15, 3225-3238, ISSN: 0885-6087, doi: 10.1002/hyp.182. Baiamonte, G., and Agnese, C. (2010). "An analytical solution of kinematic wave equations for overland flow under Green-Ampt infiltration". J. Agr. Eng., vol. 1, p. 41-49, ISSN: 1974-7071. Baiamonte, G., and Singh, V.P., (2015). "Analytical solution of kinematic wave time of concentration for overland flow under Green-Ampt Infiltration." J Hydrol E - ASCE, DOI: 10.1061/(ASCE)HE.1943-5584.0001266. Robinson, J.S., and Sivapalan, M. (1996). "Instantaneous response functions of overland flow and subsurface stormflow for catchment models". Hydrol. Process., 10, 845-862. Singh, V.P. (1976). "Derivation of time of concentration". J. of Hydrol., 30, 147-165. Singh, V.P., (1996). Kinematic-Wave Modeling in Water Resources: Surface-Water Hydrology. John Wiley & Sons, Inc., New York, 1399 pp.

  18. Numerical renormalization group study of probability distributions for local fluctuations in the Anderson-Holstein and Holstein-Hubbard models.

    Science.gov (United States)

    Hewson, Alex C; Bauer, Johannes

    2010-03-24

    We show that information on the probability density of local fluctuations can be obtained from a numerical renormalization group calculation of a reduced density matrix. We apply this approach to the Anderson-Holstein impurity model to calculate the ground state probability density ρ(x) for the displacement x of the local oscillator. From this density we can deduce an effective local potential for the oscillator and compare its form with that obtained from a semiclassical approximation as a function of the coupling strength. The method is extended to the infinite dimensional Holstein-Hubbard model using dynamical mean field theory. We use this approach to compare the probability densities for the displacement of the local oscillator in the normal, antiferromagnetic and charge ordered phases.

  19. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  20. Integrated statistical modelling of spatial landslide probability

    Science.gov (United States)

    Mergili, M.; Chu, H.-J.

    2015-09-01

    Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.

  1. Estimation of most probable power distribution in BWRs by least squares method using in-core measurements

    Energy Technology Data Exchange (ETDEWEB)

    Ezure, Hideo

    1988-09-01

    Effective combination of measured data with theoretical analysis has permitted deriving a mehtod for more accurately estimating the power distribution in BWRs. Use is made of least squares method for the combination between relationship of the power distribution with measured values and the model used in FLARE or in the three-dimensional two-group diffusion code. Trial application of the new method to estimating the power distribution in JPDR-1 has proved the method to provide reliable results.

  2. Probability distribution function values in mobile phones;Valores de funciones de distribución de probabilidad en el teléfono móvil

    Directory of Open Access Journals (Sweden)

    Luis Vicente Chamorro Marcillllo

    2013-06-01

    Full Text Available Engineering, within its academic and application forms, as well as any formal research work requires the use of statistics and every inferential statistical analysis requires the use of values of probability distribution functions that are generally available in tables. Generally, management of those tables have physical problems (wasteful transport and wasteful consultation and operational (incomplete lists and limited accuracy. The study, “Probability distribution function values in mobile phones”, permitted determining – through a needs survey applied to students involved in statistics studies at Universidad de Nariño – that the best known and most used values correspond to Chi-Square, Binomial, Student’s t, and Standard Normal distributions. Similarly, it showed user’s interest in having the values in question within an alternative means to correct, at least in part, the problems presented by “the famous tables”. To try to contribute to the solution, we built software that allows immediately and dynamically obtaining the values of the probability distribution functions most commonly used by mobile phones.

  3. Impact of pitch angle setup error and setup error correction on dose distribution in volumetric modulated arc therapy for prostate cancer.

    Science.gov (United States)

    Takemura, Akihiro; Togawa, Kumiko; Yokoi, Tomohiro; Ueda, Shinichi; Noto, Kimiya; Kojima, Hironori; Isomura, Naoki; Kumano, Tomoyasu

    2016-07-01

    In volumetric modulated arc therapy (VMAT) for prostate cancer, a positional and rotational error correction is performed according to the position and angle of the prostate. The correction often involves body leaning, and there is concern regarding variation in the dose distribution. Our purpose in this study was to evaluate the impact of body pitch rotation on the dose distribution regarding VMAT. Treatment plans were obtained retrospectively from eight patients with prostate cancer. The body in the computed tomography images for the original VMAT plan was shifted to create VMAT plans with virtual pitch angle errors of ±1.5° and ±3°. Dose distributions for the tilted plans were recalculated with use of the same beam arrangement as that used for the original VMAT plan. The mean value of the maximum dose differences in the dose distributions between the original VMAT plan and the tilted plans was 2.98 ± 0.96 %. The value of the homogeneity index for the planning target volume (PTV) had an increasing trend according to the pitch angle error, and the values of the D 95 for the PTV and D 2ml, V 50, V 60, and V 70 for the rectum had decreasing trends (p < 0.05). However, there was no correlation between differences in these indexes and the maximum dose difference. The pitch angle error caused by body leaning had little effect on the dose distribution; in contrast, the pitch angle correction reduced the effects of organ displacement and improved these indexes. Thus, the pitch angle setup error in VMAT for prostate cancer should be corrected.

  4. The difference between the joint probability distributions of apparent wave heights and periods and individual wave heights and periods

    Institute of Scientific and Technical Information of China (English)

    ZHENGGuizhen; JIANGXiulan; HANShuzong

    2004-01-01

    The joint distribution of wave heights and periods of individual waves is usually approximated by the joint distribution of apparent wave heights and periods. However there is difference between them. This difference is addressed and the theoretical joint distributions of apparent wave heights and periods due to Longuet-Higgins and Sun are modified to give more reasonable representations of the joint distribution of wave heights and periods of individual waves. The modification has overcome an inherent drawback of these joint PDFs that the mean wave period is infinite. A comparison is made between the modified formulae and the field data of Goda, which shows that the new formulae consist with the measurement better than their original counterparts.

  5. Existence, uniqueness and regularity of a time-periodic probability density distribution arising in a sedimentation-diffusion problem

    Science.gov (United States)

    Nitsche, Ludwig C.; Nitsche, Johannes M.; Brenner, Howard

    1988-01-01

    The sedimentation and diffusion of a nonneutrally buoyant Brownian particle in vertical fluid-filled cylinder of finite length which is instantaneously inverted at regular intervals are investigated analytically. A one-dimensional convective-diffusive equation is derived to describe the temporal and spatial evolution of the probability density; a periodicity condition is formulated; the applicability of Fredholm theory is established; and the parameter-space regions are determined within which the existence and uniqueness of solutions are guaranteed. Numerical results for sample problems are presented graphically and briefly characterized.

  6. Diagnostics of Rovibrational Distribution of H2 in Low Temperature Plasmas by Fulcher-α band Spectroscopy - on the Reaction Rates and Transition Probabilities

    Institute of Scientific and Technical Information of China (English)

    Xiao Bingjia; Shinichiro Kado; Shin Kajita; Daisuge Yamasaki; Satoru Tanaka

    2005-01-01

    A novel fitting procedure is proposed for a better determination of H2 rovibrational distribution from the Fulcher-a band spectroscopy. We have recalculated the transition probabilities and the results show that they deviate from Franck-Condon approximation especially for the non-diagonal transitions. We also calculated the complete sets of vibrationally resolved crosssections for electron impact d3∏u- X3∑g transition based on the semi-classical Gryzinski theory.An example of experimental study confirms that current approach provides a tool for a better diagnostics of H2 rovibrational distribution in electronic ground state.

  7. Large Scale Distribution of Ultra High Energy Cosmic Rays Detected at the Pierre Auger Observatory with Zenith Angles up to 80°

    NARCIS (Netherlands)

    Aab, A.; Abreu, P.; Aglietta, M.; Ahn, E. J.; Samarai, I. Al; Albuquerque, I. F. M.; Allekotte, I.; Allen, J.; Allison, P.; Almela, A.; Alvarez Castillo, J.; Alvarez-Muñiz, J.; Alves Batista, R.; Ambrosio, M.; Aminaei, A.; Anchordoqui, L.; Andringa, S.; Aramo, C.; Aranda, V. M.; Arqueros, F.; Asorey, H.; Assis, P.; Aublin, J.; Ave, M.; Avenier, M.; Avila, G.; Awal, N.; Badescu, A. M.; Barber, K. B.; Bäuml, J.; Baus, C.; Beatty, J. J.; Becker, K. H.; Bellido, J. A.; Berat, C.; Bertaina, M. E.; Bertou, X.; Biermann, P. L.; Billoir, P.; Blaess, S. G.; Blanco, M.; Bleve, C.; Blümer, H.; Boháčová, M.; Boncioli, D.; Bonifazi, C.; Bonino, R.; Borodai, N.; Brack, J.; Brancus, I.; Bridgeman, A.; Brogueira, P.; Brown, W. C.; Buchholz, P.; Bueno, A.; Buitink, S.; Buscemi, M.; Caballero-Mora, K. S.; Caccianiga, B.; Caccianiga, L.; Candusso, M.; Caramete, L.; Caruso, R.; Castellina, A.; Cataldi, G.; Cazon, L.; Cester, R.; Chavez, A. G.; Chiavassa, A.; Chinellato, J. A.; Chudoba, J.; Cilmo, M.; Clay, R. W.; Cocciolo, G.; Colalillo, R.; Coleman, A.; Collica, L.; Coluccia, M. R.; Conceição, R.; Contreras, F.; Cooper, M. J.; Cordier, A.; Coutu, S.; Covault, C. E.; Cronin, J.; Curutiu, A.; Dallier, R.; Daniel, B.; Dasso, S.; Daumiller, K.; Dawson, B. R.; de Almeida, R. M.; De Domenico, M.; de Jong, S. J.; de Mello Neto, J. R. T.; De Mitri, I.; de Oliveira, J.; de Souza, V.; del Peral, L.; Deligny, O.; Dembinski, H.; Dhital, N.; Di Giulio, C.; Di Matteo, A.; Diaz, J. C.; Díaz Castro, M. L.; Diogo, F.; Dobrigkeit, C.; Docters, W.; D’Olivo, J. C.; Dorofeev, A.; Dorosti Hasankiadeh, Q.; Dova, M. T.; Ebr, J.; Engel, R.; Erdmann, M.; Erfani, M.; Escobar, C. O.; Espadanal, J.; Etchegoyen, A.; Facal San Luis, P.; Falcke, H.; Fang, K.; Farrar, G.; Fauth, A. C.; Fazzini, N.; Ferguson, A. P.; Fernandes, M.; Fick, B.; Figueira, J. M.; Filevich, A.; Filipčič, A.; Fox, B. D.; Fratu, O.; Freire, M. M.; Fröhlich, U.; Fuchs, B.; Fujii, T.; Gaior, R.; García, B.; Garcia-Gamez, D.; Garcia-Pinto, D.; Garilli, G.; Gascon Bravo, A.; Gate, F.; Gemmeke, H.; Ghia, P. L.; Giaccari, U.; Giammarchi, M.; Giller, M.; Glaser, C.; Glass, H.; Gómez Berisso, M.; Gómez Vitale, P. F.; Gonçalves, P.; Gonzalez, J. G.; González, N.; Gookin, B.; Gordon, J.; Gorgi, A.; Gorham, P.; Gouffon, P.; Grebe, S.; Griffith, N.; Grillo, A. F.; Grubb, T. D.; Guarino, F.; Guedes, G. P.; Hampel, M. R.; Hansen, P.; Harari, D.; Harrison, T. A.; Hartmann, S.; Harton, J. L.; Haungs, A.; Hebbeker, T.; Heck, D.; Heimann, P.; Herve, A. E.; Hill, G. C.; Hojvat, C.; Hollon, N.; Holt, E.; Homola, P.; Hörandel, J. R.; Horvath, P.; Hrabovský, M.; Huber, D.; Huege, T.; Insolia, A.; Isar, P. G.; Jandt, I.; Jansen, S.; Jarne, C.; Josebachuili, M.; Kääpä, A.; Kambeitz, O.; Kampert, K. H.; Kasper, P.; Katkov, I.; Kégl, B.; Keilhauer, B.; Keivani, A.; Kemp, E.; Kieckhafer, R. M.; Klages, H. O.; Kleifges, M.; Kleinfeller, J.; Krause, R.; Krohm, N.; Krömer, O.; Kruppke-Hansen, D.; Kuempel, D.; Kunka, N.; LaHurd, D.; Latronico, L.; Lauer, R.; Lauscher, M.; Lautridou, P.; Le Coz, S.; Leão, M. S. A. B.; Lebrun, D.; Lebrun, P.; Leigui de Oliveira, M. A.; Letessier-Selvon, A.; Lhenry-Yvon, I.; Link, K.; López, R.; Louedec, K.; Lozano Bahilo, J.; Lu, L.; Lucero, A.; Ludwig, M.; Malacari, M.; Maldera, S.; Mallamaci, M.; Maller, J.; Mandat, D.; Mantsch, P.; Mariazzi, A. G.; Marin, V.; Mariş, I. C.; Marsella, G.; Martello, D.; Martin, L.; Martinez, H.; Martínez Bravo, O.; Martraire, D.; Masías Meza, J. J.; Mathes, H. J.; Mathys, S.; Matthews, J.; Matthews, J. A. J.; Matthiae, G.; Maurel, D.; Maurizio, D.; Mayotte, E.; Mazur, P. O.; Medina, C.; Medina-Tanco, G.; Meissner, R.; Melissas, M.; Melo, D.; Menshikov, A.; Messina, S.; Meyhandan, R.; Mićanović, S.; Micheletti, M. I.; Middendorf, L.; Minaya, I. A.; Miramonti, L.; Mitrica, B.; Molina-Bueno, L.; Mollerach, S.; Monasor, M.; Monnier Ragaigne, D.; Montanet, F.; Morello, C.; Mostafá, M.; Moura, C. A.; Muller, M. A.; Müller, G.; Müller, S.; Münchmeyer, M.; Mussa, R.; Navarra, G.; Navas, S.; Necesal, P.; Nellen, L.; Nelles, A.; Neuser, J.; Nguyen, P. H.; Niechciol, M.; Niemietz, L.; Niggemann, T.; Nitz, D.; Nosek, D.; Novotny, V.; Nožka, L.; Ochilo, L.; Oikonomou, F.; Olinto, A.; Oliveira, M.; Pacheco, N.; Pakk Selmi-Dei, D.; Palatka, M.; Pallotta, J.; Palmieri, N.; Papenbreer, P.; Parente, G.; Parra, A.; Paul, T.; Pech, M.; Pȩkala, J.; Pelayo, R.; Pepe, I. M.; Perrone, L.; Petermann, E.; Peters, C.; Petrera, S.; Petrov, Y.; Phuntsok, J.; Piegaia, R.; Pierog, T.; Pieroni, P.; Pimenta, M.; Pirronello, V.; Platino, M.; Plum, M.; Porcelli, A.; Porowski, C.; Prado, R. R.; Privitera, P.; Prouza, M.; Purrello, V.; Quel, E. J.; Querchfeld, S.; Quinn, S.; Rautenberg, J.; Ravel, O.; Ravignani, D.; Revenu, B.; Ridky, J.; Riggi, S.; Risse, M.; Ristori, P.; Rizi, V.; Rodrigues de Carvalho, W.; Rodriguez Fernandez, G.; Rodriguez Rojo, J.; Rodríguez-Frías, M. D.; Rogozin, D.; Ros, G.; Rosado, J.; Rossler, T.; Roth, M.; Roulet, E.; Rovero, A. C.; Saffi, S. J.; Saftoiu, A.; Salamida, F.; Salazar, H.; Saleh, A.; Salesa Greus, F.; Salina, G.; Sánchez, F.; Sanchez-Lucas, P.; Santo, C. E.; Santos, E.; Santos, E. M.; Sarazin, F.; Sarkar, B.; Sarmento, R.; Sato, R.; Scharf, N.; Scherini, V.; Schieler, H.; Schiffer, P.; Schmidt, D.; Scholten, O.; Schoorlemmer, H.; Schovánek, P.; Schröder, F. G.; Schulz, A.; Schulz, J.; Schumacher, J.; Sciutto, S. J.; Segreto, A.; Settimo, M.; Shadkam, A.; Shellard, R. C.; Sidelnik, I.; Sigl, G.; Sima, O.; Śmiałkowski, A.; Šmída, R.; Snow, G. R.; Sommers, P.; Sorokin, J.; Squartini, R.; Srivastava, Y. N.; Stanič, S.; Stapleton, J.; Stasielak, J.; Stephan, M.; Stutz, A.; Suarez, F.; Suomijärvi, T.; Supanitsky, A. D.; Sutherland, M. S.; Swain, J.; Szadkowski, Z.; Szuba, M.; Taborda, O. A.; Tapia, A.; Tepe, A.; Theodoro, V. M.; Timmermans, C.; Todero Peixoto, C. J.; Toma, G.; Tomankova, L.; Tomé, B.; Tonachini, A.; Torralba Elipe, G.; Torres Machado, D.; Travnicek, P.; Trovato, E.; Ulrich, R.; Unger, M.; Urban, M.; Valdés Galicia, J. F.; Valiño, I.; Valore, L.; van Aar, G.; van Bodegom, P.; van den Berg, A. M.; van Velzen, S.; van Vliet, A.; Varela, E.; Vargas Cárdenas, B.; Varner, G.; Vázquez, J. R.; Vázquez, R. A.; Veberič, D.; Verzi, V.; Vicha, J.; Videla, M.; Villaseñor, L.; Vlcek, B.; Vorobiov, S.; Wahlberg, H.; Wainberg, O.; Walz, D.; Watson, A. A.; Weber, M.; Weidenhaupt, K.; Weindl, A.; Werner, F.; Widom, A.; Wiencke, L.; Wilczyńska, B.; Wilczyński, H.; Williams, C.; Winchen, T.; Wittkowski, D.; Wundheiler, B.; Wykes, S.; Yamamoto, T.; Yapici, T.; Yuan, G.; Yushkov, A.; Zamorano, B.; Zas, E.; Zavrtanik, D.; Zavrtanik, M.; Zepeda, A.; Zhou, J.; Zhu, Y.; Zimbres Silva, M.; Ziolkowski, M.; Zuccarello, F.

    2015-01-01

    We present the results of an analysis of the large angular scale distribution of the arrival directions of cosmic rays with energy above 4 EeV detected at the Pierre Auger Observatory including for the first time events with zenith angle between 60° and 80°. We perform two Rayleigh analyses, one in

  8. LARGE SCALE DISTRIBUTION OF ULTRA HIGH ENERGY COSMIC RAYS DETECTED AT THE PIERRE AUGER OBSERVATORY WITH ZENITH ANGLES UP TO 80 degrees

    NARCIS (Netherlands)

    Aab, A.; Abreu, P.; Aglietta, M.; Ahn, E. J.; Al Samarai, I.; Albuquerque, I. F. M.; Allekotte, I.; Allen, J.; Allison, P.; Almela, A.; Alvarez Castillo, J.; Alvarez-Muniz, J.; Batista, R. Alves; Ambrosio, M.; Aminaei, A.; Anchordoqui, L.; Andringa, S.; Aramo, C.; Aranda, M.; Arqueros, F.; Asorey, H.; Assis, P.; Aublin, J.; Ave, M.; Avenier, M.; Avila, G.; Awal, N.; Badescu, A. M.; Barber, K. B.; Baeuml, J.; Baus, C.; Beatty, J. J.; Becker, K. H.; Bellido, J. A.; Berat, C.; Bertaina, M. E.; Bertou, X.; Biermann, P. L.; Billoir, P.; Blaess, S. G.; Blanco, M.; Bleve, C.; Bluemer, H.; Bohacova, M.; Boncioli, D.; Bonifazi, C.; Bonino, R.; Borodai, N.; Brack, J.; Brancus, I.; Bridgeman, A.; Brogueira, P.; Brown, W. C.; Buchholz, P.; Bueno, A.; Buitink, S.; Buscemi, M.; Caballero-Mora, K. S.; Caccianiga, B.; Caccianiga, L.; Candusso, M.; Caramete, L.; Caruso, R.; Castellina, A.; Cataldi, G.; Cazon, L.; Cester, R.; Chavez, A. G.; Chiavassa, A.; Chinellato, J. A.; Chudoba, J.; Cilmo, M.; Clay, R. W.; Cocciolo, G.; Colalillo, R.; Coleman, A.; Collica, L.; Coluccia, M. R.; Conceicao, R.; Contreras, F.; Cooper, M. J.; Cordier, A.; Coutu, S.; Covault, C. E.; Cronin, J.; Curutiu, A.; Dallier, R.; Daniel, B.; Dasso, S.; Daumiller, K.; Dawson, B. R.; de Almeida, R. M.; De Domenico, M.; de Jong, S. J.; de Mello Neto, J. R. T.; De Mitri, I.; de Oliveira, J.; de Souza, V.; del Peral, L.; Deligny, O.; Dembinski, H.; Dhital, N.; Di Giulio, C.; Di Matteo, A.; Diaz, J. C.; Diaz Castro, M. L.; Diogo, F.; Dobrigkeit, C.; Docters, W.; D'Olivo, J. C.; Dorofeev, A.; Hasankiadeh, Q. Dorosti; Dova, M. T.; Ebr, J.; Engel, R.; Erdmann, M.; Erfani, M.; Escobar, C. O.; Espadanal, J.; Etchegoyen, A.; Luis, P. Facal San; Falcke, H.; Fang, K.; Farrar, G.; Fauth, A. C.; Fazzini, N.; Ferguson, A. P.; Fernandes, M.; Fick, B.; Figueira, J. M.; Filevich, A.; Filipcic, A.; Fox, B. D.; Fratu, O.; Freire, M. M.; Froehlich, U.; Fuchs, B.; Fujii, T.; Gaior, R.; Garcia, B.; Garcia-Gamez, D.; Garcia-Pinto, D.; Garilli, G.; Gascon Bravo, A.; Gate, F.; Gemmeke, H.; Ghia, P. L.; Giaccari, U.; Giammarchi, M.; Giller, M.; Glaser, C.; Glass, H.; Gomez Berisso, M.; Gomez Vitale, P. F.; Goncalves, P.; Gonzalez, J. G.; Gonzalez, N.; Gookin, B.; Gordon, J.; Gorgi, A.; Gorham, P.; Gouffon, P.; Grebe, S.; Griffith, N.; Grillo, A. F.; Grubb, T. D.; Guarino, F.; Guedes, G. P.; Hampel, M. R.; Hansen, P.; Harari, D.; Harrison, T. A.; Hartmann, S.; Harton, J. L.; Haungs, A.; Hebbeker, T.; Heck, D.; Heimann, P.; Herve, A. E.; Hill, G. C.; Hojvat, C.; Hollon, N.; Holt, E.; Homola, P.; Horandel, J. R.; Horvath, P.; Hrabovsky, M.; Huber, D.; Huege, T.; Insolia, A.; Isar, P. G.; Jandt, I.; Jansen, S.; Jarne, C.; Josebachuili, M.; Kaeaepae, A.; Kambeitz, O.; Kampert, K. H.; Kasper, P.; Katkov, I.; Kegl, B.; Keilhauer, B.; Keivani, A.; Kemp, E.; Kieckhafer, R. M.; Klages, H. O.; Kleifges, M.; Kleinfeller, J.; Krause, R.; Krohm, N.; Kroemer, O.; Kruppke-Hansen, D.; Kuempel, D.; Kunka, N.; LaHurd, D.; Latronico, L.; Lauer, R.; Lauscher, M.; Lautridou, P.; Le Coz, S.; Leao, M. S. A. B.; Lebrun, D.; Lebrun, P.; Leigui de Oliveira, M. A.; Letessier-Selvon, A.; Lhenry-Yvon, I.; Link, K.; Lopez, R.; Louedec, K.; Lozano Bahilo, J.; Lu, L.; Lucero, A.; Ludwig, M.; Malacari, M.; Maldera, S.; Mallamaci, M.; Maller, J.; Mandat, D.; Mantsch, P.; Mariazzi, A. G.; Marin, V.; Maris, I. C.; Marsella, G.; Martello, D.; Martin, L.; Martinez, H.; Martinez Bravo, O.; Martraire, D.; Masias Meza, J. J.; Mathes, H. J.; Mathys, S.; Matthews, J.; Matthews, J. A. J.; Matthiae, G.; Maurel, D.; Maurizio, D.; Mayotte, E.; Mazur, P. O.; Medina, C.; Medina-Tanco, G.; Meissner, R.; Melissas, M.; Melo, D.; Menshikov, A.; Messina, S.; Meyhandan, R.; Micanovic, S.; Micheletti, M. I.; Middendorf, L.; Minaya, I. A.; Miramonti, L.; Mitrica, B.; Molina-Bueno, L.; Mollerach, S.; Monasor, M.; Ragaigne, D. Monnier; Montanet, F.; Morello, C.; Mostafa, M.; Moura, C. A.; Muller, M. A.; Mueller, G.; Mueller, S.; Muenchmeyer, M.; Mussa, R.; Navarra, G.; Navas, S.; Necesal, P.; Nellen, L.; Nelles, A.; Neuser, J.; Nguyen, P. H.; Niechciol, M.; Niemietz, L.; Niggemann, T.; Nitz, D.; Nosek, D.; Novotny, V.; Nozka, L.; Ochilo, L.; Oikonomou, F.; Olinto, A.; Oliveira, M.; Pacheco, N.; Pakk Selmi-Dei, D.; Palatka, M.; Pallotta, J.; Palmieri, N.; Papenbreer, P.; Parente, G.; Parra, A.; Paul, T.; Pech, M.; Pekala, J.; Pelayo, R.; Pepe, I. M.; Perrone, L.; Petermann, E.; Peters, C.; Petrera, S.; Petrov, Y.; Phuntsok, J.; Piegaia, R.; Pierog, T.; Pieroni, P.; Pimenta, M.; Pirronello, V.; Platino, M.; Plum, M.; Porcelli, A.; Porowski, C.; Prado, R. R.; Privitera, P.; Prouza, M.; Purrello, V.; Quel, E. J.; Querchfeld, S.; Quinn, S.; Rautenberg, J.; Ravel, O.; Ravignani, D.; Revenu, B.; Ridky, J.; Riggi, S.; Risse, M.; Ristori, P.; Rizi, V.; Rodrigues de Carvalho, W.; Fernandez, G. Rodriguez; Rodriguez Rojo, J.; Rodriguez-Frias, M. D.; Rogozin, D.; Ros, G.; Rosado, J.; Rossler, T.; Roth, M.; Roulet, E.; Rovero, A. C.; Saffi, S. J.; Saftoiu, A.; Salamida, F.; Salazar, H.; Saleh, A.; Greus, F. Salesa; Salina, G.; Sanchez, F.; Sanchez-Lucas, P.; Santo, C. E.; Santos, E.; Santos, E. M.; Sarazin, F.; Sarkar, B.; Sarmento, R.; Sato, R.; Scharf, N.; Scherini, V.; Schieler, H.; Schiffer, P.; Schmidt, D.; Scholten, O.; Schoorlemmer, H.; Schovanek, P.; Schroeder, F. G.; Schulz, A.; Schulz, J.; Schumacher, J.; Sciutto, S. J.; Segreto, A.; Settimo, M.; Shadkam, A.; Shellard, R. C.; Sidelnik, I.; Sigl, G.; Sima, O.; Smialkowski, A.; Smida, R.; Snow, G. R.; Sommers, P.; Sorokin, J.; Squartini, R.; Srivastava, Y. N.; Stanic, S.; Stapleton, J.; Stasielak, J.; Stephan, M.; Stutz, A.; Suarez, F.; Suomijaervi, T.; Supanitsky, A. D.; Sutherland, M. S.; Swain, J.; Szadkowski, Z.; Szuba, M.; Taborda, O. A.; Tapia, A.; Tepe, A.; Theodoro, V. M.; Timmermans, C.; Todero Peixoto, C. J.; Toma, G.; Tomankova, L.; Tome, B.; Tonachini, A.; Torralba Elipe, G.; Torres Machado, D.; Travnicek, P.; Trovato, E.; Ulrich, R.; Unger, M.; Urban, M.; Valdes Galicia, J. F.; Valino, I.; Valore, L.; van Aar, G.; van Bodegom, P.; van den Berg, A. M.; van Velzen, S.; van Vliet, A.; Varela, E.; Vargas Cardenas, B.; Varner, G.; Vazquez, J. R.; Vazquez, R. A.; Veberic, D.; Verzi, V.; Vicha, J.; Videla, M.; Villasenor, L.; Vlcek, B.; Vorobiov, S.; Wahlberg, H.; Wainberg, O.; Walz, D.; Watson, A. A.; Weber, M.; Weidenhaupt, K.; Weindl, A.; Werner, F.; Widom, A.; Wiencke, L.; Wilczynska, B.; Wilczynski, H.; Williams, C.; Winchen, T.; Wittkowski, D.; Wundheiler, B.; Wykes, S.; Yamamoto, T.; Yapici, T.; Yuan, G.; Yushkov, A.; Zamorano, B.; Zas, E.; Zavrtanik, D.; Zavrtanik, M.; Zepeda, A.; Zhou, J.; Zhu, Y.; Zimbres Silva, M.; Ziolkowski, M.; Zuccarello, F.

    2015-01-01

    We present the results of an analysis of the large angular scale distribution of the arrival directions of cosmic rays with energy above 4 EeV detected at the Pierre Auger Observatory including for the first time events with zenith angle between 60 degrees and 80 degrees. We perform two Rayleigh ana

  9. Energetic Particle Pitch Angle Distributions Observed At Widely-Spaced Longitudes in the 23 July 2012 and Other Large Solar Particle Events

    Science.gov (United States)

    Leske, R. A.; Cummings, A. C.; Cohen, C. M.; Mewaldt, R. A.; Labrador, A. W.; Stone, E. C.; Wiedenbeck, M. E.; Christian, E. R.; von Rosenvinge, T. T.

    2015-12-01

    Solar energetic particle (SEP) pitch angle distributions arise from the competing effects of magnetic focusing and scattering as the particles travel through the interplanetary medium, and can therefore indicate interplanetary conditions far from the observer. The STEREO Low Energy Telescopes measure SEP pitch angle distributions for protons, helium, and heavier ions with energies of about 2-12 MeV/nucleon. A wide variety of particle anisotropies was observed in the extreme SEP event of 23 July 2012. At the STEREO-Ahead spacecraft, the solar source of the activity was near central meridian and the pitch angle distribution was initially an outward-flowing beam. High time resolution (1-minute) observations revealed peculiar oscillations in beam width on a timescale of several minutes; such behavior does not seem to have been previously reported in other events. Particle flow became bidirectional while inside a magnetic cloud following a tremendous shock. Particle intensities at the Behind spacecraft, from which the event occurred over the east limb of the Sun, were about 1000 times lower than at Ahead. Pitch angle distributions during the peak of the event show inward-flowing particles that underwent partial mirroring closer to the Sun and formed a distinctive loss-cone distribution (indicating that the magnetic field strength at the mirror point was too small to turn around particles with the smallest pitch angles). We present the observations of this rich variety of anisotropies within a single event, compare with observations in other events, and discuss the implications for SEP transport in the inner heliosphere.

  10. Investigating anthelmintic efficacy against gastrointestinal nematodes in cattle by considering appropriate probability distributions for faecal egg count data

    Directory of Open Access Journals (Sweden)

    J.W. Love

    2017-04-01

    Where FEC data were obtained with less sensitive counting techniques (i.e. McMaster 30 or 15 epg, zero-inflated distributions and their associated central tendency were the most appropriate and would be recommended to use, i.e. the arithmetic group mean divided by the proportion of non-zero counts present; otherwise apparent anthelmintic efficacy could be misrepresented.

  11. The Mean Distance to the nth Neighbour in a Uniform Distribution of Random Points: An Application of Probability Theory

    Science.gov (United States)

    Bhattacharyya, Pratip; Chakrabarti, Bikas K.

    2008-01-01

    We study different ways of determining the mean distance (r[subscript n]) between a reference point and its nth neighbour among random points distributed with uniform density in a D-dimensional Euclidean space. First, we present a heuristic method; though this method provides only a crude mathematical result, it shows a simple way of estimating…

  12. Predicting probability of occurrence and factors affecting distribution and abundance of three Ozark endemic crayfish species at multiple spatial scales

    Science.gov (United States)

    Nolen, Matthew S.; Magoulick, Daniel D.; DiStefano, Robert J.; Imhoff, Emily M.; Wagner, Brian K.

    2014-01-01

    Crayfishes and other freshwater aquatic fauna are particularly at risk globally due to anthropogenic demand, manipulation and exploitation of freshwater resources and yet are often understudied. The Ozark faunal region of Missouri and Arkansas harbours a high level of aquatic biological diversity, especially in regard to endemic crayfishes. Three such endemics, Orconectes eupunctus,Orconectes marchandi and Cambarus hubbsi, are threatened by limited natural distribution and the invasions of Orconectes neglectus.

  13. Probability distribution of the number of distinct sites visited by a random walk on the finite-size fully-connected lattice

    CERN Document Server

    Turban, L

    2016-01-01

    The probability distribution of the number $s$ of distinct sites visited up to time $t$ by a random walk on the fully-connected lattice with $N$ sites is first obtained by solving the eigenvalue problem associated with the discrete master equation. Then, using generating function techniques, we compute the joint probability distribution of $s$ and $r$, where $r$ is the number of sites visited only once up to time $t$. Mean values, variances and covariance are deduced from the generating functions and their finite-size-scaling behaviour is studied. Introducing properly centered and scaled variables $u$ and $v$ for $r$ and $s$ and working in the scaling limit ($t\\to\\infty$, $N\\to\\infty$ with $w=t/N$ fixed) the joint probability density of $u$ and $v$ is shown to be a bivariate Gaussian density. It follows that the fluctuations of $r$ and $s$ around their mean values in a finite-size system are Gaussian in the scaling limit. The same type of finite-size scaling is expected to hold on periodic lattices above the ...

  14. On the Efficient Simulation of the Distribution of the Sum of Gamma-Gamma Variates with Application to the Outage Probability Evaluation Over Fading Channels

    KAUST Repository

    Chaouki Ben Issaid

    2016-06-01

    The Gamma-Gamma distribution has recently emerged in a number of applications ranging from modeling scattering and reverbation in sonar and radar systems to modeling atmospheric turbulence in wireless optical channels. In this respect, assessing the outage probability achieved by some diversity techniques over this kind of channels is of major practical importance. In many circumstances, this is intimately related to the difficult question of analyzing the statistics of a sum of Gamma-Gamma random variables. Answering this question is not a simple matter. This is essentially because outage probabilities encountered in practice are often very small, and hence the use of classical Monte Carlo methods is not a reasonable choice. This lies behind the main motivation of the present work. In particular, this paper proposes a new approach to estimate the left tail of the sum of Gamma-Gamma variates. More specifically, we propose a mean-shift importance sampling scheme that efficiently evaluates the outage probability of L-branch maximum ratio combining diversity receivers over Gamma-Gamma fading channels. The proposed estimator satisfies the well-known bounded relative error criterion, a well-desired property characterizing the robustness of importance sampling schemes, for both identically and non-identically independent distributed cases. We show the accuracy and the efficiency of our approach compared to naive Monte Carlo via some selected numerical simulations.

  15. 负二项分布概率最大值的性质%The Characters of the Probability Maximum Value for Negative Binomial Distribution

    Institute of Scientific and Technical Information of China (English)

    丁勇

    2016-01-01

    The character of probability maximum value for negative binomial distribution was explored. The probability maximum value for negative binomial distribution was a function of p and r, where p was the probability of success for each test, and r was the number of the first successful test. It was a mono-tonically increasing continuous function of p when r was given,only (r-1)/p was a integer, its derivative did not exist, and a monotone decreasing function of r when p was given.%负二项分布概率的最大值是每次试验成功的概率p和首次试验成功次数r的函数。对确定的r,该函数是p的单调上升的连续函数,仅当(r-1)/p是整数时不可导;对确定的p,该函数是r的单调下降函数。

  16. Quantum probability

    CERN Document Server

    Gudder, Stanley P

    2014-01-01

    Quantum probability is a subtle blend of quantum mechanics and classical probability theory. Its important ideas can be traced to the pioneering work of Richard Feynman in his path integral formalism.Only recently have the concept and ideas of quantum probability been presented in a rigorous axiomatic framework, and this book provides a coherent and comprehensive exposition of this approach. It gives a unified treatment of operational statistics, generalized measure theory and the path integral formalism that can only be found in scattered research articles.The first two chapters survey the ne

  17. 风速概率分布参数估计的低阶概率权重矩法%Low-order Probability-weighted Moments Method for Wind Speed Probability Distribution Parameter Estimation

    Institute of Scientific and Technical Information of China (English)

    潘晓春

    2012-01-01

    It is necessary to describe the statistical properties of wind speed using three-parameter Weibull distribution for offshore wind energy resource assessment and utilization.According to the functional relation between parameters and probability-weighted moments(PWM),the functions were fitted with the shape parameter and PWM using logistic curve.Two formulae of parameter estimation were studied out based on low-order insufficient and exceeding PWM.Accuracy test results show that these formulae had higher precision in large-scale range.Through comparative analysis with high-order PWM method for example,the author believes the low-order PWM methods in this paper are worth popularizing.%为便于进行海上风能资源评估与利用,采用三参数Weibull分布来描述风的统计特性是必要的。根据Weibull分布的三参数与概率权重矩(probability-weighted moment,PWM)的关系,应用罗吉斯蒂曲线拟合形状参数与PWM的函数关系,提出低阶不及PWM和超过PWM 2种参数估计方法。精度检验显示,文中方法在较大范围内均具有较高的精度。通过算例分析比较,认为提出的低阶PWM法值得推广使用。

  18. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  19. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  20. Lexicographic Probability, Conditional Probability, and Nonstandard Probability

    Science.gov (United States)

    2009-11-11

    the following conditions: CP1. µ(U |U) = 1 if U ∈ F ′. CP2 . µ(V1 ∪ V2 |U) = µ(V1 |U) + µ(V2 |U) if V1 ∩ V2 = ∅, U ∈ F ′, and V1, V2 ∈ F . CP3. µ(V |U...µ(V |X)× µ(X |U) if V ⊆ X ⊆ U , U,X ∈ F ′, V ∈ F . Note that it follows from CP1 and CP2 that µ(· |U) is a probability measure on (W,F) (and, in... CP2 hold. This is easily seen to determine µ. Moreover, µ vaciously satisfies CP3, since there do not exist distinct sets U and X in F ′ such that U

  1. 最概然分布理论的探究和剖析%A Summary of Background Knowledge of the Most Probable Distribution Theory

    Institute of Scientific and Technical Information of China (English)

    周昱; 魏蔚; 张艳燕; 马晓栋

    2011-01-01

    文章总结了最概然分布理论推导所需的一些基本概念和基本结论,在基本概念的表达、基本结论的理解和所有相关知识点的关联上提出一些体会与心得。%By summarizing basic concepts and conclusions in deriving the most probable distribution theory, this paper propose some comments and experiences of the expression of basic concepts, comprehend of basic conclusions, and relations of those pertinent knowle

  2. An investigation of the dose distribution effect related with collimator angle in volumetric arc therapy of prostate cancer

    Directory of Open Access Journals (Sweden)

    Bora Tas

    2016-01-01

    Full Text Available To investigate the dose-volume variations of planning target volume (PTV and organ at risks (OARs in eleven prostate cancer patients planned with single and double arc volumetric modulated arc therapy (VMAT when varying collimator angle. Single and double arc VMAT treatment plans were created using Monaco5.0® with collimator angle set to 0°. All plans were normalized 7600 cGy dose to the 95% of clinical target volume (CTV volume. The single arc VMAT plans were reoptimized with different collimator angles (0°, 15°, 30°, 45°, 60°, 75°, and 90°, and for double arc VMAT plans (0–0°, 15°–345, 30–330°, 45–315°, 60–300°, 75–285°, 90–270° using the same optimization parameters. For the comparison the parameters of heterogeneity index (HI, dose-volume histogram and minimum dose to the 95% of PTV volume (D95 PTV calculated and analyzed. The best plans were verified using 2 dimensional ion chamber array IBA Matrixx® and three-dimensional IBA Compass® program. The comparison between calculation and measurement were made by the γ-index (3%/3 mm analysis. A higher D95 (PTV were found for single arc VMAT with 15° collimator angle. For double arc, VMAT with 60–300° and 75–285° collimator angles. However, lower rectum doses obtained for 75–285° collimator angles. There was no significant dose difference, based on other OARs which are bladder and femur head. When we compared single and double arc VMAT's D95 (PTV, we determined 2.44% high coverage and lower HI with double arc VMAT. All plans passed the γ-index (3%/3 mm analysis with more than 97% of the points and we had an average γ-index for CTV 0.36, for PTV 0.32 with double arc VMAT. These results were significant by Wilcoxon signed rank test statistically. The results show that dose coverage of target and OAR's doses also depend significantly on the collimator angles due to the geometry of target and OARs. Based on the results we have decided to plan prostate

  3. Towards a Global Water Scarcity Risk Assessment Framework: Incorporation of Probability Distributions and Hydro-Climatic Variability

    Science.gov (United States)

    Veldkamp, T. I. E.; Wada, Y.; Aerts, J. C. J. H.; Ward, P. J.

    2016-01-01

    Changing hydro-climatic and socioeconomic conditions increasingly put pressure on fresh water resources and are expected to aggravate water scarcity conditions towards the future. Despite numerous calls for risk-based water scarcity assessments, a global-scale framework that includes UNISDR's definition of risk does not yet exist. This study provides a first step towards such a risk based assessment, applying a Gamma distribution to estimate water scarcity conditions at the global scale under historic and future conditions, using multiple climate change and population growth scenarios. Our study highlights that water scarcity risk, expressed in terms of expected annual exposed population, increases given all future scenarios, up to greater than 56.2% of the global population in 2080. Looking at the drivers of risk, we find that population growth outweigh the impacts of climate change at global and regional scales. Using a risk-based method to assess water scarcity, we show the results to be less sensitive than traditional water scarcity assessments to the use of fixed threshold to represent different levels of water scarcity. This becomes especially important when moving from global to local scales, whereby deviations increase up to 50% of estimated risk levels.

  4. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... of insurance companies facing losses due to natural disasters, banks seeking protection against huge losses, failures in expensive and sophisticated systems or loss of valuable information in electronic systems. The main difficulty when dealing with this kind of problems is the unavailability of a closed...

  5. Modeling Non-Equilibrium Dynamics of a Discrete Probability Distribution: General Rate Equation for Maximal Entropy Generation in a Maximum-Entropy Landscape with Time-Dependent Constraints

    Directory of Open Access Journals (Sweden)

    Gian Paolo Beretta

    2008-08-01

    Full Text Available A rate equation for a discrete probability distribution is discussed as a route to describe smooth relaxation towards the maximum entropy distribution compatible at all times with one or more linear constraints. The resulting dynamics follows the path of steepest entropy ascent compatible with the constraints. The rate equation is consistent with the Onsager theorem of reciprocity and the fluctuation-dissipation theorem. The mathematical formalism was originally developed to obtain a quantum theoretical unification of mechanics and thermodinamics. It is presented here in a general, non-quantal formulation as a part of an effort to develop tools for the phenomenological treatment of non-equilibrium problems with applications in engineering, biology, sociology, and economics. The rate equation is also extended to include the case of assigned time-dependences of the constraints and the entropy, such as for modeling non-equilibrium energy and entropy exchanges.

  6. Estimation method for random sonic fatigue life of thin-walled structure of a combustor liner based on stress probability distribution%Estimation method for random sonic fatigue life of thin-walled structure of a combustor liner based on stress probability distribution

    Institute of Scientific and Technical Information of China (English)

    SHA Yun-dong; GUO Xiao-peng; LIAO Lian-fang; XIE Li-juan

    2011-01-01

    As to the sonic fatigue problem of an aero-engine combustor liner structure under the random acoustic loadings, an effective method for predicting the fatigue life of a structure under random loadings was studied. Firstly, the probability distribution of Von Mises stress of thin-walled structure under random loadings was studied, analysis suggested that probability density function of Von Mises stress process accord approximately with two-parameter Weibull distribution. The formula for calculating Weibull parameters were given. Based on the Miner linear theory, the method for predicting the random sonic fatigue life based on the stress probability density was developed, and the model for fatigue life prediction was constructed. As an example, an aero-engine combustor liner structure was considered. The power spectrum density (PSD) of the vibrational stress response was calculated by using the coupled FEM/BEM (finite element method/boundary element method) model, the fatigue life was estimated by using the constructed model. And considering the influence of the wide frequency band, the calculated results were modified. Comparetive analysis shows that the estimated results of sonic fatigue of the combustor liner structure by using Weibull distribution of Von Mises stress are more conservative than using Dirlik distribution to some extend. The results show that the methods presented in this paper are practical for the random fatigue life analysis of the aeronautical thin-walled structures.

  7. Probability Distribution of Airport Capacity Affected by Weather%天气影响的机场容量概率分布

    Institute of Scientific and Technical Information of China (English)

    张静; 徐肖豪; 王飞

    2011-01-01

    Weather is a major factor causing airport capacity reduction. To reflect the impact of the weather on capacity, the n-phase arrival capacity distribution model is established. The historical weather data are translated into the capacity probability distribution for each weather type through weather type decision tree. According to the capacity probability distribution of each weather type, the probabilistic weather forecasts are translated into probabilistic capacity forecasts by using total probability formula. Weather forecasts of a day are simulated according to the 5-year airport hourly data, and a set of the n-phase arrival capacity distribution based on the weather forecasts is obtained. Simulation results indicate that inclement weather forecasts at different time can be translated into a set of stochastic capacity forecasts, which thus meeting the needs of the real time traffic flow management.%天气是影响机场容量下降的主要因素,为了反映预测天气对容量的影响,建立了n-阶段到达容量分布模型.通过天气类型决策树将历史天气数据转换为每种天气类型的到达容量概率分布.根据天气类型的容量概率分布,用全概公式将概率天气预测转换为概率容量预测.基于5年的机场小时天气数据,对某一日的预测天气进行算例仿真,得到了一组基于预测天气的n-阶段容量概率分布.结果表明,n-阶段容量分布模型能够将机场不同时段的预测恶劣天气转换为预测随机容量,从而满足实时流量管理的需要.

  8. Probability Theory without Bayes' Rule

    OpenAIRE

    Rodriques, Samuel G.

    2014-01-01

    Within the Kolmogorov theory of probability, Bayes' rule allows one to perform statistical inference by relating conditional probabilities to unconditional probabilities. As we show here, however, there is a continuous set of alternative inference rules that yield the same results, and that may have computational or practical advantages for certain problems. We formulate generalized axioms for probability theory, according to which the reverse conditional probability distribution P(B|A) is no...

  9. Characterization of the energy distribution of neutrons generated by 5 MeV protons on a thick beryllium target at different emission angles

    Energy Technology Data Exchange (ETDEWEB)

    Agosteo, S. [Politecnico di Milano, Dipartimento di Energia, Piazza Leonardo da Vinci 32, 20133 Milano (Italy)] [Istituto Nazionale di Fisica Nucleare, Sezione di Milano, via Celoria 16, 20133 Milano (Italy); Colautti, P., E-mail: paolo.colautti@lnl.infn.it [INFN, Laboratori Nazionali di Legnaro (LNL), Via dell' Universita, 2, I-35020 Legnaro (PD) (Italy); Esposito, J., E-mail: juan.esposito@tin.it [INFN, Laboratori Nazionali di Legnaro (LNL), Via dell' Universita, 2, I-35020 Legnaro (PD) (Italy); Fazzi, A.; Introini, M.V.; Pola, A. [Politecnico di Milano, Dipartimento di Energia, Piazza Leonardo da Vinci 32, 20133 Milano (Italy)] [Istituto Nazionale di Fisica Nucleare, Sezione di Milano, via Celoria 16, 20133 Milano (Italy)

    2011-12-15

    Neutron energy spectra at different emission angles, between 0 Degree-Sign and 120 Degree-Sign from the Be(p,xn) reaction generated by a beryllium thick-target bombarded with 5 MeV protons, have been measured at the Legnaro Laboratories (LNL) of the Italian National Institute for Nuclear Physics research (INFN). A new and quite compact recoil-proton spectrometer, based on a monolithic silicon telescope, coupled to a polyethylene converter, was efficiently used with respect to the traditional Time-of-Flight (TOF) technique. The measured distributions of recoil-protons were processed through an iterative unfolding algorithm in order to determine the neutron energy spectra at all the angles accounted for. The neutron energy spectrum measured at 0 Degree-Sign resulted to be in good agreement with the only one so far available at the requested energy and measured years ago with TOF technique. Moreover, the results obtained at different emission angles resulted to be consistent with detailed past measurements performed at 4 MeV protons at the same angles by TOF techniques.

  10. Intracranial cerebrospinal fluid spaces imaging using a pulse-triggered three-dimensional turbo spin echo MR sequence with variable flip-angle distribution

    Energy Technology Data Exchange (ETDEWEB)

    Hodel, Jerome [Unite Analyse et Restauration du Mouvement, UMR-CNRS, 8005 LBM ParisTech Ensam, Paris (France); University Paris Est Creteil (UPEC), Creteil (France); Assistance Publique-Hopitaux de Paris, Paris (France); Hopital Henri Mondor, Department of Neuroradiology, Creteil (France); Hopital Henri Mondor, Creteil (France); Silvera, Jonathan [University Paris Est Creteil (UPEC), Creteil (France); Assistance Publique-Hopitaux de Paris, Paris (France); Hopital Henri Mondor, Department of Neuroradiology, Creteil (France); Bekaert, Olivier; Decq, Philippe [Unite Analyse et Restauration du Mouvement, UMR-CNRS, 8005 LBM ParisTech Ensam, Paris (France); University Paris Est Creteil (UPEC), Creteil (France); Assistance Publique-Hopitaux de Paris, Paris (France); Hopital Henri Mondor, Department of Neurosurgery, Creteil (France); Rahmouni, Alain [University Paris Est Creteil (UPEC), Creteil (France); Assistance Publique-Hopitaux de Paris, Paris (France); Hopital Henri Mondor, Department of Radiology, Creteil (France); Bastuji-Garin, Sylvie [University Paris Est Creteil (UPEC), Creteil (France); Assistance Publique-Hopitaux de Paris, Paris (France); Hopital Henri Mondor, Department of Public Health, Creteil (France); Vignaud, Alexandre [Siemens Healthcare, Saint Denis (France); Petit, Eric; Durning, Bruno [Laboratoire Images Signaux et Systemes Intelligents, UPEC, Creteil (France)

    2011-02-15

    To assess the three-dimensional turbo spin echo with variable flip-angle distribution magnetic resonance sequence (SPACE: Sampling Perfection with Application optimised Contrast using different flip-angle Evolution) for the imaging of intracranial cerebrospinal fluid (CSF) spaces. We prospectively investigated 18 healthy volunteers and 25 patients, 20 with communicating hydrocephalus (CH), five with non-communicating hydrocephalus (NCH), using the SPACE sequence at 1.5T. Volume rendering views of both intracranial and ventricular CSF were obtained for all patients and volunteers. The subarachnoid CSF distribution was qualitatively evaluated on volume rendering views using a four-point scale. The CSF volumes within total, ventricular and subarachnoid spaces were calculated as well as the ratio between ventricular and subarachnoid CSF volumes. Three different patterns of subarachnoid CSF distribution were observed. In healthy volunteers we found narrowed CSF spaces within the occipital aera. A diffuse narrowing of the subarachnoid CSF spaces was observed in patients with NCH whereas patients with CH exhibited narrowed CSF spaces within the high midline convexity. The ratios between ventricular and subarachnoid CSF volumes were significantly different among the volunteers, patients with CH and patients with NCH. The assessment of CSF spaces volume and distribution may help to characterise hydrocephalus. (orig.)

  11. A maximum entropy approach to the study of residue-specific backbone angle distributions in α-synuclein, an intrinsically disordered protein

    Science.gov (United States)

    Mantsyzov, Alexey B; Maltsev, Alexander S; Ying, Jinfa; Shen, Yang; Hummer, Gerhard; Bax, Ad

    2014-01-01

    α-Synuclein is an intrinsically disordered protein of 140 residues that switches to an α-helical conformation upon binding phospholipid membranes. We characterize its residue-specific backbone structure in free solution with a novel maximum entropy procedure that integrates an extensive set of NMR data. These data include intraresidue and sequential HN–Hα and HN–HN NOEs, values for 3JHNHα, 1JHαCα, 2JCαN, and 1JCαN, as well as chemical shifts of 15N, 13Cα, and 13C′ nuclei, which are sensitive to backbone torsion angles. Distributions of these torsion angles were identified that yield best agreement to the experimental data, while using an entropy term to minimize the deviation from statistical distributions seen in a large protein coil library. Results indicate that although at the individual residue level considerable deviations from the coil library distribution are seen, on average the fitted distributions agree fairly well with this library, yielding a moderate population (20–30%) of the PPII region and a somewhat higher population of the potentially aggregation-prone β region (20–40%) than seen in the database. A generally lower population of the αR region (10–20%) is found. Analysis of 1H–1H NOE data required consideration of the considerable backbone diffusion anisotropy of a disordered protein. PMID:24976112

  12. A maximum entropy approach to the study of residue-specific backbone angle distributions in α-synuclein, an intrinsically disordered protein.

    Science.gov (United States)

    Mantsyzov, Alexey B; Maltsev, Alexander S; Ying, Jinfa; Shen, Yang; Hummer, Gerhard; Bax, Ad

    2014-09-01

    α-Synuclein is an intrinsically disordered protein of 140 residues that switches to an α-helical conformation upon binding phospholipid membranes. We characterize its residue-specific backbone structure in free solution with a novel maximum entropy procedure that integrates an extensive set of NMR data. These data include intraresidue and sequential H(N) − H(α) and H(N) − H(N) NOEs, values for (3) JHNHα, (1) JHαCα, (2) JCαN, and (1) JCαN, as well as chemical shifts of (15)N, (13)C(α), and (13)C' nuclei, which are sensitive to backbone torsion angles. Distributions of these torsion angles were identified that yield best agreement to the experimental data, while using an entropy term to minimize the deviation from statistical distributions seen in a large protein coil library. Results indicate that although at the individual residue level considerable deviations from the coil library distribution are seen, on average the fitted distributions agree fairly well with this library, yielding a moderate population (20-30%) of the PPII region and a somewhat higher population of the potentially aggregation-prone β region (20-40%) than seen in the database. A generally lower population of the αR region (10-20%) is found. Analysis of (1)H − (1)H NOE data required consideration of the considerable backbone diffusion anisotropy of a disordered protein.

  13. Probability Distribution Functions OF 12CO(J = 1-0) Brightness and Integrated Intensity in M51: The PAWS View

    CERN Document Server

    Hughes, Annie; Schinnerer, Eva; Colombo, Dario; Pety, Jerome; Leroy, Adam K; Dobbs, Clare L; Garcia-Burillo, Santiago; Thompson, Todd A; Dumas, Gaelle; Schuster, Karl F; Kramer, Carsten

    2013-01-01

    We analyse the distribution of CO brightness temperature and integrated intensity in M51 at ~40 pc resolution using new CO data from the Plateau de Bure Arcsecond Whirlpool Survey (PAWS). We present probability distribution functions (PDFs) of the CO emission within the PAWS field, which covers the inner 11 x 7 kpc of M51. We find variations in the shape of CO PDFs within different M51 environments, and between M51 and M33 and the Large Magellanic Cloud (LMC). Globally, the PDFs for the inner disk of M51 can be represented by narrow lognormal functions that cover 1 to 2 orders of magnitude in CO brightness and integrated intensity. The PDFs for M33 and the LMC are narrower and peak at lower CO intensities. However, the CO PDFs for different dynamical environments within the PAWS field depart from the shape of the global distribution. The PDFs for the interarm region are approximately lognormal, but in the spiral arms and central region of M51, they exhibit diverse shapes with a significant excess of bright CO...

  14. Ionization compression impact on dense gas distribution and star formation, Probability density functions around H ii regions as seen by Herschel

    CERN Document Server

    Tremblin, P; Minier, V; Didelon, P; Hill, T; Anderson, L D; Motte, F; Zavagno, A; André, Ph; Arzoumanian, D; Audit, E; Benedettini, M; Bontemps, S; Csengeri, T; Di Francesco, J; Giannini, T; Hennemann, M; Luong, Q Nguyen; Marston, A P; Peretto, N; Rivera-Ingraham, A; Russeil, D; Rygl, K L J; Spinoglio, L; White, G J

    2014-01-01

    Ionization feedback should impact the probability distribution function (PDF) of the column density around the ionized gas. We aim to quantify this effect and discuss its potential link to the Core and Initial Mass Function (CMF/IMF). We used in a systematic way Herschel column density maps of several regions observed within the HOBYS key program: M16, the Rosette and Vela C molecular cloud, and the RCW 120 H ii region. We fitted the column density PDFs of all clouds with two lognormal distributions, since they present a double-peak or enlarged shape in the PDF. Our interpretation is that the lowest part of the column density distribution describes the turbulent molecular gas while the second peak corresponds to a compression zone induced by the expansion of the ionized gas into the turbulent molecular cloud. The condensations at the edge of the ionized gas have a steep compressed radial profile, sometimes recognizable in the flattening of the power-law tail. This could lead to an unambiguous criterion able t...

  15. Probability Distributions over Cryptographic Protocols

    Science.gov (United States)

    2009-06-01

    exception. Cryptyc integrates use of pattern- matching in the spi calculus framework , which in turn allows the specification of nested cryptographic...programs too: the metaheuristic search for security protocols,” Information and Software Technology, vol. 43, pp. 891– 904, December 2001. 131 [9] X

  16. CP violation and the CKM angle $\\gamma$ from angular distributions of untagged B$_{s}$ decays governed by $\\overline{b}$ --> $\\overline{c}$u$\\overline{s}$

    CERN Document Server

    Fleischer, Robert; Fleischer, Robert; Dunietz, Isard

    1996-01-01

    We demonstrate that time-dependent studies of angular distributions for B_s decays caused by \\bar b\\to\\bar cu\\bar s quark-level transitions extract cleanly and model-independently the CKM angle \\gamma. This CKM angle could be cleanly determined from untagged B_s decays alone, if the lifetime difference between the B_s mass eigenstates B_s^L and B_s^H is sizable. The time-dependences for the relevant tagged and untagged observables are given both in a general notation and in terms of linear polarization states and should exhibit large CP-violating effects. These observables may furthermore provide insights into the hadronization dynamics of the corresponding exclusive B_s decays thereby allowing tests of the factorization hypothesis.

  17. Impact of distributed generation in the probability density of voltage sags; Impacto da geracao distribuida na densidade de probabilidade de afundamentos de tensao

    Energy Technology Data Exchange (ETDEWEB)

    Ramos, Alessandro Candido Lopes [CELG - Companhia Energetica de Goias, Goiania, GO (Brazil). Generation and Transmission. System' s Operation Center], E-mail: alessandro.clr@celg.com.br; Batista, Adalberto Jose [Universidade Federal de Goias (UFG), Goiania, GO (Brazil)], E-mail: batista@eee.ufg.br; Leborgne, Roberto Chouhy [Universidade Federal do Rio Grande do Sul (UFRS), Porto Alegre, RS (Brazil)], E-mail: rcl@ece.ufrgs.br; Emiliano, Pedro Henrique Mota, E-mail: ph@phph.com.br

    2009-07-01

    This article presents the impact of distributed generation in studies of voltage sags caused by faults in the electrical system. We simulated short-circuit-to-ground in 62 lines of 230, 138, 69 and 13.8 kV that are part of the electrical system of the city of Goiania, of Goias state . For each fault position was monitored the bus voltage of 380 V in an industrial consumer sensitive to such sags. Were inserted different levels of GD near the consumer. The simulations of a short circuit, with the monitoring bar 380 V, were performed again. A study using stochastic simulation Monte Carlo (SMC) was performed to obtain, at each level of GD, the probability curves and sags of the probability density and its voltage class. With these curves were obtained the average number of sags according to each class, that the consumer bar may be submitted annually. The simulations were performed using the Program Analysis of Simultaneous Faults - ANAFAS. In order to overcome the intrinsic limitations of the methods of simulation of this program and allow data entry via windows, a computational tool was developed in Java language. Data processing was done using the MATLAB software.

  18. Effect of rib angle on local heat/mass transfer distribution in a two-pass rib-roughened channel

    Science.gov (United States)

    Chandra, P. R.; Han, J. C.; Lau, S. C.

    1987-01-01

    The naphthalene sublimation technique is used to investigate the heat transfer characteristics of turbulent air flow in a two-pass channel. A test section that resembles the internal cooling passages of gas turbine airfoils is employed. The local Sherwood numbers on the ribbed walls were found to be 1.5-6.5 times those for a fully developed flow in a smooth square duct. Depending on the rib angle-of-attack and the Reynolds number, the average ribbed-wall Sherwood numbers were 2.5-3.5 times higher than the fully developed values.

  19. RANDOM VARIABLE WITH FUZZY PROBABILITY

    Institute of Scientific and Technical Information of China (English)

    吕恩琳; 钟佑明

    2003-01-01

    Mathematic description about the second kind fuzzy random variable namely the random variable with crisp event-fuzzy probability was studied. Based on the interval probability and using the fuzzy resolution theorem, the feasible condition about a probability fuzzy number set was given, go a step further the definition arid characters of random variable with fuzzy probability ( RVFP ) and the fuzzy distribution function and fuzzy probability distribution sequence of the RVFP were put forward. The fuzzy probability resolution theorem with the closing operation of fuzzy probability was given and proved. The definition and characters of mathematical expectation and variance of the RVFP were studied also. All mathematic description about the RVFP has the closing operation for fuzzy probability, as a result, the foundation of perfecting fuzzy probability operation method is laid.

  20. Water distributions in polystyrene-block-poly[styrene-g-poly(ethylene oxide)] block grafted copolymer system in aqueous solutions revealed by contrast variation small angle neutron scattering study

    Science.gov (United States)

    Li, Xin; Hong, Kunlun; Liu, Yun; Shew, Chwen-Yang; Liu, Emily; Herwig, Kenneth W.; Smith, Gregory S.; Zhao, Junpeng; Zhang, Guangzhao; Pispas, Stergios; Chen, Wei-Ren

    2010-10-01

    We develop an experimental approach to analyze the water distribution around a core-shell micelle formed by polystyrene-block-poly[styrene-g-poly(ethylene oxide (PEO)] block copolymers in aqueous media at a fixed polymeric concentration of 10 mg/ml through contrast variation small angle neutron scattering (SANS) study. Through varying the D2O/H2O ratio, the scattering contributions from the water molecules and the micellar constituent components can be determined. Based on the commonly used core-shell model, a theoretical coherent scattering cross section incorporating the effect of water penetration is developed and used to analyze the SANS I(Q ). We have successfully quantified the intramicellar water distribution and found that the overall micellar hydration level increases with the increase in the molecular weight of hydrophilic PEO side chains. Our work presents a practical experimental means for evaluating the intramacromolecular solvent distributions of general soft matter systems.

  1. Azimuthal Angle Distribution in $B \\to K^* (\\to K \\pi) \\ell^+ \\ell^- $ at low invariant $m_{\\ell^+ \\ell^-}$ Region

    CERN Document Server

    Kim, C S; Lü, C D; Morozumi, T; Kim, Yeong Gyun; Lu, Cai-Dian; Morozumi, Takuya

    2000-01-01

    We present the angular distribution of the rare B decay, $B \\to K^* (\\to K invariant mass region of dileptons, we can probe new physics effects efficiently. In particular, this distribution is found to be quite sensitive to the ratio of the contributions from two independent magnetic moment operators, which also contribute to $B \\to K^* \\gamma$. Therefore, our method can be very useful when new physics is introduced without changing the total decay rate of the $b \\to s \\gamma$. The angular distributions are compared with the predictions of the standard model, and are shown for the cases when the afore-mentioned ratio is different from the standard model prediction.

  2. Simultaneous estimation of the dip angles and slip distribution on the faults of the 2016 Kumamoto earthquake through a weak nonlinear inversion of InSAR data

    Science.gov (United States)

    Fukahata, Yukitoshi; Hashimoto, Manabu

    2016-12-01

    At the 2016 Kumamoto earthquake, surface ruptures were observed not only along the Futagawa fault, where main ruptures occurred, but also along the Hinagu fault. To estimate the slip distribution on these faults, we extend a method of nonlinear inversion analysis (Fukahata and Wright in Geophys J Int 173:353-364, 2008) to a two-fault system. With the method of Fukahata and Wright (2008), we can simultaneously determine the optimal dip angle of a fault and the slip distribution on it, based on Akaike's Bayesian information criterion by regarding the dip angle as an hyperparameter. By inverting the InSAR data with the developed method, we obtain the dip angles of the Futagawa and Hinagu faults as 61° ± 6° and 74° ± 12°, respectively. The slip on the Futagawa fault is mainly strike slip. The largest slip on it is over 5 m around the center of the model fault (130.9° in longitude) with a significant normal slip component. The slip on the Futagawa fault quickly decreases to zero beyond the intersection with the Hinagu fault. On the other hand, the slip has a local peak just inside Aso caldera, which would be a cause of severe damage in this area. A relatively larger reverse fault slip component on a deeper part around the intersection with Aso caldera suggests that something complicated happened there. The slip on the Hinagu fault is almost a pure strike slip with a peak of about 2.4 m. The developed method is useful in clarifying the slip distribution, when a complicated rupture like the Kumamoto earthquake happens in a remote area.[Figure not available: see fulltext.

  3. Measurement of the flux and zenith-angle distribution of upward through-going muons by Super-Kamiokande

    CERN Document Server

    Fukuda, Y; Ichihara, E; Inoue, K; Ishihara, K; Ishino, H; Itow, Y; Kajita, T; Kameda, J; Kasuga, S; Kobayashi, K; Kobayashi, Y; Koshio, Y; Miura, M; Nakahata, M; Nakayama, S; Okada, A; Okumura, K; Sakurai, N; Shiozawa, M; Suzuki, Y; Takeuchi, Y; Totsuka, Y; Yamada, S; Earl, M; Habig, A; Kearns, E; Messier, M D; Scholberg, K; Stone, J L; Sulak, L R; Walter, C W; Goldhaber, M; Barszczak, T; Casper, D; Gajewski, W; Kropp, W R; Price, L R; Reines, F; Smy, M B; Sobel, H W; Vagins, M R; Ganezer, K S; Keig, W E; Ellsworth, R W; Tasaka, S; Flanagan, J W; Kibayashi, A; Learned, J G; Matsuno, S; Stenger, V J; Takemori, D; Ishii, T; Kanzaki, J; Kobayashi, T; Mine, S; Nakamura, K; Nishikawa, K; Oyama, Y; Sakai, A; Sakuda, M; Sasaki, O; Echigo, S; Kohama, M; Suzuki, A T; Haines, T J; Blaufuss, E; Kim, B K; Sanford, R; Svoboda, R; Chen, M L; Goodman, J A; Sullivan, G W; Hill, J; Jung, C K; Martens, K; Mauger, C; McGrew, C; Sharkey, E; Viren, B; Yanagisawa, C; Doki, W; Miyano, K; Okazawa, H; Saji, C; Takahata, M; Nagashima, Y; Takita, M; Yamaguchi, T; Yoshida, M; Kim, S B; Etoh, M; Fujita, K; Hasegawa, A; Hasegawa, T; Hatakeyama, S; Iwamoto, T; Koga, M; Maruyama, T; Ogawa, H; Shirai, J; Suzuki, A; Tsushima, F; Koshiba, M; Nemoto, M; Nishijima, K; Futagami, T; Hayato, Y; Kanaya, Y; Kaneyuki, K; Watanabe, Y; Kielczewska, D; Doyle, R A; George, J S; Stachyra, A L; Wai, L L; Wilkes, R J; Young, K K

    1999-01-01

    A total of 614 upward through-going muons of minimum energy 1.6 GeV are observed by Super-Kamiokande during 537 detector live days. The measured muon flux is 1.74+/-0.07(stat.)+/-0.02(sys.)x10^{-13}cm^{-2}s^{-1}sr^{-1} compared to an expected flux of 1.97+/-0.44(theo.)x10^{-13}cm^{-2}s^{-1}sr^{-1}. The absolute measured flux is in agreement with the prediction within the errors. However, the zenith angle dependence of the observed upward through-going muon flux does not agree with no-oscillation predictions. The observed distortion in shape is consistent with the \

  4. Characteristics of the spatiotemporal distribution of daily extreme temperature events in China: Minimum temperature records in different climate states against the background of the most probable temperature

    Institute of Scientific and Technical Information of China (English)

    Qian Zhong-Hua; Hu Jing-Guo; Feng Guo-Lin; Cao Yong-Zhong

    2012-01-01

    Based on the skewed function,the most probable temperature is defined and the spatiotemporal distributions of the frequencies and strengths of extreme temperature events in different climate states over China are investigated,where the climate states are referred to as State Ⅰ,State Ⅱ and State Ⅲ,i.e.,the daily minimum temperature records of 1961-1990,1971-2000,and 1981-2009.The results show that in space the frequency of high temperature events in summer decreases clearly in the lower and middle reaches of the Yellow River in State Ⅰ and that low temperature events decrease in northern China in State Ⅱ.In the present state,the frequency of high temperature events increases significantly in most areas over China except the north east,while the frequency of low temperature events decreases mainly in north China and the regions between the Yangtze River and the Yellow River.The distributions of frequencies and strengths of extreme temperature events are consistent in space.The analysis of time evolution of extreme events shows that the occurrence of high temperature events become higher with the change in state,while that of low temperature events decreases.High temperature events are becoming stronger as well and deserve to be paid special attention.

  5. MERA: a webserver for evaluating backbone torsion angle distributions in dynamic and disordered proteins from NMR data

    Energy Technology Data Exchange (ETDEWEB)

    Mantsyzov, Alexey B. [M.V. Lomonosov Moscow State University, Faculty of Fundamental Medicine (Russian Federation); Shen, Yang; Lee, Jung Ho [National Institutes of Health, Laboratory of Chemical Physics, National Institute of Diabetes and Digestive and Kidney Diseases (United States); Hummer, Gerhard [Max Planck Institute of Biophysics (Germany); Bax, Ad, E-mail: bax@nih.gov [National Institutes of Health, Laboratory of Chemical Physics, National Institute of Diabetes and Digestive and Kidney Diseases (United States)

    2015-09-15

    MERA (Maximum Entropy Ramachandran map Analysis from NMR data) is a new webserver that generates residue-by-residue Ramachandran map distributions for disordered proteins or disordered regions in proteins on the basis of experimental NMR parameters. As input data, the program currently utilizes up to 12 different parameters. These include three different types of short-range NOEs, three types of backbone chemical shifts ({sup 15}N, {sup 13}C{sup α}, and {sup 13}C′), six types of J couplings ({sup 3}J{sub HNHα}, {sup 3}J{sub C′C′}, {sup 3}J{sub C′Hα}, {sup 1}J{sub HαCα}, {sup 2}J{sub CαN} and {sup 1}J{sub CαN}), as well as the {sup 15}N-relaxation derived J(0) spectral density. The Ramachandran map distributions are reported in terms of populations of their 15° × 15° voxels, and an adjustable maximum entropy weight factor is available to ensure that the obtained distributions will not deviate more from a newly derived coil library distribution than required to account for the experimental data. MERA output includes the agreement between each input parameter and its distribution-derived value. As an application, we demonstrate performance of the program for several residues in the intrinsically disordered protein α-synuclein, as well as for several static and dynamic residues in the folded protein GB3.

  6. Three paths toward the quantum angle operator

    Science.gov (United States)

    Gazeau, Jean Pierre; Szafraniec, Franciszek Hugon

    2016-12-01

    We examine mathematical questions around angle (or phase) operator associated with a number operator through a short list of basic requirements. We implement three methods of construction of quantum angle. The first one is based on operator theory and parallels the definition of angle for the upper half-circle through its cosine and completed by a sign inversion. The two other methods are integral quantization generalizing in a certain sense the Berezin-Klauder approaches. One method pertains to Weyl-Heisenberg integral quantization of the plane viewed as the phase space of the motion on the line. It depends on a family of "weight" functions on the plane. The third method rests upon coherent state quantization of the cylinder viewed as the phase space of the motion on the circle. The construction of these coherent states depends on a family of probability distributions on the line.

  7. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  8. Monte Carlo dose calculations and radiobiological modelling: analysis of the effect of the statistical noise of the dose distribution on the probability of tumour control.

    Science.gov (United States)

    Buffa, F M; Nahum, A E

    2000-10-01

    The aim of this work is to investigate the influence of the statistical fluctuations of Monte Carlo (MC) dose distributions on the dose volume histograms (DVHs) and radiobiological models, in particular the Poisson model for tumour control probability (tcp). The MC matrix is characterized by a mean dose in each scoring voxel, d, and a statistical error on the mean dose, sigma(d); whilst the quantities d and sigma(d) depend on many statistical and physical parameters, here we consider only their dependence on the phantom voxel size and the number of histories from the radiation source. Dose distributions from high-energy photon beams have been analysed. It has been found that the DVH broadens when increasing the statistical noise of the dose distribution, and the tcp calculation systematically underestimates the real tumour control value, defined here as the value of tumour control when the statistical error of the dose distribution tends to zero. When increasing the number of energy deposition events, either by increasing the voxel dimensions or increasing the number of histories from the source, the DVH broadening decreases and tcp converges to the 'correct' value. It is shown that the underestimation of the tcp due to the noise in the dose distribution depends on the degree of heterogeneity of the radiobiological parameters over the population; in particular this error decreases with increasing the biological heterogeneity, whereas it becomes significant in the hypothesis of a radiosensitivity assay for single patients, or for subgroups of patients. It has been found, for example, that when the voxel dimension is changed from a cube with sides of 0.5 cm to a cube with sides of 0.25 cm (with a fixed number of histories of 10(8) from the source), the systematic error in the tcp calculation is about 75% in the homogeneous hypothesis, and it decreases to a minimum value of about 15% in a case of high radiobiological heterogeneity. The possibility of using the error

  9. Water quality analysis in rivers with non-parametric probability distributions and fuzzy inference systems: application to the Cauca River, Colombia.

    Science.gov (United States)

    Ocampo-Duque, William; Osorio, Carolina; Piamba, Christian; Schuhmacher, Marta; Domingo, José L

    2013-02-01

    The integration of water quality monitoring variables is essential in environmental decision making. Nowadays, advanced techniques to manage subjectivity, imprecision, uncertainty, vagueness, and variability are required in such complex evaluation process. We here propose a probabilistic fuzzy hybrid model to assess river water quality. Fuzzy logic reasoning has been used to compute a water quality integrative index. By applying a Monte Carlo technique, based on non-parametric probability distributions, the randomness of model inputs was estimated. Annual histograms of nine water quality variables were built with monitoring data systematically collected in the Colombian Cauca River, and probability density estimations using the kernel smoothing method were applied to fit data. Several years were assessed, and river sectors upstream and downstream the city of Santiago de Cali, a big city with basic wastewater treatment and high industrial activity, were analyzed. The probabilistic fuzzy water quality index was able to explain the reduction in water quality, as the river receives a larger number of agriculture, domestic, and industrial effluents. The results of the hybrid model were compared to traditional water quality indexes. The main advantage of the proposed method is that it considers flexible boundaries between the linguistic qualifiers used to define the water status, being the belongingness of water quality to the diverse output fuzzy sets or classes provided with percentiles and histograms, which allows classify better the real water condition. The results of this study show that fuzzy inference systems integrated to stochastic non-parametric techniques may be used as complementary tools in water quality indexing methodologies.

  10. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  11. The influence of cathode excavation of cathodic arc evaporator on thickness uniformity and erosion products angle distribution

    Directory of Open Access Journals (Sweden)

    D. V. Duhopel'nikov

    2014-01-01

    Full Text Available Cathodic arc evaporators are used for coating with functional films. Prolonged or buttend evaporators may be used for this purposes. In butt-end evaporator the cathode spots move continuously on the cathode work surface and evaporate cathode material. High depth excavation profile forms on the cathode work surface while the thick coating precipitation (tens or hundreds of microns. The cathode excavation profile is shaped like a “cup” with high walls for electrostatic discharge stabilization systems with axial magnetic fields. Cathode spots move on the bottom of the “cup”. It is very likely that high “cup” walls are formed as a result of lasting work time influence on the uniformity of precipitated films.In the present work the influence of excavation profile walls height on the uniformity of precipitated coating was carried out. The high profile walls are formed due to lasting work of DC vacuum arc evaporator. The cathode material used for tests was 3003 aluminum alloy. The extended substrate was placed parallel to the cathode work surface. Thickness distribution along the substrate length with the new cathode was obtained after 6 hours and after 12 hours of continuous operation.The thickness distribution of precipitated coating showed that the cathode excavation has an influence on the angular distribution of the matter escaping the cathode. It can be clearly seen from the normalized dependence coating thickness vs the distance from the substrate center. Also the angular distribution of the matter flow from the cathode depending on the cathode working time was obtained. It was shown that matter flow from the cathode differs from the LambertKnudsen law. The more the cathode excavation the more this difference.So, cathode excavation profile has an influence on the uniformity of precipitated coating and it is necessary to take in account the cathode excavation profile while coating the thick films.

  12. Anisotropic pitch angle distribution of ~100 keV microburst electrons in the loss cone: measurements from STSAT-1

    Directory of Open Access Journals (Sweden)

    J. J. Lee

    2012-11-01

    Full Text Available Electron microburst energy spectra in the range of 170 keV to 360 keV have been measured using two solid-state detectors onboard the low-altitude (680 km, polar-orbiting Korean STSAT-1 (Science and Technology SATellite-1. Applying a unique capability of the spacecraft attitude control system, microburst energy spectra have been accurately resolved into two components: perpendicular to and parallel to the geomagnetic field direction. The former measures trapped electrons and the latter those electrons with pitch angles in the loss cone and precipitating into atmosphere. It is found that the perpendicular component energy spectra are harder than the parallel component and the loss cone is not completely filled by the electrons in the energy range of 170 keV to 360 keV. These results have been modeled assuming a wave-particle cyclotron resonance mechanism, where higher energy electrons travelling within a magnetic flux tube interact with whistler mode waves at higher latitudes (lower altitudes. Our results suggest that because higher energy (relativistic microbursts do not fill the loss cone completely, only a small portion of electrons is able to reach low altitude (~100 km atmosphere. Thus assuming that low energy microbursts and relativistic microbursts are created by cyclotron resonance with chorus elements (but at different locations, the low energy portion of the microburst spectrum will dominate at low altitudes. This explains why relativistic microbursts have not been observed by balloon experiments, which typically float at altitudes of ~30 km and measure only X-ray flux produced by collisions between neutral atmospheric particles and precipitating electrons.

  13. Conditional Probability Statistical Distributions in Variant Measurement Simulations%在变值测量模拟中的条件概率统计分布

    Institute of Scientific and Technical Information of China (English)

    郑智捷

    2011-01-01

    . Under the conditional probability model, intrinsic wave-like statistical distributions are observed on both normal conditions and interferenced conditions in their spatial statistical distributions respectively.

  14. Evaluation of the mercury contamination in mushrooms of genus Leccinum from two different regions of the world: Accumulation, distribution and probable dietary intake.

    Science.gov (United States)

    Falandysz, Jerzy; Zhang, Ji; Wang, Yuanzhong; Krasińska, Grażyna; Kojta, Anna; Saba, Martyna; Shen, Tao; Li, Tao; Liu, Honggao

    2015-12-15

    This study focused on investigation of the accumulation and distribution of mercury (Hg) in mushrooms of the genus Leccinum that emerged on soils of totally different geochemical bedrock composition. Hg in 6 species from geographically diverse regions of the mercuriferous belt areas in Yunnan of SW China, and 8 species from the non-mercuriferous regions of Poland in Europe was measured. Also assessed was the probable dietary intake of Hg from consumption of Leccinum spp., which are traditional organic food items in SW China and Poland. The results showed that L. chromapes, L. extremiorientale, L. griseum and L. rugosicepes are good accumulators of Hg and the sequestered Hg in caps were up to 4.8, 3.5, 3.6 and 4.7 mg Hg kg(-1) dry matter respectively. Leccinum mushrooms from Poland also efficiently accumulated Hg with their average Hg content being an order of magnitude lower due to low concentrations of Hg in forest topsoil of Poland compared to the elevated contents in Yunnan. Consumption of Leccinum mushrooms with elevated Hg contents in Yunnan at rates of up to 300 g fresh product per week during the foraging season would not result in Hg intake that exceeds the provisional weekly tolerance limit of 0.004 mg kg(-1) body mass, assuming no Hg ingestion from other foods.

  15. Summed Probability Distribution of 14C Dates Suggests Regional Divergences in the Population Dynamics of the Jomon Period in Eastern Japan.

    Directory of Open Access Journals (Sweden)

    Enrico R Crema

    Full Text Available Recent advances in the use of summed probability distribution (SPD of calibrated 14C dates have opened new possibilities for studying prehistoric demography. The degree of correlation between climate change and population dynamics can now be accurately quantified, and divergences in the demographic history of distinct geographic areas can be statistically assessed. Here we contribute to this research agenda by reconstructing the prehistoric population change of Jomon hunter-gatherers between 7,000 and 3,000 cal BP. We collected 1,433 14C dates from three different regions in Eastern Japan (Kanto, Aomori and Hokkaido and established that the observed fluctuations in the SPDs were statistically significant. We also introduced a new non-parametric permutation test for comparing multiple sets of SPDs that highlights point of divergences in the population history of different geographic regions. Our analyses indicate a general rise-and-fall pattern shared by the three regions but also some key regional differences during the 6th millennium cal BP. The results confirm some of the patterns suggested by previous archaeological studies based on house and site counts but offer statistical significance and an absolute chronological framework that will enable future studies aiming to establish potential correlation with climatic changes.

  16. The Chandra COSMOS Legacy Survey: Clustering of X-ray selected AGN at 2.9Probability Distribution Functions

    CERN Document Server

    Allevato, V; Finoguenov, A; Marchesi, S; Zamorani, G; Hasinger, G; Salvato, M; Miyaji, T; Gilli, R; Cappelluti, N; Brusa, M; Suh, H; Lanzuisi, G; Trakhtenbrot, B; Griffiths, R; Vignali, C; Schawinski, K; Karim, A

    2016-01-01

    We present the measurement of the projected and redshift space 2-point correlation function (2pcf) of the new catalog of Chandra COSMOS-Legacy AGN at 2.9$\\leq$z$\\leq$5.5 ($\\langle L_{bol} \\rangle \\sim$10$^{46}$ erg/s) using the generalized clustering estimator based on phot-z probability distribution functions (Pdfs) in addition to any available spec-z. We model the projected 2pcf estimated using $\\pi_{max}$ = 200 h$^{-1}$ Mpc with the 2-halo term and we derive a bias at z$\\sim$3.4 equal to b = 6.6$^{+0.60}_{-0.55}$, which corresponds to a typical mass of the hosting halos of log M$_h$ = 12.83$^{+0.12}_{-0.11}$ h$^{-1}$ M$_{\\odot}$. A similar bias is derived using the redshift-space 2pcf, modelled including the typical phot-z error $\\sigma_z$ = 0.052 of our sample at z$\\geq$2.9. Once we integrate the projected 2pcf up to $\\pi_{max}$ = 200 h$^{-1}$ Mpc, the bias of XMM and \\textit{Chandra} COSMOS at z=2.8 used in Allevato et al. (2014) is consistent with our results at higher redshift. The results suggest only...

  17. Estimating tail probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Carr, D.B.; Tolley, H.D.

    1982-12-01

    This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.

  18. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications...

  19. DISTRIBUTION OF SPIRAL GALAXIES ON VIEWING ANGLES%旋涡星系随倾角的分布

    Institute of Scientific and Technical Information of China (English)

    马骏; 宋国玄; 束成钢

    2000-01-01

    This paper presents the normalized inclination distributions for the spiral galaxies in RC3.The results show that,except for the bin of 81°~90°,in which the apparent minor isophotal diameters that are used to obtain the inclinations,are affected by the central bulges,the distributions for Sa,Sab,Scd and Sd are well consistent with those from the Monte-Carlo simulation of random inclinations within 3σ,and Sb and Sbc almost,but Sc is different.One reason that the difference between the real distribution and the Monte-Carlo simulation distribution for Sc may be that,some quite inclined spirals,the arms of which are inherently loosely wound on the galactic plane and should be classified to Sc galaxies,have been incorrectly classified to the earlier ones,because the tightness of spiral arms,which is one of the criteria of the Hubble classification in RC3,is different between on the galactic plane and on the tangent plane of the celestial sphere.Our result also implies that there might exist biases in the luminosity functions of individual Hubble types if spiral galaxies are only classified visually.%研究了RC3星系表中旋涡星系随倾角的分布.结果表明:在0°~80°之间,Sa,Sab,Scd和Sd星系随倾角的实际分布与Monte-Carlo随机分布在3σ内吻合较好;对Sb和Sbc星系几乎吻合;但对Sc星系却相差甚远.由以上的结果分析可知:在RC3星系表中,一些较侧向的旋涡星系,等光视短轴的值受核球的影响较大;同时,Sc星系的实际分布与Monte-Carlo随机分布相差甚远的一个原因可能是由于Hubble分类本身引起的,因为Hubble分类是在星系图像上进行,而旋臂的缠卷松紧程度却受倾角的影响.这样,一些较侧向的Sc星系就会被错分类到早型星系中.结果还直接表明:如果星系分类只是在图像上进行,则各类星系的光度函数就会出现偏差.

  20. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  1. Characteristics of Pitch Angle Distributions of 100s Kev Electrons in the Slot Region and Inner Radiation Belt­­­­­­­­

    Science.gov (United States)

    Zhao, H.; Li, X.; Blake, J. B.; Fennell, J.; Claudepierre, S. G.; Baker, D. N.; Jaynes, A. N.; Malaspina, D.

    2014-12-01

    The pitch angle distribution (PAD) of energetic electrons in the slot region and inner radiation belt received little attention in the past decades due to the lack of quality measurements. Using the state-of-art pitch-angle-resolved data from the Magnetic Electron Ion Spectrometer (MagEIS) instrument onboard the Van Allen Probes, a detailed analysis of 100s keV electron PADs below L =4 is performed, in which the PADs is categorized into three types: normal (flux peaking at 90°), cap (exceedingly peaking narrowly around 90°) and 90°-minimum (lower flux at 90°) PADs. By examining the characteristics of the PADs of 460 keV electrons for over a year, we find that the 90°-minimum PADs are generally present in the inner belt (Lbelt and relatively constant in the inner belt but changes significantly in the slot region (2mechanism can hardly explain the formation of 90°-minimum PADs at the center of inner belt. These new and compelling observations, made possible by the high-quality measurements of MagEIS, present a challenge for the wave modelers, and future work is still needed to fully understand them.

  2. Nuclear matter distributions in the {sup 6}He and {sup 8}He nuclei from differential cross sections for small-angle proton elastic scattering at intermediate energy

    Energy Technology Data Exchange (ETDEWEB)

    Alkhazov, G.D. E-mail: alkhazov@pcfarm.pnpi.spb.ru; Dobrovolsky, A.V.; Egelhof, P.; Geissel, H.; Irnich, H.; Khanzadeev, A.V.; Korolev, G.A.; Lobodenko, A.A.; Muenzenberg, G.; Mutterer, M.; Neumaier, S.R.; Schwab, W.; Seliverstov, D.M.; Suzuki, T.; Vorobyov, A.A

    2002-12-30

    A Glauber based analysis of the experimental cross sections for small-angle elastic p {sup 6,8}He scattering at 0.7 GeV has been performed. The radii and radial shape of the {sup 6}He and {sup 8}He nuclei have been determined using phenomenological nuclear density distributions with two free parameters. The deduced shapes of the {sup 6}He and {sup 8}He nuclear matter radial distributions conform with the concept that both nuclei consist of an {alpha}-particle core and a significant neutron halo. The accuracy of the theoretical analysis of the elastic-scattering cross-section data is discussed, and possible sources of systematic uncertainty related to some basic limitations in the applied method are outlined. The experimental p {sup 6,8}He elastic-scattering cross sections have also been utilized for probing the matter density distributions resulting from various nuclear microscopic models. Besides, the sensitivity of the total p {sup 6,8}He reaction cross sections to the size of the {sup 6}He and {sup 8}He nuclei has been considered.

  3. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  4. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making.

  5. Volcano shapes, entropies, and eruption probabilities

    Science.gov (United States)

    Gudmundsson, Agust; Mohajeri, Nahid

    2014-05-01

    We propose that the shapes of polygenetic volcanic edifices reflect the shapes of the associated probability distributions of eruptions. In this view, the peak of a given volcanic edifice coincides roughly with the peak of the probability (or frequency) distribution of its eruptions. The broadness and slopes of the edifices vary widely, however. The shapes of volcanic edifices can be approximated by various distributions, either discrete (binning or histogram approximation) or continuous. For a volcano shape (profile) approximated by a normal curve, for example, the broadness would be reflected in its standard deviation (spread). Entropy (S) of a discrete probability distribution is a measure of the absolute uncertainty as to the next outcome/message: in this case, the uncertainty as to time and place of the next eruption. A uniform discrete distribution (all bins of equal height), representing a flat volcanic field or zone, has the largest entropy or uncertainty. For continuous distributions, we use differential entropy, which is a measure of relative uncertainty, or uncertainty change, rather than absolute uncertainty. Volcano shapes can be approximated by various distributions, from which the entropies and thus the uncertainties as regards future eruptions can be calculated. We use the Gibbs-Shannon formula for the discrete entropies and the analogues general formula for the differential entropies and compare their usefulness for assessing the probabilities of eruptions in volcanoes. We relate the entropies to the work done by the volcano during an eruption using the Helmholtz free energy. Many factors other than the frequency of eruptions determine the shape of a volcano. These include erosion, landslides, and the properties of the erupted materials (including their angle of repose). The exact functional relation between the volcano shape and the eruption probability distribution must be explored for individual volcanoes but, once established, can be used to

  6. Extracting magnetic cluster size and its distributions in advanced perpendicular recording media with shrinking grain size using small angle x-ray scattering

    Energy Technology Data Exchange (ETDEWEB)

    Mehta, Virat; Ikeda, Yoshihiro; Takano, Ken; Terris, Bruce D.; Hellwig, Olav [San Jose Research Center, HGST a Western Digital company, 3403 Yerba Buena Rd., San Jose, California 95135 (United States); Wang, Tianhan [Department of Materials Science and Engineering, Stanford University, Stanford, California 94035 (United States); Stanford Institute for Materials and Energy Science (SIMES), SLAC National Accelerator Laboratory, 2575 Sand Hill Road, Menlo Park, California 94025 (United States); Wu, Benny; Graves, Catherine [Stanford Institute for Materials and Energy Science (SIMES), SLAC National Accelerator Laboratory, 2575 Sand Hill Road, Menlo Park, California 94025 (United States); Department of Applied Physics, Stanford University, Stanford, California 94035 (United States); Dürr, Hermann A.; Scherz, Andreas; Stöhr, Jo [Stanford Institute for Materials and Energy Science (SIMES), SLAC National Accelerator Laboratory, 2575 Sand Hill Road, Menlo Park, California 94025 (United States)

    2015-05-18

    We analyze the magnetic cluster size (MCS) and magnetic cluster size distribution (MCSD) in a variety of perpendicular magnetic recording (PMR) media designs using resonant small angle x-ray scattering at the Co L{sub 3} absorption edge. The different PMR media flavors considered here vary in grain size between 7.5 and 9.5 nm as well as in lateral inter-granular exchange strength, which is controlled via the segregant amount. While for high inter-granular exchange, the MCS increases rapidly for grain sizes below 8.5 nm, we show that for increased amount of segregant with less exchange the MCS remains relatively small, even for grain sizes of 7.5 and 8 nm. However, the MCSD still increases sharply when shrinking grains from 8 to 7.5 nm. We show evidence that recording performance such as signal-to-noise-ratio on the spin stand correlates well with the product of magnetic cluster size and magnetic cluster size distribution.

  7. Assessment of evaluated (n,d) energy-angle elastic scattering distributions using MCNP simulations of critical measurements and simplified calculation benchmarks

    Energy Technology Data Exchange (ETDEWEB)

    Kozier, K.S. [Atomic Energy of Canada Limited, Chalk River Laboratories, Chalk River, Ontario (Canada)

    2008-07-01

    Different evaluated (n,d) energy-angle elastic scattering distributions produce k-effective differences in MCNP5 simulations of critical experiments involving heavy water (D{sub 2}O) of sufficient magnitude to suggest a need for new (n,d) scattering measurements and/or distributions derived from modern theoretical nuclear models, especially at neutron energies below a few MeV. The present work focuses on the small reactivity change of < 1 mk that is observed in the MCNP5 D{sub 2}O coolant-void-reactivity calculation bias for simulations of two pairs of critical experiments performed in the ZED-2 reactor at the Chalk River Laboratories when different nuclear data libraries are used for deuterium. The deuterium data libraries tested include Endf/B-VII.0, Endf/B-VI.4, JENDL-3.3 and a new evaluation, labelled Bonn-B, which is based on recent theoretical nuclear-model calculations. Comparison calculations were also performed for a simplified, two-region, spherical model having an inner, 250-cm radius, homogeneous sphere of UO{sub 2}, without and with deuterium, and an outer 20-cm-thick deuterium reflector. A notable observation from this work is the reduction of about 0.4 mk in the MCNP5 ZED-2 CVR calculation bias that is obtained when the O-in-UO{sub 2} thermal scattering data comes from Endf-B-VII.0. (author)

  8. Zeros of Schrödinger's Radial Function Rnl(r) and Kummer's Function 1F1(-a c; z) and Their ``Angle'' Distributions

    Science.gov (United States)

    Tarasov, V. F.

    In the present paper exact formulae for the calculation of zeros of Rnl(r) and 1F1(-a c; z), where z = 2 λ r, a = n - l - 1 >= 0 and c = 2l + 2 >= 2 are presented. For a 4) numerical methods are employed to obtain the results (to within 10-15). For greater geometrical obviousness of the irregulary distribution (as a > 3) of zeros xk = zk - (c + a - 1) on the axis y = 0, the circular diagrams with the radius Ra = (a - 1) √ {c + a - 1} are presented for the first time. It is possible to notice some singularities of distribution of these zeros and their images - the points Tk - on the circle. For a = 3 and 4 their exact ``angle'' asymptotics (as c --> ∞) are obtained. It is shown that in the basis of the L. Ferrari, L. Euler and J.-L. Lagrange methods, using for solving the equation 1F1(-4 c; z) = 0, oneREFID="S0217979202011998FN001"> Common for all these methods. equation is obtained viz., the cubic resolvent equation of FEL-type. Calculating of zeros xk of the Rnl(r) and 1F1(z) functions enable us to show the ``singular'' cases (a, c) = (4, 6), (6, 4), (8, 14), ...

  9. The Properties and Distribution of Eyjafjallajökull Volcanic Ash, as Observed with MISR Space-based Multi-angle Imaging, April-May 2010 (Invited)

    Science.gov (United States)

    Kahn, R. A.; Gaitley, B. J.; Nelson, D. L.; Garay, M. J.; Misr Team

    2010-12-01

    Although volcanic eruptions occur about once per week globally, on average, relatively few of them affect the daily lives of millions of people. Significant exceptions were two eruptions of the Eyjafjallajökull volcano in southern Iceland, which produced ash clouds lasting several weeks during each of April and May 2010. During the first eruption, air traffic over most of Europe was halted, severely affecting international transportation, trade, and economics. For the second ash cloud, space-based and suborbital observations, together with aerosol transport modeling, were used to predict ash plume distribution, making it possible to selectively close only the limited airspace in which there was actual risk of significant ash exposure. These events highlight the immense value of aerosol measurement and modeling capabilities when integrated and applied in emergency response situations. Geosynchronous satellite and continuous, ground-based observations played the most immediate roles in constraining model ash-cloud-extent predictions. However, the rich information content of large-scale though less frequent observations from instruments such as the NASA Earth Observing System’s Multi-angle Imaging SpectroRadiometer (MISR) are key to improving the underlying representations of processes upon which the plume transport models rely. MISR contributes to this pool of information by providing maps of plume height derived from stereo imaging that are independent of knowledge of the temperature structure of the atmosphere or assumptions that the ash cloud is in thermal equilibrium with the environment. Such maps are obtained primarily near-source, where features of the ash cloud can be observed and co-registered in the multi-angle views. A distribution of heights is produced, making it possible to report all-important layer extent rather than just a characteristic plume elevation. Results are derived at 1.1 km horizontal and about 0.5 km vertical resolution. In addition

  10. A graduate course in probability

    CERN Document Server

    Tucker, Howard G

    2014-01-01

    Suitable for a graduate course in analytic probability, this text requires only a limited background in real analysis. Topics include probability spaces and distributions, stochastic independence, basic limiting options, strong limit theorems for independent random variables, central limit theorem, conditional expectation and Martingale theory, and an introduction to stochastic processes.

  11. SU-E-J-182: Reproducibility of Tumor Motion Probability Distribution Function in Stereotactic Body Radiation Therapy of Lung Using Real-Time Tumor-Tracking Radiotherapy System

    Energy Technology Data Exchange (ETDEWEB)

    Shiinoki, T; Hanazawa, H; Park, S; Takahashi, T; Shibuya, K [Yamaguchi University, Ube, Yamaguchi (Japan); Kawamura, S; Uehara, T; Yuasa, Y; Koike, M [Yamaguchi University Hospital, Ube, Yamaguchi (Japan)

    2015-06-15

    Purpose: We aim to achieve new four-dimensional radiotherapy (4DRT) using the next generation real-time tumor-tracking (RTRT) system and flattening-filter-free techniques. To achieve new 4DRT, it is necessary to understand the respiratory motion of tumor. The purposes of this study were: 1.To develop the respiratory motion analysis tool using log files. 2.To evaluate the reproducibility of tumor motion probability distribution function (PDF) during stereotactic body RT (SBRT) of lung tumor. Methods: Seven patients having fiducial markers closely implanted to the lung tumor were enrolled in this study. The positions of fiducial markers were measured using the RTRT system (Mitsubishi Electronics Co., JP) and recorded as two types of log files during the course of SBRT. For each patients, tumor motion range and tumor motion PDFs in left-right (LR), anterior-posterior (AP) and superior-inferior (SI) directions were calculated using log files of all beams per fraction (PDFn). Fractional PDF reproducibility (Rn) was calculated as Kullback-Leibler (KL) divergence between PDF1 and PDFn of tumor motion. The mean of Rn (Rm) was calculated for each patient and correlated to the patient’s mean tumor motion range (Am). The change of Rm during the course of SBRT was also evluated. These analyses were performed using in-house developed software. Results: The Rm were 0.19 (0.07–0.30), 0.14 (0.07–0.32) and 0.16 (0.09–0.28) in LR, AP and SI directions, respectively. The Am were 5.11 mm (2.58–9.99 mm), 7.81 mm (2.87–15.57 mm) and 11.26 mm (3.80–21.27 mm) in LR, AP and SI directions, respectively. The PDF reproducibility decreased as the tumor motion range increased in AP and SI direction. That decreased slightly through the course of RT in SI direction. Conclusion: We developed the respiratory motion analysis tool for 4DRT using log files and quantified the range and reproducibility of respiratory motion for lung tumors.

  12. Joint probabilities and quantum cognition

    CERN Document Server

    de Barros, J Acacio

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantum-like response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  13. The study of the (α, α'f) reaction at 120 MeV on 232Th and 238U (II) : Fission barrier properties deduced from fission probabilities and angular distributions

    NARCIS (Netherlands)

    Plicht, J. van der; Harakeh, M.N.; van der Woude, Adriaan; David, P.; Debrus, J.; Janszen, H.; Schulze, J.

    1981-01-01

    The fission probabilities and angular distributions of the fission fragments for the (α, α'f) reaction on 232Th and 238U at a bombarding energy of 120 MeV have been measured from about 4 to 14 MeV excitation energy. Evidence for sub-barrier resonances has been found, the negative parity ones occurri

  14. The study of the (α, α’f) reaction at 120 MeV on 232Th and 238U (I) : Fission probabilities and angular distributions in the region of the giant quadrupole resonances

    NARCIS (Netherlands)

    Plicht, J. van der; Harakeh, M.N.; van der Woude, Adriaan; David, P.; Debrus, J.; Janszen, H.; Schulze, J.

    1980-01-01

    The fission decay channel of 232Th and 238U has been investigated, using the (α, α’f) reaction at 120 MeV bombarding energy. The angular distributions of the fission fragments and the fission probabilities up to around 15 MeV excitation have been measured. No evidence for the fission decay of the gi

  15. Fractal probability laws.

    Science.gov (United States)

    Eliazar, Iddo; Klafter, Joseph

    2008-06-01

    We explore six classes of fractal probability laws defined on the positive half-line: Weibull, Frechét, Lévy, hyper Pareto, hyper beta, and hyper shot noise. Each of these classes admits a unique statistical power-law structure, and is uniquely associated with a certain operation of renormalization. All six classes turn out to be one-dimensional projections of underlying Poisson processes which, in turn, are the unique fixed points of Poissonian renormalizations. The first three classes correspond to linear Poissonian renormalizations and are intimately related to extreme value theory (Weibull, Frechét) and to the central limit theorem (Lévy). The other three classes correspond to nonlinear Poissonian renormalizations. Pareto's law--commonly perceived as the "universal fractal probability distribution"--is merely a special case of the hyper Pareto class.

  16. Accurate Size and Size-Distribution Determination of Polystyrene Latex Nanoparticles in Aqueous Medium Using Dynamic Light Scattering and Asymmetrical Flow Field Flow Fractionation with Multi-Angle Light Scattering

    Directory of Open Access Journals (Sweden)

    Shinichi Kinugasa

    2012-01-01

    Full Text Available Accurate determination of the intensity-average diameter of polystyrene latex (PS-latex by dynamic light scattering (DLS was carried out through extrapolation of both the concentration of PS-latex and the observed scattering angle. Intensity-average diameter and size distribution were reliably determined by asymmetric flow field flow fractionation (AFFFF using multi-angle light scattering (MALS with consideration of band broadening in AFFFF separation. The intensity-average diameter determined by DLS and AFFFF-MALS agreed well within the estimated uncertainties, although the size distribution of PS-latex determined by DLS was less reliable in comparison with that determined by AFFFF-MALS.

  17. The Inductive Applications of Probability Calculus

    Directory of Open Access Journals (Sweden)

    Corrado Gini

    2015-06-01

    Full Text Available The Author goes back to Founders of Probability calculus to investigate their original interpretation of the probability measure in the applications of the probability theory to real problems. The Author puts in evidence some misunderstandings related to the inversion of deductions derived by the use of probability distributions for investigating the causes of events.

  18. Inferring Beliefs as Subjectively Imprecise Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.;

    2012-01-01

    We propose a method for estimating subjective beliefs, viewed as a subjective probability distribution. The key insight is to characterize beliefs as a parameter to be estimated from observed choices in a well-defined experimental task and to estimate that parameter as a random coefficient. The e...... probabilities are indeed best characterized as probability distributions with non-zero variance....

  19. NUMERICAL SIMULATION OF ORIENTATION DISTRIBUTION FUNCTION OF CYLINDRICAL PARTICLE SUSPENSIONS

    Institute of Scientific and Technical Information of China (English)

    林建忠; 张凌新

    2002-01-01

    The orientation distribution function of cylindrical particle suspensions was deduced and numerically simulated, and an application was taken in a wedge-shaped flow field. The relationship between the orientation distribution function and particle orientation angles was obtained. The results show that comparing with the most probable angle distribution which comes to being in short time, the distribution of the steady state doesn' t vary much in range ; the main difference is the anti-clockwise rotation in the right and upper field, that is, particles rotate more at the points where the velocity gradients are larger.The most probable orientations are close to the direction of local streamlines. In the direction of streamlines, with poleradius decreasing, the most probable angles increase,but the angles between their orientations and the local streamlines decrease.

  20. Distribution of functional groups in periodic mesoporous organosilica materials studied by small-angle neutron scattering with in situ adsorption of nitrogen

    Directory of Open Access Journals (Sweden)

    Monir Sharifi

    2012-05-01

    Full Text Available Periodic mesoporous materials of the type (R′O3Si-R-Si(OR′3 with benzene as an organic bridge and a crystal-like periodicity within the pore walls were functionalized with SO3H or SO3− groups and investigated by small-angle neutron scattering (SANS with in situ nitrogen adsorption at 77 K. If N2 is adsorbed in the pores the SANS measurements show a complete matching of all of the diffraction signals that are caused by the long-range ordering of the mesopores in the benzene-PMO, due to the fact that the benzene-PMO walls possess a neutron scattering length density (SLD similar to that of nitrogen in the condensed state. However, signals at higher q-values (>1 1/Å are not affected with respect to their SANS intensity, even after complete pore filling, confirming the assumption of a crystal-like periodicity within the PMO material walls due to π–π interactions between the organic bridges. The SLD of pristine benzene-PMO was altered by functionalizing the surface with different amounts of SO3H-groups, using the grafting method. For a low degree of functionalization (0.81 mmol SO3H·g−1 and/or an inhomogeneous distribution of the SO3H-groups, the SLD changes only negligibly, and thus, complete contrast matching is still found. However, for higher amounts of SO3H-groups (1.65 mmol SO3H·g−1 being present in the mesopores, complete matching of the neutron diffraction signals is no longer observed proving that homogeneously distributed SO3H-groups on the inner pore walls of the benzene-PMO alter the SLD in a way that it no longer fits to the SLD of the condensed N2.

  1. Three-dimensional shapes and distribution of FePd nanoparticles observed by electron tomography using high-angle annular dark-field scanning transmission electron microscopy

    Science.gov (United States)

    Sato, Kazuhisa; Aoyagi, Kenta; Konno, Toyohiko J.

    2010-01-01

    We have studied three-dimensional shapes and distribution of FePd nanoparticles, prepared by electron beam deposition and postdeposition annealing, by means of single-axis tilt tomography using atomic number contrasts obtained by high-angle annular dark-field scanning transmission electron microscopy. Particle size, shape, and locations were reconstructed by weighted backprojection (WBP), as well as by simultaneous iterative reconstruction technique (SIRT). We have also estimated the particle size by simple extrapolation of tilt-series original data sets, which proved to be quite powerful. The results of the two algorithms for reconstruction have been compared quantitatively with those obtained by the extrapolation method and those independently reported by electron holography. It was found that the reconstructed intensity map by WBP contains a small amount of dotlike artifacts, which do not exist in the results by SIRT, and that the particle surface obtained by WBP is rougher than that by SIRT. We demonstrate, on the other hand, that WBP yields a better estimation of the particle size in the z direction than SIRT does, most likely due to the presence of a "missing wedge" in the original data set.

  2. Probability for physicists

    CERN Document Server

    Sirca, Simon

    2016-01-01

    This book is designed as a practical and intuitive introduction to probability, statistics and random quantities for physicists. The book aims at getting to the main points by a clear, hands-on exposition supported by well-illustrated and worked-out examples. A strong focus on applications in physics and other natural sciences is maintained throughout. In addition to basic concepts of random variables, distributions, expected values and statistics, the book discusses the notions of entropy, Markov processes, and fundamentals of random number generation and Monte-Carlo methods.

  3. Approach for simultaneous measurement of two-dimensional angular distribution of charged particles. III. Fine focusing of wide-angle beams in multiple lens systems

    Science.gov (United States)

    Matsuda, Hiroyuki; Daimon, Hiroshi; Tóth, László; Matsui, Fumihiko

    2007-04-01

    This paper provides a way of focusing wide-angle charged-particle beams in multiple lens systems. In previous papers [H. Matsuda , Phys. Rev. E 71, 066503 (2005); 74, 036501 (2006)], it was shown that an ellipsoidal mesh, combined with electrostatic lenses, enables correction of spherical aberration over wide acceptance angles up to ±60° . In this paper, practical situations where ordinary electron lenses are arranged behind the wide-angle electrostatic lenses are taken into account using ray tracing calculation. For practical realization of the wide-angle lens systems, the acceptance angle is set to ±50° . We note that the output beams of the wide-angle electrostatic lenses have somewhat large divergence angles which cause unacceptable or non-negligible spherical aberration in additional lenses. A solution to this problem is presented showing that lens combinations to cancel spherical aberration are available, whereby wide-angle charged-particle beams can be finely focused with considerably reduced divergence angles less than ±5° .

  4. Energetic Electron Pitch Angle Diffusion due to Whistler Wave during Terrestrial Storms

    Institute of Scientific and Technical Information of China (English)

    XIAO Fu-Liang; HE Hui-Yong

    2006-01-01

    A concise and elegant expression of cyclotron harmonic resonant quasi-pure pitch-angle diffusion is constructed for the parallel whistler mode waves, and the quasi-linear diffusion coefficient is prescribed in terms of the whistler mode wave spectral intensity. Numerical computations are performed for the specific case of energetic electrons interacting with a band of frequency of whistler mode turbulence at L ≈ 3. It is found that the quasi-pure pitch-angle diffusion driven by the whistler mode scatters energetic electrons from the larger pitch-angles into the loss cone, and causes pitch-angle distribution to evolve from the pancake-shaped before the terrestrial storms to the flat-top during the main phase. This probably accounts for the quasi-isotropic pitch-angle distribution observed by the combined release and radiation effects satellite spacecraft at L ≈ 3.

  5. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  6. Probability distribution of intersymbol distances in random symbolic sequences: Applications to improving detection of keywords in texts and of amino acid clustering in proteins.

    Science.gov (United States)

    Carpena, Pedro; Bernaola-Galván, Pedro A; Carretero-Campos, Concepción; Coronado, Ana V

    2016-11-01

    Symbolic sequences have been extensively investigated in the past few years within the framework of statistical physics. Paradigmatic examples of such sequences are written texts, and deoxyribonucleic acid (DNA) and protein sequences. In these examples, the spatial distribution of a given symbol (a word, a DNA motif, an amino acid) is a key property usually related to the symbol importance in the sequence: The more uneven and far from random the symbol distribution, the higher the relevance of the symbol to the sequence. Thus, many techniques of analysis measure in some way the deviation of the symbol spatial distribution with respect to the random expectation. The problem is then to know the spatial distribution corresponding to randomness, which is typically considered to be either the geometric or the exponential distribution. However, these distributions are only valid for very large symbolic sequences and for many occurrences of the analyzed symbol. Here, we obtain analytically the exact, randomly expected spatial distribution valid for any sequence length and any symbol frequency, and we study its main properties. The knowledge of the distribution allows us to define a measure able to properly quantify the deviation from randomness of the symbol distribution, especially for short sequences and low symbol frequency. We apply the measure to the problem of keyword detection in written texts and to study amino acid clustering in protein sequences. In texts, we show how the results improve with respect to previous methods when short texts are analyzed. In proteins, which are typically short, we show how the measure quantifies unambiguously the amino acid clustering and characterize its spatial distribution.

  7. Probability distribution of intersymbol distances in random symbolic sequences: Applications to improving detection of keywords in texts and of amino acid clustering in proteins

    Science.gov (United States)

    Carpena, Pedro; Bernaola-Galván, Pedro A.; Carretero-Campos, Concepción; Coronado, Ana V.

    2016-11-01

    Symbolic sequences have been extensively investigated in the past few years within the framework of statistical physics. Paradigmatic examples of such sequences are written texts, and deoxyribonucleic acid (DNA) and protein sequences. In these examples, the spatial distribution of a given symbol (a word, a DNA motif, an amino acid) is a key property usually related to the symbol importance in the sequence: The more uneven and far from random the symbol distribution, the higher the relevance of the symbol to the sequence. Thus, many techniques of analysis measure in some way the deviation of the symbol spatial distribution with respect to the random expectation. The problem is then to know the spatial distribution corresponding to randomness, which is typically considered to be either the geometric or the exponential distribution. However, these distributions are only valid for very large symbolic sequences and for many occurrences of the analyzed symbol. Here, we obtain analytically the exact, randomly expected spatial distribution valid for any sequence length and any symbol frequency, and we study its main properties. The knowledge of the distribution allows us to define a measure able to properly quantify the deviation from randomness of the symbol distribution, especially for short sequences and low symbol frequency. We apply the measure to the problem of keyword detection in written texts and to study amino acid clustering in protein sequences. In texts, we show how the results improve with respect to previous methods when short texts are analyzed. In proteins, which are typically short, we show how the measure quantifies unambiguously the amino acid clustering and characterize its spatial distribution.

  8. Probability representations of fuzzy systems

    Institute of Scientific and Technical Information of China (English)

    LI Hongxing

    2006-01-01

    In this paper, the probability significance of fuzzy systems is revealed. It is pointed out that COG method, a defuzzification technique used commonly in fuzzy systems, is reasonable and is the optimal method in the sense of mean square. Based on different fuzzy implication operators, several typical probability distributions such as Zadeh distribution, Mamdani distribution, Lukasiewicz distribution, etc. are given. Those distributions act as "inner kernels" of fuzzy systems. Furthermore, by some properties of probability distributions of fuzzy systems, it is also demonstrated that CRI method, proposed by Zadeh, for constructing fuzzy systems is basically reasonable and effective. Besides, the special action of uniform probability distributions in fuzzy systems is characterized. Finally, the relationship between CRI method and triple I method is discussed. In the sense of construction of fuzzy systems, when restricting three fuzzy implication operators in triple I method to the same operator, CRI method and triple I method may be related in the following three basic ways: 1) Two methods are equivalent; 2) the latter is a degeneration of the former; 3) the latter is trivial whereas the former is not. When three fuzzy implication operators in triple I method are not restricted to the same operator, CRI method is a special case of triple I method; that is, triple I method is a more comprehensive algorithm. Since triple I method has a good logical foundation and comprises an idea of optimization of reasoning, triple I method will possess a beautiful vista of application.

  9. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  10. Survival probability and ruin probability of a risk model

    Institute of Scientific and Technical Information of China (English)

    LUO Jian-hua

    2008-01-01

    In this paper, a new risk model is studied in which the rate of premium income is regarded as a random variable, the arrival of insurance policies is a Poisson process and the process of claim occurring is p-thinning process. The integral representations of the survival probability are gotten. The explicit formula of the survival probability on the infinite interval is obtained in the special casc--exponential distribution.The Lundberg inequality and the common formula of the ruin probability are gotten in terms of some techniques from martingale theory.

  11. 基于最大熵原理的Web服务QoS概率分布获取%Probability distribution estimation for Web service QoS based on max entropy principle

    Institute of Scientific and Technical Information of China (English)

    代志华; 付晓东; 黄袁; 贾楠

    2012-01-01

    为了进行服务风险管理,需要了解服务质量(QoS)的随机特性,而描述QoS随机特性的一种有效手段是获得其准确的概率分布.为此,提出了一种基于最大熵原理在小样本情况下获取Web服务QoS概率分布的方法.方法采用最大熵原理将小样本情况下QoS概率分布获取的问题规约为一个由已知QoS数据确定约束条件的最优化问题进行求解,获得QoS概率密度函数的解析式,然后设计了对该概率密度函数解析式参数进行估计的算法.最后,以实际的Web服务QoS数据为基础,通过实验验证了该方法对不同QoS分布获取时的有效性和合理性,并验证了分布获取算法的效率和终止性.%To manage the risk of service, it is necessary to obtain stochastic character of Quality of Service (QoS) that is represented as accurate probability distribution. This paper presented an approach to estimate probability distribution of Web service QoS in the case of small number of samples. Using max entropy principle, the analytical formula of the probability density function can be obtained by transforming the probability distribution estimation problem into an optimal problem with constraints obtained from sampling QoS data. Then an algorithm to estimate parameters of the probability density function was designed. The experimental and simulation results based on real Web service QoS data show the effectiveness of the proposed approach for probability distribution estimation of different QoS attribute. The efficiency and feasibility of the distribution estimation algorithm have got validated by experiments too.

  12. The Probability Distribution of the Maximum Amount of Daily Precipitation During 20 Days in Summer of the Huaihe Basins%淮河流域汛期20d内最大日降水量概率分布

    Institute of Scientific and Technical Information of China (English)

    梁莉; 赵琳娜; 巩远发; 包红军; 王成鑫; 王志

    2011-01-01

    利用淮河流域158个站点1980-2007年夏季降水量资料,选取淮河上游、淮河中上游、淮河中下游、洪泽湖以下和沂沭河5个子流域,采用(Τ)分布函数分析了淮河流域首雨日(前1日无雨)和连续雨日(前1日有雨)的夏季多年降水的概率分布特点.通过对代表站息县、阜阳、商丘、淮安、连云港(Τ)分布概率密度与样本频率的对比分析和K-S检验表明:(Τ)分布函数能较好拟合分条件的淮河流域夏季雨日的概率分布,用该分布函数递推得到的1d,10 d,20 d内最大日降水量概率分布比较规则合理.淮河流域5个子流域中淮河上游、淮河中下游、沂沭河流域在10 d,20 d内最大日降水量不低于10 mm,25 mm,50mm的可能性更大.%The daily precipitation records of 158 meteorological rain gauges over the Huaihe Basins make it possible to analyze the probability distribution, using gamma distribution of precipitation during the summer of 1980-2007 by distinguishing rainy days following a dry or wet preceding day over the years. Five precipitation rain gauge stations, namely Xixian, Fuyang, Shangqiu, Huaian, Lianyungang stations, are investigated as representative stations of five catchments, namely the upper stream of the Huaihe River, the part stream between Wangjiaba Dam and Bengbu Station of the Huaihe River, the part stream between Bengbu Station and Hongze Lake of Huaihe River, the Huaihe River downstream below Hongze Lake and the Yishu River watershed, to analyze their probability distribution respectively. Through the Kolmogorov-Smirnov (K-S) test and the comparison between the gamma distribution probability density function of the five representative stations and the sample frequency of the daily precipitation records, it is proved that gamma distribution function can be an adequate fitting to the probability distribution of the precipitation in summer of the rainy days following a dry or wet preceding day. The probability

  13. 分组截尾数据下离散型寿命概率分布的估计方法%Estimation on survival discrete probability distributions with grouped censored data

    Institute of Scientific and Technical Information of China (English)

    侯超钧; 吴东庆; 王前; 杨志伟

    2012-01-01

    At present, the grouped data to unknown survival discrete probability distributions are less studied. To a-void solving complex nonlinear maximum likelihood equations,the probability distribution formula with recursion relations is derived from the likelihood equations by joining the Lagrange multiplier. Then the maximum likelihood estimation of p1 is obtained through the degradation of single interval model. Thus , the probability distribution of pi is calculated successively. The experiment result shows that this method is effective.%目前针对离散型未知寿命分布的分组数据研究较少.为避免求解复杂的非线性极大似然方程组,由加入拉格朗日乘子的似然方程组中推导具有递推关系的概率分布计算公式,并通过退化的单区间模型得到p1的极大似然估计,从而完成概率分布律p1的递推计算.实验说明该方法有效.

  14. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  15. A Novel Approach to Probability

    CERN Document Server

    Kafri, Oded

    2016-01-01

    When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...

  16. Measurement of the weak mixing angle and the spin of the gluon from angular distributions in the reaction pp{yields} Z/{gamma}*+X{yields}{mu}{sup +}{mu}{sup -}+X with ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Schmieden, Kristof

    2013-04-15

    The measurement of the effective weak mixing angle with the ATLAS experiment at the LHC is presented. It is extracted from the forward-backward asymmetry in the polar angle distribution of the muons originating from Z boson decays in the reaction pp{yields}Z/{gamma}{sup *}+X{yields} {mu}{sup +}{mu}{sup -}+X. In total 4.7 fb{sup -1} of proton-proton collisions at {radical}(s)=7 TeV are analysed. In addition, the full polar and azimuthal angular distributions are measured as a function of the transverse momentum of the Z/{gamma}{sup *} system and are compared to several simulations as well as recent results obtained in p anti p collisions. Finally, the angular distributions are used to confirm the spin of the gluon using the Lam-Tung relation.

  17. Comparação de distribuições de probabilidade e estimativa da precipitação provável para região de Barbacena, MG Comparasion of probability distribution models and estimative of the probable rainfall for the Barbacena County, MG

    Directory of Open Access Journals (Sweden)

    Bruno Teixeira Ribeiro

    2007-10-01

    Full Text Available Estudos probabilísticos envolvendo variáveis climáticas são de extrema importância para as atividades da agropecuária, construção civil, turismo, transporte, dentre outros. Visando contribuir para o planejamento da agricultura irrigada, este trabalho teve como objetivos comparar distribuições de probabilidade ajustadas às séries históricas decendiais e mensais, e estimar as precipitações prováveis para o município de Barbacena, MG. Foram estudados os meses de dezembro, janeiro e fevereiro, no período de 1942 a 2003, constituindo-se séries históricas com 62 anos de observações. As lâminas diárias foram totalizadas em períodos mensais e decendiais, sendo aplicadas as distribuições log-Normal 2 parâmetros, log-Normal 3 parâmetros e Gama. Para avaliar a adequabilidade das distribuições, nos períodos estudados, utilizou-se o teste de Qui-quadrado (chi2, ao nível de 5% de significância. As precipitações prováveis foram estimadas para cada período estudado utilizando a distribuição que apresentou o menor valor de chi2, nos níveis de probabilidade de excedência de 75, 90 e 98%. A distribuição Gama foi a que melhor se ajustou aos dados. O estudo de precipitações prováveis é uma boa ferramenta no auxílio da tomada de decisão quanto ao planejamento e uso da irrigação.Probabilistic studies involving climatic variables are of extreme importance for farming activities, construction, tourism, among others. Seeking to contribute for the planning of irrigate agriculture, this work had as objectives to compare adjusted probability distribution models to the monthly and decennial historical series and to estimate the probable rainfall for the Barbacena County, Minas Gerais State, Brazil. Rainfall data of December, January and February, from 1942 to 2003, were studied, constituting historical series with 62 years of observations. Daily rainfall depths were added for 10 and 30 days, applying Gama, log-Normal 2 and

  18. Effects of temperature distribution on failure probability of coated particles in spherical fuel elements%球形燃料元件温度分布对包覆燃料颗粒失效概率的影响

    Institute of Scientific and Technical Information of China (English)

    张永栋; 林俊; 朱天宝; 张海青; 朱智勇

    2016-01-01

    Background:Particles coated by TRISO (Tristructural isotropic) embedded in spherical fuel elements are used in solid fuel molten salt reactor. Temperature distribution during operation can affect the failure probability of TRISO particles embedded in different parts of fuel elements. Purpose: This study aims to investigate the temperature distribution effects on failure probability of coated fuel particles. Methods: Micro-volume element analysis of temperature distribution effect on the failure probability of coated particles was carried out for the first time, and the impact of spherical fuel element size on the average failure probability of TRISO particles was also evaluated. Results: At a given power density, the failure probability of TRISO particles would be deviated by an order of magnitude when either core temperature or average temperature of the fuel element was used to calculate the average failure probability. With the same power density and the same burnups, the average failure probability of coated particles could be lowered by two orders of magnitude through reducing the diameter of fuel element by 1 cm. Conclusion:It is necessary to take the temperature distribution into account for calculating the failure probability of coated fuel particles. In addition, it is found that the average failure probability of coated fuel particles can be lowered by reducing the sizes of the fuel element. This may be a proper way to secure the fuel elements working at high power densities.%固态熔盐堆采用TRISO (Tristructural isotropic)包覆颗粒球形燃料元件。在运行工况下,燃料元件内部存在一定的温度分布,填充在燃料元件内部不同位置的TRISO颗粒的失效概率会因此受到影响。利用体积微元的方法分析了温度分布对包覆颗粒失效概率的影响,并进一步研究了球形燃料元件尺寸对TRISO颗粒平均失效概率的影响。结果表明,在一定的功率密度下,如果利用球心

  19. Learning unbelievable marginal probabilities

    CERN Document Server

    Pitkow, Xaq; Miller, Ken D

    2011-01-01

    Loopy belief propagation performs approximate inference on graphical models with loops. One might hope to compensate for the approximation by adjusting model parameters. Learning algorithms for this purpose have been explored previously, and the claim has been made that every set of locally consistent marginals can arise from belief propagation run on a graphical model. On the contrary, here we show that many probability distributions have marginals that cannot be reached by belief propagation using any set of model parameters or any learning algorithm. We call such marginals `unbelievable.' This problem occurs whenever the Hessian of the Bethe free energy is not positive-definite at the target marginals. All learning algorithms for belief propagation necessarily fail in these cases, producing beliefs or sets of beliefs that may even be worse than the pre-learning approximation. We then show that averaging inaccurate beliefs, each obtained from belief propagation using model parameters perturbed about some le...

  20. Work on probability distribution of breakdown of NC press%数控压力机故障概率分布研究

    Institute of Scientific and Technical Information of China (English)

    张强; 马立强; 贾亚洲

    2001-01-01

    开发了分析数控机床可靠性的专用软件。建立了数控压力机的操作和维修数据库。故障数据验证了数控压力机故障分布符合指数分布,并给出了数控压力机可靠性特性的计算方法。%Special software of reliability analysis of NC machine-tool has been developed with establishment of operation and maintenance database of NC press. The distribution of breakdown of NC press has been proved to accord with exponent distribution. Calculating method of reliability characteristics of NC press has been offered.

  1. Monte Carlo method of macroscopic modulation of small-angle charged particle reflection from solid surfaces

    CERN Document Server

    Bratchenko, M I

    2001-01-01

    A novel method of Monte Carlo simulation of small-angle reflection of charged particles from solid surfaces has been developed. Instead of atomic-scale simulation of particle-surface collisions the method treats the reflection macroscopically as 'condensed history' event. Statistical parameters of reflection are sampled from the theoretical distributions upon energy and angles. An efficient sampling algorithm based on combination of inverse probability distribution function method and rejection method has been proposed and tested. As an example of application the results of statistical modeling of particles flux enhancement near the bottom of vertical Wehner cone are presented and compared with simple geometrical model of specular reflection.

  2. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  3. VizieR Online Data Catalog: Jet angles and gamma-ray energetics estimations (Goldstein+, 2016)

    Science.gov (United States)

    Goldstein, A.; Connaughton, V.; Briggs, M. S.; Burns, E.

    2016-04-01

    We present a method to estimate the jet opening angles of long duration gamma-ray bursts (GRBs) using the prompt gamma-ray energetics and an inversion of the Ghirlanda relation, which is a correlation between the time-integrated peak energy of the GRB prompt spectrum and the collimation-corrected energy in gamma-rays. The derived jet opening angles using this method and detailed assumptions match well with the corresponding inferred jet opening angles obtained when a break in the afterglow is observed. Furthermore, using a model of the predicted long GRB redshift probability distribution observable by the Fermi Gamma-ray Burst Monitor (GBM), we estimate the probability distributions for the jet opening angle and rest-frame energetics for a large sample of GBM GRBs for which the redshifts have not been observed. Previous studies have only used a handful of GRBs to estimate these properties due to the paucity of observed afterglow jet breaks, spectroscopic redshifts, and comprehensive prompt gamma-ray observations, and we potentially expand the number of GRBs that can be used in this analysis by more than an order of magnitude. In this analysis, we also present an inferred distribution of jet breaks which indicates that a large fraction of jet breaks are not observable with current instrumentation and observing strategies. We present simple parameterizations for the jet angle, energetics, and jet break distributions so that they may be used in future studies. (1 data file).

  4. ESTIMATING LONG GRB JET OPENING ANGLES AND REST-FRAME ENERGETICS

    Energy Technology Data Exchange (ETDEWEB)

    Goldstein, Adam [Space Science Office, VP62, NASA/Marshall Space Flight Center, Huntsville, AL 35812 (United States); Connaughton, Valerie [Science and Technology Institute, Universities Space Research Association, Huntsville, AL 35805 (United States); Briggs, Michael S.; Burns, Eric, E-mail: adam.m.goldstein@nasa.gov [Center for Space Plasma and Aeronomic Research, University of Alabama in Huntsville, 320 Sparkman Drive, Huntsville, AL 35899 (United States)

    2016-02-10

    We present a method to estimate the jet opening angles of long duration gamma-ray bursts (GRBs) using the prompt gamma-ray energetics and an inversion of the Ghirlanda relation, which is a correlation between the time-integrated peak energy of the GRB prompt spectrum and the collimation-corrected energy in gamma-rays. The derived jet opening angles using this method and detailed assumptions match well with the corresponding inferred jet opening angles obtained when a break in the afterglow is observed. Furthermore, using a model of the predicted long GRB redshift probability distribution observable by the Fermi Gamma-ray Burst Monitor (GBM), we estimate the probability distributions for the jet opening angle and rest-frame energetics for a large sample of GBM GRBs for which the redshifts have not been observed. Previous studies have only used a handful of GRBs to estimate these properties due to the paucity of observed afterglow jet breaks, spectroscopic redshifts, and comprehensive prompt gamma-ray observations, and we potentially expand the number of GRBs that can be used in this analysis by more than an order of magnitude. In this analysis, we also present an inferred distribution of jet breaks which indicates that a large fraction of jet breaks are not observable with current instrumentation and observing strategies. We present simple parameterizations for the jet angle, energetics, and jet break distributions so that they may be used in future studies.

  5. Evaluating probability forecasts

    CERN Document Server

    Lai, Tze Leung; Shen, David Bo; 10.1214/11-AOS902

    2012-01-01

    Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability forecasts. This approach uses loss functions relating the predicted to the actual probabilities of the events and applies martingale theory to exploit the temporal structure between the forecast and the subsequent occurrence or nonoccurrence of the event.

  6. Probability Distribution of Precipitation Extremes over the Yangtze River Basin%1960-2005年长江流域降水极值概率分布特征

    Institute of Scientific and Technical Information of China (English)

    苏布达; Marco Gemmer; 姜彤

    2008-01-01

    Based on the daily observational precipitation data of 147 stations in the Yangtze River basin for 1960-2005,and the projected daily data of 79 grids from ECHAM5/MPI-OM in the 20th century,time series of precipitation extremes which contain annual maximum(AM)and Munger index(MI)were constructed.The distribution feature of precipitation extremes was analyzed based on the two index series.Research results show that(1)the intensity and probability of extreme heavy precipitation are higher in the middle Mintuo River sub-catchment,the Dongting Lake area,the mid-lower main stream section of the Yangtze River,and the southeastern Poyang Lake sub-catchment;whereas,the intensity and probability of drought events are higher in the mid-lower Jinsha River sub-catchment and the Jialing River sub-catchment;(2)compared with observational data,the averaged value of AM is higher but the deviation coefficient is lower in projected data,and the center of precipitation extremes moves northwards;(3)in spite of certain differences in the spatial distributions of observed and projected precipitation extremes,by applying General Extreme Value(GEV)and Wakeby(WAK)models with the method of L-Moment Estimator(LME)to the precipitation extremes,it is proved that WAK can simulate the probability distribution of precipitation extremes calculated from both observed and projected data quite well.The WAK could be an important function for estimating the precipitation extreme events in the Yangtze River basin under future climatic scenarios.

  7. Ruin Probability and Asymptotic Estimate of a Compound Binomial Distribution Model%复合二项分布模型的破产概率及其渐近估计

    Institute of Scientific and Technical Information of China (English)

    许璐; 赵闻达

    2012-01-01

    运用古典概率的有关知识,通过建立合适的数学模型导出了复合二项分布的破产概率的显式解,进而得到了它的渐近估计表达式。所得结论包含了有关文献的结果。%The classical probability theory is used to derive solution of the ultimate ruin prob- ability in a compound binomial distribution model, and its asymptotic estimation is obtained. The conclusion has improved the result in related literature.

  8. Empirical and Computational Tsunami Probability

    Science.gov (United States)

    Geist, E. L.; Parsons, T.; ten Brink, U. S.; Lee, H. J.

    2008-12-01

    A key component in assessing the hazard posed by tsunamis is quantification of tsunami likelihood or probability. To determine tsunami probability, one needs to know the distribution of tsunami sizes and the distribution of inter-event times. Both empirical and computational methods can be used to determine these distributions. Empirical methods rely on an extensive tsunami catalog and hence, the historical data must be carefully analyzed to determine whether the catalog is complete for a given runup or wave height range. Where site-specific historical records are sparse, spatial binning techniques can be used to perform a regional, empirical analysis. Global and site-specific tsunami catalogs suggest that tsunami sizes are distributed according to a truncated or tapered power law and inter-event times are distributed according to an exponential distribution modified to account for clustering of events in time. Computational methods closely follow Probabilistic Seismic Hazard Analysis (PSHA), where size and inter-event distributions are determined for tsunami sources, rather than tsunamis themselves as with empirical analysis. In comparison to PSHA, a critical difference in the computational approach to tsunami probabilities is the need to account for far-field sources. The three basic steps in computational analysis are (1) determination of parameter space for all potential sources (earthquakes, landslides, etc.), including size and inter-event distributions; (2) calculation of wave heights or runup at coastal locations, typically performed using numerical propagation models; and (3) aggregation of probabilities from all sources and incorporation of uncertainty. It is convenient to classify two different types of uncertainty: epistemic (or knowledge-based) and aleatory (or natural variability). Correspondingly, different methods have been traditionally used to incorporate uncertainty during aggregation, including logic trees and direct integration. Critical

  9. Statistical study of the pulse width distribution for radio pulsars

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    Pulse widths of standard pulse profiles for 262 pulsars were measured by using the Urumqi 25 m radio telescope at 1.54 GHz.For the simplest case of circular emission beam,we applied Monte Carlo simulations to the pulse width distribution.Different density functions of magnetic inclination angle α and observer angle ξ were considered.Using Kolmogorov-Smirnov tests,we derived the most probable distribution for α and ξ.

  10. Quantifying bid-ask spreads in the Chinese stock market using limit-order book data: Intraday pattern, probability distribution, long memory, and multifractal nature

    CERN Document Server

    Gu, G F; Zhou, W X; Chen, Wei; Gu, Gao-Feng; Zhou, Wei-Xing

    2006-01-01

    The statistical properties of the bid-ask spread of a frequently traded Chinese stock listed on the Shenzhen Stock Exchange are investigated using the limit-order book data. Three different definitions of spread are considered based on the time right before transactions, the time whenever the highest buying price or the lowest selling price changes, and a fixed time interval. The results are qualitatively similar no matter linear prices or logarithmic prices are used. The average spread exhibits evident intraday patterns consisting of a big L-shape in the morning and a small L-shape in the afternoon. The distributions of the spread with different definitions decay as power laws. The tail exponents of spreads at transaction level are well within the interval $(2,3)$ and that of average spreads are well in line with the inverse cubic law for different time intervals. Based on the detrended fluctuation analysis, we find evidence of long memory in the bid-ask spread time series for all three definitions, even aft...

  11. Quantifying bid-ask spreads in the Chinese stock market using limit-order book data. Intraday pattern, probability distribution, long memory, and multifractal nature

    Science.gov (United States)

    Gu, G.-F.; Chen, W.; Zhou, W.-X.

    2007-05-01

    The statistical properties of the bid-ask spread of a frequently traded Chinese stock listed on the Shenzhen Stock Exchange are investigated using the limit-order book data. Three different definitions of spread are considered based on the time right before transactions, the time whenever the highest buying price or the lowest selling price changes, and a fixed time interval. The results are qualitatively similar no matter linear prices or logarithmic prices are used. The average spread exhibits evident intraday patterns consisting of a big L-shape in morning transactions and a small L-shape in the afternoon. The distributions of the spread with different definitions decay as power laws. The tail exponents of spreads at transaction level are well within the interval (2,3) and that of average spreads are well in line with the inverse cubic law for different time intervals. Based on the detrended fluctuation analysis, we found the evidence of long memory in the bid-ask spread time series for all three definitions, even after the removal of the intraday pattern. Using the classical box-counting approach for multifractal analysis, we show that the time series of bid-ask spread do not possess multifractal nature.

  12. Introduction to probability

    CERN Document Server

    Roussas, George G

    2006-01-01

    Roussas's Introduction to Probability features exceptionally clear explanations of the mathematics of probability theory and explores its diverse applications through numerous interesting and motivational examples. It provides a thorough introduction to the subject for professionals and advanced students taking their first course in probability. The content is based on the introductory chapters of Roussas's book, An Intoduction to Probability and Statistical Inference, with additional chapters and revisions. Written by a well-respected author known for great exposition an

  13. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods.

  14. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    2013-01-01

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned prob

  15. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  16. Dependent Probability Spaces

    Science.gov (United States)

    Edwards, William F.; Shiflett, Ray C.; Shultz, Harris

    2008-01-01

    The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…

  17. Research of Dynamic Depreciation of Medical Equipment Based on x2 Distribution Probability Density Function%基于x2分布概率密度函数的医疗设备折旧法的研究

    Institute of Scientific and Technical Information of China (English)

    邓厚斌; 葛毅; 范璐敏; 刘晓雯; 李盈盈

    2012-01-01

    In order to proceed depreciation accounting of medical equipment reasonably, this paper analyses and compares the advantages and disadvantages of several common depreciation methods, with use efficiency of medical equipment, proposes distribution rule of static depreciation rate of x2 distribution probability density function, meanwhile, introduces benchmark benefit ratio of funds, establishes dynamic depreciation method of medical equipment.%为能够较合理地进行医疗设备的折旧核算,本文分析比较了常见的几种折旧方法的优缺点,结合医疗设备的使用效能,提出了医疗设备拟合x 2分布概率密度函数的静态折旧率分布规律,建立新的医疗设备折旧法.

  18. System Downlink Outage Probability Analysis in Distributed Antenna Systems%分布式天线系统中的系统下行中断概率分析

    Institute of Scientific and Technical Information of China (English)

    王俊波; 王金元; 陈华敏; 陈明

    2011-01-01

    The main focus of this paper is to investigate the system downlink outage probability in distributed antenna systems (DAS).Firstly,this paper establishes a comtposite channel model which takes into account three factors such as path loss,lognormal shadowing and Rayleigh fading. Then, by making use of the moment generating function (MGF), this paper derives the probability density function (PDF) of the output signal-to-noise ratio (SNR) after maximal ratio combining (MRC) at the receiver.After that,an approximate analytical expression of the outage probability for a mobile station (MS) over a given position is derived with an antenna selective transmission (ST) scheme. Further, considering the distribution of MSs in the system, a closed-form expression of the system outage probability is obtained. Numerical results show that the closed-form expression can provide sufficient precision for evaluating the outage probability performance of DAS.%本文针对分布式天线系统的系统下行中断概率问题展开研究.文章首先建立了包含路径损耗、阴影衰落和瑞利衰落的复合信道模型.接着,在接收端采用最大比合并的方式接收信号,并运用矩生成函数推导出输出信噪比的概率密度函数.然后,对分布式天线采用选择传输策略,并分析出给定移动台位置时的中断概率的表达式.最后,考虑小区内移动台任意分布特点,进一步推导出系统下行中断概率闭合表达式.仿真结果表明,所推导的闭合表达式能准确地评估系统中断概率性能.

  19. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  20. Philosophy and probability

    CERN Document Server

    Childers, Timothy

    2013-01-01

    Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,

  1. Dynamical Simulation of Probabilities

    Science.gov (United States)

    Zak, Michail

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices(such as random number generators). Self-orgainizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed. Special attention was focused upon coupled stochastic processes, defined in terms of conditional probabilities, for which joint probability does not exist. Simulations of quantum probabilities are also discussed.

  2. On the computability of conditional probability

    CERN Document Server

    Ackerman, Nathanael L; Roy, Daniel M

    2010-01-01

    We study the problem of computing conditional probabilities, a fundamental operation in statistics and machine learning. In the elementary discrete setting, a ratio of probabilities defines conditional probability. In the abstract setting, conditional probability is defined axiomatically and the search for more constructive definitions is the subject of a rich literature in probability theory and statistics. In the discrete or dominated setting, under suitable computability hypotheses, conditional probabilities are computable. However, we show that in general one cannot compute conditional probabilities. We do this by constructing a pair of computable random variables in the unit interval whose conditional distribution encodes the halting problem at almost every point. We show that this result is tight, in the sense that given an oracle for the halting problem, one can compute this conditional distribution. On the other hand, we show that conditioning in abstract settings is computable in the presence of cert...

  3. FAILURE PROBABILITY ANALYSIS ON SCHEDULING RULE WITH EXPONENTIALLY DISTRIBUTED PROCESSING TIME%作业时间服从指数分布的调度规则失效率分析

    Institute of Scientific and Technical Information of China (English)

    汤健超; 张国基; 张毕西; 林彬彬

    2011-01-01

    研究当作业时间服从指数分布时,作业的随机波动性对基于作业时间的调度规则的影响.对于2项作业以及3项作业的情况,采用概率统计分析的方法,推导出调度规则失效率解析解.对于多项作业的情况,提出产生随机数模拟的模型,给出失效率的无偏估计量和置信区间,并分析模拟的迭代次数的选取及模型有效性.从而得到:在作业时间服从指数分布时,相邻作业时间期望值的两两比值越接近1,调度规则失效率越大.对作业调度计划具理论指导意义.%The impact of jobs' random fluctuation on scheduling rule is analyzed, the processing time of which subjects to exponential distribution.For two jobs and three jobs, with theory of probability and statistics, analytical solutions of failure probability are obtained.For multiple jobs, by proposing a simulation of random number generator model, and presenting the unbiased estimator and confidence interval, the iteration times of simulation and effectiveness of the model are analyzed.Finally, the following conclusion is obtained that for jobs with exponentially distributed processing time, the failure probability gets higher when ratios of every two adjacent jobs get closer to 1.The research affords a theoretical guidance for job scheduling.

  4. Fatal accident distribution by age, gender and head injury, and death probability at accident scene in Mashhad, Iran, 2006-2009.

    Science.gov (United States)

    Zangooei Dovom, Hossein; Shafahi, Yousef; Zangooei Dovom, Mehdi

    2013-01-01

    Several studies have investigated road traffic deaths, but few have compared by road user type. Iran, with an estimated 44 road traffic deaths per 100,000 population in 2002 had higher road traffic deaths than any other country for which reliable estimates can be made. So, the present study was conducted on road death data and identified fatal accident distribution by age, gender and head injury as well as the influences of age and gender on deaths at accident scenes for all road user groups. Data used in this study are on fatal road accidents recorded by forensic medicine experts of the Khorasan Razavi province in Mashhad, the capital of the province, the second largest city and the largest place of pilgrimage, immigration and tourism in Iran. Chi-square test and odds ratio were used to identify the relation of death place with age and gender in 2495 fatal road accidents from 2006 to 2009. The t-test and analysis of variance were employed for continues variable, age, to compare males' and females' mean age for all road user categories. For two genders, all three groups of fatalities (pedestrian, motorcyclist and motor vehicle occupant) had a peak at the ages of 21-30. The youngest were male motorcyclists (mean age = 28). Old pedestrians were included in road deaths very much, too. Male/female overall ratio was 3.41 and the highest male/female ratio was related to motorcyclists (14). The overall ratio of head injury to other organ injuries (torso and underbody) was 2.51 and pedestrians had the largest amount of head injury (38.2%). Regarding death at accident scene, for all road users, gender did not have any significant relation with death at the scene (P-value > 0.1); on the contrary, age had significant relation (P-value accident scenes (male/female ratio at accident sense motorcyclists 41-50 and motor vehicle occupants 31-40 died the most at accident scenes. Identifying the most endangered groups of road accident fatalities, which was conducted in this

  5. Multinomial mixture model with heterogeneous classification probabilities

    Science.gov (United States)

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  6. Research on Probability Distribution Characteristics of Bulk Power System Reliability Considering Parameter Uncertainty%考虑参数不确定的电网可靠性概率分布特征

    Institute of Scientific and Technical Information of China (English)

    赵渊; 郭胤; 谢开贵

    2013-01-01

    The influence of parameter uncertainty on probability distribution characteristics of bulk power system reliability is researched, which is usually done only for expected indices of bulk power reliability in traditional power system reliability. The marginal probability distributions of pre-failure working time and post-failure repairing time of the component under parameter uncertainty are derived; to avoid the obstacles that the marginal distribution function is too complex to analyze and calculate, the numerical integration is utilized to discretize and characterize the marginal distribution function, and on this basis traditional sequential Monte Carlo algorithm is improved to implement the calculation of marginal probability distribution of reliability indices of bulk power system, in which the parameter uncertainty is taken into account, and the new thinking for the research on probability distribution characteristics of power system reliability under parameter uncertainty is explored. Meanwhile the comparison validation of computation time and efficiency by the proposed method with those by dual-loop Monte Carlo simulation is performed. The correctness and validity of the proposed method are validated by reliability evaluation results of RBTS and IEEE RTS79.%  从概率分布角度刻画电网可靠性的随机分布规律,并对概率分布特征受参数不确定性的影响开展研究,克服了常规电网可靠性评估中在参数不确定性影响研究上仅仅囿于期望值指标的缺陷。推导了参数不确定时元件故障前工作时间和故障后修复时间的边缘概率分布,为规避边缘分布函数过于复杂且难以解析计算的障碍,采用了数值积分对其进行离散化表征,并据此改进传统序贯蒙特卡洛算法,实现了计入参数不确定影响的电网可靠性指标的边缘概率分布计算,探索了参数不确定的电网可靠性概率分布特征研究的新思路。同时采用

  7. Probability on real Lie algebras

    CERN Document Server

    Franz, Uwe

    2016-01-01

    This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.

  8. Dynamic gesture recognition method based on probability matrix model of interval distribution%基于区间分布概率矩阵模型的动态手势识别方法

    Institute of Scientific and Technical Information of China (English)

    张建忠; 常丹华

    2013-01-01

    For the present, gesture recognition algorithms based on the accelerometer have contradiction between dynamic realtime and recognition rate. Aiming at this problem, a probability matrix model of interval distribution and dynamic gesture recognition method is proposed. The gestural signal from the three dimension accelerometer is preprocessed through a series of methods including automatically action data detection algorithm, normalization, and cubic spline interpolation. Moreover, according to the characteristics on signal distribution, the observation points in each axis are determined and get the probability matrix of interval distribution on each observation point. Further, the on-line and fast gesture discrimination algorithm is realized on the optimized matrixes. The method is evaluated on real data set from a finger-mount wearable device. The result shows that it has good real time effect and high recognition rate.%针对目前基于加速度传感器的手势识别算法的动态实时性与识别率的相互矛盾性,提出一种区间分布概率矩阵模型及动态手势识别方法.将手势动作的三维加速度信号进行动作数据自动检测、归一化和三次样条插值预处理,再根据信号分布特征,确定数据观测点,构造各观测点处的区间分布概率矩阵,优化矩阵,实现在线快速手势识别.该方法对手指可穿戴设备得到的真实数据集中进行了评估.结果显示其实时效果好,识别率高,实用性强.

  9. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  10. Precipitação provável para a região de Madre de Deus, Alto Rio Grande: modelos de probabilidades e valores característicos Probable rainfall for Madre de Deus county, Alto Rio Grande region: distribution probability models and values characteristics

    Directory of Open Access Journals (Sweden)

    José Alves Junqueira Júnior

    2007-06-01

    Full Text Available Nos dias atuais a irrigação é uma das principais técnicas a serviço da agricultura. Entretanto, a consideração da irrigação como única fonte de suprir a demanda de água para as plantas pode acarretar em sistemas superdimensionados, o que contribui para elevar seu custo de implantação. Uma das alternativas utilizadas na solução desse problema consiste em considerar a precipitação a um determinado nível de probabilidade, ou seja, a precipitação provável, o que possibilitaria fazer a irrigação complementar. Assim, objetivou-se com o presente trabalho, caracterizar a precipitação provável na região do município de Madre de Deus, MG, comparando quatro diferentes modelos de distribuição de freqüência (Gama, Normal, Log-normal 2 e 3 parâmetros. As lâminas diárias foram totalizadas em períodos de 10, 15 e 30 dias, sendo avaliadas com 13 diferentes níveis de probabilidades, para séries históricas de 57 anos de observação, compreendido entre 1942 e 1999. Foi aplicado o teste de Kolmogorov-Smirnov a fim de avaliar a adequabilidade das mesmas e verificar qual modelo é mais adequado para cada uma das séries históricas. Observou-se que os modelos de probabilidade adequaram-se melhor ao período chuvoso, sendo a distribuição Log-normal 3 parâmetros a mais adequada para as séries históricas de período mensal e a distribuição Gama para os períodos quinzenal e decendial.Nowadays, irrigation is one of the most important agricultural technique. Therefore, this technique can not be the only source to supply water for crops, because the irrigation system may be over designed, increasing installation costs. One of alternatives to solve this problem is to analyze the probability of rainfall, decreasing costs and easing the irrigation management. This study purposes to characterize probable rainfall for Madre de Deus Village, comparing four (4 probability distribution models (Gama, Normal, Log-normal at 2 and 3

  11. Probability and radical behaviorism

    Science.gov (United States)

    Espinosa, James M.

    1992-01-01

    The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforcement and extinction, respectively. PMID:22478114

  12. Probability and radical behaviorism

    OpenAIRE

    Espinosa, James M.

    1992-01-01

    The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforc...

  13. Influence of incident angles on echo intensity distribution of cat’ s eye photoelectric systems%入射角对猫眼光电系统回波光强分布的影响

    Institute of Scientific and Technical Information of China (English)

    滕渊; 和婷; 左帅

    2015-01-01

    In order to study effect of incident angles on echo intensity distribution of a cat ’ s eye photoelectric system , the model of intensity distribution of Gaussian beam passing through the cat ’ s eye photoelectric system was built on the basis of generalized diffraction integral formula and the analytical expressions of transmission were deduced .The intensity distribution varying with the incident angle at two detection distances was simulated numerically .The results show that in the range of half the field of view of a cat ’ s eye photoelectric system , when incident beam is Gaussian beam , the power of echo light beam becomes smaller with the increase of detection distance and incident angle and the distribution mode is close to Gaussian mode .Compared with the short distance detection , echo intensity distribution of the long distance detection which reaches the Gaussian mode has more stringent demands for incidence condition .This study provides theoretical basis for actual detection and has particular significance for selection of incidence angles .%为了研究入射角对猫眼光电系统回波光强分布的影响,从广义衍射积分公式出发,建立了倾斜高斯光束通过猫眼光电系统的光强分布模型,推导出其传输解析表达式,数值模拟了两种探测距离下光强分布随入射角度的变化规律。结果表明,在猫眼光电系统的半视场角范围内,入射光束为高斯光束时,探测距离、入射角的共同增大使得回波光束的能量逐渐减小,且分布模式越来越接近高斯模式;相比近距离探测,远距离探测时回波光强分布达到高斯模式对入射条件的要求更为严格。该研究为实际侦察提供了理论依据,尤其对入射角的选取具有指导意义。

  14. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  15. 基于最大熵原理的区域农业干旱度概率分布模型%Probability distribution model of regional agricultural drought degree based on the maximum entropy principle

    Institute of Scientific and Technical Information of China (English)

    陈海涛; 黄鑫; 邱林; 王文川

    2013-01-01

      提出了构建综合考虑自然因素与农作物生长周期之间量化关系的干旱度评价指标,并基于最大熵原理建立了项目区干旱度分布密度函数,避免了以往构建概率分布的随意性,实现了对区域农业干旱度进行量化评价的目的。首先根据作物在非充分灌溉条件下的减产率,建立了干旱程度的量化评价指标,然后通过蒙特卡罗法生成了长系列降雨资料,并计算历年干旱度指标,最后利用最大熵原理,构建了农业干旱度分布的概率分布密度函数。以河南省濮阳市渠村灌区为对象进行了实例计算。结果表明,该模型概念清晰,计算简便实用,结果符合实际,是一种较好的评估方法。%The evaluation index of drought degree,which comprehensively considering the quantitative rela⁃tionship between the crop growing period and natural factors, is presented in this paper. The distribution density function of drought degree has been established based on the maximum-entropy principle. It can avoid the randomness of probability distribution previous constructed and has realized purpose of quantita⁃tive evaluation of agricultural drought degree. Firstly, the quantitative evaluation index of drought degree was established according to the yield reduction rate of deficit irrigation conditions. Secondly,a long series rainfall data were generated by Monte-Carlo method and the past years index of drought degree were calcu⁃lated. Finally, the density function of probability distribution of agricultural drought degree distribution was constructed by using maximum entropy principle. As an example, the calculation results of the distribution of drought degree of agriculture in Qucun irrigation area were presented. The results show that the model provides a better evaluation method with clear concept,simple and practical approach,and reasonable out⁃comes.

  16. Advanced Probability Theory for Biomedical Engineers

    CERN Document Server

    Enderle, John

    2006-01-01

    This is the third in a series of short books on probability theory and random processes for biomedical engineers. This book focuses on standard probability distributions commonly encountered in biomedical engineering. The exponential, Poisson and Gaussian distributions are introduced, as well as important approximations to the Bernoulli PMF and Gaussian CDF. Many important properties of jointly Gaussian random variables are presented. The primary subjects of the final chapter are methods for determining the probability distribution of a function of a random variable. We first evaluate the prob

  17. PROBABILITY MODEL OF GUNTHER GENERATOR

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    This paper constructs the probability model of Gunther generator at first, and the finite dimension union distribution of the output sequence is presented. The result shows that the output sequence is an independent and uniformly distributed 0,1 random variable sequence.It gives the theoretical foundation about why Gunther generator can avoid the statistic weakness of the output sequence of stop-and-go generator, and analyzes the coincidence between output sequence and input sequences of Gunther generator. The conclusions of this paper would offer theoretical references for designers and analyzers of clock-controlled generators.

  18. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice prob...

  19. On Quantum Conditional Probability

    Directory of Open Access Journals (Sweden)

    Isabel Guerra Bobo

    2013-02-01

    Full Text Available We argue that quantum theory does not allow for a generalization of the notion of classical conditional probability by showing that the probability defined by the Lüders rule, standardly interpreted in the literature as the quantum-mechanical conditionalization rule, cannot be interpreted as such.

  20. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...

  1. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  2. Data analysis recipes: Probability calculus for inference

    CERN Document Server

    Hogg, David W

    2012-01-01

    In this pedagogical text aimed at those wanting to start thinking about or brush up on probabilistic inference, I review the rules by which probability distribution functions can (and cannot) be combined. I connect these rules to the operations performed in probabilistic data analysis. Dimensional analysis is emphasized as a valuable tool for helping to construct non-wrong probabilistic statements. The applications of probability calculus in constructing likelihoods, marginalized likelihoods, posterior probabilities, and posterior predictions are all discussed.

  3. Inference for the Bivariate and Multivariate Hidden Truncated Pareto(type II) and Pareto(type IV) Distribution and Some Measures of Divergence Related to Incompatibility of Probability Distribution

    Science.gov (United States)

    Ghosh, Indranil

    2011-01-01

    Consider a discrete bivariate random variable (X, Y) with possible values x[subscript 1], x[subscript 2],..., x[subscript I] for X and y[subscript 1], y[subscript 2],..., y[subscript J] for Y. Further suppose that the corresponding families of conditional distributions, for X given values of Y and of Y for given values of X are available. We…

  4. A new angle on the Euler angles

    Science.gov (United States)

    Markley, F. Landis; Shuster, Malcolm D.

    1995-01-01

    We present a generalization of the Euler angles to axes beyond the twelve conventional sets. The generalized Euler axes must satisfy the constraint that the first and the third are orthogonal to the second; but the angle between the first and third is arbitrary, rather than being restricted to the values 0 and pi/2, as in the conventional sets. This is the broadest generalization of the Euler angles that provides a representation of an arbitrary rotation matrix. The kinematics of the generalized Euler angles and their relation to the attitude matrix are presented. As a side benefit, the equations for the generalized Euler angles are universal in that they incorporate the equations for the twelve conventional sets of Euler angles in a natural way.

  5. Parameter Estimation of Generalized Partial Probability Weighted Moments for the Generalized Pareto Distribution%广义Pareto分布的广义有偏概率加权矩估计方法

    Institute of Scientific and Technical Information of China (English)

    赵旭; 程维虎; 李婧兰

    2012-01-01

    The generalized Pareto distribution (GPD) is one of the most important distribution in statistics analysis that has been widely used in finance, insurance, hydrology and meteorology applications and so on. While traditional estimation methods, such as maximum likelihood (ML), methods of moments (MOM) and probability weighted moments (PWM) methods have been extensively applied, the use of these methods are often restricted. Alternative approaches (e.g., generalized probability weighted moments, L-moments and LH-moments) exist but they use complete or non-censored samples. However, censored samples are often encountered in hydrology and meteorology fields. In this article, we propose a computationally easy method from censored data for fitting the GPD, which is resistant against extremely small or large outliers, I.e., they will be robust with the lower and upper breakdown points. This method is based on probability weighted moments. Firstly, we solve shape parameter estimator which has high estimated precision, then the location and scale parameters are given for the GPD. Simulation studies show that the proposed method performs well compared to traditional techniques.%广义Pareto分布(GPD)是统计分析中一个极为重要的分布,被广泛应用于金融、保险、水文及气象等领域.传统的参数估计方法如极大似然估计、矩估计及概率加权矩估计方法等已被广泛应用,但使用中存在一定的局限性.虽然提出很多改进方法如广义概率加权矩估计、L矩和LH矩法等,但都是研究完全样本的估计问题,而在水文及气象等应用领域常出现截尾样本.本文基于概率加权矩理论,利用截尾样本对三参数GPD提出一种应用范围广且简单易行的参数估计方法,可有效减弱异常值的影响.首先求解出具有较高精度的形状参数的参数估计,其次得出位置参数及尺度参数的参数估计.通过Monte Carlo模拟说明该方法估计精度较高.

  6. 基于物流视角的图书分销效率实证研究%A Logistics Angle for Empirical Study on Book Distribution Efficiency

    Institute of Scientific and Technical Information of China (English)

    罗芳

    2013-01-01

    In this paper starting from the studies on the performance and efficiency of book distribution,we focused on the factors influencing book distribution efficiency and book logistics cost,then established the index system for the measurement of book distribution efficiency and found that controlling the logistics cost in book distribution,reasonably using e-commerce information flow and optimizing the plan of the route of the book distribution supply chain were the key factors in improving the efficiency of modern book distribution.%将具备传统与现代电子商务特点下物流的典型商品-图书作为分销效率的客观载体,从图书分销绩效出发,着眼于影响图书分销效率与图书物流成本等因素,构建衡量图书分销效率的指标体系,进一步对该指标体系采用基于熵值法的AHP分析与效度检验中的因子分析进行实证研究.研究发现,控制图书分销物流成本、合理使用电子商务信息流与优化设计图书分销供应链的线路,是提高现代图书分销效率的关键因素.

  7. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  8. Glaucoma, Open-Angle

    Science.gov (United States)

    ... Programs Home > Statistics and Data > Glaucoma, Open-angle Glaucoma, Open-angle Open-angle Glaucoma Defined In open-angle glaucoma, the fluid passes ... 2010 2010 U.S. Age-Specific Prevalence Rates for Glaucoma by Age and Race/Ethnicity The prevalence of ...

  9. Pulsed Laser Ablation-Induced Green Synthesis of TiO2 Nanoparticles and Application of Novel Small Angle X-Ray Scattering Technique for Nanoparticle Size and Size Distribution Analysis.

    Science.gov (United States)

    Singh, Amandeep; Vihinen, Jorma; Frankberg, Erkka; Hyvärinen, Leo; Honkanen, Mari; Levänen, Erkki

    2016-12-01

    This paper aims to introduce small angle X-ray scattering (SAXS) as a promising technique for measuring size and size distribution of TiO2 nanoparticles. In this manuscript, pulsed laser ablation in liquids (PLAL) has been demonstrated as a quick and simple technique for synthesizing TiO2 nanoparticles directly into deionized water as a suspension from titanium targets. Spherical TiO2 nanoparticles with diameters in the range 4-35 nm were observed with transmission electron microscopy (TEM). X-ray diffraction (XRD) showed highly crystalline nanoparticles that comprised of two main photoactive phases of TiO2: anatase and rutile. However, presence of minor amounts of brookite was also reported. The traditional methods for nanoparticle size and size distribution analysis such as electron microscopy-based methods are time-consuming. In this study, we have proposed and validated SAXS as a promising method for characterization of laser-ablated TiO2 nanoparticles for their size and size distribution by comparing SAXS- and TEM-measured nanoparticle size and size distribution. SAXS- and TEM-measured size distributions closely followed each other for each sample, and size distributions in both showed maxima at the same nanoparticle size. The SAXS-measured nanoparticle diameters were slightly larger than the respective diameters measured by TEM. This was because SAXS measures an agglomerate consisting of several particles as one big particle which slightly increased the mean diameter. TEM- and SAXS-measured mean diameters when plotted together showed similar trend in the variation in the size as the laser power was changed which along with extremely similar size distributions for TEM and SAXS validated the application of SAXS for size distribution measurement of the synthesized TiO2 nanoparticles.

  10. Pulsed Laser Ablation-Induced Green Synthesis of TiO2 Nanoparticles and Application of Novel Small Angle X-Ray Scattering Technique for Nanoparticle Size and Size Distribution Analysis

    Science.gov (United States)

    Singh, Amandeep; Vihinen, Jorma; Frankberg, Erkka; Hyvärinen, Leo; Honkanen, Mari; Levänen, Erkki

    2016-10-01

    This paper aims to introduce small angle X-ray scattering (SAXS) as a promising technique for measuring size and size distribution of TiO2 nanoparticles. In this manuscript, pulsed laser ablation in liquids (PLAL) has been demonstrated as a quick and simple technique for synthesizing TiO2 nanoparticles directly into deionized water as a suspension from titanium targets. Spherical TiO2 nanoparticles with diameters in the range 4-35 nm were observed with transmission electron microscopy (TEM). X-ray diffraction (XRD) showed highly crystalline nanoparticles that comprised of two main photoactive phases of TiO2: anatase and rutile. However, presence of minor amounts of brookite was also reported. The traditional methods for nanoparticle size and size distribution analysis such as electron microscopy-based methods are time-consuming. In this study, we have proposed and validated SAXS as a promising method for characterization of laser-ablated TiO2 nanoparticles for their size and size distribution by comparing SAXS- and TEM-measured nanoparticle size and size distribution. SAXS- and TEM-measured size distributions closely followed each other for each sample, and size distributions in both showed maxima at the same nanoparticle size. The SAXS-measured nanoparticle diameters were slightly larger than the respective diameters measured by TEM. This was because SAXS measures an agglomerate consisting of several particles as one big particle which slightly increased the mean diameter. TEM- and SAXS-measured mean diameters when plotted together showed similar trend in the variation in the size as the laser power was changed which along with extremely similar size distributions for TEM and SAXS validated the application of SAXS for size distribution measurement of the synthesized TiO2 nanoparticles.

  11. Characteristic Functions over C*-Probability Spaces

    Institute of Scientific and Technical Information of China (English)

    王勤; 李绍宽

    2003-01-01

    Various properties of the characteristic functions of random variables in a non-commutative C*-probability space are studied in this paper. It turns out that the distributions of random variables are uniquely determined by their characteristic functions. By using the properties of characteristic functions, a central limit theorem for a sequence of independent identically distributed random variables in a C*-probability space is established as well.

  12. Representing Uncertainty by Probability and Possibility

    DEFF Research Database (Denmark)

    Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance of uncer......Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance...

  13. Probability density functions characterizing PSC particle size distribution parameters for NAT and STS derived from in situ measurements between 1989 and 2010 above McMurdo Station, Antarctica, and between 1991-2004 above Kiruna, Sweden

    Science.gov (United States)

    Deshler, Terry

    2016-04-01

    Balloon-borne optical particle counters were used to make in situ size resolved particle concentration measurements within polar stratospheric clouds (PSCs) over 20 years in the Antarctic and over 10 years in the Arctic. The measurements were made primarily during the late winter in the Antarctic and in the early and mid-winter in the Arctic. Measurements in early and mid-winter were also made during 5 years in the Antarctic. For the analysis bimodal lognormal size distributions are fit to 250 meter averages of the particle concentration data. The characteristics of these fits, along with temperature, water and nitric acid vapor mixing ratios, are used to classify the PSC observations as either NAT, STS, ice, or some mixture of these. The vapor mixing ratios are obtained from satellite when possible, otherwise assumptions are made. This classification of the data is used to construct probability density functions for NAT, STS, and ice number concentration, median radius and distribution width for mid and late winter clouds in the Antarctic and for early and mid-winter clouds in the Arctic. Additional analysis is focused on characterizing the temperature histories associated with the particle classes and the different time periods. The results from theses analyses will be presented, and should be useful to set bounds for retrievals of PSC properties from remote measurements, and to constrain model representations of PSCs.

  14. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  15. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  16. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  17. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  18. Atmospheric gamma ray angle and energy distributions from sea level to 3.5 g/sq cm and 2 to 25 MeV

    Science.gov (United States)

    Ryan, J. M.; Jennings, M. C.; Radwin, M. D.; Zych, A. D.; White, R. S.

    1979-01-01

    Differential fluxes of gamma rays were calculated for energies of 2-25 MeV, zenith angles of 0-50 deg and 180-130 deg, and atmospheric depths from nominal sea level, 1000 g/sq cm, to float altitude, 3.5 g/sq cm residual atmosphere. Above 100 g/sq cm growth curves were constructed to estimate the contribution of the extraterrestrial gamma ray flux to the total downward-moving flux, while the upward-moving gamma rays were taken to be strictly of atmospheric origin. Below 100 g/sq cm, all gamma rays originate in the atmosphere. The downward atmospheric flux increases by almost two orders of magnitude between float altitude and the Pfotzer maximum, while the extraterrestrial flux is attenuated exponentially. Gamma rays produced by neutron interactions with the carbon in the scintillator liquid are eliminated by constructing growth curves for downward-moving gamma rays at high altitudes and are negligible compared with downward-moving gamma rays at lower altitudes and upward-moving gamma rays at all altitudes.

  19. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  20. Monte Carlo transition probabilities

    OpenAIRE

    Lucy, L. B.

    2001-01-01

    Transition probabilities governing the interaction of energy packets and matter are derived that allow Monte Carlo NLTE transfer codes to be constructed without simplifying the treatment of line formation. These probabilities are such that the Monte Carlo calculation asymptotically recovers the local emissivity of a gas in statistical equilibrium. Numerical experiments with one-point statistical equilibrium problems for Fe II and Hydrogen confirm this asymptotic behaviour. In addition, the re...

  1. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  2. Tomographic probability representation for quantum fermion fields

    CERN Document Server

    Andreev, V A; Man'ko, V I; Son, Nguyen Hung; Thanh, Nguyen Cong; Timofeev, Yu P; Zakharov, S D

    2009-01-01

    Tomographic probability representation is introduced for fermion fields. The states of the fermions are mapped onto probability distribution of discrete random variables (spin projections). The operators acting on the fermion states are described by fermionic tomographic symbols. The product of the operators acting on the fermion states is mapped onto star-product of the fermionic symbols. The kernel of the star-product is obtained. The antisymmetry of the fermion states is formulated as the specific symmetry property of the tomographic joint probability distribution associated with the states.

  3. The New Method Judged Horizontal Distribution Pattern by Uniform Angle Index%角尺度判断林木水平分布格局的新方法

    Institute of Scientific and Technical Information of China (English)

    赵中华; 惠刚盈; 胡艳波; 张弓乔

    2016-01-01

    正态分布检验林分(树种)平均角尺度判断林木水平分布格局方法对2个林分/种群的判断结果与 Ripley’s L函数点格局分析方法判断结果完全一致,而聚集指数 R与 Ripley’s L检验的判断结果的差别明显增加,说明置信水平对水平分布格局判断结果影响比较明显。【结论】研究提出的正态分布检验林分(树种)平均角尺度判断林木水平分布格局的方法克服了统一的置信区间不适用于评判抽样调查或群落中数量较少的种群水平分布格局问题,进一步完善了角尺度判断林木水平分布格局理论,提升了角尺度判断林木水平分布格局的准确性与适用范围。%Objective]This paper proposed a new method to judge tree horizontal distribution pattern by uniform angle index in order to further improve the theory of the uniform angle index to judge tree horizontal distribution pattern.[Method]6 000 simulated stands with an area of 70 m × 70 m and with different densities and distribution patterns were produced by stand spatial structure analysis software ( Winkelmass) ,the 2 field-tested broad-leaved korean pine forests in northeast China were then used to verify the accuracy of the new method for judging the stand and population horizontal distribution pattern,and the results were also compared with R aggregation index and Ripley’s L.[Result]According to the conclusion of the mean value of uniform angle index ( W ) of random distribution stand conform to the normal distribution and its relationships with the standard deviation,this contribution proposed the new method of judgment stand/population spatial horizontal distribution pattern by uniform angle index. The 6 000 simulated stands with different density and horizontal distribution patterns were produced by Winkelmass with an area of 70 m × 70 m. The results of simulation data showed that the coincidence rate of uniform angle index normal distribution test method was 100% to different density

  4. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  5. 冷却猪肉阴性样品中气单胞菌概率分布的影响与优选%Effect and optimization of Aeromonas spp probability distribution in negative samples of chilled pork

    Institute of Scientific and Technical Information of China (English)

    董庆利; 宋筱瑜; 丁甜; 刘箐

    2016-01-01

    This study was designed to verify the effects of pathogen in the negative samples on quantitative microbiological risk assessment ( QMRA) . Previous research on QMRA of Aeromonas spp. in chilled pork was taken as an example,and two scenarios of Aeromonas spp. in the negative samples,zero and maximum value ( detection limit) ,respectively,were simulated in quantitative exposure assessment. The predictive food⁃poison probability of the two scenarios was 33�6% and 69�3%,respectively,and these values were higher than the previous results of 22�1% based on Jarvis function to estimate the possible pathogen distribution in negative samples significantly ( P<0�01) . Moreover,Akaike Information Criterion ( AIC) , Bayesian Information Criterion ( BIC) , X2 , and other parameters were applied for evaluating pathogen in negative sample with different continuous probability distributions. Exponential distribution proved to be better than Logistic, Normal, Triangle and Uniform with AIC values equaling to -41�24 and -135�62 under the two simulated scenarios, respectively, lower than the results of other distributions. In conclusion,pathogen distribution in negative sample should be noted and further optimized during QMRA in future.%探讨不同阴性样品中致病菌污染水平对定量风险评估结果的影响。以冷却猪肉中气单胞菌定量暴露评估为例,设定阴性样品中致病菌为零值和最大值(检测限)2种极端场景,由此预测冷却猪肉中因气单胞菌导致食物中毒的概率分别为33�6%和69�3%,显著高于根据Jarvis经典公式模拟阴性样品的结果(22�1%,P<0�01)。同时,应用赤池信息量准则( AIC)、贝叶斯信息准则( BIC)和卡方检验( X2)等评价参数,对阴性样品污染水平的不同连续型概率分布进行了比较,表明上述两种极端场景下应用指数分布最优,AIC分别为-41�24和-135�62,低于逻辑、正态、

  6. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  7. Towards a Categorical Account of Conditional Probability

    Directory of Open Access Journals (Sweden)

    Robert Furber

    2015-11-01

    Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.

  8. Probabilities from Envariance

    CERN Document Server

    Zurek, W H

    2004-01-01

    I show how probabilities arise in quantum physics by exploring implications of {\\it environment - assisted invariance} or {\\it envariance}, a recently discovered symmetry exhibited by entangled quantum systems. Envariance of perfectly entangled states can be used to rigorously justify complete ignorance of the observer about the outcome of any measurement on either of the members of the entangled pair. Envariance leads to Born's rule, $p_k \\propto |\\psi_k|^2$. Probabilities derived in this manner are an objective reflection of the underlying state of the system -- they reflect experimentally verifiable symmetries, and not just a subjective ``state of knowledge'' of the observer. Envariance - based approach is compared with and found superior to the key pre-quantum definitions of probability including the {\\it standard definition} based on the `principle of indifference' due to Laplace, and the {\\it relative frequency approach} advocated by von Mises. Implications of envariance for the interpretation of quantu...

  9. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  10. Negative Probabilities and Contextuality

    CERN Document Server

    de Barros, J Acacio; Oas, Gary

    2015-01-01

    There has been a growing interest, both in physics and psychology, in understanding contextuality in experimentally observed quantities. Different approaches have been proposed to deal with contextual systems, and a promising one is contextuality-by-default, put forth by Dzhafarov and Kujala. The goal of this paper is to present a tutorial on a different approach: negative probabilities. We do so by presenting the overall theory of negative probabilities in a way that is consistent with contextuality-by-default and by examining with this theory some simple examples where contextuality appears, both in physics and psychology.

  11. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  12. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  13. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  14. 毛豆叶面积回归测算及统计分布研究%Study on Regression Estimation and Probability Distribution of Leaf Area for Soybean

    Institute of Scientific and Technical Information of China (English)

    蒋兆英; 凌斌; 陈林; 孙怀卫

    2012-01-01

    Estimation of the leaf area is essential for agriculture disaster research,irrigation estimation and crop cultivation.In this paper,the leaf area,leaf length and leaf width of soybean are processed by MATLAB and the relationship among the leaf area,leaf length and length width is studied.Several regression functions are found and verified to be useful in the calculation of leaf area.The probability distribution of leaf index is studied and the lognormal model and Gamma model are demonstrated to fit the real distribution of leaf area.%在农业灾害研究、农田灌溉估算、作物栽培指导等实际应用中都需要测算叶面积。通过MATLAB程序对数码相机拍摄的毛豆叶片照片进行处理,计算得出叶长、叶宽和叶面积,并分别对叶面积与长乘宽、叶面积与长的平方的关系进行拟合,发现叶面积与长乘宽的拟合方程精度较高。研究了叶片指标的统计分布规律,并采用K-S(Kolmogorov-Smirnov)检验法对可能的分布进行了检验。结果发现对数正态分布和Gamma分布较为符合叶片指标的实际分布。给出了毛豆的叶面积的实际测量方法,并为叶面积的研究和应用提供了理论依据。

  15. Contact Angle Goniometer

    Data.gov (United States)

    Federal Laboratory Consortium — Description:The FTA32 goniometer provides video-based contact angle and surface tension measurement. Contact angles are measured by fitting a mathematical expression...

  16. 下颌骨角部受力时应力分布的三维有限元分析%Three-dimensional finite element analysis of stress distribution of mandible angle under forces

    Institute of Scientific and Technical Information of China (English)

    吴凌莉; 陈骏; 李志杰; 何祥一

    2011-01-01

    背景:关于下颌骨撞击后,下颌骨不同部位的骨折危险性的相关研究比较薄弱.目的:应用有限元方法分析下颌骨角部受瞬间外力作用下应力分布的情况和特点.方法:采用薄层CT扫描技术、医学影像三维重建软件Amira联合Unigraphics NX造型软件建立下颌骨三维模型.在Ansys软件中,于左侧下颌角区分别施以与矢状面垂直,平行的两个不同方向的1 000 N压力,获取受力后下颌骨应力分布状况和薄弱区域受力大小.结果与结论:建立了下颌骨有限元模型,当左侧下颌角受到水平向右垂直于矢状面的外力时,左右两侧的下颌角及髁状突颈部极易造成骨折,正中联合区域内侧面,左侧颏孔区可能会出现骨裂,而右侧颏孔区仅会造成轻微损伤.当左侧下颌角下缘受到垂直向上平行于矢状面的外力时,左右两侧的下颌角及髁状突颈部极易造成骨折,正中联合区域内侧面和左侧颏孔区可能会出现骨裂,而右侧颏孔区仅会轻微损伤.提示当下颌骨角部受到瞬间外力时,应力主要集中在下颌骨的薄弱区域,应力较大部位与骨折易发部位密切相关.且在薄弱区域中,两侧下颌角及髁状突颈部的损害最为严重.%BACKGROUND: The studies regarding fracture risk in different parts of the mandible after impact are few.OBJECTIVE: To study the condition and characteristics of stress distribution in the mandible, of which forces are applied on mandibular angle by using 3D finite element analysis.METHODS: The electronic CT images of mandible were scanned and then imported to the Amira software for the fabrication of 3D mandible model. At the angle of 90°, 0° to the middle sagittal plane, 1 000 N forces were applied on the left side of mandibular angle in Ansys software. Thereafter, the condition of stress distribution and Von Mises stress of the weak parts of the mandible were expressed in the 3D mandible model.RESULTS AND CONCLUSION: 3D

  17. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  18. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  19. Varga: On Probability.

    Science.gov (United States)

    Varga, Tamas

    This booklet resulted from a 1980 visit by the author, a Hungarian mathematics educator, to the Teachers' Center Project at Southern Illinois University at Edwardsville. Included are activities and problems that make probablility concepts accessible to young children. The topics considered are: two probability games; choosing two beads; matching…

  20. End to end Distance and its Probability Distribution of Polymer Chains near a Flat Barrier%平面壁限制的高分子链末端距及其概率分布

    Institute of Scientific and Technical Information of China (English)

    黄建花; 蒋文华; 韩世钧

    2001-01-01

    The problem of polymer chains near an impenetrable plane is investigated by means of the probability method. It is shown that the 2kth moment of the reduced normal component of the end-to-end distance A2k only depends on the reduced distance to the plane of the first segment AZ0, here, A=l- 1· , n is the chain length, l is the bond length and fixed to be unity, which can be expressed as A2k=f(AZ0). When AZ0≈ 0, A2k is the maximum(A2k=k!), then it decreases rapidly and soon reaches the minimum with the increase of AZ0, afterwards A2k goes up gradually and reaches the limit value [(2k- 1)× (2k- 3)× … × 1]/2k when AZ0 is large enough. Suggesting that the polymer chain can be significantly elongated for small Z0 and contracted for an intermediate range of Z0 due to the barrier. The distribution of the end-to-end distance also depends on the distance Z0 to the plane of the first segment.