WorldWideScience

Sample records for angle probability distributions

  1. Predicting dihedral angle probability distributions for protein coil residues from primary sequence using neural networks

    DEFF Research Database (Denmark)

    Helles, Glennie; Fonseca, Rasmus

    2009-01-01

    residue in the input-window. The trained neural network shows a significant improvement (4-68%) in predicting the most probable bin (covering a 30°×30° area of the dihedral angle space) for all amino acids in the data set compared to first order statistics. An accuracy comparable to that of secondary...... local context dependent dihedral angle propensities in coil-regions. This predicted distribution can potentially improve tertiary structure prediction methods that are based on sampling the backbone dihedral angles of individual amino acids. The predicted distribution may also help predict local...

  2. Exact Probability Distribution versus Entropy

    Directory of Open Access Journals (Sweden)

    Kerstin Andersson

    2014-10-01

    Full Text Available The problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations  are necessary in order to estimate the number of guesses.  Several  kinds of approximations  are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic  sizes of alphabets and words (100, the number of guesses can be estimated  within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions.  For many probability  distributions,  the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion  of guesses needed on average compared to the total number  decreases almost exponentially with the word length. The leading term in an asymptotic  expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.

  3. Probability distributions of landslide volumes

    Directory of Open Access Journals (Sweden)

    M. T. Brunetti

    2009-03-01

    Full Text Available We examine 19 datasets with measurements of landslide volume, VL, for sub-aerial, submarine, and extraterrestrial mass movements. Individual datasets include from 17 to 1019 landslides of different types, including rock fall, rock slide, rock avalanche, soil slide, slide, and debris flow, with individual landslide volumes ranging over 10−4 m3VL≤1013 m3. We determine the probability density of landslide volumes, p(VL, using kernel density estimation. Each landslide dataset exhibits heavy tailed (self-similar behaviour for their frequency-size distributions, p(VL as a function of VL, for failures exceeding different threshold volumes, VL*, for each dataset. These non-cumulative heavy-tailed distributions for each dataset are negative power-laws, with exponents 1.0≤β≤1.9, and averaging β≈1.3. The scaling behaviour of VL for the ensemble of the 19 datasets is over 17 orders of magnitude, and is independent of lithological characteristics, morphological settings, triggering mechanisms, length of period and extent of the area covered by the datasets, presence or lack of water in the failed materials, and magnitude of gravitational fields. We argue that the statistics of landslide volume is conditioned primarily on the geometrical properties of the slope or rock mass where failures occur. Differences in the values of the scaling exponents reflect the primary landslide types, with rock falls exhibiting a smaller scaling exponent (1.1≤β≤1.4 than slides and soil slides (1.5≤β≤1.9. We argue that the difference is a consequence of the disparity in the mechanics of rock falls and slides.

  4. Probability distributions of landslide volumes

    OpenAIRE

    M. T. Brunetti; Guzzetti, F.; M. Rossi

    2009-01-01

    We examine 19 datasets with measurements of landslide volume, VL, for sub-aerial, submarine, and extraterrestrial mass movements. Individual datasets include from 17 to 1019 landslides of different types, including rock fall, rock slide, rock avalanche, soil slide, slide, and debris flow, with individual landslide volumes ranging over 10−4 m3≤VL≤1013 m3. We determine the probability density of landslide volumes, p(VL), using kernel density estimation. Each landslide...

  5. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how the...... motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  6. ASYMPTOTIC QUANTIZATION OF PROBABILITY DISTRIBUTIONS

    Institute of Scientific and Technical Information of China (English)

    Klaus P(o)tzelberger

    2003-01-01

    We give a brief introduction to results on the asymptotics of quantization errors.The topics discussed include the quantization dimension,asymptotic distributions of sets of prototypes,asymptotically optimal quantizations,approximations and random quantizations.

  7. The Multivariate Gaussian Probability Distribution

    DEFF Research Database (Denmark)

    Ahrendt, Peter

    2005-01-01

    This technical report intends to gather information about the multivariate gaussian distribution, that was previously not (at least to my knowledge) to be found in one place and written as a reference manual. Additionally, some useful tips and tricks are collected that may be useful in practical ...

  8. Non-Gaussian Photon Probability Distribution

    International Nuclear Information System (INIS)

    This paper investigates the axiom that the photon's probability distribution is a Gaussian distribution. The Airy disc empirical evidence shows that the best fit, if not exact, distribution is a modified Gamma mΓ distribution (whose parameters are α = r, βr/√(u)) in the plane orthogonal to the motion of the photon. This modified Gamma distribution is then used to reconstruct the probability distributions along the hypotenuse from the pinhole, arc from the pinhole, and a line parallel to photon motion. This reconstruction shows that the photon's probability distribution is not a Gaussian function. However, under certain conditions, the distribution can appear to be Normal, thereby accounting for the success of quantum mechanics. This modified Gamma distribution changes with the shape of objects around it and thus explains how the observer alters the observation. This property therefore places additional constraints to quantum entanglement experiments. This paper shows that photon interaction is a multi-phenomena effect consisting of the probability to interact Pi, the probabilistic function and the ability to interact Ai, the electromagnetic function. Splitting the probability function Pi from the electromagnetic function Ai enables the investigation of the photon behavior from a purely probabilistic Pi perspective. The Probabilistic Interaction Hypothesis is proposed as a consistent method for handling the two different phenomena, the probability function Pi and the ability to interact Ai, thus redefining radiation shielding, stealth or cloaking, and invisibility as different effects of a single phenomenon Pi of the photon probability distribution. Sub wavelength photon behavior is successfully modeled as a multi-phenomena behavior. The Probabilistic Interaction Hypothesis provides a good fit to Otoshi's (1972) microwave shielding, Schurig et al.(2006) microwave cloaking, and Oulton et al.(2008) sub wavelength confinement; thereby providing a strong case that

  9. Probability Distributions for a Surjective Unimodal Map

    Institute of Scientific and Technical Information of China (English)

    HongyanSUN; LongWANG

    1996-01-01

    In this paper we show that the probability distributions for a surjective unimodal map can be classified into three types,δ function,asymmetric and symmetric type,by identifying the binary structures of its initial values.The Borel's normal number theorem is equivalent or prior to the Frobenius-Perron operator in analyzing the probability distributions for this kind of maps,and in particular we can constitute a multifractal probability distribution from the surjective tent map by selecting a non- Borel's normal number as the initial value.

  10. Lagrangian Probability Distributions of Turbulent Flows

    OpenAIRE

    Friedrich, R.

    2002-01-01

    We outline a statistical theory of turbulence based on the Lagrangian formulation of fluid motion. We derive a hierarchy of evolution equations for Lagrangian N-point probability distributions as well as a functional equation for a suitably defined probability functional which is the analog of Hopf's functional equation. Furthermore, we adress the derivation of a generalized Fokker-Plank equation for the joint velocity - position probability density of N fluid particles.

  11. Asymmetry of the work probability distribution

    OpenAIRE

    Saha, Arnab; Bhattacharjee, J. K.

    2006-01-01

    We show, both analytically and numerically, that for a nonlinear system making a transition from one equilibrium state to another under the action of an external time dependent force, the work probability distribution is in general asymmetric.

  12. The Pauli equation for probability distributions

    International Nuclear Information System (INIS)

    The tomographic-probability distribution for a measurable coordinate and spin projection is introduced to describe quantum states as an alternative to the density matrix. An analogue of the Pauli equation for the spin-1/2 particle is obtained for such a probability distribution instead of the usual equation for the wavefunction. Examples of the tomographic description of Landau levels and coherent states of a charged particle moving in a constant magnetic field are presented. (author)

  13. The Pauli equation for probability distributions

    Energy Technology Data Exchange (ETDEWEB)

    Mancini, S. [INFM, Dipartimento di Fisica, Universita di Milano, Milan (Italy). E-mail: Stefano.Mancini@mi.infn.it; Man' ko, O.V. [P.N. Lebedev Physical Institute, Moscow (Russian Federation). E-mail: Olga.Manko@sci.lebedev.ru; Man' ko, V.I. [INFM, Dipartimento di Matematica e Fisica, Universita di Camerino, Camerino (Italy). E-mail: Vladimir.Manko@sci.lebedev.ru; Tombesi, P. [INFM, Dipartimento di Matematica e Fisica, Universita di Camerino, Camerino (Italy). E-mail: Paolo.Tombesi@campus.unicam.it

    2001-04-27

    The tomographic-probability distribution for a measurable coordinate and spin projection is introduced to describe quantum states as an alternative to the density matrix. An analogue of the Pauli equation for the spin-1/2 particle is obtained for such a probability distribution instead of the usual equation for the wavefunction. Examples of the tomographic description of Landau levels and coherent states of a charged particle moving in a constant magnetic field are presented. (author)

  14. Learning a Probability Distribution Efficiently and Reliably

    Science.gov (United States)

    Laird, Philip; Gamble, Evan

    1988-01-01

    A new algorithm, called the CDF-Inversion Algorithm, is described. Using it, one can efficiently learn a probability distribution over a finite set to a specified accuracy and confidence. The algorithm can be extended to learn joint distributions over a vector space. Some implementation results are described.

  15. The Pauli Equation for Probability Distributions

    OpenAIRE

    Mancini, S.; Man'ko, O. V.; Man'ko, V. I.; Tombesi, P.

    2000-01-01

    The "marginal" distributions for measurable coordinate and spin projection is introduced. Then, the analog of the Pauli equation for spin-1/2 particle is obtained for such probability distributions instead of the usual wave functions. That allows a classical-like approach to quantum mechanics. Some illuminating examples are presented.

  16. The Pauli Equation for Probability Distributions

    CERN Document Server

    Mancini, S; Man'ko, V I; Tombesi, P

    2001-01-01

    The "marginal" distributions for measurable coordinate and spin projection is introduced. Then, the analog of the Pauli equation for spin-1/2 particle is obtained for such probability distributions instead of the usual wave functions. That allows a classical-like approach to quantum mechanics. Some illuminating examples are presented.

  17. Five-Parameter Bivariate Probability Distribution

    Science.gov (United States)

    Tubbs, J.; Brewer, D.; Smith, O. W.

    1986-01-01

    NASA technical memorandum presents four papers about five-parameter bivariate gamma class of probability distributions. With some overlap of subject matter, papers address different aspects of theories of these distributions and use in forming statistical models of such phenomena as wind gusts. Provides acceptable results for defining constraints in problems designing aircraft and spacecraft to withstand large wind-gust loads.

  18. Qualitative criterion for atom sputtering angle distributions

    International Nuclear Information System (INIS)

    A model is introduced to explain the shape of atom polar emission angle distributions for monocomponent targets sputtered by normally incident keV - energy ions. Analytical expressions are obtained from the model which make it possible to identify three known kinds of the angle distributions - subcosinus, isotropic and supracosinus, for given ion energies and target-ion pairs. Furthermore the fourth, hybrid false-isotropic distribution is found, which is superposition of supracosinus and subcosinus distributions. The theoretical predictions of the angle distributions shape agree with the numerical modeling for sputtering of carbon and platinum by 0.1-10 keV Ar+ ions

  19. Proposal for Modified Damage Probability Distribution Functions

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup; Hansen, Peter Friis

    1996-01-01

    Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on...

  20. Probability distributions for offshore wind speeds

    International Nuclear Information System (INIS)

    In planning offshore wind farms, short-term wind speeds play a central role in estimating various engineering parameters, such as power output, extreme wind load, and fatigue load. Lacking wind speed time series of sufficient length, the probability distribution of wind speed serves as the primary substitute for data when estimating design parameters. It is common practice to model short-term wind speeds with the Weibull distribution. Using 10-min wind speed time series at 178 ocean buoy stations ranging from 1 month to 20 years in duration, we show that the widely-accepted Weibull distribution provides a poor fit to the distribution of wind speeds when compared with more complicated models. We compare distributions in terms of three different metrics: probability plot R2, estimates of average turbine power output, and estimates of extreme wind speed. While the Weibull model generally gives larger R2 than any other 2-parameter distribution, the bimodal Weibull, Kappa, and Wakeby models all show R2 values significantly closer to 1 than the other distributions considered (including the Weibull), with the bimodal Weibull giving the best fits. The Kappa and Wakeby distributions fit the upper tail (higher wind speeds) of a sample better than the bimodal Weibull, but may drastically over-estimate the frequency of lower wind speeds. Because the average turbine power is controlled by high wind speeds, the Kappa and Wakeby estimate average turbine power output very well, with the Kappa giving the least bias and mean square error out of all the distributions. The 2-parameter Lognormal distribution performs best for estimating extreme wind speeds, but still gives estimates with significant error. The fact that different distributions excel under different applications motivates further research on model selection based upon the engineering parameter of interest.

  1. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions and the...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....... computations of aggregate values. The paper also reports on the experiments with the methods. The work is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. No previous work considers the combination of the aspects of uncertain...

  2. Evolution of the jet opening angle distribution in holographic plasma

    CERN Document Server

    Rajagopal, Krishna; van der Schee, Wilke

    2016-01-01

    We use holography to analyze the evolution of an ensemble of jets, with an initial probability distribution for their energy and opening angle as %for jets in proton-proton (pp) collisions, as they propagate through an expanding cooling droplet of strongly coupled plasma as in heavy ion collisions. We identify two competing effects: (i) each individual jet widens as it propagates; (ii) the opening angle distribution for jets emerging from the plasma within any specified range of energies has been pushed toward smaller angles, comparing to pp jets with the same energies. The second effect arises because small-angle jets suffer less energy loss and because jets with a higher initial energy are less probable in the ensemble. We illustrate both effects in a simple two-parameter model, and find that their consequence in sum is that the opening angle distribution for jets in any range of energies contains fewer narrow and wide jets. Either effect can dominate in the mean opening angle, for not unreasonable values o...

  3. Confidence intervals for the lognormal probability distribution

    International Nuclear Information System (INIS)

    The present communication addresses the topic of symmetric confidence intervals for the lognormal probability distribution. This distribution is frequently utilized to characterize inherently positive, continuous random variables that are selected to represent many physical quantities in applied nuclear science and technology. The basic formalism is outlined herein and a conjured numerical example is provided for illustration. It is demonstrated that when the uncertainty reflected in a lognormal probability distribution is large, the use of a confidence interval provides much more useful information about the variable used to represent a particular physical quantity than can be had by adhering to the notion that the mean value and standard deviation of the distribution ought to be interpreted as best value and corresponding error, respectively. Furthermore, it is shown that if the uncertainty is very large a disturbing anomaly can arise when one insists on interpreting the mean value and standard deviation as the best value and corresponding error, respectively. Reliance on using the mode and median as alternative parameters to represent the best available knowledge of a variable with large uncertainties is also shown to entail limitations. Finally, a realistic physical example involving the decay of radioactivity over a time period that spans many half-lives is presented and analyzed to further illustrate the concepts discussed in this communication

  4. Joint probability distributions for projection probabilities of random orthonormal states

    International Nuclear Information System (INIS)

    The quantum chaos conjecture applied to a finite dimensional quantum system implies that such a system has eigenstates that show similar statistical properties as the column vectors of random orthogonal or unitary matrices. Here, we consider the different probabilities for obtaining a specific outcome in a projective measurement, provided the system is in one of its eigenstates. We then give analytic expressions for the joint probability density for these probabilities, with respect to the ensemble of random matrices. In the case of the unitary group, our results can be applied, also, to the phenomenon of universal conductance fluctuations, where the same mathematical quantities describe partial conductances in a two-terminal mesoscopic scattering problem with a finite number of modes in each terminal. (paper)

  5. Joint probability distributions for projection probabilities of random orthonormal states

    Science.gov (United States)

    Alonso, L.; Gorin, T.

    2016-04-01

    The quantum chaos conjecture applied to a finite dimensional quantum system implies that such a system has eigenstates that show similar statistical properties as the column vectors of random orthogonal or unitary matrices. Here, we consider the different probabilities for obtaining a specific outcome in a projective measurement, provided the system is in one of its eigenstates. We then give analytic expressions for the joint probability density for these probabilities, with respect to the ensemble of random matrices. In the case of the unitary group, our results can be applied, also, to the phenomenon of universal conductance fluctuations, where the same mathematical quantities describe partial conductances in a two-terminal mesoscopic scattering problem with a finite number of modes in each terminal.

  6. Generating pseudo-random discrete probability distributions

    Energy Technology Data Exchange (ETDEWEB)

    Maziero, Jonas, E-mail: jonasmaziero@gmail.com [Universidade Federal de Santa Maria (UFSM), RS (Brazil). Departamento de Fisica

    2015-08-15

    The generation of pseudo-random discrete probability distributions is of paramount importance for a wide range of stochastic simulations spanning from Monte Carlo methods to the random sampling of quantum states for investigations in quantum information science. In spite of its significance, a thorough exposition of such a procedure is lacking in the literature. In this article, we present relevant details concerning the numerical implementation and applicability of what we call the iid, normalization, and trigonometric methods for generating an unbiased probability vector p=(p{sub 1},⋯ ,p{sub d}). An immediate application of these results regarding the generation of pseudo-random pure quantum states is also described. (author)

  7. Constraints on probability distributions of grammatical forms

    Directory of Open Access Journals (Sweden)

    Kostić Aleksandar

    2007-01-01

    Full Text Available In this study we investigate the constraints on probability distribution of grammatical forms within morphological paradigms of Serbian language, where paradigm is specified as a coherent set of elements with defined criteria for inclusion. Thus, for example, in Serbian all feminine nouns that end with the suffix "a" in their nominative singular form belong to the third declension, the declension being a paradigm. The notion of a paradigm could be extended to other criteria as well, hence, we can think of noun cases, irrespective of grammatical number and gender, or noun gender, irrespective of case and grammatical number, also as paradigms. We took the relative entropy as a measure of homogeneity of probability distribution within paradigms. The analysis was performed on 116 morphological paradigms of typical Serbian and for each paradigm the relative entropy has been calculated. The obtained results indicate that for most paradigms the relative entropy values fall within a range of 0.75 - 0.9. Nonhomogeneous distribution of relative entropy values allows for estimating the relative entropy of the morphological system as a whole. This value is 0.69 and can tentatively be taken as an index of stability of the morphological system.

  8. Supra-Bayesian Combination of Probability Distributions

    Czech Academy of Sciences Publication Activity Database

    Sečkárová, Vladimíra

    Veszprém : University of Pannonia, 2010, s. 112-117. ISBN 978-615-5044-00-7. [11th International PhD Workshop on Systems and Control. Veszprém (HU), 01.09.2010-03.09.2010] R&D Projects: GA ČR GA102/08/0567 Institutional research plan: CEZ:AV0Z10750506 Keywords : Supra-Bayesian approach * sharing of probabilistic information * Bayesian decision making Subject RIV: BC - Control Systems Theory http://library.utia.cas.cz/separaty/2010/AS/seckarova-supra-bayesian combination of probability distributions.pdf

  9. Parametric probability distributions for anomalous change detection

    Energy Technology Data Exchange (ETDEWEB)

    Theiler, James P [Los Alamos National Laboratory; Foy, Bernard R [Los Alamos National Laboratory; Wohlberg, Brendt E [Los Alamos National Laboratory; Scovel, James C [Los Alamos National Laboratory

    2010-01-01

    The problem of anomalous change detection arises when two (or possibly more) images are taken of the same scene, but at different times. The aim is to discount the 'pervasive differences' that occur thoughout the imagery, due to the inevitably different conditions under which the images were taken (caused, for instance, by differences in illumination, atmospheric conditions, sensor calibration, or misregistration), and to focus instead on the 'anomalous changes' that actually take place in the scene. In general, anomalous change detection algorithms attempt to model these normal or pervasive differences, based on data taken directly from the imagery, and then identify as anomalous those pixels for which the model does not hold. For many algorithms, these models are expressed in terms of probability distributions, and there is a class of such algorithms that assume the distributions are Gaussian. By considering a broader class of distributions, however, a new class of anomalous change detection algorithms can be developed. We consider several parametric families of such distributions, derive the associated change detection algorithms, and compare the performance with standard algorithms that are based on Gaussian distributions. We find that it is often possible to significantly outperform these standard algorithms, even using relatively simple non-Gaussian models.

  10. Probability Distribution for Flowing Interval Spacing

    International Nuclear Information System (INIS)

    The purpose of this analysis is to develop a probability distribution for flowing interval spacing. A flowing interval is defined as a fractured zone that transmits flow in the Saturated Zone (SZ), as identified through borehole flow meter surveys (Figure 1). This analysis uses the term ''flowing interval spacing'' as opposed to fractured spacing, which is typically used in the literature. The term fracture spacing was not used in this analysis because the data used identify a zone (or a flowing interval) that contains fluid-conducting fractures but does not distinguish how many or which fractures comprise the flowing interval. The flowing interval spacing is measured between the midpoints of each flowing interval. Fracture spacing within the SZ is defined as the spacing between fractures, with no regard to which fractures are carrying flow. The Development Plan associated with this analysis is entitled, ''Probability Distribution for Flowing Interval Spacing'', (CRWMS M and O 2000a). The parameter from this analysis may be used in the TSPA SR/LA Saturated Zone Flow and Transport Work Direction and Planning Documents: (1) ''Abstraction of Matrix Diffusion for SZ Flow and Transport Analyses'' (CRWMS M and O 1999a) and (2) ''Incorporation of Heterogeneity in SZ Flow and Transport Analyses'', (CRWMS M and O 1999b). A limitation of this analysis is that the probability distribution of flowing interval spacing may underestimate the effect of incorporating matrix diffusion processes in the SZ transport model because of the possible overestimation of the flowing interval spacing. Larger flowing interval spacing results in a decrease in the matrix diffusion processes. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be determined from the data. Because each flowing interval probably has more than one fracture contributing to a flowing interval, the true flowing interval spacing could be

  11. Audio feature extraction using probability distribution function

    Science.gov (United States)

    Suhaib, A.; Wan, Khairunizam; Aziz, Azri A.; Hazry, D.; Razlan, Zuradzman M.; Shahriman A., B.

    2015-05-01

    Voice recognition has been one of the popular applications in robotic field. It is also known to be recently used for biometric and multimedia information retrieval system. This technology is attained from successive research on audio feature extraction analysis. Probability Distribution Function (PDF) is a statistical method which is usually used as one of the processes in complex feature extraction methods such as GMM and PCA. In this paper, a new method for audio feature extraction is proposed which is by using only PDF as a feature extraction method itself for speech analysis purpose. Certain pre-processing techniques are performed in prior to the proposed feature extraction method. Subsequently, the PDF result values for each frame of sampled voice signals obtained from certain numbers of individuals are plotted. From the experimental results obtained, it can be seen visually from the plotted data that each individuals' voice has comparable PDF values and shapes.

  12. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  13. Evolution of the Jet Opening Angle Distribution in Holographic Plasma.

    Science.gov (United States)

    Rajagopal, Krishna; Sadofyev, Andrey V; van der Schee, Wilke

    2016-05-27

    We use holography to analyze the evolution of an ensemble of jets, with an initial probability distribution for their energy and opening angle as in proton-proton (pp) collisions, as they propagate through an expanding cooling droplet of strongly coupled plasma as in heavy ion collisions. We identify two competing effects: (i) each individual jet widens as it propagates and (ii) because wide-angle jets lose more energy, energy loss combined with the steeply falling perturbative spectrum serves to filter wide jets out of the ensemble at any given energy. Even though every jet widens, jets with a given energy can have a smaller mean opening angle after passage through the plasma than jets with that energy would have had in vacuum, as experimental data may indicate. PMID:27284647

  14. Dichotomous choice contingent valuation probability distributions

    OpenAIRE

    Kerr, Geoffrey N.

    2000-01-01

    Parametric distributions applied to dichotomous choice contingent valuation data invoke assumptions about the distribution of willingness to pay that may contravene economic theory. This article develops and applies distributions that allow the shape of bid distributions to vary. Alternative distributions provide little, if any, improvement in statistical fit from commonly used distributions. While median willingness to pay is largely invariant to distribution, estimates of mean consumer surp...

  15. How to Read Probability Distributions as Statements about Process

    OpenAIRE

    Frank, Steven A.

    2014-01-01

    Probability distributions can be read as simple expressions of information. Each continuous probability distribution describes how information changes with magnitude. Once one learns to read a probability distribution as a measurement scale of information, opportunities arise to understand the processes that generate the commonly observed patterns. Probability expressions may be parsed into four components: the dissipation of all information, except the preservation of average values, taken o...

  16. Eliciting Subjective Probability Distributions with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd;

    2015-01-01

    We test in a laboratory experiment the theoretical prediction that risk attitudes have a surprisingly small role in distorting reports from true belief distributions. We find evidence consistent with theory in our experiment.......We test in a laboratory experiment the theoretical prediction that risk attitudes have a surprisingly small role in distorting reports from true belief distributions. We find evidence consistent with theory in our experiment....

  17. Incorporating Skew into RMS Surface Roughness Probability Distribution

    Science.gov (United States)

    Stahl, Mark T.; Stahl, H. Philip.

    2013-01-01

    The standard treatment of RMS surface roughness data is the application of a Gaussian probability distribution. This handling of surface roughness ignores the skew present in the surface and overestimates the most probable RMS of the surface, the mode. Using experimental data we confirm the Gaussian distribution overestimates the mode and application of an asymmetric distribution provides a better fit. Implementing the proposed asymmetric distribution into the optical manufacturing process would reduce the polishing time required to meet surface roughness specifications.

  18. Semi-stable distributions in free probability theory

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Semi-stable distributions, in classical probability theory, are characterized as limiting distributions of subsequences of normalized partial sums of independent and identically distributed random variables. We establish the noncommutative counterpart of semi-stable distributions. We study the characterization of noncommutative semi-stability through free cumulant transform and develop the free semi-stability and domain of semi-stable attraction in free probability theory.

  19. Probability distributions in risk management operations

    CERN Document Server

    Artikis, Constantinos

    2015-01-01

    This book is about the formulations, theoretical investigations, and practical applications of new stochastic models for fundamental concepts and operations of the discipline of risk management. It also examines how these models can be useful in the descriptions, measurements, evaluations, and treatments of risks threatening various modern organizations. Moreover, the book makes clear that such stochastic models constitute very strong analytical tools which substantially facilitate strategic thinking and strategic decision making in many significant areas of risk management. In particular the incorporation of fundamental probabilistic concepts such as the sum, minimum, and maximum of a random number of continuous, positive, independent, and identically distributed random variables in the mathematical structure of stochastic models significantly supports the suitability of these models in the developments, investigations, selections, and implementations of proactive and reactive risk management operations. The...

  20. Distribution of angles in hyperbolic lattices

    DEFF Research Database (Denmark)

    Risager, Morten Skarsholm; Truelsen, Jimi Lee

    2010-01-01

    We prove an effective equidistribution result about angles in a hyperbolic lattice. We use this to generalize a result from the study by Boca.......We prove an effective equidistribution result about angles in a hyperbolic lattice. We use this to generalize a result from the study by Boca....

  1. Distribution of Angles in Hyperbolic Lattices

    DEFF Research Database (Denmark)

    S. Risager, Morten; L. Truelsen, Jimi

    2008-01-01

    We prove an effective equidistribution result about angles in a hyperbolic lattice. We use this to generalize a result due to F. P. Boca.......We prove an effective equidistribution result about angles in a hyperbolic lattice. We use this to generalize a result due to F. P. Boca....

  2. Negative Binomial and Multinomial States: probability distributions and coherent states

    OpenAIRE

    Fu, Hong-Chen; Sasaki, Ryu

    1996-01-01

    Following the relationship between probability distribution and coherent states, for example the well known Poisson distribution and the ordinary coherent states and relatively less known one of the binomial distribution and the $su(2)$ coherent states, we propose ``interpretation'' of $su(1,1)$ and $su(r,1)$ coherent states ``in terms of probability theory''. They will be called the ``negative binomial'' (``multinomial'') ``states'' which correspond to the ``negative'' binomial (multinomial)...

  3. Some explicit expressions for the probability distribution of force magnitude

    Indian Academy of Sciences (India)

    Saralees Nadarajah

    2008-08-01

    Recently, empirical investigations have suggested that the components of contact forces follow the exponential distribution. However, explicit expressions for the probability distribution of the corresponding force magnitude have not been known and only approximations have been used in the literature. In this note, for the first time, I provide explicit expressions for the probability distribution of the force magnitude. Both two-dimensional and three-dimensional cases are considered.

  4. Characteristic function and Spitzer's law for the winding angle distribution of planar brownian curves

    International Nuclear Information System (INIS)

    Using the analogy between brownian motion and Quantum Mechanics, we study the winding angle θ of planar brownian curves around a given point, say the origin O. In particular, we compute the characteristic function for the probability distribution of θ and recover Spitzer's law in the limit of infinitely large times. Finally, we study the (large) change in the winding angle distribution when we add a repulsive potential at the origin

  5. Net baryon number probability distribution near the chiral phase transition

    OpenAIRE

    Morita, Kenji; Skokov, Vladimir; Friman, Bengt; Redlich, Krzysztof

    2014-01-01

    We discuss the properties of the net baryon number probability distribution near the chiral phase transition to explore the effect of critical fluctuations. Our studies are performed within Landau theory, where the coefficients of the polynomial potential are parametrized, so as to reproduce the mean-field (MF), the Z(2) , and the O(4) scaling behaviors of the cumulants of the net baryon number. We show that in the critical region the structure of the probability distribution changes, dependi...

  6. Information-theoretic methods for estimating of complicated probability distributions

    CERN Document Server

    Zong, Zhi

    2006-01-01

    Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur

  7. Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution

    Science.gov (United States)

    Hamadameen, Abdulqader Othman; Zainuddin, Zaitul Marlizawati

    2014-06-01

    This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α-. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen's method is employed to find a compromise solution, supported by illustrative numerical example.

  8. Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution

    Energy Technology Data Exchange (ETDEWEB)

    Hamadameen, Abdulqader Othman [Optimization, Department of Mathematical Sciences, Faculty of Science, UTM (Malaysia); Zainuddin, Zaitul Marlizawati [Department of Mathematical Sciences, Faculty of Science, UTM (Malaysia)

    2014-06-19

    This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α{sup –}. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen’s method is employed to find a compromise solution, supported by illustrative numerical example.

  9. Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution

    International Nuclear Information System (INIS)

    This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α–. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen’s method is employed to find a compromise solution, supported by illustrative numerical example

  10. Fitness Probability Distribution of Bit-Flip Mutation.

    Science.gov (United States)

    Chicano, Francisco; Sutton, Andrew M; Whitley, L Darrell; Alba, Enrique

    2015-01-01

    Bit-flip mutation is a common mutation operator for evolutionary algorithms applied to optimize functions over binary strings. In this paper, we develop results from the theory of landscapes and Krawtchouk polynomials to exactly compute the probability distribution of fitness values of a binary string undergoing uniform bit-flip mutation. We prove that this probability distribution can be expressed as a polynomial in p, the probability of flipping each bit. We analyze these polynomials and provide closed-form expressions for an easy linear problem (Onemax), and an NP-hard problem, MAX-SAT. We also discuss a connection of the results with runtime analysis. PMID:24885680

  11. Most probable degree distribution at fixed structural entropy

    Indian Academy of Sciences (India)

    Ginestra Bianconi

    2008-06-01

    The structural entropy is the entropy of the ensemble of uncorrelated networks with given degree sequence. Here we derive the most probable degree distribution emerging when we distribute stubs (or half-edges) randomly through the nodes of the network by keeping fixed the structural entropy. This degree distribution is found to decay as a Poisson distribution when the entropy is maximized and to have a power-law tail with an exponent → 2 when the entropy is minimized.

  12. Differentiable, multi-dimensional, knowledge-based energy terms for torsion angle probabilities and propensities.

    Science.gov (United States)

    Amir, El-Ad David; Kalisman, Nir; Keasar, Chen

    2008-07-01

    Rotatable torsion angles are the major degrees of freedom in proteins. Adjacent angles are highly correlated and energy terms that rely on these correlations are intensively used in molecular modeling. However, the utility of torsion based terms is not yet fully exploited. Many of these terms do not capture the full scale of the correlations. Other terms, which rely on lookup tables, cannot be used in the context of force-driven algorithms because they are not fully differentiable. This study aims to extend the usability of torsion terms by presenting a set of high-dimensional and fully-differentiable energy terms that are derived from high-resolution structures. The set includes terms that describe backbone conformational probabilities and propensities, side-chain rotamer probabilities, and an elaborate term that couples all the torsion angles within the same residue. The terms are constructed by cubic spline interpolation with periodic boundary conditions that enable full differentiability and high computational efficiency. We show that the spline implementation does not compromise the accuracy of the original database statistics. We further show that the side-chain relevant terms are compatible with established rotamer probabilities. Despite their very local characteristics, the new terms are often able to identify native and native-like structures within decoy sets. Finally, force-based minimization of NMR structures with the new terms improves their torsion angle statistics with minor structural distortion (0.5 A RMSD on average). The new terms are freely available in the MESHI molecular modeling package. The spline coefficients are also available as a documented MATLAB file. PMID:18186478

  13. ProbOnto: ontology and knowledge base of probability distributions

    Science.gov (United States)

    Swat, Maciej J.; Grenon, Pierre; Wimalaratne, Sarala

    2016-01-01

    Motivation: Probability distributions play a central role in mathematical and statistical modelling. The encoding, annotation and exchange of such models could be greatly simplified by a resource providing a common reference for the definition of probability distributions. Although some resources exist, no suitably detailed and complex ontology exists nor any database allowing programmatic access. Results: ProbOnto, is an ontology-based knowledge base of probability distributions, featuring more than 80 uni- and multivariate distributions with their defining functions, characteristics, relationships and re-parameterization formulas. It can be used for model annotation and facilitates the encoding of distribution-based models, related functions and quantities. Availability and Implementation: http://probonto.org Contact: mjswat@ebi.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153608

  14. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (Sst, Sst, pst) for stochastic uncertainty, a probability space (Ssu, Ssu, psu) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (Sst, Sst, pst) and (Ssu, Ssu, psu). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  15. NORMALLY DISTRIBUTED PROBABILITY MEASURE ON THE METRIC SPACE OF NORMS

    Institute of Scientific and Technical Information of China (English)

    Á.G. HORVÁTH

    2013-01-01

    In this paper we propose a method to construct probability measures on the space of convex bodies. For this purpose, first, we introduce the notion of thinness of a body. Then we show the existence of a measure with the property that its pushforward by the thinness function is a probability measure of truncated normal distribution. Finally, we improve this method to find a measure satisfying some important properties in geometric measure theory.

  16. Uniform distribution of initial states: The physical basis of probability

    Science.gov (United States)

    Zhang Kechen

    1990-02-01

    For repetitive experiments performed on a deterministic system with initial states restricted to a certain region in phase space, the relative frequency of an event has a definite value insensitive to the preparation of the experiments only if the initial states leading to that event are distributed uniformly in the prescribed region. Mechanical models of coin tossing and roulette spinning and equal a priori probability hypothesis in statistical mechanics are considered in the light of this principle. Probabilities that have arisen from uniform distributions of initial states do not necessarily submit to Kolmogorov's axioms of probability. In the finite-dimensional case, a uniform distribution in phase space either in the coarse-grained sense or in the limit sense can be formulated in a unified way.

  17. A probable probability distribution of a series nonequilibrium states in a simple system out of equilibrium

    Science.gov (United States)

    Gao, Haixia; Li, Ting; Xiao, Changming

    2016-05-01

    When a simple system is in its nonequilibrium state, it will shift to its equilibrium state. Obviously, in this process, there are a series of nonequilibrium states. With the assistance of Bayesian statistics and hyperensemble, a probable probability distribution of these nonequilibrium states can be determined by maximizing the hyperensemble entropy. It is known that the largest probability is the equilibrium state, and the far a nonequilibrium state is away from the equilibrium one, the smaller the probability will be, and the same conclusion can also be obtained in the multi-state space. Furthermore, if the probability stands for the relative time the corresponding nonequilibrium state can stay, then the velocity of a nonequilibrium state returning back to its equilibrium can also be determined through the reciprocal of the derivative of this probability. It tells us that the far away the state from the equilibrium is, the faster the returning velocity will be; if the system is near to its equilibrium state, the velocity will tend to be smaller and smaller, and finally tends to 0 when it gets the equilibrium state.

  18. Assigning probability distributions to input parameters of performance assessment models

    International Nuclear Information System (INIS)

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available

  19. Assigning probability distributions to input parameters of performance assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, Srikanta [INTERA Inc., Austin, TX (United States)

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available.

  20. Augmenting momentum resolution with well tuned probability distributions

    CERN Document Server

    Landi, Gregorio

    2016-01-01

    The realistic probability distributions of a previous article are applied to the reconstruction of tracks in constant magnetic field. The complete forms and their schematic approximations produce excellent momentum estimations, drastically better than standard fits. A simplified derivation of one of our probability distributions is illustrated. The momentum reconstructions are compared with standard fits (least squares) with two different position algorithms: the eta-algorithm and the two-strip center of gravity. The quality of our results are expressed as the increase of the magnetic field and signal-to-noise ratio that overlap the standard fit reconstructions with ours best distributions. The data and the simulations are tuned on the tracker of a running experiment and its double sided microstrip detectors, here each detector side is simulated to measure the magnetic bending. To overlap with our best distributions, the magnetic field must be increased by a factor 1.5 for the least squares based on the eta-a...

  1. Probability distribution of extreme share returns in Malaysia

    Science.gov (United States)

    Zin, Wan Zawiah Wan; Safari, Muhammad Aslam Mohd; Jaaman, Saiful Hafizah; Yie, Wendy Ling Shin

    2014-09-01

    The objective of this study is to investigate the suitable probability distribution to model the extreme share returns in Malaysia. To achieve this, weekly and monthly maximum daily share returns are derived from share prices data obtained from Bursa Malaysia over the period of 2000 to 2012. The study starts with summary statistics of the data which will provide a clue on the likely candidates for the best fitting distribution. Next, the suitability of six extreme value distributions, namely the Gumbel, Generalized Extreme Value (GEV), Generalized Logistic (GLO) and Generalized Pareto (GPA), the Lognormal (GNO) and the Pearson (PE3) distributions are evaluated. The method of L-moments is used in parameter estimation. Based on several goodness of fit tests and L-moment diagram test, the Generalized Pareto distribution and the Pearson distribution are found to be the best fitted distribution to represent the weekly and monthly maximum share returns in Malaysia stock market during the studied period, respectively.

  2. Modeling highway travel time distribution with conditional probability models

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira Neto, Francisco Moraes [ORNL; Chin, Shih-Miao [ORNL; Hwang, Ho-Ling [ORNL; Han, Lee [University of Tennessee, Knoxville (UTK)

    2014-01-01

    ABSTRACT Under the sponsorship of the Federal Highway Administration's Office of Freight Management and Operations, the American Transportation Research Institute (ATRI) has developed performance measures through the Freight Performance Measures (FPM) initiative. Under this program, travel speed information is derived from data collected using wireless based global positioning systems. These telemetric data systems are subscribed and used by trucking industry as an operations management tool. More than one telemetric operator submits their data dumps to ATRI on a regular basis. Each data transmission contains truck location, its travel time, and a clock time/date stamp. Data from the FPM program provides a unique opportunity for studying the upstream-downstream speed distributions at different locations, as well as different time of the day and day of the week. This research is focused on the stochastic nature of successive link travel speed data on the continental United States Interstates network. Specifically, a method to estimate route probability distributions of travel time is proposed. This method uses the concepts of convolution of probability distributions and bivariate, link-to-link, conditional probability to estimate the expected distributions for the route travel time. Major contribution of this study is the consideration of speed correlation between upstream and downstream contiguous Interstate segments through conditional probability. The established conditional probability distributions, between successive segments, can be used to provide travel time reliability measures. This study also suggests an adaptive method for calculating and updating route travel time distribution as new data or information is added. This methodology can be useful to estimate performance measures as required by the recent Moving Ahead for Progress in the 21st Century Act (MAP 21).

  3. Probability Measure of Navigation pattern predition using Poisson Distribution Analysis

    Directory of Open Access Journals (Sweden)

    Dr.V.Valli Mayil

    2012-06-01

    Full Text Available The World Wide Web has become one of the most important media to store, share and distribute information. The rapid expansion of the web has provided a great opportunity to study user and system behavior by exploring web access logs. Web Usage Mining is the application of data mining techniques to large web data repositories in order to extract usage patterns. Every web server keeps a log of all transactions between the server and the clients. The log data which are collected by web servers contains information about every click of user to the web documents of the site. The useful log information needs to be analyzed and interpreted in order to obtainknowledge about actual user preferences in accessing web pages. In recent years several methods have been proposed for mining web log data. This paper addresses the statistical method of Poisson distribution analysis to find out the higher probability session sequences which is then used to test the web application performance.The analysis of large volumes of click stream data demands the employment of data mining methods. Conducting data mining on logs of web servers involves the determination of frequently occurring access sequences. A statistical poisson distribution shows the frequency probability of specific events when the average probability of a single occurrence is known. The Poisson distribution is a discrete function wich is used in this paper to find out the probability frequency of particular page is visited by the user.

  4. Probability distribution function for a solid with vacancies

    OpenAIRE

    Metlov, Leonid S.

    2011-01-01

    Expression for probability distribution is got taking into account a presence and removal of degeneracy on the microstates. Its application allows to describe the process of melting of solids, as saltatory phase transition of the first kind without bringing in of concept of the order parameter.

  5. Contact pressure distribution and support angle optimization of kiln tyre

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    According to the shearing force character and the deformation coordination condition of shell at the station of supports, the mathematical models to calculate contact angle and contact pressure distribution between tyre and shell were set up, the formulae of bending moment and bending stress of tyre were obtained. Taking the maximum of tyre fatigue life as the optimal objective, the optimization model of tyre support angle was built. The computational results show that when tyre support angle is 30°, tyre life is far less than that when tyre support angle is optimal, which is 35.6°, and it is unsuitable to stipulate tyre support angle to be 30° in traditional design. The larger the load, the less the nominal stress amplitude increment of tyre, the more favorable the tyre fatigue life when tyre support angle is optimal.

  6. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C. [Arizona State Univ., Tempe, AZ (United States)

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S{sub st}, S{sub st}, p{sub st}) for stochastic uncertainty, a probability space (S{sub su}, S{sub su}, p{sub su}) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S{sub st}, S{sub st}, p{sub st}) and (S{sub su}, S{sub su}, p{sub su}). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency`s standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems.

  7. Lower and upper probabilities in the distributive lattice of subsystems

    International Nuclear Information System (INIS)

    The set of subsystems Σ(m) of a finite quantum system Σ(n) (with variables in Z(n)) together with logical connectives, is a distributive lattice. With regard to this lattice, the ℓ(m| ρn)=Tr(P(m)ρn) (where P(m) is the projector to Σ(m)) obeys a supermodularity inequality, and it is interpreted as a lower probability in the sense of the Dempster–Shafer theory, and not as a Kolmogorov probability. It is shown that the basic concepts of the Dempster–Shafer theory (lower and upper probabilities and the Dempster multivaluedness) are pertinent to the quantum formalism of finite systems. (paper)

  8. Unitary equilibrations: probability distribution of the Loschmidt echo

    CERN Document Server

    Venuti, Lorenzo Campos

    2009-01-01

    Closed quantum systems evolve unitarily and therefore cannot converge in a strong sense to an equilibrium state starting out from a generic pure state. Nevertheless for large system size one observes temporal typicality. Namely, for the overwhelming majority of the time instants, the statistics of observables is practically indistinguishable from an effective equilibrium one. In this paper we consider the Loschmidt echo (LE) to study this sort of unitary equilibration after a quench. We draw several conclusions on general grounds and on the basis of an exactly-solvable example of a quasi-free system. In particular we focus on the whole probability distribution of observing a given value of the LE after waiting a long time. Depending on the interplay between the initial state and the quench Hamiltonian, we find different regimes reflecting different equilibration dynamics. When the perturbation is small and the system is away from criticality the probability distribution is Gaussian. However close to criticali...

  9. Measurement of probability distributions for internal stresses in dislocated crystals

    Energy Technology Data Exchange (ETDEWEB)

    Wilkinson, Angus J.; Tarleton, Edmund; Vilalta-Clemente, Arantxa; Collins, David M. [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Jiang, Jun; Britton, T. Benjamin [Department of Materials, Imperial College London, Royal School of Mines, Exhibition Road, London SW7 2AZ (United Kingdom)

    2014-11-03

    Here, we analyse residual stress distributions obtained from various crystal systems using high resolution electron backscatter diffraction (EBSD) measurements. Histograms showing stress probability distributions exhibit tails extending to very high stress levels. We demonstrate that these extreme stress values are consistent with the functional form that should be expected for dislocated crystals. Analysis initially developed by Groma and co-workers for X-ray line profile analysis and based on the so-called “restricted second moment of the probability distribution” can be used to estimate the total dislocation density. The generality of the results are illustrated by application to three quite different systems, namely, face centred cubic Cu deformed in uniaxial tension, a body centred cubic steel deformed to larger strain by cold rolling, and hexagonal InAlN layers grown on misfitting sapphire and silicon carbide substrates.

  10. Probability Distribution Function Evolution for Binary Alloy Solidification

    Energy Technology Data Exchange (ETDEWEB)

    Steinzig, M.L.; Harlow, F.H.

    1999-02-26

    The thermally controlled solidification of a binary alloy, nucleated at isolated sites, is described by the evolution of a probability distribution function, whose variables include grain size and distance to nearest neighbor, together with descriptors of shape, orientation, and such material properties as orientation of nonisotropic elastic modulus and coefficient of thermal expansion. The relevant Liouville equation is described and coupled with global equations for energy and solute transport. Applications are discussed for problems concerning nucleation and impingement and the consequences for final size and size distribution. The goal of this analysis is to characterize the grain structure of the solidified casting and to enable the description of its probable response to thermal treatment, machining, and the imposition of mechanical insults.

  11. Outage probability of distributed beamforming with co-channel interference

    KAUST Repository

    Yang, Liang

    2012-03-01

    In this letter, we consider a distributed beamforming scheme (DBF) in the presence of equal-power co-channel interferers for both amplify-and-forward and decode-and-forward relaying protocols over Rayleigh fading channels. We first derive outage probability expressions for the DBF systems. We then present a performance analysis for a scheme relying on source selection. Numerical results are finally presented to verify our analysis. © 2011 IEEE.

  12. Phase diagram of epidemic spreading - unimodal vs. bimodal probability distributions

    OpenAIRE

    Lancic, Alen; Antulov-Fantulin, Nino; Sikic, Mile; Stefancic, Hrvoje

    2009-01-01

    The disease spreading on complex networks is studied in SIR model. Simulations on empirical complex networks reveal two specific regimes of disease spreading: local containment and epidemic outbreak. The variables measuring the extent of disease spreading are in general characterized by a bimodal probability distribution. Phase diagrams of disease spreading for empirical complex networks are introduced. A theoretical model of disease spreading on m-ary tree is investigated both analytically a...

  13. Testing for the maximum cell probabilities in multinomial distributions

    Institute of Scientific and Technical Information of China (English)

    XIONG; Shifeng

    2005-01-01

    This paper investigates one-sided hypotheses testing for p[1], the largest cell probability of multinomial distribution. A small sample test of Ethier (1982) is extended to the general cases. Based on an estimator of p[1], a kind of large sample tests is proposed. The asymptotic power of the above tests under local alternatives is derived. An example is presented at the end of this paper.

  14. Quantization of Prior Probabilities for Collaborative Distributed Hypothesis Testing

    OpenAIRE

    Rhim, Joong Bum; Varshney, Lav R.; GOYAL, VIVEK K.

    2011-01-01

    This paper studies the quantization of prior probabilities, drawn from an ensemble, for distributed detection and data fusion. Design and performance equivalences between a team of N agents tied by a fixed fusion rule and a more powerful single agent are obtained. Effects of identical quantization and diverse quantization are compared. Consideration of perceived common risk enables agents using diverse quantizers to collaborate in hypothesis testing, and it is proven that the minimum mean Bay...

  15. Analytical theory of the probability distribution function of structure formation

    OpenAIRE

    Anderson, Johan; Kim, Eun-Jin

    2009-01-01

    The probability distribution function (PDF) tails of the zonal flow structure formation and the PDF tails of momentum flux by incorporating effect of a shear flow in ion-temperature-gradient (ITG) turbulence are computed in the present paper. The bipolar vortex soliton (modon) is assumed to be the coherent structure responsible for bursty and intermittent events driving the PDF tails. It is found that stronger zonal flows are generated in ITG turbulence than Hasegawa-Mima (HM) turbulence as w...

  16. Estimating probable flaw distributions in PWR steam generator tubes

    Energy Technology Data Exchange (ETDEWEB)

    Gorman, J.A.; Turner, A.P.L. [Dominion Engineering, Inc., McLean, VA (United States)

    1997-02-01

    This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regarding uncertainties and assumptions in the data and analyses.

  17. Wide angle near-field diffraction and Wigner distribution

    CERN Document Server

    Almeida, J B

    2003-01-01

    Free-space propagation can be described as a shearing of the Wigner distribution function in the spatial coordinate; this shearing is linear in paraxial approximation but assumes a more complex shape for wide-angle propagation. Integration in the frequency domain allows the determination of near-field diffraction, leading to the well known Fresnel diffraction when small angles are considered and allowing exact prediction of wide-angle diffraction. The authors use this technique to demonstrate evanescent wave formation and diffraction elimination for very small apertures.

  18. The latitude dependence and probability distribution of polar mesospheric turbulence

    Directory of Open Access Journals (Sweden)

    M. Rapp

    2006-11-01

    Full Text Available We consider in-situ observations and results from a global circulation model to study the latitude dependence and probability distribution of polar mesospheric turbulence. A comparison of summer observations at 69° N and 79° N shows that mesospheric turbulence weakens towards the summer pole. Furthermore, these data suggest that at both latitudes in about ~70% of all samples there are non-turbulent altitude bins in the considered altitude range between 70 and 95 km. The remaining 30% with detectable turbulence show an approximately log-normal distribution of dissipation rates. A low-resolution model version with a gravity wave (GW parameterization explains the observed latitude dependence as a consequence of a downshift of the breaking levels towards the summer pole and an accompanying decay of turbulent heating per unit mass. When we do not use a GW parameterization but employ a high spatial resolution instead to simulate GW effects explicitly, the model predicts a similar latitudinal dependence with weakening turbulence towards the summer pole. In addition, the model also produces a log-normal distribution of dissipation rates. The simulated probability distribution is more narrow than in the observations since the model resolves at most mid-frequency GWs, whereas real turbulence is also excited by smaller-scale disturbances. The GW resolving simulation suggests a weaker tropospheric GW source at polar latitudes as the dominating mechanism for the latitudinal dependence.

  19. The Probability Distribution Model of Wind Speed over East Malaysia

    Directory of Open Access Journals (Sweden)

    Nurulkamal Masseran

    2013-07-01

    Full Text Available Many studies have found that wind speed is the most significant parameter of wind power. Thus, an accurate determination of the probability distribution of wind speed is an important parameter to measure before estimating the wind energy potential over a particular region. Utilizing an accurate distribution will minimize the uncertainty in wind resource estimates and improve the site assessment phase of planning. In general, different regions have different wind regimes. Hence, it is reasonable that different wind distributions will be found for different regions. Because it is reasonable to consider that wind regimes vary according to the region of a particular country, nine different statistical distributions have been fitted to the mean hourly wind speed data from 20 wind stations in East Malaysia, for the period from 2000 to 2009. The values from Kolmogorov-Smirnov statistic, Akaike’s Information Criteria, Bayesian Information Criteria and R2 correlation coefficient were compared with the distributions to determine the best fit for describing the observed data. A good fit for most of the stations in East Malaysia was found using the Gamma and Burr distributions, though there was no clear pattern observed for all regions in East Malaysia. However, the Gamma distribution was a clear fit to the data from all stations in southern Sabah.

  20. Winding angle distributions for two-dimensional collapsing polymers

    Science.gov (United States)

    Narros, Arturo; Owczarek, Aleksander L.; Prellberg, Thomas

    2016-01-01

    We provide numerical support for a long-standing prediction of universal scaling of winding angle distributions. Simulations of interacting self-avoiding walks show that the winding angle distribution for N-step walks is compatible with the theoretical prediction of a Gaussian with a variance growing asymptotically as C log N, with C = 2 in the swollen phase (previously verified), and C = 24/7 at the θ-point. At low temperatures weaker evidence demonstrates compatibility with the same scaling and a value of C = 4 in the collapsed phase, also as theoretically predicted.

  1. Mathematical simulation of gamma-radiation angle distribution measurements

    International Nuclear Information System (INIS)

    We developed mathematical model of the facility for gamma-radiation angle distribution measurement and calculated response functions for gamma-radiation intensities. We developed special software for experimental data processing, the 'Shelter' object radiation spectra unfolding and Sphere detector (ShD) angle resolution estimation. Neuronet method using for detection of the radiation directions is given. We developed software based on the neuronet algorithm, that allows obtaining reliable distribution of gamma-sources that make impact on the facility detectors at the measurement point. 10 refs.; 15 figs.; 4 tab

  2. Effect of Air Outlet Angle on Air Distribution Performance Index

    Directory of Open Access Journals (Sweden)

    Isbeyeh W. Maid

    2013-05-01

    Full Text Available       In this paper a numerical study of velocity and temperature distribution in air conditioned space have been made. The computational model consists of the non-isothermal 3-D turbulent with (k-ε model. The numerical study is made to conduct air distribution in a room air-conditioned space with real interior dimensions (6×4×3m and to analyze the effect of changing angle of grille vanes on the flow pattern, velocity, and temperature distribution in the room under a set of different condition, and under a supply air temperature of 16˚C to examine the final result on air distribution performance index (ADPI.The results show a significant effect within the change of supply air angle, the maximum air distribution performance index (ADPI is 52% when air change per hour (ACH is equal to 10 at 16˚C inlet temperature with angle ( 15˚ down, and the minimum value of (ADPI is 20% when ACH is equal to 15 at 16˚C inlet temperature and angle ( degree. 

  3. Monsoonal differences and probability distribution of PM(10) concentration.

    Science.gov (United States)

    Md Yusof, Noor Faizah Fitri; Ramli, Nor Azam; Yahaya, Ahmad Shukri; Sansuddin, Nurulilyana; Ghazali, Nurul Adyani; Al Madhoun, Wesam

    2010-04-01

    There are many factors that influence PM(10) concentration in the atmosphere. This paper will look at the PM(10) concentration in relation with the wet season (north east monsoon) and dry season (south west monsoon) in Seberang Perai, Malaysia from the year 2000 to 2004. It is expected that PM(10) will reach the peak during south west monsoon as the weather during this season becomes dry and this study has proved that the highest PM(10) concentrations in 2000 to 2004 were recorded in this monsoon. Two probability distributions using Weibull and lognormal were used to model the PM(10) concentration. The best model used for prediction was selected based on performance indicators. Lognormal distribution represents the data better than Weibull distribution model for 2000, 2001, and 2002. However, for 2003 and 2004, Weibull distribution represents better than the lognormal distribution. The proposed distributions were successfully used for estimation of exceedences and predicting the return periods of the sequence year. PMID:19365611

  4. The probability distribution associated with the dichotomic Markov process

    International Nuclear Information System (INIS)

    We have calculated the probability distribution function for the dichotomic Markov process using the work of Pomraning [Linear kinetic theory and particle transport in stochastic mixtures. Singapore: World Scientific; 1991] and have studied the rate of convergence to this exact form as the number of terms in a series approximation is increased. Each term in the series involves an additional stochastic moment in the hierarchy of moments. It is observed that convergence is fast near the source but, as the distance from the source increases, more and more moments are required to obtain a specified accuracy

  5. Phase diagram of epidemic spreading - unimodal vs. bimodal probability distributions

    CERN Document Server

    Lancic, Alen; Sikic, Mile; Stefancic, Hrvoje

    2009-01-01

    The disease spreading on complex networks is studied in SIR model. Simulations on empirical complex networks reveal two specific regimes of disease spreading: local containment and epidemic outbreak. The variables measuring the extent of disease spreading are in general characterized by a bimodal probability distribution. Phase diagrams of disease spreading for empirical complex networks are introduced. A theoretical model of disease spreading on m-ary tree is investigated both analytically and in simulations. It is shown that the model reproduces qualitative features of phase diagrams of disease spreading observed in empirical complex networks. The role of tree-like structure of complex networks in disease spreading is discussed.

  6. Brownian Motion on a Sphere: Distribution of Solid Angles

    OpenAIRE

    Krishna, M. M. G.; Samuel, Joseph; Sinha, Supurna

    2000-01-01

    We study the diffusion of Brownian particles on the surface of a sphere and compute the distribution of solid angles enclosed by the diffusing particles. This function describes the distribution of geometric phases in two state quantum systems (or polarised light) undergoing random evolution. Our results are also relevant to recent experiments which observe the Brownian motion of molecules on curved surfaces like micelles and biological membranes. Our theoretical analysis agrees well with the...

  7. BOND-ANGLE DISTRIBUTION FUNCTIONS IN METALLIC GLASSES

    OpenAIRE

    Hafner, J.

    1985-01-01

    Bond-angle distribution functions have been calculated for realistic models of metallic glasses. They suggest a defected icosahedral short-range bond-orientational order and a close analogy of the short-range topological order in the amorphous and in the crystalline states.

  8. Probability Distribution Function of the Upper Equatorial Pacific Current Speeds

    Science.gov (United States)

    Chu, P. C.

    2008-12-01

    The probability distribution function (PDF) of the upper (0-50 m) tropical Pacific current speeds (w), constructed from hourly ADCP data (1990-2007) at six stations for the TOGA-TAO project, satisfies the two- parameter Weibull distribution reasonably well with different characteristics between El Nino and La Nina events: In the western Pacific, the PDF of w has a larger peakedness during the La Nina events than during the El Nino events; and vice versa in the eastern Pacific. However, the PDF of w for the lower layer (100-200 m) does not fit the Weibull distribution so well as the upper layer. This is due to the different stochastic differential equations between upper and lower layers in the tropical Pacific. For the upper layer, the stochastic differential equations, established on the base of the Ekman dynamics, have analytical solution, i.e., the Rayleigh distribution (simplest form of the Weibull distribution), for constant eddy viscosity K. Knowledge on PDF of w during the El Nino and La Nina events will improve the ensemble horizontal flux calculation, which contributes to the climate studies.

  9. Maximum-entropy probability distributions under Lp-norm constraints

    Science.gov (United States)

    Dolinar, S.

    1991-01-01

    Continuous probability density functions and discrete probability mass functions are tabulated which maximize the differential entropy or absolute entropy, respectively, among all probability distributions with a given L sub p norm (i.e., a given pth absolute moment when p is a finite integer) and unconstrained or constrained value set. Expressions for the maximum entropy are evaluated as functions of the L sub p norm. The most interesting results are obtained and plotted for unconstrained (real valued) continuous random variables and for integer valued discrete random variables. The maximum entropy expressions are obtained in closed form for unconstrained continuous random variables, and in this case there is a simple straight line relationship between the maximum differential entropy and the logarithm of the L sub p norm. Corresponding expressions for arbitrary discrete and constrained continuous random variables are given parametrically; closed form expressions are available only for special cases. However, simpler alternative bounds on the maximum entropy of integer valued discrete random variables are obtained by applying the differential entropy results to continuous random variables which approximate the integer valued random variables in a natural manner. All the results are presented in an integrated framework that includes continuous and discrete random variables, constraints on the permissible value set, and all possible values of p. Understanding such as this is useful in evaluating the performance of data compression schemes.

  10. Log-concave Probability Distributions: Theory and Statistical Testing

    DEFF Research Database (Denmark)

    An, Mark Yuing

    1996-01-01

    This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete and...... multivariate distributions are also discussed. We propose simple non-parametric testing procedures for log-concavity. The test statistics are constructed to test one of the two implicati ons of log-concavity: increasing hazard rates and new-is-better-than-used (NBU) property. The test for increasing hazard...... rates are based on normalized spacing of the sample order statistics. The tests for NBU property fall into the category of Hoeffding's U-statistics...

  11. Probability Distribution Function of Passive Scalars in Shell Models

    Science.gov (United States)

    Zhang, Xiao-Qiang; Wang, Guang-Rui; Chen, Shi-Gang

    2008-04-01

    A shell-model version of passive scalar problem is introduced, which is inspired by the model of K. Ohkitani and M. Yakhot [K. Ohkitani and M. Yakhot, Phys. Rev. Lett. 60 (1988) 983; K. Ohkitani and M. Yakhot, Prog. Theor. Phys. 81 (1988) 329]. As in the original problem, the prescribed random velocity field is Gaussian and δ correlated in time. Deterministic differential equations are regarded as nonlinear Langevin equation. Then, the Fokker Planck equations of PDF for passive scalars are obtained and solved numerically. In energy input range (n PDF) of passive scalars is near the Gaussian distribution. In inertial range (5 = 17), the probability distribution function (PDF) of passive scalars has obvious intermittence. And the scaling power of passive scalar is anomalous. The results of numerical simulations are compared with experimental measurements.

  12. Neighbor-dependent Ramachandran probability distributions of amino acids developed from a hierarchical Dirichlet process model.

    Directory of Open Access Journals (Sweden)

    Daniel Ting

    2010-04-01

    Full Text Available Distributions of the backbone dihedral angles of proteins have been studied for over 40 years. While many statistical analyses have been presented, only a handful of probability densities are publicly available for use in structure validation and structure prediction methods. The available distributions differ in a number of important ways, which determine their usefulness for various purposes. These include: 1 input data size and criteria for structure inclusion (resolution, R-factor, etc.; 2 filtering of suspect conformations and outliers using B-factors or other features; 3 secondary structure of input data (e.g., whether helix and sheet are included; whether beta turns are included; 4 the method used for determining probability densities ranging from simple histograms to modern nonparametric density estimation; and 5 whether they include nearest neighbor effects on the distribution of conformations in different regions of the Ramachandran map. In this work, Ramachandran probability distributions are presented for residues in protein loops from a high-resolution data set with filtering based on calculated electron densities. Distributions for all 20 amino acids (with cis and trans proline treated separately have been determined, as well as 420 left-neighbor and 420 right-neighbor dependent distributions. The neighbor-independent and neighbor-dependent probability densities have been accurately estimated using Bayesian nonparametric statistical analysis based on the Dirichlet process. In particular, we used hierarchical Dirichlet process priors, which allow sharing of information between densities for a particular residue type and different neighbor residue types. The resulting distributions are tested in a loop modeling benchmark with the program Rosetta, and are shown to improve protein loop conformation prediction significantly. The distributions are available at http://dunbrack.fccc.edu/hdp.

  13. Diachronic changes in word probability distributions in daily press

    Directory of Open Access Journals (Sweden)

    Stanković Jelena

    2006-01-01

    Full Text Available Changes in probability distributions of individual words and word types were investigated within two samples of daily press in the span of fifty years. Two samples of daily press were used in this study. The one derived from the Corpus of Serbian Language (CSL /Kostić, Đ., 2001/ that covers period between 1945. and 1957. and the other derived from the Ebart Media Documentation (EBR that was complied from seven daily news and five weekly magazines from 2002. and 2003. Each sample consisted of about 1 million words. The obtained results indicate that nouns and adjectives were more frequent in the CSL, while verbs and prepositions are more frequent in the EBR sample, suggesting a decrease of sentence length in the last five decades. Conspicuous changes in probability distribution of individual words were observed for nouns and adjectives, while minimal or no changes were observed for verbs and prepositions. Such an outcome suggests that nouns and adjectives are most susceptible to diachronic changes, while verbs and prepositions appear to be resistant to such changes.

  14. Uncertainty squared: Choosing among multiple input probability distributions and interpreting multiple output probability distributions in Monte Carlo climate risk models

    Science.gov (United States)

    Baer, P.; Mastrandrea, M.

    2006-12-01

    Simple probabilistic models which attempt to estimate likely transient temperature change from specified CO2 emissions scenarios must make assumptions about at least six uncertain aspects of the causal chain between emissions and temperature: current radiative forcing (including but not limited to aerosols), current land use emissions, carbon sinks, future non-CO2 forcing, ocean heat uptake, and climate sensitivity. Of these, multiple PDFs (probability density functions) have been published for the climate sensitivity, a couple for current forcing and ocean heat uptake, one for future non-CO2 forcing, and none for current land use emissions or carbon cycle uncertainty (which are interdependent). Different assumptions about these parameters, as well as different model structures, will lead to different estimates of likely temperature increase from the same emissions pathway. Thus policymakers will be faced with a range of temperature probability distributions for the same emissions scenarios, each described by a central tendency and spread. Because our conventional understanding of uncertainty and probability requires that a probabilistically defined variable of interest have only a single mean (or median, or modal) value and a well-defined spread, this "multidimensional" uncertainty defies straightforward utilization in policymaking. We suggest that there are no simple solutions to the questions raised. Crucially, we must dispel the notion that there is a "true" probability probabilities of this type are necessarily subjective, and reasonable people may disagree. Indeed, we suggest that what is at stake is precisely the question, what is it reasonable to believe, and to act as if we believe? As a preliminary suggestion, we demonstrate how the output of a simple probabilistic climate model might be evaluated regarding the reasonableness of the outputs it calculates with different input PDFs. We suggest further that where there is insufficient evidence to clearly

  15. Non-Gaussian probability distributions of solar wind fluctuations

    Directory of Open Access Journals (Sweden)

    E. Marsch

    Full Text Available The probability distributions of field differences ∆x(τ=x(t+τ-x(t, where the variable x(t may denote any solar wind scalar field or vector field component at time t, have been calculated from time series of Helios data obtained in 1976 at heliocentric distances near 0.3 AU. It is found that for comparatively long time lag τ, ranging from a few hours to 1 day, the differences are normally distributed according to a Gaussian. For shorter time lags, of less than ten minutes, significant changes in shape are observed. The distributions are often spikier and narrower than the equivalent Gaussian distribution with the same standard deviation, and they are enhanced for large, reduced for intermediate and enhanced for very small values of ∆x. This result is in accordance with fluid observations and numerical simulations. Hence statistical properties are dominated at small scale τ by large fluctuation amplitudes that are sparsely distributed, which is direct evidence for spatial intermittency of the fluctuations. This is in agreement with results from earlier analyses of the structure functions of ∆x. The non-Gaussian features are differently developed for the various types of fluctuations. The relevance of these observations to the interpretation and understanding of the nature of solar wind magnetohydrodynamic (MHD turbulence is pointed out, and contact is made with existing theoretical concepts of intermittency in fluid turbulence.

  16. Probability distribution function for reorientations in Maier-Saupe potential

    Science.gov (United States)

    Sitnitsky, A. E.

    2016-06-01

    Exact analytic solution for the probability distribution function of the non-inertial rotational diffusion equation, i.e., of the Smoluchowski one, in a symmetric Maier-Saupe uniaxial potential of mean torque is obtained via the confluent Heun's function. Both the ordinary Maier-Saupe potential and the double-well one with variable barrier width are considered. Thus, the present article substantially extends the scope of the potentials amenable to the treatment by reducing Smoluchowski equation to the confluent Heun's one. The solution is uniformly valid for any barrier height. We use it for the calculation of the mean first passage time. Also the higher eigenvalues for the relaxation decay modes in the case of ordinary Maier-Saupe potential are calculated. The results obtained are in full agreement with those of the approach developed by Coffey, Kalmykov, Déjardin and their coauthors in the whole range of barrier heights.

  17. Gesture Recognition Based on the Probability Distribution of Arm Trajectories

    Science.gov (United States)

    Wan, Khairunizam; Sawada, Hideyuki

    The use of human motions for the interaction between humans and computers is becoming an attractive alternative to verbal media, especially through the visual interpretation of the human body motion. In particular, hand gestures are used as non-verbal media for the humans to communicate with machines that pertain to the use of the human gestures to interact with them. This paper introduces a 3D motion measurement of the human upper body for the purpose of the gesture recognition, which is based on the probability distribution of arm trajectories. In this study, by examining the characteristics of the arm trajectories given by a signer, motion features are selected and classified by using a fuzzy technique. Experimental results show that the use of the features extracted from arm trajectories effectively works on the recognition of dynamic gestures of a human, and gives a good performance to classify various gesture patterns.

  18. The Probability Distribution Function of Column Density in Molecular Clouds

    CERN Document Server

    Vázquez-Semadeni, E; Vazquez-Semadeni, Enrique; Garcia, Nieves

    2001-01-01

    We discuss the probability distribution function (PDF) of column density resulting from density fields with lognormal PDFs, applicable to molecular clouds. For magnetic and non-magnetic numerical simulations of compressible, isothermal turbulence, we show that the density autocorrelation function (ACF) decays over short distances compared to the simulation size. The density "events" along a line of sight can be assumed to be independent over distances larger than this, and the Central Limit Theorem should be applicable. However, using random realizations of lognormal fields, we show that the convergence to a Gaussian shape is extremely slow in the high-density tail, and thus the column density PDF is not expected to exhibit a unique functional shape, but to transit instead from a lognormal to a Gaussian form as the column length increases, with decreasing variance. For intermediate path lengths, the column density PDF assumes a nearly exponential decay. For cases with density contrasts of $10^4$, comparable t...

  19. Probability distributions for one component equations with multiplicative noise

    CERN Document Server

    Deutsch, J M

    1993-01-01

    Abstract: Systems described by equations involving both multiplicative and additive noise are common in nature. Examples include convection of a passive scalar field, polymersin turbulent flow, and noise in dye lasers. In this paper the one component version of this problem is studied. The steady state probability distribution is classified into two different types of behavior. One class has power law tails and the other is of the form of an exponential to a power law. The value of the power law exponent is determined analytically for models having colored gaussian noise. It is found to only depend on the power spectrum of the noise at zero frequency. When non-gaussian noise is considered it is shown that stretched exponential tails are possible. An intuitive understanding of the results is found and makes use of the Lyapunov exponents for these systems.

  20. Seismic pulse propagation with constant Q and stable probability distributions

    Directory of Open Access Journals (Sweden)

    M. Tomirotti

    1997-06-01

    Full Text Available The one-dimensional propagation of seismic waves with constant Q is shown to be governed by an evolution equation of fractional order in time, which interpolates the heat equation and the wave equation. The fundamental solutions for the Cauchy and Signalling problems are expressed in terms of entire functions (of Wright type in the similarity variable and their behaviours turn out to be intermediate between those for the limiting cases of a perfectly viscous fluid and a perfectly elastic solid. In view of the small dissipation exhibited by the seismic pulses, the nearly elastic limit is considered. Furthermore, the fundamental solutions for the Cauchy and Signalling problems are shown to be related to stable probability distributions with an index of stability determined by the order of the fractional time derivative in the evolution equation.

  1. Characterizing the \\lyaf\\ flux probability distribution function using Legendre polynomials

    CERN Document Server

    Cieplak, Agnieszka M

    2016-01-01

    The Lyman-$\\alpha$ forest is a highly non-linear field with a lot of information available in the data beyond the power spectrum. The flux probability distribution function (PDF) has been used as a successful probe of small-scale physics. In this paper we argue that measuring coefficients of the Legendre polyonomial expansion of the PDF offers several advantages over measuring the binned values as is commonly done. In particular, $n$-th coefficient can be expressed as a linear combination of the first $n$ moments, allowing these coefficients to be measured in the presence of noise and allowing a clear route for marginalisation over mean flux. Moreover, in the presence of noise, our numerical work shows that a finite number of coefficients are well measured with very sharp transition into noise dominance. This compresses the available information into a small number of well-measured quantities.

  2. Quantization of Prior Probabilities for Collaborative Distributed Hypothesis Testing

    Science.gov (United States)

    Rhim, Joong Bum; Varshney, Lav R.; Goyal, Vivek K.

    2012-09-01

    This paper studies the quantization of prior probabilities, drawn from an ensemble, for distributed detection and data fusion. Design and performance equivalences between a team of N agents tied by a fixed fusion rule and a more powerful single agent are obtained. Effects of identical quantization and diverse quantization are compared. Consideration of perceived common risk enables agents using diverse quantizers to collaborate in hypothesis testing, and it is proven that the minimum mean Bayes risk error is achieved by diverse quantization. The comparison shows that optimal diverse quantization with K cells per quantizer performs as well as optimal identical quantization with N(K-1)+1 cells per quantizer. Similar results are obtained for maximum Bayes risk error as the distortion criterion.

  3. Quantization of Prior Probabilities for Collaborative Distributed Hypothesis Testing

    CERN Document Server

    Rhim, Joong Bum; Goyal, Vivek K

    2011-01-01

    This paper studies the quantization of prior probabilities, drawn from an ensemble, for distributed detection and data fusion. Design and performance equivalences between a team of N agents tied by a fixed fusion rule and a more powerful single agent are obtained. Effects of identical quantization and diverse quantization are compared. Consideration of perceived common risk enables agents using diverse quantizers to collaborate in hypothesis testing, and it is proven that the minimum mean Bayes risk error is achieved by diverse quantization. The comparison shows that optimal diverse quantization with K cells per quantizer performs as well as optimal identical quantization with N(K-1)+1 cells per quantizer. Similar results are obtained for maximum Bayes risk error as the distortion criterion.

  4. Spin-Orbit angle distribution and the origin of (mis)aligned hot Jupiters

    CERN Document Server

    Crida, Aurélien

    2014-01-01

    For 61 transiting hot Jupiters, the projection of the angle between the orbital plane and the stellar equator (called the spin-orbit angle) has been measured. For about half of them, a significant misalignment is detected, and retrograde planets have been observed. This challenges scenarios of the formation of hot Jupiters. In order to better constrain formation models, we relate the distribution of the real spin-orbit angle $\\Psi$ to the projected one $\\beta$. Then, a comparison with the observations is relevant. We analyse the geometry of the problem to link analytically the projected angle $\\beta$ to the real spin-orbit angle $\\Psi$. The distribution of $\\Psi$ expected in various models is taken from the literature, or derived with a simplified model and Monte-Carlo simulations in the case of the disk-torquing mechanism. An easy formula to compute the probability density function (PDF) of $\\beta$ knowing the PDF of $\\Psi$ is provided. All models tested here look compatible with the observed distribution be...

  5. Insights from probability distribution functions of intensity maps

    CERN Document Server

    Breysse, Patrick C; Behroozi, Peter S; Dai, Liang; Kamionkowski, Marc

    2016-01-01

    In the next few years, intensity-mapping surveys that target lines such as CO, Ly$\\alpha$, and CII stand to provide powerful probes of high-redshift astrophysics. However, these line emissions are highly non-Gaussian, and so the typical power-spectrum methods used to study these maps will leave out a significant amount of information. We propose a new statistic, the probability distribution of voxel intensities, which can access this extra information. Using a model of a CO intensity map at $z\\sim3$ as an example, we demonstrate that this voxel intensity distribution (VID) provides substantial constraining power beyond what is obtainable from the power spectrum alone. We find that a future survey similar to the planned COMAP Full experiment could constrain the CO luminosity function to order $\\sim10\\%$. We also explore the effects of contamination from continuum emission, interloper lines, and gravitational lensing on our constraints and find that the VID statistic retains significant constraining power even ...

  6. Characterization of cast metals with probability distribution functions

    International Nuclear Information System (INIS)

    Characterization of microstructure using a probability distribution function (PDF) provides a means for extracting useful information about material properties. In the extension of classical PDF methods developed in the research, material characteristics are evolved by propagating an initial PDF through time, using growth laws derived from consideration of heat flow and species diffusion, constrained by the Gibbs-Thomson law. A model is described here that allows for nucleation, followed by growth of nominally spherical grains according to a stable or unstable growth law. Results are presented for the final average grain size as a function of cooling rate for various nucleation parameters. In particular the authors show that the model describes linear variation of final grain size with the inverse cube root of cooling rate. Within a subset of casting parameters, the stable-to-unstable manifests itself as a bimodal distribution of final grain size. Calculations with the model are described for the liquid to epsilon phase transition in a plutonium 1 weight percent gallium alloy

  7. The Probability Distribution for Non-Gaussianity Estimators

    CERN Document Server

    Smith, Tristan L; Wandelt, Benjamin D

    2011-01-01

    One of the principle efforts in cosmic microwave background (CMB) research is measurement of the parameter fnl that quantifies the departure from Gaussianity in a large class of non-minimal inflationary (and other) models. Estimators for fnl are composed of a sum of products of the temperatures in three different pixels in the CMB map. Since the number ~Npix^2 of terms in this sum exceeds the number Npix of measurements, these ~Npix^2 terms cannot be statistically independent. Therefore, the central-limit theorem does not necessarily apply, and the probability distribution function (PDF) for the fnl estimator does not necessarily approach a Gaussian distribution for N_pix >> 1. Although the variance of the estimators is known, the significance of a measurement of fnl depends on knowledge of the full shape of its PDF. Here we use Monte Carlo realizations of CMB maps to determine the PDF for two minimum-variance estimators: the standard estimator, constructed under the null hypothesis (fnl=0), and an improved e...

  8. COSMIC MICROWAVE BACKGROUND LIKELIHOOD APPROXIMATION FOR BANDED PROBABILITY DISTRIBUTIONS

    Energy Technology Data Exchange (ETDEWEB)

    Gjerløw, E.; Mikkelsen, K.; Eriksen, H. K.; Næss, S. K.; Seljebotn, D. S. [Institute of Theoretical Astrophysics, University of Oslo, P.O. Box 1029 Blindern, N-0315 Oslo (Norway); Górski, K. M.; Huey, G.; Jewell, J. B.; Rocha, G.; Wehus, I. K., E-mail: eirik.gjerlow@astro.uio.no [Jet Propulsion Laboratory, California Institute of Technology, 4800 Oak Grove Drive, Pasadena, CA 91109 (United States)

    2013-11-10

    We investigate sets of random variables that can be arranged sequentially such that a given variable only depends conditionally on its immediate predecessor. For such sets, we show that the full joint probability distribution may be expressed exclusively in terms of uni- and bivariate marginals. Under the assumption that the cosmic microwave background (CMB) power spectrum likelihood only exhibits correlations within a banded multipole range, Δl{sub C}, we apply this expression to two outstanding problems in CMB likelihood analysis. First, we derive a statistically well-defined hybrid likelihood estimator, merging two independent (e.g., low- and high-l) likelihoods into a single expression that properly accounts for correlations between the two. Applying this expression to the Wilkinson Microwave Anisotropy Probe (WMAP) likelihood, we verify that the effect of correlations on cosmological parameters in the transition region is negligible in terms of cosmological parameters for WMAP; the largest relative shift seen for any parameter is 0.06σ. However, because this may not hold for other experimental setups (e.g., for different instrumental noise properties or analysis masks), but must rather be verified on a case-by-case basis, we recommend our new hybridization scheme for future experiments for statistical self-consistency reasons. Second, we use the same expression to improve the convergence rate of the Blackwell-Rao likelihood estimator, reducing the required number of Monte Carlo samples by several orders of magnitude, and thereby extend it to high-l applications.

  9. Prediction of optimum section pitch angle distribution along wind turbine blades

    International Nuclear Information System (INIS)

    Highlights: ► Prediction of optimum pitch angle along wind turbine blades. ► Maximum electrical power extraction at the installation site. ► Solving BEM equations with the probability distribution function of wind speed at a installation site. - Abstract: In this paper, the boost in electrical energy production of horizontal-axis wind turbines with fixed rotor speed is studied. To achieve this, a new innovative algorithm is proposed and justified to predict a distribution of section pitch angle along wind turbine blades that corresponds to the maximum power extraction in the installation site. A code is developed based on the blade element momentum theory which incorporates different corrections such as the tip loss correction. This aerodynamic code is capable of accurately predicting the aerodynamics of horizontal-axis wind turbines

  10. Performance Probability Distributions for Sediment Control Best Management Practices

    Science.gov (United States)

    Ferrell, L.; Beighley, R.; Walsh, K.

    2007-12-01

    Controlling soil erosion and sediment transport can be a significant challenge during the construction process due to the extent and conditions of bare, disturbed soils. Best Management Practices (BMPs) are used as the framework for the design of sediment discharge prevention systems in stormwater pollution prevention plans which are typically required for construction sites. This research focuses on commonly-used BMP systems for perimeter control of sediment export: silt fences and fiber rolls. Although these systems are widely used, the physical and engineering parameters describing their performance are not well understood. Performance expectations are based on manufacturer results, but due to the dynamic conditions that exist on a construction site performance expectations are not always achievable in the field. Based on experimental results product performance is shown to be highly variable. Experiments using the same installation procedures show inconsistent sediment removal performances ranging from (>)85 percent to zero. The goal of this research is to improve the determination of off-site sediment yield based on probabilistic performance results of perimeter control BMPs. BMPs are evaluated in the Soil Erosion Research Laboratory (SERL) in the Civil and Environmental Engineering department at San Diego State University. SERL experiments are performed on a 3-m by 10-m tilting soil bed with a soil depth of 0.5 meters and a slope of 33 percent. The simulated storm event consists of 17 mm/hr for 20 minutes followed by 51 mm/hr for 30 minutes. The storm event is based on an ASTM design storm intended to simulate BMP failures. BMP performance is assessed based on experiments where BMPs are installed per manufacture specifications, less than optimal installations, and no treatment conditions. Preliminary results from 30 experiments are presented and used to develop probability distributions for BMP sediment removal efficiencies. The results are then combined with

  11. Tools for Bramwell-Holdsworth-Pinton Probability Distribution

    OpenAIRE

    Mirela Danubianu; Tiberiu Socaciu; Ioan Maxim

    2009-01-01

    This paper is a synthesis of a range of papers presentedat various conferences related to distribution Bramwell-Holdsworth-Pinton. S. T. Bramwell, P. C. W. Holdsworth, J. F.Pinton introduced a new non-parametric distribution (calledBHP) after studying some magnetization problems in 2D. Probabilitydensity function of distribution can be aproximated as amodified GFT (Gumbel-Fisher-Tippitt) distribution.

  12. Compound Outage Probability and Capacity of a Class of Fading MIMO Channels with Channel Distribution Uncertainty

    CERN Document Server

    Ioannou, Ioanna; Loyka, Sergey

    2011-01-01

    Outage probability and capacity of a class of block-fading MIMO channels are considered with partial channel distribution information. Specifically, the channel or its distribution are not known but the latter is known to belong to a class of distributions where each member is within a certain distance (uncertainty) from a nominal distribution. Relative entropy is used as a measure of distance between distributions. Compound outage probability defined as min (over the transmit signal distribution) -max (over the channel distribution class) outage probability is introduced and investigated. This generalizes the standard outage probability to the case of partial channel distribution information. Compound outage probability characterization (via one-dimensional convex optimization), its properties and approximations are given. It is shown to have two-regime behavior: when the nominal outage probability decreases (e.g. by increasing the SNR), the compound outage first decreases linearly down to a certain threshol...

  13. Tools for Bramwell-Holdsworth-Pinton Probability Distribution

    Directory of Open Access Journals (Sweden)

    Mirela Danubianu

    2009-01-01

    Full Text Available This paper is a synthesis of a range of papers presentedat various conferences related to distribution Bramwell-Holdsworth-Pinton. S. T. Bramwell, P. C. W. Holdsworth, J. F.Pinton introduced a new non-parametric distribution (calledBHP after studying some magnetization problems in 2D. Probabilitydensity function of distribution can be aproximated as amodified GFT (Gumbel-Fisher-Tippitt distribution.

  14. Investigation of Probability Distributions Using Dice Rolling Simulation

    Science.gov (United States)

    Lukac, Stanislav; Engel, Radovan

    2010-01-01

    Dice are considered one of the oldest gambling devices and thus many mathematicians have been interested in various dice gambling games in the past. Dice have been used to teach probability, and dice rolls can be effectively simulated using technology. The National Council of Teachers of Mathematics (NCTM) recommends that teachers use simulations…

  15. The Impact of an Instructional Intervention Designed to Support Development of Stochastic Understanding of Probability Distribution

    Science.gov (United States)

    Conant, Darcy Lynn

    2013-01-01

    Stochastic understanding of probability distribution undergirds development of conceptual connections between probability and statistics and supports development of a principled understanding of statistical inference. This study investigated the impact of an instructional course intervention designed to support development of stochastic…

  16. APL: An angle probability list to improve knowledge-based metaheuristics for the three-dimensional protein structure prediction.

    Science.gov (United States)

    Borguesan, Bruno; Barbachan e Silva, Mariel; Grisci, Bruno; Inostroza-Ponta, Mario; Dorn, Márcio

    2015-12-01

    Tertiary protein structure prediction is one of the most challenging problems in structural bioinformatics. Despite the advances in algorithm development and computational strategies, predicting the folded structure of a protein only from its amino acid sequence remains as an unsolved problem. We present a new computational approach to predict the native-like three-dimensional structure of proteins. Conformational preferences of amino acid residues and secondary structure information were obtained from protein templates stored in the Protein Data Bank and represented as an Angle Probability List. Two knowledge-based prediction methods based on Genetic Algorithms and Particle Swarm Optimization were developed using this information. The proposed method has been tested with twenty-six case studies selected to validate our approach with different classes of proteins and folding patterns. Stereochemical and structural analysis were performed for each predicted three-dimensional structure. Results achieved suggest that the Angle Probability List can improve the effectiveness of metaheuristics used to predicted the three-dimensional structure of protein molecules by reducing its conformational search space. PMID:26495908

  17. Using projections and correlations to approximate probability distributions

    CERN Document Server

    Karlen, D A

    1998-01-01

    A method to approximate continuous multi-dimensional probability density functions (PDFs) using their projections and correlations is described. The method is particularly useful for event classification when estimates of systematic uncertainties are required and for the application of an unbinned maximum likelihood analysis when an analytic model is not available. A simple goodness of fit test of the approximation can be used, and simulated event samples that follow the approximate PDFs can be efficiently generated. The source code for a FORTRAN-77 implementation of this method is available.

  18. Probability distributions for measures of placental shape and morphology

    International Nuclear Information System (INIS)

    Birthweight at delivery is a standard cumulative measure of placental growth, but is a crude summary of other placental characteristics, such as, e.g., the chorionic plate size, and the shape and position of the umbilical cord insertion. Distributions of such measures across a cohort reveal information about the developmental history of the chorionic plate which is unavailable from an analysis based solely on the mean and standard deviation. Various measures were determined from digitized images of chorionic plates obtained from the pregnancy, infection, and nutrition study, a prospective cohort study of preterm birth in central North Carolina between 2002 and 2004. Centroids (geometric centers) and umbilical cord insertions were taken directly from the images. Chorionic plate outlines were obtained from an interpolation based on a Fourier series, while eccentricity (of the best-fit ellipse), skewness, and kurtosis were determined from the method of moments. Histograms of each variable were compared against the normal, lognormal, and Lévy distributions. Only a single measure (eccentricity) followed a normal distribution. All others followed lognormal or ‘heavy-tailed’ distributions for moderate to extreme deviations from the mean, where the relative likelihood far exceeded those of a normal distribution. (paper)

  19. Beta-hypergeometric probability distribution on symmetric matrices

    OpenAIRE

    Hassairi, Abdelhamid; Masmoudi, Mouna

    2012-01-01

    Some remarkable properties of the beta distribution are based on relations involving independence between beta random variables such that a parameter of one among them is the sum of the parameters of an other (see (1.1) et (1.2) below). Asci, Letac and Piccioni \\cite{6} have used the real beta-hypergeometric distribution on $ \\reel$ to give a general version of these properties without the condition on the parameters. In the present paper, we extend the properties of the real beta to the beta...

  20. Scaling Properties of the Probability Distribution of Lattice Gribov Copies

    CERN Document Server

    Lokhov, A Y; Roiesnel, C

    2005-01-01

    We study the problem of the Landau gauge fixing in the case of the SU(2) lattice gauge theory. We show that the probability to find a lattice Gribov copy increases considerably when the physical size of the lattice exceeds some critical value $\\approx2.75/\\sqrt{\\sigma}$, almost independent of the lattice spacing. The impact of the choice of the copy on Green functions is presented. We confirm that the ghost propagator depends on the choice of the copy whereas the gluon propagator is insensitive to it (within present statistical errors). The gluonic three-point functions are also insensitive to it. Finally we show that gauge copies which have the same value of the minimisation functional ($\\int d^4x (A^a_\\mu)^2$) are equivalent, up to a global gauge transformation, and yield the same Green functions.

  1. Extreme Points of the Convex Set of Joint Probability Distributions with Fixed Marginals

    Indian Academy of Sciences (India)

    K R Parthasarathy

    2007-11-01

    By using a quantum probabilistic approach we obtain a description of the extreme points of the convex set of all joint probability distributions on the product of two standard Borel spaces with fixed marginal distributions.

  2. Variation in the probability distribution function of passive contaminant concentration

    Czech Academy of Sciences Publication Activity Database

    Jurčáková, Klára; Riveron, F.

    Praha : Ústav termomechaniky AV ČR v. v. i., 2009 - (Jonáš, P.; Uruba, V.), s. 19-20 ISBN 978-80-87012-21-5. [Colloquium Fluid Dynamics 2009. Praha (CZ), 21.10.2009-23.10.2009] Institutional research plan: CEZ:AV0Z20760514 Keywords : temporal variation of concentrations * atmospheric boundary layer * Weibull distribution Subject RIV: DG - Athmosphere Sciences, Meteorology

  3. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    Science.gov (United States)

    Yilmaz, Şeyda; Bayrak, Erdem; Bayrak, Yusuf

    2016-04-01

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed time using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.

  4. Disoriented Chiral Condensates, Pion Probability Distributions and Parallels with Disordered System

    OpenAIRE

    Mekjian, A. Z.

    1999-01-01

    A general expression is discussed for pion probability distributions coming from relativistic heavy ion collisions. The general expression contains as limits: 1) The disoriented chiral condensate (DCC), 2) the negative binomial distribution and Pearson type III distribution, 3) a binomial or Gaussian result, 4) and a Poisson distribution. This general expression approximates other distributions such as a signal to noise laser distribution. Similarities and differences of the DCC distribution ...

  5. Calculation of ruin probabilities for a dense class of heavy tailed distributions

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis; Samorodnitsky, Gennady

    2015-01-01

    any such distribution. We prove that formulas from renewal theory, and with a particular attention to ruin probabilities, which are true for common phase-type distributions also hold true for the infinite-dimensional case. We provide algorithms for calculating functionals of interest such as the...... renewal density and the ruin probability. It might be of interest to approximate a given heavy tailed distribution of some other type by a distribution from the class of infinite-dimensional phase-type distributions and to this end we provide a calibration procedure which works for the approximation of...... distributions with a slowly varying tail. An example from risk theory, comparing ruin probabilities for a classical risk process with Pareto distributed claim sizes, is presented and exact known ruin probabilities for the Pareto case are compared to the ones obtained by approximating by an infinite...

  6. The probability distribution of the predicted CFM-induced ozone depletion. [Chlorofluoromethane

    Science.gov (United States)

    Ehhalt, D. H.; Chang, J. S.; Bulter, D. M.

    1979-01-01

    It is argued from the central limit theorem that the uncertainty in model predicted changes of the ozone column density is best represented by a normal probability density distribution. This conclusion is validated by comparison with a probability distribution generated by a Monte Carlo technique. In the case of the CFM-induced ozone depletion, and based on the estimated uncertainties in the reaction rate coefficients alone the relative mean standard deviation of this normal distribution is estimated to be 0.29.

  7. Optimal probabilistic cloning of two linearly independent states with arbitrary probability distribution

    Science.gov (United States)

    Zhang, Wen; Rui, Pinshu; Zhang, Ziyun; Liao, Yanlin

    2016-02-01

    We investigate the probabilistic quantum cloning (PQC) of two states with arbitrary probability distribution. The optimal success probabilities are worked out for 1→ 2 PQC of the two states. The results show that the upper bound on the success probabilities of PQC in Qiu (J Phys A 35:6931-6937, 2002) cannot be reached in general. With the optimal success probabilities, we design simple forms of 1→ 2 PQC and work out the unitary transformation needed in the PQC processes. The optimal success probabilities for 1→ 2 PQC are also generalized to the M→ N PQC case.

  8. Average Consensus Analysis of Distributed Inference with Uncertain Markovian Transition Probability

    OpenAIRE

    Won Il Kim; Rong Xiong; Qiuguo Zhu; Jun Wu

    2013-01-01

    The average consensus problem of distributed inference in a wireless sensor network under Markovian communication topology of uncertain transition probability is studied. A sufficient condition for average consensus of linear distributed inference algorithm is presented. Based on linear matrix inequalities and numerical optimization, a design method of fast distributed inference is provided.

  9. A measure of mutual divergence among a number of probability distributions

    Directory of Open Access Journals (Sweden)

    J. N. Kapur

    1987-01-01

    major inequalities due to Shannon, Renyi and Holder. The inequalities are then used to obtain some useful results in information theory. In particular measures are obtained to measure the mutual divergence among two or more probability distributions.

  10. Noise figure and photon probability distribution in Coherent Anti-Stokes Raman Scattering (CARS)

    OpenAIRE

    Dimitropoulos, D.; Solli, D. R.; Claps, R.; Jalali, B.

    2006-01-01

    The noise figure and photon probability distribution are calculated for coherent anti-Stokes Raman scattering (CARS) where an anti-Stokes signal is converted to Stokes. We find that the minimum noise figure is ~ 3dB.

  11. Simultaneous distribution between the deflection angle and the lateral displacement under the Moliere theory of multiple scattering

    International Nuclear Information System (INIS)

    Simultaneous distribution between the deflection angle and the lateral displacement of fast charged particles traversing through matter is derived by applying numerical inverse Fourier transforms on the Fourier spectral density solved analytically under the Moliere theory of multiple scattering, taking account of ionization loss. Our results show the simultaneous Gaussian distribution at the region of both small deflection angle and lateral displacement, though they show the characteristic contour patterns of probability density specific to the single and the double scatterings at the regions of large deflection angle and/or lateral displacement. The influences of ionization loss on the distribution are also investigated. An exact simultaneous distribution is derived under the fixed energy condition based on a well-known model of screened single scattering, which indicates the limit of validity of the Moliere theory applied to the simultaneous distribution. The simultaneous distribution will be valuable for improving the accuracy and the efficiency of experimental analyses and simulation studies relating to charged particle transports. (orig.)

  12. Simultaneous distribution between the deflection angle and the lateral displacement under the Moliere theory of multiple scattering

    Energy Technology Data Exchange (ETDEWEB)

    Nakatsuka, Takao [Okayama Shoka University, Laboratory of Information Science, Okayama (Japan); Okei, Kazuhide [Kawasaki Medical School, Dept. of Information Sciences, Kurashiki (Japan); Iyono, Atsushi [Okayama university of Science, Dept. of Fundamental Science, Faculty of Science, Okayama (Japan); Bielajew, Alex F. [Univ. of Michigan, Dept. Nuclear Engineering and Radiological Sciences, Ann Arbor, MI (United States)

    2015-12-15

    Simultaneous distribution between the deflection angle and the lateral displacement of fast charged particles traversing through matter is derived by applying numerical inverse Fourier transforms on the Fourier spectral density solved analytically under the Moliere theory of multiple scattering, taking account of ionization loss. Our results show the simultaneous Gaussian distribution at the region of both small deflection angle and lateral displacement, though they show the characteristic contour patterns of probability density specific to the single and the double scatterings at the regions of large deflection angle and/or lateral displacement. The influences of ionization loss on the distribution are also investigated. An exact simultaneous distribution is derived under the fixed energy condition based on a well-known model of screened single scattering, which indicates the limit of validity of the Moliere theory applied to the simultaneous distribution. The simultaneous distribution will be valuable for improving the accuracy and the efficiency of experimental analyses and simulation studies relating to charged particle transports. (orig.)

  13. Exploration of probability distribution of velocities of saltating sand particles based on the stochastic particle-bed collisions

    International Nuclear Information System (INIS)

    The wind-blown sand saltating movement is mainly categorized into two mechanical processes, that is, the interaction between the moving sand particles and the wind in the saltation layer, and the collisions of incident particles with sand bed, and the latter produces a lift-off velocity of a sand particle moving into saltation. In this Letter a methodology of phenomenological analysis is presented to get probability density (distribution) function (pdf) of the lift-off velocity of sand particles from sand bed based on the stochastic particle-bed collision. After the sand particles are dealt with by uniform circular disks and a 2D collision between an incident particle and the granular bed is employed, we get the analytical formulas of lift-off velocity of ejected and rebound particles in saltation, which are functions of some random parameters such as angle and magnitude of incident velocity of the impacting particles, impact and contact angles between the collision particles, and creeping velocity of sand particles, etc. By introducing the probability density functions (pdf's) of these parameters in communion with all possible patterns of sand bed and all possible particle-bed collisions, and using the essential arithmetic of multi-dimension random variables' pdf, the pdf's of lift-off velocities are deduced out and expressed by the pdf's of the random parameters in the collisions. The numerical results of the distributions of lift-off velocities display that they agree well with experimental ones

  14. Inclusion of organ movements in IMRT treatment planning via inverse planning based on probability distributions

    International Nuclear Information System (INIS)

    In this paper, we investigate an off-line strategy to incorporate inter-fraction organ motion in IMRT treatment planning. It was suggested that inverse planning could be based on a probability distribution of patient geometries instead of a single snap shot. However, this concept is connected to two intrinsic problems: first, this probability distribution has to be estimated from only a few images; and second, the distribution is only sparsely sampled over the treatment course due to a finite number of fractions. In the current work, we develop new concepts of inverse planning which account for these two problems

  15. Multi-scale Characterization and Modeling of Surface Slope Probability Distribution for ~20-km Diameter Lunar Craters

    Science.gov (United States)

    Mahanti, P.; Robinson, M. S.; Boyd, A. K.

    2013-12-01

    Craters ~20-km diameter and above significantly shaped the lunar landscape. The statistical nature of the slope distribution on their walls and floors dominate the overall slope distribution statistics for the lunar surface. Slope statistics are inherently useful for characterizing the current topography of the surface, determining accurate photometric and surface scattering properties, and in defining lunar surface trafficability [1-4]. Earlier experimental studies on the statistical nature of lunar surface slopes were restricted either by resolution limits (Apollo era photogrammetric studies) or by model error considerations (photoclinometric and radar scattering studies) where the true nature of slope probability distribution was not discernible at baselines smaller than a kilometer[2,3,5]. Accordingly, historical modeling of lunar surface slopes probability distributions for applications such as in scattering theory development or rover traversability assessment is more general in nature (use of simple statistical models such as the Gaussian distribution[1,2,5,6]). With the advent of high resolution, high precision topographic models of the Moon[7,8], slopes in lunar craters can now be obtained at baselines as low as 6-meters allowing unprecedented multi-scale (multiple baselines) modeling possibilities for slope probability distributions. Topographic analysis (Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC) 2-m digital elevation models (DEM)) of ~20-km diameter Copernican lunar craters revealed generally steep slopes on interior walls (30° to 36°, locally exceeding 40°) over 15-meter baselines[9]. In this work, we extend the analysis from a probability distribution modeling point-of-view with NAC DEMs to characterize the slope statistics for the floors and walls for the same ~20-km Copernican lunar craters. The difference in slope standard deviations between the Gaussian approximation and the actual distribution (2-meter sampling) was

  16. Solar proton pitch angle distribution for the January 24, 1969 event

    International Nuclear Information System (INIS)

    Pitch angle distributions during the highly anisotropic phase of the event are fitted by a polynomial in cosmic pitch angle, μ, and the results are compared with the predictions of a Fokker-Planck equation in μ space for quasi-steady injection. Implications for the theory of the diffusion coefficient D(μ) are discussed. (orig.)

  17. Void Probability Enhanced Multiplicity Distribution of Produced Hadrons in p-p Collision at Lhc Energies

    Science.gov (United States)

    Dutta, S.; Chan, A. H.; Oh, C. H.

    2012-08-01

    This paper studies the multiplicity distribution of hadrons produced in p-p collisions at 0.9 and 2.36 TeV using ALICE as a detector. The multiplicity distribution exhibits enhanced void probability. They are also found to satisfy the void probability scaling. The scaling of χ with \\bar n\\bar k2 is studied using the generalized hypergeometric model. The variation of the parameter "a" of the hyper geometric model with energy and type of events is also studied. The parameter "a" distinguishes between various theoretical models, e.g. Lorentz/Catalan, negative binomial, geometric distribution etc. Finally a comparison is made with the p--\\bar p collisions at 200, 546 and 900 GeV. It is observed both for p-p and p--\\bar p data, the value of "a" decreases with increase in collision energy and approach towards the upper bound or the NB model of the void probability scaling.

  18. Ruin Probability and Joint Distributions of Some Actuarial Random Vectors in the Compound Pascal Model

    Institute of Scientific and Technical Information of China (English)

    Xian-min Geng; Shu-chen Wan

    2011-01-01

    The compound negative binomial model, introduced in this paper, is a discrete time version. We discuss the Markov properties of the surplus process, and study the ruin probability and the joint distributions of actuarial random vectors in this model. By the strong Markov property and the mass function of a defective renewal sequence, we obtain the explicit expressions of the ruin probability, the finite-horizon ruin probability,the joint distributions of T, U(T - 1), |U(T)| and inf 0≤n<T1 U(n) (i.e., the time of ruin, the surplus immediately before ruin, the deficit at ruin and maximal deficit from ruin to recovery) and the distributions of some actuariai random vectors.

  19. Probability distributions for directed polymers in random media with correlated noise.

    Science.gov (United States)

    Chu, Sherry; Kardar, Mehran

    2016-07-01

    The probability distribution for the free energy of directed polymers in random media (DPRM) with uncorrelated noise in d=1+1 dimensions satisfies the Tracy-Widom distribution. We inquire if and how this universal distribution is modified in the presence of spatially correlated noise. The width of the distribution scales as the DPRM length to an exponent β, in good (but not full) agreement with previous renormalization group and numerical results. The scaled probability is well described by the Tracy-Widom form for uncorrelated noise, but becomes symmetric with increasing correlation exponent. We thus find a class of distributions that continuously interpolates between Tracy-Widom and Gaussian forms. PMID:27575059

  20. Fitting the distribution of dry and wet spells with alternative probability models

    Science.gov (United States)

    Deni, Sayang Mohd; Jemain, Abdul Aziz

    2009-06-01

    The development of the rainfall occurrence model is greatly important not only for data-generation purposes, but also in providing informative resources for future advancements in water-related sectors, such as water resource management and the hydrological and agricultural sectors. Various kinds of probability models had been introduced to a sequence of dry (wet) days by previous researchers in the field. Based on the probability models developed previously, the present study is aimed to propose three types of mixture distributions, namely, the mixture of two log series distributions (LSD), the mixture of the log series Poisson distribution (MLPD), and the mixture of the log series and geometric distributions (MLGD), as the alternative probability models to describe the distribution of dry (wet) spells in daily rainfall events. In order to test the performance of the proposed new models with the other nine existing probability models, 54 data sets which had been published by several authors were reanalyzed in this study. Also, the new data sets of daily observations from the six selected rainfall stations in Peninsular Malaysia for the period 1975-2004 were used. In determining the best fitting distribution to describe the observed distribution of dry (wet) spells, a Chi-square goodness-of-fit test was considered. The results revealed that the new method proposed that MLGD and MLPD showed a better fit as more than half of the data sets successfully fitted the distribution of dry and wet spells. However, the existing models, such as the truncated negative binomial and the modified LSD, were also among the successful probability models to represent the sequence of dry (wet) days in daily rainfall occurrence.

  1. LAGRANGE MULTIPLIERS IN THE PROBABILITY DISTRIBUTIONS ELICITATION PROBLEM: AN APPLICATION TO THE 2013 FIFA CONFEDERATIONS CUP

    Directory of Open Access Journals (Sweden)

    Diogo de Carvalho Bezerra

    2015-12-01

    Full Text Available ABSTRACT Contributions from the sensitivity analysis of the parameters of the linear programming model for the elicitation of experts' beliefs are presented. The process allows for the calibration of the family of probability distributions obtained in the elicitation process. An experiment to obtain the probability distribution of a future event (Brazil vs. Spain soccer game in the 2013 FIFA Confederations Cup final game was conducted. The proposed sensitivity analysis step may help to reduce the vagueness of the information given by the expert.

  2. Probability distribution of the order parameter in the directed percolation universality class.

    Science.gov (United States)

    Martins, P H L

    2012-04-01

    The probability distributions of the order parameter for two models in the directed percolation universality class were evaluated. Monte Carlo simulations have been performed for the one-dimensional generalized contact process and the Domany-Kinzel cellular automaton. In both cases, the density of active sites was chosen as the order parameter. The criticality of those models was obtained by solely using the corresponding probability distribution function. It has been shown that the present method, which has been successfully employed in treating equilibrium systems, is indeed also useful in the study of nonequilibrium phase transitions. PMID:22680423

  3. Rank-Ordered Multifractal Analysis of Probability Distributions in Fluid Turbulence

    Science.gov (United States)

    Wu, Cheng-Chin; Chang, Tien

    2015-11-01

    Rank-Ordered Multifractal Analysis (ROMA) was introduced by Chang and Wu (2008) to describe the multifractal characteristic of intermittent events. The procedure provides a natural connection between the rank-ordered spectrum and the idea of one-parameter scaling for monofractals. This technique has successfully been applied to MHD turbulence simulations and turbulence data observed in various space plasmas. In this paper, the technique is applied to the probability distributions in the inertial range of the turbulent fluid flow, as given in the vast Johns Hopkins University (JHU) turbulence database. In addition, a refined method of finding the continuous ROMA spectrum and the scaled probability distribution function (PDF) simultaneously is introduced.

  4. Electron fluxes and pitch-angle distributions at dipolarization fronts: THEMIS multipoint observations

    Science.gov (United States)

    Runov, A.; Angelopoulos, V.; Gabrielse, C.; Zhou, X.-Z.; Turner, D.; Plaschke, F.

    2013-02-01

    Taking advantage of multipoint observations from a Cluster-like Time History of Events and Macroscale Interactions during Substorms (THEMIS) probe configuration repeated in three events, we study pitch-angle distributions (PAD) of lower energy (0.2-keV) electrons and omnidirectional energy-time spectrograms of higher energy (30-500 keV) electrons observed at and near dipolarization fronts in the plasma sheet. Recent observations have shown that dipolarization fronts in the plasma sheet provide an impulsive electric field suggested to cause electron energization and dispersionless injections. Increase and decrease in energetic electron flux are equally probable at the fronts, however. Our case studies demonstrate increased energetic electron flux in the front's central region but decreased flux on its dusk side, where diverted plasma flow forms a vortex. An electric field associated with this vortex causes the electron flux decrease. We also find that shorter-term energetic flux decreases, often observed before injections, coincide with a dip in the northward magnetic field ahead of the front. We attribute these decreases to particle energy loss via the inverse betatron effect. Our case studies reveal that pancake-type (maximum at 90° pitch angle) and cigar-type (maxima at 0 and 180°) PADs coexist at the same front. Our data analysis suggests that energetic electron PADs are mainly pancake type near the neutral sheet (|Bx| cigar type at |Bx| > 10 nt. These results, to be confirmed in statistical studies, provide important constraints for further modeling of electron energization and transport toward the inner magnetosphere.

  5. Influence of blade angle distribution along leading edge on cavitation performance of a centrifugal pump

    Science.gov (United States)

    Xu, Y.; Tan, L.; Cao, S. L.; Wang, Y. C.; Meng, G.; Qu, W. S.

    2015-01-01

    The influence of blade angle distribution along leading edge on cavitation performance of centrifugal pumps is analysed in the present paper. Three sets of blade angle distribution along leading edge for three blade inlet angles are chosen to design nine centrifugal pump impellers. The RNG k-epsilon turbulence model and the Zwart-Gerber-Belamri cavitation model are employed to simulate the cavitation flows in centrifugal pumps with different impellers and the same volute. The numerical results are compared with the experimental data, and the comparison proves that the numerical simulation can accurately predict the cavitation performance of centrifugal pumps. On the basis of the numerical simulations, the pump head variations with pump inlet pressure, and the flow details in centrifugal pump are revealed to demonstrate the influence of blade angle distribution along leading edge on cavitation performances of centrifugal pumps.

  6. Importance measures for imprecise probability distributions and their sparse grid solutions

    Institute of Scientific and Technical Information of China (English)

    WANG; Pan; LU; ZhenZhou; CHENG; Lei

    2013-01-01

    For the imprecise probability distribution of structural system, the variance based importance measures (IMs) of the inputs are investigated, and three IMs are defined on the conditions of random distribution parameters, interval distribution parameters and the mixture of those two types of distribution parameters. The defined IMs can reflect the influence of the inputs on the output of the structural system with imprecise distribution parameters, respectively. Due to the large computational cost of the variance based IMs, sparse grid method is employed in this work to compute the variance based IMs at each reference point of distribution parameters. For the three imprecise distribution parameter cases, the sparse grid method and the combination of sparse grid method with genetic algorithm are used to compute the defined IMs. Numerical and engineering examples are em-ployed to demonstrate the rationality of the defined IMs and the efficiency of the applied methods.

  7. Analysis of low probability of intercept (LPI) radar signals using the Wigner Distribution

    OpenAIRE

    Gau, Jen-Yu

    2002-01-01

    Approved for public release, distribution is unlimited The parameters of Low Probability of Intercept (LPI) radar signals are hard to identify by using traditional periodogram signal processing techniques. Using the Wigner Distribution (WD), this thesis examines eight types of LPI radar signals. Signal to noise ratios of 0 dB and -6dB are also investigated. The eight types LPI radar signals examined include Frequency Modulation Continuous Wave (FMCW), Frank code, P1 code, P2 code, P3 code,...

  8. The probability distribution function for the sum of squares of independent random variables

    Science.gov (United States)

    Fateev, Yury; Dmitriev, Dmitry; Tyapkin, Valery; Kremez, Nikolai; Shaidurov, Vladimir

    2016-08-01

    In the present paper, the probability distribution function is derived for the sum of squares of random variables for nonzero expectations. This distribution function enables one to develop an efficient one-step algorithm for phase ambiguity resolution when determining the spatial orientation from signals of satellite radio-navigation systems. Threshold values for rejecting false solutions and statistical properties of the algorithm are obtained.

  9. Various Models for Pion Probability Distributions from Heavy-Ion Collisions

    OpenAIRE

    Mekjian, A. Z.; Schlei, B. R.; Strottman, D.

    1998-01-01

    Various models for pion multiplicity distributions produced in relativistic heavy ion collisions are discussed. The models include a relativistic hydrodynamic model, a thermodynamic description, an emitting source pion laser model, and a description which generates a negative binomial description. The approach developed can be used to discuss other cases which will be mentioned. The pion probability distributions for these various cases are compared. Comparison of the pion laser model and Bos...

  10. The exact probability distribution of the rank product statistics for replicated experiments

    OpenAIRE

    Eisinga, R.N.; Breitling, R.; Heskes, T.M.

    2013-01-01

    The rank product method is a widely accepted technique for detecting differentially regulated genes in replicated microarray experiments. To approximate the sampling distribution of the rank product statistic, the original publication proposed a permutation approach, whereas recently an alternative approximation based on the continuous gamma distribution was suggested. However, both approximations are imperfect for estimating small tail probabilities. In this paper we relate the rank product ...

  11. A note on the best invariant estimation of continuous probability distributions under mean square loss

    OpenAIRE

    Schürmann, Thomas

    2015-01-01

    We consider the nonparametric estimation problem of continuous probability distribution functions. For the integrated mean square error we provide the statistic corresponding to the best invariant estimator proposed by Aggarwal (1955) and Ferguson (1967). The table of critical values is computed and a numerical power comparison of the statistic with the traditional Cram\\'{e}r-von Mises statistic is done for several representative distributions.

  12. Providing probability distributions for the causal pathogen of clinical mastitis using naive Bayesian networks

    NARCIS (Netherlands)

    Steeneveld, W.; Gaag, van der L.C.; Barkema, H.W.; Hogeveen, H.

    2009-01-01

    Clinical mastitis (CM) can be caused by a wide variety of pathogens and farmers must start treatment before the actual causal pathogen is known. By providing a probability distribution for the causal pathogen, naive Bayesian networks (NBN) can serve as a management tool for farmers to decide which t

  13. Measurements of gas hydrate formation probability distributions on a quasi-free water droplet

    Science.gov (United States)

    Maeda, Nobuo

    2014-06-01

    A High Pressure Automated Lag Time Apparatus (HP-ALTA) can measure gas hydrate formation probability distributions from water in a glass sample cell. In an HP-ALTA gas hydrate formation originates near the edges of the sample cell and gas hydrate films subsequently grow across the water-guest gas interface. It would ideally be desirable to be able to measure gas hydrate formation probability distributions of a single water droplet or mist that is freely levitating in a guest gas, but this is technically challenging. The next best option is to let a water droplet sit on top of a denser, immiscible, inert, and wall-wetting hydrophobic liquid to avoid contact of a water droplet with the solid walls. Here we report the development of a second generation HP-ALTA which can measure gas hydrate formation probability distributions of a water droplet which sits on a perfluorocarbon oil in a container that is coated with 1H,1H,2H,2H-Perfluorodecyltriethoxysilane. It was found that the gas hydrate formation probability distributions of such a quasi-free water droplet were significantly lower than those of water in a glass sample cell.

  14. Calculation of the Multivariate Probability Distribution Function Values and their Gradient Vectors

    OpenAIRE

    Szantai, T.

    1987-01-01

    The described collection of subroutines developed for calculation of values of multivariate normal, Dirichlet and gamma distribution functions and their gradient vectors is an unique tool that can be used e.g. to compute the Loss-of-Load Probability of electric networks and to solve optimization problems with a reliability constraint.

  15. Establishment and optimization of project investment risk income models on the basis of probability χ distribution

    Institute of Scientific and Technical Information of China (English)

    LU Wei-ji; CUI Wei

    2001-01-01

    In this paper, two kinds of models are presented and optimized for pro ject investment risk income on the basis of probability χ distribution. One kin d of model being proved has only a maximal value and another kind being proved h as no extreme values.

  16. Criticality of the net-baryon number probability distribution at finite density

    OpenAIRE

    Kenji Morita; Bengt Friman; Krzysztof Redlich

    2014-01-01

    We compute the probability distribution $P(N)$ of the net-baryon number at finite temperature and quark-chemical potential, $\\mu$, at a physical value of the pion mass in the quark-meson model within the functional renormalization group scheme. For $\\mu/T

  17. Criticality of the net-baryon number probability distribution at finite density

    OpenAIRE

    Morita, Kenji; Friman, Bengt; Redlich, Krzysztof

    2015-01-01

    We compute the probability distribution P(N) of the net-baryon number at finite temperature and quark-chemical potential, μ , at a physical value of the pion mass in the quark-meson model within the functional renormalization group scheme. For μ/T

  18. Simulations of Seasonal and Latitudinal Variations in Leaf Inclination Angle Distribution: Implications for Remote Sensing

    Science.gov (United States)

    Huemmrich, Karl F.

    2013-01-01

    The leaf inclination angle distribution (LAD) is an important characteristic of vegetation canopy structure affecting light interception within the canopy. However, LADs are difficult and time consuming to measure. To examine possible global patterns of LAD and their implications in remote sensing, a model was developed to predict leaf angles within canopies. Canopies were simulated using the SAIL radiative transfer model combined with a simple photosynthesis model. This model calculated leaf inclination angles for horizontal layers of leaves within the canopy by choosing the leaf inclination angle that maximized production over a day in each layer. LADs were calculated for five latitude bands for spring and summer solar declinations. Three distinct LAD types emerged: tropical, boreal, and an intermediate temperate distribution. In tropical LAD, the upper layers have a leaf angle around 35 with the lower layers having horizontal inclination angles. While the boreal LAD has vertical leaf inclination angles throughout the canopy. The latitude bands where each LAD type occurred changed with the seasons. The different LADs affected the fraction of absorbed photosynthetically active radiation (fAPAR) and Normalized Difference Vegetation Index (NDVI) with similar relationships between fAPAR and leaf area index (LAI), but different relationships between NDVI and LAI for the different LAD types. These differences resulted in significantly different relationships between NDVI and fAPAR for each LAD type. Since leaf inclination angles affect light interception, variations in LAD also affect the estimation of leaf area based on transmittance of light or lidar returns.

  19. INVESTIGATION OF INFLUENCE OF ENCODING FUNCTION COMPLEXITY ON DISTRIBUTION OF ERROR MASKING PROBABILITY

    Directory of Open Access Journals (Sweden)

    A. B. Levina

    2016-03-01

    Full Text Available Error detection codes are mechanisms that enable robust delivery of data in unreliable communication channels and devices. Unreliable channels and devices are error-prone objects. Respectively, error detection codes allow detecting such errors. There are two classes of error detecting codes - classical codes and security-oriented codes. The classical codes have high percentage of detected errors; however, they have a high probability to miss an error in algebraic manipulation. In order, security-oriented codes are codes with a small Hamming distance and high protection to algebraic manipulation. The probability of error masking is a fundamental parameter of security-oriented codes. A detailed study of this parameter allows analyzing the behavior of the error-correcting code in the case of error injection in the encoding device. In order, the complexity of the encoding function plays an important role in the security-oriented codes. Encoding functions with less computational complexity and a low probability of masking are the best protection of encoding device against malicious acts. This paper investigates the influence of encoding function complexity on the error masking probability distribution. It will be shownthat the more complex encoding function reduces the maximum of error masking probability. It is also shown in the paper that increasing of the function complexity changes the error masking probability distribution. In particular, increasing of computational complexity decreases the difference between the maximum and average value of the error masking probability. Our resultshave shown that functions with greater complexity have smoothed maximums of error masking probability, which significantly complicates the analysis of error-correcting code by attacker. As a result, in case of complex encoding function the probability of the algebraic manipulation is reduced. The paper discusses an approach how to measure the error masking

  20. Digital simulation of two-dimensional random fields with arbitrary power spectra and non-Gaussian probability distribution functions

    DEFF Research Database (Denmark)

    Yura, Harold; Hanson, Steen Grüner

    2012-01-01

    Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the...... desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative...

  1. Pitch angle distribution measurements of an electron gun for electron transport experiments in stellarators

    International Nuclear Information System (INIS)

    The pitch angle distributions of a mesh type electron gun for the electron transport experiments in stellarators are estimated with the electron re-entry effect and the measurement method for pitch angle distribution by means of a local mirror field is proposed. It is found that the electron re-entry effect is significant to design electron guns for the electron transport experiment in stellarators because the high pitch angle electrons which re-enter into the gun launch again with a low pitch angle. The compensation method for an error field on the quasi helically symmetric stellarator HSX is also proposed. It is found that the additional toroidal mirror modes [n,m] = [3,0], [4,0] can eliminate a dangerous error field model [-1, -1] like the earth field. Here, n and m are toroidal and poloidal modes. (author)

  2. Earthquake probabilities and magnitude distribution (M≥6.7) along the Haiyuan fault, northwestern China

    Institute of Scientific and Technical Information of China (English)

    冉洪流

    2004-01-01

    In recent years, some researchers have studied the paleoearthquake along the Haiyuan fault and revealed a lot of paleoearthquake events. All available information allows more reliable analysis of earthquake recurrence interval and earthquake rupture patterns along the Haiyuan fault. Based on this paleoseismological information, the recurrence probability and magnitude distribution for M≥6.7 earthquakes in future 100 years along the Haiyuan fault can be obtained through weighted computation by using Poisson and Brownian passage time models and considering different rupture patterns. The result shows that the recurrence probability of MS≥6.7 earthquakes is about 0.035 in future 100 years along the Haiyuan fault.

  3. Improving quality of sample entropy estimation for continuous distribution probability functions

    Science.gov (United States)

    Miśkiewicz, Janusz

    2016-05-01

    Entropy is a one of the key parameters characterizing state of system in statistical physics. Although, the entropy is defined for systems described by discrete and continuous probability distribution function (PDF), in numerous applications the sample entropy is estimated by a histogram, which, in fact, denotes that the continuous PDF is represented by a set of probabilities. Such a procedure may lead to ambiguities and even misinterpretation of the results. Within this paper, two possible general algorithms based on continuous PDF estimation are discussed in the application to the Shannon and Tsallis entropies. It is shown that the proposed algorithms may improve entropy estimation, particularly in the case of small data sets.

  4. Quantile selection procedure and assoiated distribution of ratios of order statistics from a restricted family of probability distributions

    International Nuclear Information System (INIS)

    A quantile selection procedure in reliability problems pertaining to a restricted family of probability distributions is discussed. This family is assumed to be star-ordered with respect to the standard normal distribution folded at the origin. Motivation for this formulation of the problem is described. Both exact and asymptotic results dealing with the distribution of the maximum of ratios of order statistics from such a family are obtained and tables of the appropriate constants, percentiles of this statistic, are given in order to facilitate the use of the selection procedure

  5. Probability distribution and the boundary value problem in noncommutative quantum mechanics

    International Nuclear Information System (INIS)

    Full text: Non-commutative quantum mechanics (NCQM) still has some important open questions, such as, for example, the correct definition of the probability density and the consistent formulation of the boundary value problem. The main difficulty relies on the fact that in a non-commutative space the classical notion of point has no operational meaning. Besides that, it is well known that in NCQM the ordinary definition of probability density does not satisfy the continuity equation, thus being physically inadequate to this context. As a consequence, the formulation of the boundary value problem in NCQM is ill-defined, since the confining conditions for a particle trapped in a closed region are often formulated in terms of the properties of the probability density at the boundaries of such a region. In this work we solve both problems in a unified way. We consider a two-dimensional configuration space generated by two non-commutative coordinates satisfying a canonical commutation relation. This non-commutative space is formally equal to the phase space of a quantum particle moving in a line, what suggests an approach based on the Wigner formulation of quantum mechanics. We introduce a quasi-probability distribution function, constructed by means of the Moyal product of functions. By making use of the operation of partial trace we construct a normalizable, positive-definite function. We demonstrate that this function satisfy the continuity equation, so that it can be interpreted as a probability density function, thus providing a physically consistent probabilistic interpretation for NCQM. Even though the probability density contains all the available information about the physical system, it is useful to formulate the boundary value problem in terms of wave functions fulfilling some appropriated differential equation. By making use of harmonic analysis we introduce an auxiliary wave function, which is related to the physical probability density in the same way as

  6. Explicit Expressions for the Ruin Probabilities of Erlang Risk Processes with Pareto Individual Claim Distributions

    Institute of Scientific and Technical Information of China (English)

    Li Wei; Hai-liang Yang

    2004-01-01

    In this paper we first consider a risk process in which claim inter-arrival times and the time until the first claim have an Erlang (2)distribution.An explicit solution is derived for the probability of ultimate ruin,given an initial reserve of u when the claim size follows a Pareto distribution.Follow Ramsay [8] ,Laplace transforms and exponential integrals are used to derive the solution,which involves a single integral of real valued functions along the positive real line,and the integrand is not of an oscillating kind.Then we show that the ultimate ruin probability can be expressed as the sum of expected values of functions of two different Gamma random variables.Finally,the results are extended to the Erlang(n)case.Numerical examples are given to illustrate the main results.

  7. Comparison of Probability Distribution Function in Determining Minimum Annual and Monthly Streamflow

    Directory of Open Access Journals (Sweden)

    Nícolas Reinaldo Finkler

    2015-11-01

    Full Text Available In this study it was aimed to provide the foundation studies of water availability in the Arroio Belo basin, in Caxias do Sul/RS. Therefore, this study aimed to analyze the application of Weibull, Normal, LogNormal, Gumbel (minimum, LogPearson and Pearson theoretical probability functions to data of minimum streamflows for seven consecutive days of the basin. The analysis had two approaches: application in annual data, and then on monthly data, considering seasonality. To verify the adherence to the estimated probabilities of observed frequencies, we applied three tests: Kolmogorov-Smirnov, Anderson-Darling and Chi-Squared. The results show that the Log-Pearson III distribution shows greater accuracy in representing the annual data of the series and reach the best fit of the minimum streamflow. The monthly data analysis indicated the use of the distribution Pearson III, which showed higher suitability to the minimum streamflow data.

  8. Rank-Ordered Multifractal Analysis (ROMA of probability distributions in fluid turbulence

    Directory of Open Access Journals (Sweden)

    C. C. Wu

    2011-04-01

    Full Text Available Rank-Ordered Multifractal Analysis (ROMA was introduced by Chang and Wu (2008 to describe the multifractal characteristic of intermittent events. The procedure provides a natural connection between the rank-ordered spectrum and the idea of one-parameter scaling for monofractals. This technique has successfully been applied to MHD turbulence simulations and turbulence data observed in various space plasmas. In this paper, the technique is applied to the probability distributions in the inertial range of the turbulent fluid flow, as given in the vast Johns Hopkins University (JHU turbulence database. In addition, a new way of finding the continuous ROMA spectrum and the scaled probability distribution function (PDF simultaneously is introduced.

  9. Learning algorithms and probability distributions in feed-forward and feed-back networks

    OpenAIRE

    Hopfield, J J

    1987-01-01

    Learning algorithms have been used both on feed-forward deterministic networks and on feed-back statistical networks to capture input-output relations and do pattern classification. These learning algorithms are examined for a class of problems characterized by noisy or statistical data, in which the networks learn the relation between input data and probability distributions of answers. In simple but nontrivial networks the two learning rules are closely related. Under some circumstances the...

  10. An experimental study of the surface elevation probability distribution and statistics of wind-generated waves

    Science.gov (United States)

    Huang, N. E.; Long, S. R.

    1980-01-01

    Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.

  11. Sampling the Probability Distribution of Type Ia Supernova Lightcurve Parameters in Cosmological Analysis

    OpenAIRE

    Dai, Mi; Wang, Yun

    2015-01-01

    In order to obtain robust cosmological constraints from Type Ia supernova (SN Ia) data, we have applied Markov Chain Monte Carlo (MCMC) to SN Ia lightcurve fitting. We develop a method for sampling the resultant probability density distributions (pdf) of the SN Ia lightcuve parameters in the MCMC likelihood analysis to constrain cosmological parameters, and validate it using simulated data sets. Applying this method to the Joint Lightcurve Analysis (JLA) data set of SNe Ia, we find that sampl...

  12. Wigner Function and Phase Probability Distribution of q-Analogueof Squeezed One-Photon State

    Institute of Scientific and Technical Information of China (English)

    FANG Jian-Shu; MENG Xiang-Guo; ZHANG Xiang-Ping; WANG Ji-Suo; LIANG Bao-Long

    2008-01-01

    In this paper, in terms of the technique of integration within an ordered product (IWOP) of operators and the properties of the inverses of q-deformed annihilation and creation operators, normalizable q-analogue of the squeezed one-photon state, which is quite different from one introduced by Song and Fan [Int. 3. Theor. Phys. 41 (2002) 695], is constructed. Moreover, the Wigner function and phase probability distribution of q-analogue of the squeezed one-photon state are examined.

  13. Optimal design of unit hydrographs using probability distribution and genetic algorithms

    Indian Academy of Sciences (India)

    Rajib Kumar Bhattacharjya

    2004-10-01

    A nonlinear optimization model is developed to transmute a unit hydrograph into a probability distribution function (PDF). The objective function is to minimize the sum of the square of the deviation between predicted and actual direct runoff hydrograph of a watershed. The predicted runoff hydrograph is estimated by using a PDF. In a unit hydrograph, the depth of rainfall excess must be unity and the ordinates must be positive. Incorporation of a PDF ensures that the depth of rainfall excess for the unit hydrograph is unity, and the ordinates are also positive. Unit hydrograph ordinates are in terms of intensity of rainfall excess on a discharge per unit catchment area basis, the unit area thus representing the unit rainfall excess. The proposed method does not have any constraint. The nonlinear optimization formulation is solved using binary-coded genetic algorithms. The number of variables to be estimated by optimization is the same as the number of probability distribution parameters; gamma and log-normal probability distributions are used. The existing nonlinear programming model for obtaining optimal unit hydrograph has also been solved using genetic algorithms, where the constrained nonlinear optimization problem is converted to an unconstrained problem using penalty parameter approach. The results obtained are compared with those obtained by the earlier LP model and are fairly similar.

  14. Probability distribution of pitting corrosion depth and rate in underground pipelines: A Monte Carlo study

    International Nuclear Information System (INIS)

    The probability distributions of external-corrosion pit depth and pit growth rate were investigated in underground pipelines using Monte Carlo simulations. The study combines a predictive pit growth model developed by the authors with the observed distributions of the model variables in a range of soils. Depending on the pipeline age, any of the three maximal extreme value distributions, i.e. Weibull, Frechet or Gumbel, can arise as the best fit to the pitting depth and rate data. The Frechet distribution best fits the corrosion data for long exposure periods. This can be explained by considering the long-term stabilization of the diffusion-controlled pit growth. The findings of the study provide reliability analysts with accurate information regarding the stochastic characteristics of the pitting damage in underground pipelines.

  15. Probability distribution of pitting corrosion depth and rate in underground pipelines: A Monte Carlo study

    Energy Technology Data Exchange (ETDEWEB)

    Caleyo, F. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico)], E-mail: fcaleyo@gmail.com; Velazquez, J.C. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico); Valor, A. [Facultad de Fisica, Universidad de La Habana, San Lazaro y L, Vedado, 10400, La Habana (Cuba); Hallen, J.M. [Departamento de Ingenieria Metalurgica, ESIQIE, IPN, UPALM Edif. 7, Zacatenco, 07738 Mexico, D.F. (Mexico)

    2009-09-15

    The probability distributions of external-corrosion pit depth and pit growth rate were investigated in underground pipelines using Monte Carlo simulations. The study combines a predictive pit growth model developed by the authors with the observed distributions of the model variables in a range of soils. Depending on the pipeline age, any of the three maximal extreme value distributions, i.e. Weibull, Frechet or Gumbel, can arise as the best fit to the pitting depth and rate data. The Frechet distribution best fits the corrosion data for long exposure periods. This can be explained by considering the long-term stabilization of the diffusion-controlled pit growth. The findings of the study provide reliability analysts with accurate information regarding the stochastic characteristics of the pitting damage in underground pipelines.

  16. Probability distributions in a two-parameter scaling theory of localization

    Science.gov (United States)

    Heinrichs, J.

    1988-06-01

    Probability distributions for the resistance of two- and three-dimensional disordered conductors are studied using a Migdal-Kadanoff-type scaling transformation together with the author's previously derived distributions in one dimension. The present treatment differs from earlier work in two respects: On one hand, it includes the effect of an average potential barrier V experienced by an electron originating from the perfect leads which connect the conductor to a constant-voltage source; on the other hand, the input distribution for one-dimensional systems is based on an exact solution for the effect of the random potential on the complex reflection amplitude of an electron at a certain energy. The scaling equation for probability distributions and for their successive moments are parametrized in terms of the mean resistance, ρ¯, and of a fixed parameter γ related to V. Hence they correspond to a special form of two-parameter scaling. A mobility edge, ρ¯≡ρc, exists only for d>2 and, for d=3, detailed results for ρc, for the conductivity exponent ν, and for the fixed resistance distribution at ρc as a function of γ are presented. The asymptotic distribution of resistance away from the mobility edge for d=3, and in both small- and large-resistance regimes for d=2 are also studied. In the metallic regime for d>2 our treatment yields two distinct distributions, one of which is characterized by Ohm's law for the mean resistance and the other one by Ohm's law for the mean conductance. In the latter case the fluctuations of conductivity are independent of sample size for large samples. The calculated distributions are generally broad and in the localized regime, for d=3 and d=2, the rms values of resistance dominate the mean values in the infinite-sample limit.

  17. Assessing protein conformational sampling methods based on bivariate lag-distributions of backbone angles

    KAUST Repository

    Maadooliat, Mehdi

    2012-08-27

    Despite considerable progress in the past decades, protein structure prediction remains one of the major unsolved problems in computational biology. Angular-sampling-based methods have been extensively studied recently due to their ability to capture the continuous conformational space of protein structures. The literature has focused on using a variety of parametric models of the sequential dependencies between angle pairs along the protein chains. In this article, we present a thorough review of angular-sampling-based methods by assessing three main questions: What is the best distribution type to model the protein angles? What is a reasonable number of components in a mixture model that should be considered to accurately parameterize the joint distribution of the angles? and What is the order of the local sequence-structure dependency that should be considered by a prediction method? We assess the model fits for different methods using bivariate lag-distributions of the dihedral/planar angles. Moreover, the main information across the lags can be extracted using a technique called Lag singular value decomposition (LagSVD), which considers the joint distribution of the dihedral/planar angles over different lags using a nonparametric approach and monitors the behavior of the lag-distribution of the angles using singular value decomposition. As a result, we developed graphical tools and numerical measurements to compare and evaluate the performance of different model fits. Furthermore, we developed a web-tool (http://www.stat.tamu. edu/~madoliat/LagSVD) that can be used to produce informative animations. © The Author 2012. Published by Oxford University Press.

  18. Photometric Redshift Probability Distributions for Galaxies in the SDSS DR8

    CERN Document Server

    Sheldon, Erin S; Mandelbaum, Rachel; Brinkmann, J; Weaver, Benjamin A

    2011-01-01

    We present redshift probability distributions for galaxies in the SDSS DR8 imaging data. We used the nearest-neighbor weighting algorithm presented in Lima et al. 2008 and Cunha et al. 2009 to derive the ensemble redshift distribution N(z), and individual redshift probability distributions P(z) for galaxies with r < 21.8. As part of this technique, we calculated weights for a set of training galaxies with known redshifts such that their density distribution in five dimensional color-magnitude space was proportional to that of the photometry-only sample, producing a nearly fair sample in that space. We then estimated the ensemble N(z) of the photometric sample by constructing a weighted histogram of the training set redshifts. We derived P(z) s for individual objects using the same technique, but limiting to training set objects from the local color-magnitude space around each photometric object. Using the P(z) for each galaxy, rather than an ensemble N(z), can reduce the statistical error in measurements t...

  19. Criticality of the net-baryon number probability distribution at finite density

    CERN Document Server

    Morita, Kenji; Redlich, Krzysztof

    2014-01-01

    We compute the probability distribution $P(N)$ of the net-baryon number at finite temperature and quark-chemical potential, $\\mu$, at a physical value of the pion mass in the quark-meson model within the functional renormalization group scheme. For $\\mu/T<1$, the model exhibits the chiral crossover transition which belongs to the universality class of the $O(4)$ spin system in three dimensions. We explore the influence of the chiral crossover transition on the properties of the net baryon number probability distribution, $P(N)$. By considering ratios of $P(N)$ to the Skellam function, with the same mean and variance, we unravel the characteristic features of the distribution that are related to $O(4)$ criticality at the chiral crossover transition. We explore the corresponding ratios for data obtained at RHIC by the STAR Collaboration and discuss their implications. We also examine $O(4)$ criticality in the context of binomial and negative-binomial distributions for the net proton number.

  20. Research on Energy-Saving Design of Overhead Travelling Crane Camber Based on Probability Load Distribution

    Directory of Open Access Journals (Sweden)

    Tong Yifei

    2014-01-01

    Full Text Available Crane is a mechanical device, used widely to move materials in modern production. It is reported that the energy consumptions of China are at least 5–8 times of other developing countries. Thus, energy consumption becomes an unavoidable topic. There are several reasons influencing the energy loss, and the camber of the girder is the one not to be neglected. In this paper, the problem of the deflections induced by the moving payload in the girder of overhead travelling crane is examined. The evaluation of a camber giving a counterdeflection of the girder is proposed in order to get minimum energy consumptions for trolley to move along a nonstraight support. To this aim, probabilistic payload distributions are considered instead of fixed or rated loads involved in other researches. Taking 50/10 t bridge crane as a research object, the probability loads are determined by analysis of load distribution density functions. According to load distribution, camber design under different probability loads is discussed in detail as well as energy consumptions distribution. The research results provide the design reference of reasonable camber to obtain the least energy consumption for climbing corresponding to different P0; thus energy-saving design can be achieved.

  1. Criticality of the net-baryon number probability distribution at finite density

    Directory of Open Access Journals (Sweden)

    Kenji Morita

    2015-02-01

    Full Text Available We compute the probability distribution P(N of the net-baryon number at finite temperature and quark-chemical potential, μ, at a physical value of the pion mass in the quark-meson model within the functional renormalization group scheme. For μ/T<1, the model exhibits the chiral crossover transition which belongs to the universality class of the O(4 spin system in three dimensions. We explore the influence of the chiral crossover transition on the properties of the net baryon number probability distribution, P(N. By considering ratios of P(N to the Skellam function, with the same mean and variance, we unravel the characteristic features of the distribution that are related to O(4 criticality at the chiral crossover transition. We explore the corresponding ratios for data obtained at RHIC by the STAR Collaboration and discuss their implications. We also examine O(4 criticality in the context of binomial and negative-binomial distributions for the net proton number.

  2. Analytical models of probability distribution and excess noise factor of solid state photomultiplier signals with crosstalk

    International Nuclear Information System (INIS)

    Silicon Photomultipliers (SiPM), also called Solid State Photomultipliers (SSPM), are based on Geiger mode avalanche breakdown that is limited by a strong negative feedback. An SSPM can detect and resolve single photons due to the high gain and ultra-low excess noise of avalanche multiplication in this mode. Crosstalk and afterpulsing processes associated with the high gain introduce specific excess noise and deteriorate the photon number resolution of the SSPM. The probabilistic features of these processes are widely studied because of its significance for the SSPM design, characterization, optimization and application, but the process modeling is mostly based on Monte Carlo simulations and numerical methods. In this study, crosstalk is considered to be a branching Poisson process, and analytical models of probability distribution and excess noise factor (ENF) of SSPM signals based on the Borel distribution as an advance on the geometric distribution models are presented and discussed. The models are found to be in a good agreement with the experimental probability distributions for dark counts and a few photon spectrums in a wide range of fired pixels number as well as with observed super-linear behavior of crosstalk ENF.

  3. Measurement of angle distribution in multiple scattering by track digitization method

    International Nuclear Information System (INIS)

    The multiple scattering of β-ray with neon gas atoms is studied by use of the projection spark chamber. The tracks are digitized and analysed on-line to give the projected angle distribution. The present data are compared with the Moliere theory. (author)

  4. Numerical simulation of alpha hit probability distributions in sensitive bronchial epithelial cells by inhaling radon progenies

    International Nuclear Information System (INIS)

    The general objective of our research is the modelling of physical and biological processes related to the development of adverse health effects following the inhalation of radioaerosols, especially the initiation of lung cancer in central human airways by the inspiration of radon progenies. There is experimental evidence that bronchogenic carcinomas originate mainly in the vicinity of the carinal ridge of the large bronchial airways where primary hot spots of deposition have been found. In case of uranium miners, more than ninety percent of the registered lung cancer formations have occurred in this region of the lung. However, current lung deposition models do not take into consideration the inhomogeneity of deposition within the airways. In the present study, cellular deposition pattern, alpha-track and DNA hit probability distributions of inhaled radon progenies in the upper and central human airway epithelial cells are computed with a computational fluid particle dynamics model. Our computer programme generates the three-dimensional morphologically realistic geometry of the upper and central airways. The flow fields within these airways are simulated by the FLUENT CFD (computational fluid dynamics) code at wide range of flow rates. Large number of attached and unattached radon progeny trajectories is simulated by our particle trajectory code to determine the proper deposition, activity patterns and alpha-track distributions on the surface of the airways. Three-dimensional distribution of secretory and basal cells are constructed. Finally, the number of DNA hits and hit probability distributions are quantified. Computed deposition, activity and hit probability patterns are strongly inhomogeneous at all realistic parameter selections and are sensitive to the shape of the geometry. Hot spots of alpha hits are found at the cranial region and at the inner sides of the daughter airways during inhalation and, with lower intensity, at the top and bottom sides of the

  5. Analytical models of probability distribution and excess noise factor of Solid State Photomultiplier signals with crosstalk

    CERN Document Server

    Vinogradov, S

    2011-01-01

    Silicon Photomultipliers (SiPM), also so-called Solid State Photomultipliers (SSPM), are based on Geiger mode avalanche breakdown limited by strong negative feedback. SSPM can detect and resolve single photons due to high gain and ultra-low excess noise of avalanche multiplication in this mode. Crosstalk and afterpulsing processes associated with the high gain introduce specific excess noise and deteriorate photon number resolution of the SSPM. Probabilistic features of these processes are widely studied because of its high importance for the SSPM design, characterization, optimization and application, but the process modeling is mostly based on Monte Carlo simulations and numerical methods. In this study, crosstalk is considered to be a branching Poisson process, and analytical models of probability distribution and excess noise factor (ENF) of SSPM signals based on the Borel distribution as an advance on the geometric distribution models are presented and discussed. The models are found to be in a good agre...

  6. Investigation of photon detection probability dependence of SPADnet-I digital photon counter as a function of angle of incidence, wavelength and polarization

    Energy Technology Data Exchange (ETDEWEB)

    Játékos, Balázs, E-mail: jatekosb@eik.bme.hu; Ujhelyi, Ferenc; Lőrincz, Emőke; Erdei, Gábor

    2015-01-01

    SPADnet-I is a prototype, fully digital, high spatial and temporal resolution silicon photon counter, based on standard CMOS imaging technology, developed by the SPADnet consortium. Being a novel device, the exact dependence of photon detection probability (PDP) of SPADnet-I was not known as a function of angle of incidence, wavelength and polarization of the incident light. Our targeted application area of this sensor is next generation PET detector modules, where they will be used along with LYSO:Ce scintillators. Hence, we performed an extended investigation of PDP in a wide range of angle of incidence (0° to 80°), concentrating onto a 60 nm broad wavelength interval around the characteristic emission peak (λ=420 nm) of the scintillator. In the case where the sensor was optically coupled to a scintillator, our experiments showed a notable dependence of PDP on angle, polarization and wavelength. The sensor has an average PDP of approximately 30% from 0° to 60° angle of incidence, where it starts to drop rapidly. The PDP turned out not to be polarization dependent below 30°. If the sensor is used without a scintillator (i.e. the light source is in air), the polarization dependence is much less expressed, it begins only from 50°.

  7. Investigation of photon detection probability dependence of SPADnet-I digital photon counter as a function of angle of incidence, wavelength and polarization

    International Nuclear Information System (INIS)

    SPADnet-I is a prototype, fully digital, high spatial and temporal resolution silicon photon counter, based on standard CMOS imaging technology, developed by the SPADnet consortium. Being a novel device, the exact dependence of photon detection probability (PDP) of SPADnet-I was not known as a function of angle of incidence, wavelength and polarization of the incident light. Our targeted application area of this sensor is next generation PET detector modules, where they will be used along with LYSO:Ce scintillators. Hence, we performed an extended investigation of PDP in a wide range of angle of incidence (0° to 80°), concentrating onto a 60 nm broad wavelength interval around the characteristic emission peak (λ=420 nm) of the scintillator. In the case where the sensor was optically coupled to a scintillator, our experiments showed a notable dependence of PDP on angle, polarization and wavelength. The sensor has an average PDP of approximately 30% from 0° to 60° angle of incidence, where it starts to drop rapidly. The PDP turned out not to be polarization dependent below 30°. If the sensor is used without a scintillator (i.e. the light source is in air), the polarization dependence is much less expressed, it begins only from 50°

  8. The probability distribution functions of emission line flux measurements and their ratios

    CERN Document Server

    Wesson, R; Scicluna, P

    2016-01-01

    Many physical parameters in astrophysics are derived using the ratios of two observed quantities. If the relative uncertainties on measurements are small enough, uncertainties can be propagated analytically using simplifying assumptions, but for large normally distributed uncertainties, the probability distribution of the ratio become skewed, with a modal value offset from that expected in Gaussian uncertainty propagation. Furthermore, the most likely value of a ratio A/B is not equal to the reciprocal of the most likely value of B/A. The effect is most pronounced when the uncertainty on the denominator is larger than that on the numerator. We show that this effect is seen in an analysis of 12,126 spectra from the Sloan Digital Sky Survey. The intrinsically fixed ratio of the [O III] lines at 4959 and 5007 ${\\AA}$ is conventionally expressed as the ratio of the stronger line to the weaker line. Thus, the uncertainty on the denominator is larger, and non-Gaussian probability distributions result. By taking thi...

  9. Impact of spike train autostructure on probability distribution of joint spike events.

    Science.gov (United States)

    Pipa, Gordon; Grün, Sonja; van Vreeswijk, Carl

    2013-05-01

    The discussion whether temporally coordinated spiking activity really exists and whether it is relevant has been heated over the past few years. To investigate this issue, several approaches have been taken to determine whether synchronized events occur significantly above chance, that is, whether they occur more often than expected if the neurons fire independently. Most investigations ignore or destroy the autostructure of the spiking activity of individual cells or assume Poissonian spiking as a model. Such methods that ignore the autostructure can significantly bias the coincidence statistics. Here, we study the influence of the autostructure on the probability distribution of coincident spiking events between tuples of mutually independent non-Poisson renewal processes. In particular, we consider two types of renewal processes that were suggested as appropriate models of experimental spike trains: a gamma and a log-normal process. For a gamma process, we characterize the shape of the distribution analytically with the Fano factor (FFc). In addition, we perform Monte Carlo estimations to derive the full shape of the distribution and the probability for false positives if a different process type is assumed as was actually present. We also determine how manipulations of such spike trains, here dithering, used for the generation of surrogate data change the distribution of coincident events and influence the significance estimation. We find, first, that the width of the coincidence count distribution and its FFc depend critically and in a nontrivial way on the detailed properties of the structure of the spike trains as characterized by the coefficient of variation CV. Second, the dependence of the FFc on the CV is complex and mostly nonmonotonic. Third, spike dithering, even if as small as a fraction of the interspike interval, can falsify the inference on coordinated firing. PMID:23470124

  10. Spatially-constrained probability distribution model of incoherent motion (SPIM) for abdominal diffusion-weighted MRI.

    Science.gov (United States)

    Kurugol, Sila; Freiman, Moti; Afacan, Onur; Perez-Rossello, Jeannette M; Callahan, Michael J; Warfield, Simon K

    2016-08-01

    Quantitative diffusion-weighted MR imaging (DW-MRI) of the body enables characterization of the tissue microenvironment by measuring variations in the mobility of water molecules. The diffusion signal decay model parameters are increasingly used to evaluate various diseases of abdominal organs such as the liver and spleen. However, previous signal decay models (i.e., mono-exponential, bi-exponential intra-voxel incoherent motion (IVIM) and stretched exponential models) only provide insight into the average of the distribution of the signal decay rather than explicitly describe the entire range of diffusion scales. In this work, we propose a probability distribution model of incoherent motion that uses a mixture of Gamma distributions to fully characterize the multi-scale nature of diffusion within a voxel. Further, we improve the robustness of the distribution parameter estimates by integrating spatial homogeneity prior into the probability distribution model of incoherent motion (SPIM) and by using the fusion bootstrap solver (FBM) to estimate the model parameters. We evaluated the improvement in quantitative DW-MRI analysis achieved with the SPIM model in terms of accuracy, precision and reproducibility of parameter estimation in both simulated data and in 68 abdominal in-vivo DW-MRIs. Our results show that the SPIM model not only substantially reduced parameter estimation errors by up to 26%; it also significantly improved the robustness of the parameter estimates (paired Student's t-test, p < 0.0001) by reducing the coefficient of variation (CV) of estimated parameters compared to those produced by previous models. In addition, the SPIM model improves the parameter estimates reproducibility for both intra- (up to 47%) and inter-session (up to 30%) estimates compared to those generated by previous models. Thus, the SPIM model has the potential to improve accuracy, precision and robustness of quantitative abdominal DW-MRI analysis for clinical applications. PMID

  11. Generalized Delta Functions and Their Use in Quasi-Probability Distributions

    OpenAIRE

    Brewster, R. A.; Franson, J. D.

    2016-01-01

    Quasi-probability distributions are an essential tool in analyzing the properties of quantum systems, especially in quantum optics. The Glauber-Sudarshan P-function $P(\\alpha)$ is especially useful for calculating the density matrix of a system, but it is often assumed that $P(\\alpha)$ may not exist for highly quantum-mechanical systems due to its singular nature. Here we define a generalized delta function with a complex argument and derive its properties, which are very different from those...

  12. Velocity-gradient probability distribution functions in a lagrangian model of turbulence

    International Nuclear Information System (INIS)

    The Recent Fluid Deformation Closure (RFDC) model of lagrangian turbulence is recast in path-integral language within the framework of the Martin–Siggia–Rose functional formalism. In order to derive analytical expressions for the velocity-gradient probability distribution functions (vgPDFs), we carry out noise renormalization in the low-frequency regime and find approximate extrema for the Martin–Siggia–Rose effective action. We verify, with the help of Monte Carlo simulations, that the vgPDFs so obtained yield a close description of the single-point statistical features implied by the original RFDC stochastic differential equations. (paper)

  13. Computationally Efficient Modulation Level Classification Based on Probability Distribution Distance Functions

    CERN Document Server

    Urriza, Paulo; Pawe\\lczak, Przemys\\law; \\vCabrić, Danijela

    2010-01-01

    We present a novel modulation level classification (MLC) method based on probability distribution distance functions. The proposed method uses modified Kuiper and Kolmogorov- Smirnov (KS) distances to achieve low computational complexity and outperforms the state of the art methods based on cumulants and goodness-of-fit (GoF) tests. We derive the theoretical performance of the proposed MLC method and verify it via simulations. The best classification accuracy under AWGN with SNR mismatch and phase jitter is achieved with the proposed MLC method using Kuiper distances.

  14. Comparison of Lauritzen-Spiegelhalter and successive restrictions algorithms for computing probability distributions in Bayesian networks

    Science.gov (United States)

    Smail, Linda

    2016-06-01

    The basic task of any probabilistic inference system in Bayesian networks is computing the posterior probability distribution for a subset or subsets of random variables, given values or evidence for some other variables from the same Bayesian network. Many methods and algorithms have been developed to exact and approximate inference in Bayesian networks. This work compares two exact inference methods in Bayesian networks-Lauritzen-Spiegelhalter and the successive restrictions algorithm-from the perspective of computational efficiency. The two methods were applied for comparison to a Chest Clinic Bayesian Network. Results indicate that the successive restrictions algorithm shows more computational efficiency than the Lauritzen-Spiegelhalter algorithm.

  15. Long-Term Probability Distribution of Wind Turbine Planetary Bearing Loads (Poster)

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Z.; Xing, Y.; Guo, Y.; Dong, W.; Moan, T.; Gao, Z.

    2013-04-01

    Among the various causes of bearing damage and failure, metal fatigue of the rolling contact surface is the dominant failure mechanism. The fatigue life is associated with the load conditions under which wind turbines operate in the field. Therefore, it is important to understand the long-term distribution of the bearing loads under various environmental conditions. The National Renewable Energy Laboratory's 750-kW Gearbox Reliability Collaborative wind turbine is studied in this work. A decoupled analysis using several computer codes is carried out. The global aero-elastic simulations are performed using HAWC2. The time series of the drivetrain loads and motions from the global dynamic analysis are fed to a drivetrain model in SIMPACK. The time-varying internal pressure distribution along the raceway is obtained analytically. A series of probability distribution functions are then used to fit the long-term statistical distribution at different locations along raceways. The long-term distribution of the bearing raceway loads are estimated under different environmental conditions. Finally, the bearing fatigue lives are calculated.

  16. Density probability distribution functions of diffuse gas in the Milky Way

    CERN Document Server

    Berkhuijsen, E M

    2008-01-01

    In a search for the signature of turbulence in the diffuse interstellar medium in gas density distributions, we determined the probability distribution functions (PDFs) of the average volume densities of the diffuse gas. The densities were derived from dispersion measures and HI column densities towards pulsars and stars at known distances. The PDFs of the average densities of the diffuse ionized gas (DIG) and the diffuse atomic gas are close to lognormal, especially when lines of sight at |b|=5 degrees are considered separately. The PDF of at high |b| is twice as wide as that at low |b|. The width of the PDF of the DIG is about 30 per cent smaller than that of the warm HI at the same latitudes. The results reported here provide strong support for the existence of a lognormal density PDF in the diffuse ISM, consistent with a turbulent origin of density structure in the diffuse gas.

  17. EVALUATION OF THE PROBABILITY DISTRIBUTION OF PITTING CORROSION FATIGUE LIFE IN AIRCRAFT MATERIALS

    Institute of Scientific and Technical Information of China (English)

    WANG Qingyuan (王清远); N.KAWAGOISHI; Q.CHEN; R.M.PIDAPARTI

    2003-01-01

    Corrosion and fatigue properties of aircraft materials are known to have a considerable scatter due to the random nature of materials,loading,and environmental conditions.A probabilistic approach for predicting the pitting corrosion fatigue life has been investigated which captures the effect of the interaction of the cyclic load and corrosive environment and all stages of the corrosion fatigue process (i.e.the pit nucleation and growth,pit-crack transition,short- and long-crack propagation).The probabilistic model investigated considers the uncertainties in the initial pit size,corrosion pitting current,and material properties due to the scatter found in the experimental data.Monte Carlo simulations were performed to define the failure probability distribution.Predicted cumulative distribution functions of fatigue life agreed reasonably well with the existing experimental data.

  18. PHOTOMETRIC REDSHIFT PROBABILITY DISTRIBUTIONS FOR GALAXIES IN THE SDSS DR8

    International Nuclear Information System (INIS)

    We present redshift probability distributions for galaxies in the Sloan Digital Sky Survey (SDSS) Data Release 8 imaging data. We used the nearest-neighbor weighting algorithm to derive the ensemble redshift distribution N(z), and individual redshift probability distributions P(z) for galaxies with r < 21.8 and u < 29.0. As part of this technique, we calculated weights for a set of training galaxies with known redshifts such that their density distribution in five-dimensional color-magnitude space was proportional to that of the photometry-only sample, producing a nearly fair sample in that space. We estimated the ensemble N(z) of the photometric sample by constructing a weighted histogram of the training-set redshifts. We derived P(z)'s for individual objects by using training-set objects from the local color-magnitude space around each photometric object. Using the P(z) for each galaxy can reduce the statistical error in measurements that depend on the redshifts of individual galaxies. The spectroscopic training sample is substantially larger than that used for the DR7 release. The newly added PRIMUS catalog is now the most important training set used in this analysis by a wide margin. We expect the primary sources of error in the N(z) reconstruction to be sample variance and spectroscopic failures: The training sets are drawn from relatively small volumes of space, and some samples have large incompleteness. Using simulations we estimated the uncertainty in N(z) due to sample variance at a given redshift to be ∼10%-15%. The uncertainty on calculations incorporating N(z) or P(z) depends on how they are used; we discuss the case of weak lensing measurements. The P(z) catalog is publicly available from the SDSS Web site.

  19. SEP distribution function and probability of the maximum magnitudes of events

    Science.gov (United States)

    Nymmik, Rikho

    Based on the current knowledge the magnitude of specific anticipated SEP event is a random variable taken from large array of expected values. This set of expected values can be determined in terms of the distribution function. Form of the distribution function of SЕР events is usually determined from the data of continuous satellite measurement. Sometimes but without much effect indirect evidences, such as isotopes of samples of lunar rocks, data on the density of radioactive isotopes in the annual rings of ancient trees are used to determine the SEPE distribution function. The most successful was the attempt to describe the distribution function for 21-23 solar cycles by power-low function with exponential cutoff in the area of large events. Significant addition to the available information are relatively new data (McCracken et al., JGR 106(A10), 21585-21498, 2001) on the radioactive isotopes in the Greenland ice, which gives the additional information about the extreme SEP events since 1561. However, the lack of information about full set of events (mainly on small events) does not allow to use these data directly to determine the distribution function. However, using correlation between the number of sunspots and the corresponding mean number of SEP events, one can determine the distribution function since 1561 based on Greenland data. Surprisingly, the parameter values of this function coincide with those calculated from satellite data. Analysis of the obtained parameters of the distribution function shows that the maximum fluence of protons with energies above 30 MeV does not exceed 1011 cm-2 protons with about 10-11 midget probability.

  20. The probability distribution of returns in the exponential Ornstein–Uhlenbeck model

    International Nuclear Information System (INIS)

    We analyze the problem of the analytical characterization of the probability distribution of financial returns in the exponential Ornstein–Uhlenbeck model with stochastic volatility. In this model the prices are driven by a geometric Brownian motion, whose diffusion coefficient is expressed through an exponential function of an hidden variable Y governed by a mean-reverting process. We derive closed-form expressions for the probability distribution and its characteristic function in two limit cases. In the first one the fluctuations of Y are larger than the volatility normal level, while the second one corresponds to the assumption of a small stationary value for the variance of Y. Theoretical results are tested numerically by intensive use of Monte Carlo simulations. The effectiveness of the analytical predictions is checked via a careful analysis of the parameters involved in the numerical implementation of the Euler–Maruyama scheme and is tested on a data set of financial indexes. In particular, we discuss results for the German DAX30 and Dow Jones Euro Stoxx 50, finding a good agreement between the empirical data and the theoretical description

  1. A Voting Based Approach to Detect Recursive Order Number of Photocopy Documents Using Probability Distributions

    Directory of Open Access Journals (Sweden)

    Rani K

    2014-08-01

    Full Text Available Photocopy documents are very common in our normal life. People are permitted to carry and present photocopied documents to avoid damages to the original documents. But this provision is misused for temporary benefits by fabricating fake photocopied documents. Fabrication of fake photocopied document is possible only in 2nd and higher order recursive order of photocopies. Whenever a photocopied document is submitted, it may be required to check its originality. When the document is 1st order photocopy, chances of fabrication may be ignored. On the other hand when the photocopy order is 2nd or above, probability of fabrication may be suspected. Hence when a photocopy document is presented, the recursive order number of photocopy is to be estimated to ascertain the originality. This requirement demands to investigate methods to estimate order number of photocopy. In this work, a voting based approach is used to detect the recursive order number of the photocopy document using probability distributions exponential, extreme values and lognormal distributions is proposed. A detailed experimentation is performed on a generated data set and the method exhibits efficiency close to 89%.

  2. Understanding the distinctively skewed and heavy tailed character of atmospheric and oceanic probability distributions

    International Nuclear Information System (INIS)

    The probability distributions of large-scale atmospheric and oceanic variables are generally skewed and heavy-tailed. We argue that their distinctive departures from Gaussianity arise fundamentally from the fact that in a quadratically nonlinear system with a quadratic invariant, the coupling coefficients between system components are not constant but depend linearly on the system state in a distinctive way. In particular, the skewness arises from a tendency of the system trajectory to linger near states of weak coupling. We show that the salient features of the observed non-Gaussianity can be captured in the simplest such nonlinear 2-component system. If the system is stochastically forced and linearly damped, with one component damped much more strongly than the other, then the strongly damped fast component becomes effectively decoupled from the weakly damped slow component, and its impact on the slow component can be approximated as a stochastic noise forcing plus an augmented nonlinear damping. In the limit of large time-scale separation, the nonlinear augmentation of the damping becomes small, and the noise forcing can be approximated as an additive noise plus a correlated additive and multiplicative noise (CAM noise) forcing. Much of the diversity of observed large-scale atmospheric and oceanic probability distributions can be interpreted in this minimal framework

  3. A one-parameter family of transforms, linearizing convolution laws for probability distributions

    Science.gov (United States)

    Nica, Alexandru

    1995-03-01

    We study a family of transforms, depending on a parameter q∈[0,1], which interpolate (in an algebraic framework) between a relative (namely: - iz(log ℱ(·)) '(-iz)) of the logarithm of the Fourier transform for probability distributions, and its free analogue constructed by D. Voiculescu ([16, 17]). The classical case corresponds to q=1, and the free one to q=0. We describe these interpolated transforms: (a) in terms of partitions of finite sets, and their crossings; (b) in terms of weighted shifts; (c) by a matrix equation related to the method of Stieltjes for expanding continued J-fractions as power series. The main result of the paper is that all these descriptions, which extend basic approaches used for q=0 and/or q=1, remain equivalent for arbitrary q∈[0, 1]. We discuss a couple of basic properties of the convolution laws (for probability distributions) which are linearized by the considered family of transforms (these convolution laws interpolate between the usual convolution — at q=1, and the free convolution introduced by Voiculescu — at q=0). In particular, we note that description (c) mentioned in the preceding paragraph gives an insight of why the central limit law for the interpolated convolution has to do with the q-continuous Hermite orthogonal polynomials.

  4. Understanding the distinctively skewed and heavy tailed character of atmospheric and oceanic probability distributions

    Science.gov (United States)

    Sardeshmukh, Prashant D.; Penland, Cécile

    2015-03-01

    The probability distributions of large-scale atmospheric and oceanic variables are generally skewed and heavy-tailed. We argue that their distinctive departures from Gaussianity arise fundamentally from the fact that in a quadratically nonlinear system with a quadratic invariant, the coupling coefficients between system components are not constant but depend linearly on the system state in a distinctive way. In particular, the skewness arises from a tendency of the system trajectory to linger near states of weak coupling. We show that the salient features of the observed non-Gaussianity can be captured in the simplest such nonlinear 2-component system. If the system is stochastically forced and linearly damped, with one component damped much more strongly than the other, then the strongly damped fast component becomes effectively decoupled from the weakly damped slow component, and its impact on the slow component can be approximated as a stochastic noise forcing plus an augmented nonlinear damping. In the limit of large time-scale separation, the nonlinear augmentation of the damping becomes small, and the noise forcing can be approximated as an additive noise plus a correlated additive and multiplicative noise (CAM noise) forcing. Much of the diversity of observed large-scale atmospheric and oceanic probability distributions can be interpreted in this minimal framework.

  5. Learning a Flexible K-Dependence Bayesian Classifier from the Chain Rule of Joint Probability Distribution

    Directory of Open Access Journals (Sweden)

    Limin Wang

    2015-06-01

    Full Text Available As one of the most common types of graphical models, the Bayesian classifier has become an extremely popular approach to dealing with uncertainty and complexity. The scoring functions once proposed and widely used for a Bayesian network are not appropriate for a Bayesian classifier, in which class variable C is considered as a distinguished one. In this paper, we aim to clarify the working mechanism of Bayesian classifiers from the perspective of the chain rule of joint probability distribution. By establishing the mapping relationship between conditional probability distribution and mutual information, a new scoring function, Sum_MI, is derived and applied to evaluate the rationality of the Bayesian classifiers. To achieve global optimization and high dependence representation, the proposed learning algorithm, the flexible K-dependence Bayesian (FKDB classifier, applies greedy search to extract more information from the K-dependence network structure. Meanwhile, during the learning procedure, the optimal attribute order is determined dynamically, rather than rigidly. In the experimental study, functional dependency analysis is used to improve model interpretability when the structure complexity is restricted.

  6. Probability distribution of primordial angular momentum and formation of massive black holes

    CERN Document Server

    Susa, H; Tanaka, T; Hajime Susa; Misao Sasaki; Takahiro Tanaka

    1994-01-01

    Abstract{ We consider the joint probability distribution function for the mass contrast and angular momentum of over-density regions on the proto- galactic scale and investigate the formation of massive black holes at redshift z\\gsim10. We estimate the growth rate of the angular momentum by the linear perturbation theory and the decay rate by the Compton drag and apply the Press-Schechter theory to obtain the formation rate of massive black holes, assuming the full reionization of the universe at z=z_{ion}\\gg 10. We find the correlation between the mass contrast and angular momentum vanishes in the linear theory. However, application of the Press-Schechter theory introduces a correlation between the mass contrast and angular momentum of bound objects. Using thus obtained probability distribution, we calculate the mass fraction of black holes with M\\sim10^6-10^8M_{\\odot} in the universe. We find that it crucially depends on the reionization epoch z_{ion}. Specifically, for the standard CDM power spectrum with ...

  7. Stress distributions in peri-miniscrew areas from cylindrical and tapered miniscrews inserted at different angles

    Science.gov (United States)

    Choi, Sung-Hwan; Kim, Seong-Jin; Lee, Kee-Joon; Sung, Sang-Jin; Chun, Youn-Sic

    2016-01-01

    Objective The purpose of this study was to analyze stress distributions in the roots, periodontal ligaments (PDLs), and bones around cylindrical and tapered miniscrews inserted at different angles using a finite element analysis. Methods We created a three-dimensional (3D) maxilla model of a dentition with extracted first premolars and used 2 types of miniscrews (tapered and cylindrical) with 1.45-mm diameters and 8-mm lengths. The miniscrews were inserted at 30°, 60°, and 90° angles with respect to the bone surface. A simulated horizontal orthodontic force of 2 N was applied to the miniscrew heads. Then, the stress distributions, magnitudes during miniscrew placement, and force applications were analyzed with a 3D finite element analysis. Results Stresses were primarily absorbed by cortical bone. Moreover, very little stress was transmitted to the roots, PDLs, and cancellous bone. During cylindrical miniscrew insertion, the maximum von Mises stress increased as insertion angle decreased. Tapered miniscrews exhibited greater maximum von Mises stress than cylindrical miniscrews. During force application, maximum von Mises stresses increased in both groups as insertion angles decreased. Conclusions For both cylindrical and tapered miniscrew designs, placement as perpendicular to the bone surface as possible is recommended to reduce stress in the surrounding bone. PMID:27478796

  8. Probability distribution of the index in gauge theory on 2d non-commutative geometry

    Science.gov (United States)

    Aoki, Hajime; Nishimura, Jun; Susaki, Yoshiaki

    2007-10-01

    We investigate the effects of non-commutative geometry on the topological aspects of gauge theory using a non-perturbative formulation based on the twisted reduced model. The configuration space is decomposed into topological sectors labeled by the index ν of the overlap Dirac operator satisfying the Ginsparg-Wilson relation. We study the probability distribution of ν by Monte Carlo simulation of the U(1) gauge theory on 2d non-commutative space with periodic boundary conditions. In general the distribution is asymmetric under ν mapsto -ν, reflecting the parity violation due to non-commutative geometry. In the continuum and infinite-volume limits, however, the distribution turns out to be dominated by the topologically trivial sector. This conclusion is consistent with the instanton calculus in the continuum theory. However, it is in striking contrast to the known results in the commutative case obtained from lattice simulation, where the distribution is Gaussian in a finite volume, but the width diverges in the infinite-volume limit. We also calculate the average action in each topological sector, and provide deeper understanding of the observed phenomenon.

  9. Difficulties arising from the representation of the measurand by a probability distribution

    International Nuclear Information System (INIS)

    This paper identifies difficulties associated with the concept of representing fixed unknown quantities by probability distributions. This concept, which we refer to as the distributed-measurand concept, is at the heart of the approach to the evaluation of measurement uncertainty described in Supplement 1 to the Guide to the Expression of Uncertainty in Measurement. The paper notes (i) the resulting lack of invariance of measurement results to nonlinear reparametrizations of the measurement problem, (ii) the potential undetected divergence of measurement estimates obtained by Monte Carlo evaluation, (iii) the potential failure of the methodology to give uncertainty intervals enclosing the values of the measurands with an acceptable frequency and (iv) the potential loss of measurement precision. The distributed-measurand concept is gaining popularity partly because of its association with analysis using the Monte Carlo principle. However, the Monte Carlo principle is also applicable without adopting the distributed-measurand concept. Accordingly, an alternative approach to the evaluation of measurement uncertainty is briefly described

  10. Liquid-crystal variable-focus lenses with a spatially-distributed tilt angles.

    Science.gov (United States)

    Honma, Michinori; Nose, Toshiaki; Yanase, Satoshi; Yamaguchi, Rumiko; Sato, Susumu

    2009-06-22

    A pretilt angle controlling method by the density of rubbings using a tiny stylus is proposed. The control of the surface pretilt angle is achieved by rubbing a side-chain type polyimide film for a homeotropic alignment. Smooth liquid crystal (LC) director distribution in the bulk layer is successfully obtained even though the rough surface orientation. This approach is applied to LC cylindrical and rectangular lenses with a variable-focusing function. The distribution profile of the rubbing pitch (the reciprocal of the rubbing density) for small aberration is determined to be quadratic. The variable focusing function is successfully achieved in the LC rectangular lens, and the voltage dependence of the focal length is tried to be explained by the LC molecular reorientation behavior. PMID:19550499

  11. Microwave field distribution in a magic angle spinning dynamic nuclear polarization NMR probe

    OpenAIRE

    Nanni, Emilio A.; Barnes, Alexander B.; Matsuki, Yoh; Woskov, Paul P.; Corzilius, Björn; Griffin, Robert G.; Temkin, Richard J.

    2011-01-01

    We present a calculation of the microwave field distribution in a magic angle spinning (MAS) probe utilized in dynamic nuclear polarization (DNP) experiments. The microwave magnetic field (B[subscript 1S]) profile was obtained from simulations performed with the High Frequency Structure Simulator (HFSS) software suite, using a model that includes the launching antenna, the outer Kel-F stator housing coated with Ag, the RF coil, and the 4 mm diameter sapphire rotor containing the sample. The p...

  12. SIMULATING THE EFFECTS OF INITIAL PITCH-ANGLE DISTRIBUTIONS ON SOLAR FLARES

    International Nuclear Information System (INIS)

    In this work, we model both the thermal and non-thermal components of solar flares. The model we use, HYLOOP, combines a hydrodynamic equation solver with a non-thermal particle tracking code to simulate the thermal and non-thermal dynamics and emission of solar flares. In order to test the effects of pitch-angle distribution on flare dynamics and emission, a series of flares is simulated with non-thermal electron beams injected at the loop apex. The pitch-angle distribution of each beam is described by a single parameter and allowed to vary from flare to flare. We use the results of these simulations to generate synthetic hard and soft X-ray emissions (HXR and SXR). The light curves of the flares in Hinode's X-ray Telescope passbands show a distinct signal that is highly dependent on pitch-angle distribution. The simulated HXR emission in the 3-6 keV bandpass shows the formation and evolution of emission sources that correspond well to the observations of pre-impulsive flares. This ability to test theoretical models of thermal and non-thermal flare dynamics directly with observations allows for the investigation of a wide range of physical processes governing the evolution of solar flares. We find that the initial pitch-angle distribution of non-thermal particle populations has a profound effect on loop top HXR and SXR emission and that apparent motion of HXR is a natural consequence of non-thermal particle evolution in a magnetic trap.

  13. The probability distribution functions of emission line flux measurements and their ratios

    Science.gov (United States)

    Wesson, R.; Stock, D. J.; Scicluna, P.

    2016-04-01

    Many physical parameters in astrophysics are derived using the ratios of two observed quantities. If the relative uncertainties on measurements are small enough, uncertainties can be propagated analytically using simplifying assumptions, but for large normally distributed uncertainties, the probability distribution of the ratio become skewed, with a modal value offset from that expected in Gaussian uncertainty propagation. Furthermore, the most likely value of a ratio A/B is not equal to the reciprocal of the most likely value of B/A. The effect is most pronounced when the uncertainty on the denominator is larger than that on the numerator. We show that this effect is seen in an analysis of 12,126 spectra from the Sloan Digital Sky Survey. The intrinsically fixed ratio of the [O III] lines at 4959 and 5007Å is conventionally expressed as the ratio of the stronger line to the weaker line. Thus, the uncertainty on the denominator is larger, and non-Gaussian probability distributions result. By taking this effect into account, we derive an improved estimate of the intrinsic 5007/4959 ratio. We obtain a value of 3.012±0.008, which is slightly but statistically significantly higher than the theoretical value of 2.98. We further investigate the suggestion that fluxes measured from emission lines at low signal to noise are strongly biased upwards. We were unable to detect this effect in the SDSS line flux measurements, and we could not reproduce the results of Rola and Pelat who first described this bias. We suggest that the magnitude of this effect may depend strongly on the specific fitting algorithm used.

  14. The probability distribution functions of emission line flux measurements and their ratios

    Science.gov (United States)

    Wesson, R.; Stock, D. J.; Scicluna, P.

    2016-07-01

    Many physical parameters in astrophysics are derived using the ratios of two observed quantities. If the relative uncertainties on measurements are small enough, uncertainties can be propagated analytically using simplifying assumptions, but for large normally distributed uncertainties, the probability distribution of the ratio become skewed, with a modal value offset from that expected in Gaussian uncertainty propagation. Furthermore, the most likely value of a ratio A/B is not equal to the reciprocal of the most likely value of B/A. The effect is most pronounced when the uncertainty on the denominator is larger than that on the numerator. We show that this effect is seen in an analysis of 12 126 spectra from the Sloan Digital Sky Survey (SDSS). The intrinsically fixed ratio of the [O III] lines at 4959 and 5007 Å is conventionally expressed as the ratio of the stronger line to the weaker line. Thus, the uncertainty on the denominator is larger, and non-Gaussian probability distributions result. By taking this effect into account, we derive an improved estimate of the intrinsic 5007/4959 ratio. We obtain a value of 3.012 ± 0.008, which is slightly but statistically significantly higher than the theoretical value of 2.98. We further investigate the suggestion that fluxes measured from emission lines in noisy spectra are strongly biased upwards. We were unable to detect this effect in the SDSS line flux measurements, and we could not reproduce the results of Rola and Pelat who first described this bias. We suggest that the magnitude of this effect may depend strongly on the specific fitting algorithm used.

  15. Pitch angle distributions of energetic ions in the lobes of the distant geomagnetic tail

    International Nuclear Information System (INIS)

    Analysis of energetic (> 35 keV) ion data from the ISEE-3 spacecraft obtained during 1982-1983, when the spacecraft made a series of traversals of the distant geomagnetic tail (XGSE > - 230 RE), indicates that the pitch angle distribution of energetic ions in the distant tail lobes is usually highly anisotropic, being peaked closely perpendicular to the magnetic field direction, but with a small net flow in the antisunward direction. In this paper we present a model, based on the motion of single particles into and within the tail lobes, which accounts for these observed distributions. This model assumes that the lobe ions originate in the magnetosheath, where the energetic ion population consists of two components; a spatially uniform ''solar'' population, and a population of ''terrestrial'' origin, which decreases in strength with downtail distance. The pitch angle distribution at any point within the lobe may be constructed, assuming that the value of the distribution function along the particle trajectory is conserved. In general, those ions with a large field-aligned component to their motion enter the lobes in the deep tail, where the ''terrestrial'' source is weak, whilst those moving closely perpendicular to the field enter the lobes at positions much closer to the Earth, where the source is strong. The fluxes of these latter ions are therefore much enhanced above the rest of the pitch angle distribution, and are shown to account for the form of the observed distributions. The model also accounts for the more isotropic ion population observed in the lobe during solar particle events, when the ''terrestrial'' component of the magnetosheath source may be considered negligible in comparison to the enhanced ''solar'' component. (author)

  16. Thickness distribution of multi-stage incremental forming with different forming stages and angle intervals

    Institute of Scientific and Technical Information of China (English)

    李军超; 杨芬芬; 周志强

    2015-01-01

    Although multi-stage incremental sheet forming has always been adopted instead of single-stage forming to form parts with a steep wall angle or to achieve a high forming performance, it is largely dependent on empirical designs. In order to research multi-stage forming further, the effect of forming stages (n) and angle interval between the two adjacent stages (Δα) on thickness distribution was investigated. Firstly, a finite element method (FEM) model of multi-stage incremental forming was established and experimentally verified. Then, based on the proposed simulation model, different strategies were adopted to form a frustum of cone with wall angle of 30° to research the thickness distribution of multi-pass forming. It is proved that the minimum thickness increases largely and the variance of sheet thickness decreases significantly as the value of n grows. Further, with the increase of Δα, the minimum thickness increases initially and then decreases, and the optimal thickness distribution is achieved with Δα of 10°. Additionally, a formula is deduced to estimate the sheet thickness after multi-stage forming and proved to be effective. And the simulation results fit well with the experimental results.

  17. Detection of two power-law tails in the probability distribution functions of massive GMCs

    CERN Document Server

    Schneider, N; Girichidis, P; Rayner, T; Motte, F; Andre, P; Russeil, D; Abergel, A; Anderson, L; Arzoumanian, D; Benedettini, M; Csengeri, T; Didelon, P; Francesco, J D; Griffin, M; Hill, T; Klessen, R S; Ossenkopf, V; Pezzuto, S; Rivera-Ingraham, A; Spinoglio, L; Tremblin, P; Zavagno, A

    2015-01-01

    We report the novel detection of complex high-column density tails in the probability distribution functions (PDFs) for three high-mass star-forming regions (CepOB3, MonR2, NGC6334), obtained from dust emission observed with Herschel. The low column density range can be fit with a lognormal distribution. A first power-law tail starts above an extinction (Av) of ~6-14. It has a slope of alpha=1.3-2 for the rho~r^-alpha profile for an equivalent density distribution (spherical or cylindrical geometry), and is thus consistent with free-fall gravitational collapse. Above Av~40, 60, and 140, we detect an excess that can be fitted by a flatter power law tail with alpha>2. It correlates with the central regions of the cloud (ridges/hubs) of size ~1 pc and densities above 10^4 cm^-3. This excess may be caused by physical processes that slow down collapse and reduce the flow of mass towards higher densities. Possible are: 1. rotation, which introduces an angular momentum barrier, 2. increasing optical depth and weaker...

  18. Information theoretic measures of dependence, compactness, and non-gaussianity for multivariate probability distributions

    Science.gov (United States)

    Monahan, A. H.; Delsole, T.

    2009-02-01

    A basic task of exploratory data analysis is the characterisation of "structure" in multivariate datasets. For bivariate Gaussian distributions, natural measures of dependence (the predictive relationship between individual variables) and compactness (the degree of concentration of the probability density function (pdf) around a low-dimensional axis) are respectively provided by ordinary least-squares regression and Principal Component Analysis. This study considers general measures of structure for non-Gaussian distributions and demonstrates that these can be defined in terms of the information theoretic "distance" (as measured by relative entropy) between the given pdf and an appropriate "unstructured" pdf. The measure of dependence, mutual information, is well-known; it is shown that this is not a useful measure of compactness because it is not invariant under an orthogonal rotation of the variables. An appropriate rotationally invariant compactness measure is defined and shown to reduce to the equivalent PCA measure for bivariate Gaussian distributions. This compactness measure is shown to be naturally related to a standard information theoretic measure of non-Gaussianity. Finally, straightforward geometric interpretations of each of these measures in terms of "effective volume" of the pdf are presented.

  19. Diffraction in time in terms of Wigner distributions and tomographic probabilities

    CERN Document Server

    Man'ko, V I; Sharma, A; Man'ko, Vladimir; Moshinsky, Marcos; Sharma, Anju

    1999-01-01

    Long ago appeared a discussion in quantum mechanics of the problem of opening a completely absorbing shutter on which were impinging a stream of particles of definite velocity. The solution of the problem was obtained in a form entirely analogous to the optical one of diffraction by a straight edge. The argument of the Fresnel integrals was though time dependent and thus the first part in the title of this article. In section 1 we briefly review the original formulation of the problem of diffraction in time. In section 2 and 3 we reformulate respectively this problem in Wigner distributions and tomographical probabilities. In the former case the probability in phase space is very simple but, as it takes positive and negative values, the interpretation is ambiguous, but it gives a classical limit that agrees entirely with our intuition. In the latter case we can start with our initial conditions in a given reference frame but obtain our final solution in an arbitrary frame of reference.

  20. Analysis of Observation Data of Earth-Rockfill Dam Based on Cloud Probability Distribution Density Algorithm

    Directory of Open Access Journals (Sweden)

    Han Liwei

    2014-07-01

    Full Text Available Monitoring data on an earth-rockfill dam constitutes a form of spatial data. Such data include much uncertainty owing to the limitation of measurement information, material parameters, load, geometry size, initial conditions, boundary conditions and the calculation model. So the cloud probability density of the monitoring data must be addressed. In this paper, the cloud theory model was used to address the uncertainty transition between the qualitative concept and the quantitative description. Then an improved algorithm of cloud probability distribution density based on a backward cloud generator was proposed. This was used to effectively convert certain parcels of accurate data into concepts which can be described by proper qualitative linguistic values. Such qualitative description was addressed as cloud numerical characteristics-- {Ex, En, He}, which could represent the characteristics of all cloud drops. The algorithm was then applied to analyze the observation data of a piezometric tube in an earth-rockfill dam. And experiment results proved that the proposed algorithm was feasible, through which, we could reveal the changing regularity of piezometric tube’s water level. And the damage of the seepage in the body was able to be found out.

  1. A New Class of Probability Distributions for Describing the Spatial Statistics of Area-averaged Rainfall

    CERN Document Server

    Kundu, Prasun K

    2015-01-01

    Rainfall exhibits extreme variability at many space and time scales and calls for a statistical description. Based on an analysis of radar measurements of precipitation over the tropical oceans, we introduce a new probability law for the area-averaged rain rate constructed from the class of log-infinitely divisible distributions that accurately describes the frequency of the most intense rain events. The dependence of its parameters on the spatial averaging length L allows one to relate spatial statistics at different scales. In particular, it enables us to explain the observed power law scaling of the moments of the data and successfully predicts the continuous spectrum of scaling exponents expressing multiscaling characteristics of the rain intensity field.

  2. A unified optical damage criterion based on the probability density distribution of detector signals

    Science.gov (United States)

    Somoskoi, T.; Vass, Cs.; Mero, M.; Mingesz, R.; Bozoki, Z.; Osvay, K.

    2013-11-01

    Various methods and procedures have been developed so far to test laser induced optical damage. The question naturally arises, that what are the respective sensitivities of these diverse methods. To make a suitable comparison, both the processing of the measured primary signal has to be at least similar to the various methods, and one needs to establish a proper damage criterion, which has to be universally applicable for every method. We defined damage criteria based on the probability density distribution of the obtained detector signals. This was determined by the kernel density estimation procedure. We have tested the entire evaluation procedure in four well-known detection techniques: direct observation of the sample by optical microscopy; monitoring of the change in the light scattering power of the target surface and the detection of the generated photoacoustic waves both in the bulk of the sample and in the surrounding air.

  3. Irreversible models with Boltzmann–Gibbs probability distribution and entropy production

    International Nuclear Information System (INIS)

    We analyze irreversible interacting spin models evolving according to a master equation with spin flip transition rates that do not obey detailed balance but obey global balance with a Boltzmann–Gibbs probability distribution. Spin flip transition rates with up–down symmetry are obtained for a linear chain, a square lattice, and a cubic lattice with a stationary state corresponding to the Ising model with nearest neighbor interactions. We show that these irreversible dynamics describes the contact of the system with particle reservoirs that cause a flux of particles through the system. Using a microscopic definition, we determine the entropy production rate of these irreversible models and show that it can be written as a macroscopic bilinear form in the forces and fluxes. Exact expressions for this property are obtained for the linear chain and the square lattice. In this last case the entropy production rate displays a singularity at the phase transition point of the same type as the entropy itself

  4. The Density Probability Distribution in Compressible Isothermal Turbulence: Solenoidal vs Compressive Forcing

    CERN Document Server

    Federrath, Christoph; Schmidt, Wolfram

    2008-01-01

    The probability density function (PDF) of the gas density in turbulent supersonic flows is investigated with high-resolution numerical simulations. In a systematic study, we compare the density statistics of compressible turbulence driven by the usually adopted solenoidal forcing (divergence-free) and by compressive forcing (curl-free). Our results are in agreement with studies using solenoidal forcing. However, compressive forcing yields a significantly broader density distribution with standard deviation ~3 times larger at the same rms Mach number. The standard deviation-Mach number relation used in analytical models of star formation is reviewed and a modification of the existing expression is proposed, which takes into account the ratio of solenoidal and compressive modes of the turbulence forcing.

  5. Generating function for particle-number probability distribution in directed percolation

    International Nuclear Information System (INIS)

    We derive a generic expression for the generating function (GF) of the particle-number probability distribution (PNPD) for a simple reaction diffusion model that belongs to the directed percolation universality class. Starting with a single particle on a lattice, we show that the GF of the PNPD can be written as an infinite series of cumulants taken at zero momentum. This series can be summed up into a complete form at the level of a mean-field approximation. Using the renormalization group techniques, we determine logarithmic corrections for the GF at the upper critical dimension. We also find the critical scaling form for the PNPD and check its universality numerically in one dimension. The critical scaling function is found to be universal up to two non-universal metric factors

  6. On the reliability of observational measurements of column density probability distribution functions

    CERN Document Server

    Ossenkopf, Volker; Schneider, Nicola; Federrath, Christoph; Klessen, Ralf S

    2016-01-01

    Probability distribution functions (PDFs) of column densities are an established tool to characterize the evolutionary state of interstellar clouds. Using simulations, we show to what degree their determination is affected by noise, line-of-sight contamination, field selection, and the incomplete sampling in interferometric measurements. We solve the integrals that describe the convolution of a cloud PDF with contaminating sources and study the impact of missing information on the measured column density PDF. The effect of observational noise can be easily estimated and corrected for if the root mean square (rms) of the noise is known. For $\\sigma_{noise}$ values below 40\\,\\% of the typical cloud column density, $N_{peak}$, this involves almost no degradation of the accuracy of the PDF parameters. For higher noise levels and narrow cloud PDFs the width of the PDF becomes increasingly uncertain. A contamination by turbulent foreground or background clouds can be removed as a constant shield if the PDF of the c...

  7. Sampling the probability distribution of Type Ia Supernova lightcurve parameters in cosmological analysis

    Science.gov (United States)

    Dai, Mi; Wang, Yun

    2016-06-01

    In order to obtain robust cosmological constraints from Type Ia supernova (SN Ia) data, we have applied Markov Chain Monte Carlo (MCMC) to SN Ia lightcurve fitting. We develop a method for sampling the resultant probability density distributions (pdf) of the SN Ia lightcuve parameters in the MCMC likelihood analysis to constrain cosmological parameters, and validate it using simulated data sets. Applying this method to the `joint lightcurve analysis (JLA)' data set of SNe Ia, we find that sampling the SN Ia lightcurve parameter pdf's leads to cosmological parameters closer to that of a flat Universe with a cosmological constant, compared to the usual practice of using only the best-fitting values of the SN Ia lightcurve parameters. Our method will be useful in the use of SN Ia data for precision cosmology.

  8. EVALUATION OF THE PROBABILITY DISTRIBUTION OF PITTING CORROSION FATIGUE LIFE IN AIRCRAFT MATERIALS

    Institute of Scientific and Technical Information of China (English)

    王清远; N.KAWAGOISHI; Q.CHEN; R.M.PIDAPARTI

    2003-01-01

    Corrosion and fatigue properties of aircraft materials axe known to have a considerablescatter due to the random nature of materials, loading, and environmental conditions. A probabilisticapproach for predicting the pitting corrosion fatigue life has been investigated which captures the effectof the interaction of the cyclic load and corrosive environment and all stages of the corrosion fatigueprocess (i.e. the pit nucleation and growth, pit-crack transition, short- and long-crack propagation).The probabilistic model investigated considers the uncertainties in the initial pit size, corrosion pittingcurrent, and material properties due to the scatter found in the experimental data. Monte Carlo simu-lations were performed to define the failure probability distribution. Predicted cumulative distributionfunctions of fatigue life agreed reasonably well with the existing experimental data.

  9. Estimating species occurrence, abundance, and detection probability using zero-inflated distributions

    Science.gov (United States)

    Wenger, S.J.; Freeman, Mary C.

    2008-01-01

    Researchers have developed methods to account for imperfect detection of species with either occupancy (presence-absence) or count data using replicated sampling. We show how these approaches can be combined to simultaneously estimate occurrence, abundance, and detection probability by specifying a zero-inflated distribution for abundance. This approach may be particularly appropriate when patterns of occurrence and abundance arise from distinct processes operating at differing spatial or temporal scales. We apply the model to two data sets: (1) previously published data for a species of duck, Anas platyrhynchos, and (2) data for a stream fish species, Etheostoma scotti. We show that in these cases, an incomplete-detection zero-inflated modeling approach yields a superior fit to the data than other models. We propose that zero-inflated abundance models accounting for incomplete detection be considered when replicate count data are available.

  10. GENERALIZED FATIGUE CONSTANT LIFE CURVE AND TWO-DIMENSIONAL PROBABILITY DISTRIBUTION OF FATIGUE LIMIT

    Institute of Scientific and Technical Information of China (English)

    熊峻江; 武哲; 高镇同

    2002-01-01

    According to the traditional fatigue constant life curve, the concept and the universal expression of the generalized fatigue constant life curve were proposed.Then, on the basis of the optimization method of the correlation coefficient, the parameter estimation formulas were induced and the generalized fatigue constant life curve with the reliability level p was given.From P-Sa-Sm curve, the two-dimensional probability distribution of the fatigue limit was derived.After then, three set of tests of LY11 CZ corresponding to the different average stress were carried out in terms of the two-dimensional up-down method.Finally, the methods are used to analyze the test results, and it is found that the analyzedresults with the high precision may be obtained.

  11. Is the spherical leaf inclination angle distribution a valid assumption for temperate and boreal tree and shrub species?

    Science.gov (United States)

    Pisek, J.

    2012-12-01

    Directional distribution of leaves is one primary parameter for determining the radiation transmission through the canopy. When inverting canopy transmittance measurements for estimating the leaf area index or foliage clumping, incorrect assumptions on leaf angles may lead to considerable errors. Often spherical distribution of leaf normals is assumed, i.e. leaf normals are assumed to have no preferred direction in situations where no measurement data are available. The goal of this study is to examine if a spherical leaf angle distribution and the resulting isotropic G-function (G≡0.5) is indeed a valid assumption for temperate and boreal tree and shrub species. Leaf angle distributions were measured for over 80 deciduous broadleaf species commonly found in temperate and boreal ecoclimatic regions. The leaf inclination angles were obtained by sampling the complete vertical extent of trees and shrubs using a recently introduced technique based on digital photography. It is found a spherical leaf angle distribution is not a valid assumption for both tree and shrub species in temperate and boreal ecoclimatic regions. Given the influence of leaf angle distribution on inverting clumping and LAI estimates from canopy transmittance measurements, it is recommended to use planophile or plagiophile leaf angle distribution as more appropriate for modeling radiation transmission in temperate and boreal ecoclimatic regions when no actual leaf inclination angle measurements are available.

  12. Application of probability distribution functions in the ASTM RBCA framework for use in California

    International Nuclear Information System (INIS)

    Currently, Environmental Protection Agency (EPA, 1989b) and other conventional methodologies of risk assessment, such as the American Society for Testing and Materials--risk-based corrective action (ASTM/RBCA) format, make use of deterministic, or point numbers in making estimates of risk. The goal of risk assessment is to provide a systematic tool to evaluate hazards and exposures to assist in the management of society's activities. To properly do this, there must be an attempt by the regulator or the responsible party to use information as effectively as possible. The use of historical data and probability distribution functions is a suggested initial approach to dealing with LUFT sites in California, taking into account geophysical, societal, and health based parameters particular to the State. These parameters may be based on results of the CalLUFT HCA, from California Census information, or from other sources, where appropriate. Because of the limitations involved with the use of point sources in the ASTM/RBCA format, probability distribution functions can be used to give regulatory personnel and risk managers more understanding of the actual range of risks involved. Such information will allow the risk manager a higher comfort level in dealing with risks, and will, by detailing the residual risks involved, allow for the potential consequences of decisions to be better known. The above methodology effectively allows the risk manager to choose a level of health risk appropriate for the site, allows for a general prioritizing in regards to other sites, and removes some of the restrictions in applying remedial action necessitated by MCLs or deterministic risk estimates

  13. Innovative Meta-Heuristic Approach Application for Parameter Estimation of Probability Distribution Model

    Science.gov (United States)

    Lee, T. S.; Yoon, S.; Jeong, C.

    2012-12-01

    The primary purpose of frequency analysis in hydrology is to estimate the magnitude of an event with a given frequency of occurrence. The precision of frequency analysis depends on the selection of an appropriate probability distribution model (PDM) and parameter estimation techniques. A number of PDMs have been developed to describe the probability distribution of the hydrological variables. For each of the developed PDMs, estimated parameters are provided based on alternative estimation techniques, such as the method of moments (MOM), probability weighted moments (PWM), linear function of ranked observations (L-moments), and maximum likelihood (ML). Generally, the results using ML are more reliable than the other methods. However, the ML technique is more laborious than the other methods because an iterative numerical solution, such as the Newton-Raphson method, must be used for the parameter estimation of PDMs. In the meantime, meta-heuristic approaches have been developed to solve various engineering optimization problems (e.g., linear and stochastic, dynamic, nonlinear). These approaches include genetic algorithms, ant colony optimization, simulated annealing, tabu searches, and evolutionary computation methods. Meta-heuristic approaches use a stochastic random search instead of a gradient search so that intricate derivative information is unnecessary. Therefore, the meta-heuristic approaches have been shown to be a useful strategy to solve optimization problems in hydrology. A number of studies focus on using meta-heuristic approaches for estimation of hydrological variables with parameter estimation of PDMs. Applied meta-heuristic approaches offer reliable solutions but use more computation time than derivative-based methods. Therefore, the purpose of this study is to enhance the meta-heuristic approach for the parameter estimation of PDMs by using a recently developed algorithm known as a harmony search (HS). The performance of the HS is compared to the

  14. Size effect on strength and lifetime probability distributions of quasibrittle structures

    Indian Academy of Sciences (India)

    Zdeněk P Bažant; Jia-Liang Le

    2012-02-01

    Engineering structures such as aircraft, bridges, dams, nuclear containments and ships, as well as computer circuits, chips and MEMS, should be designed for failure probability < $10^{-6}-10^{-7}$ per lifetime. The safety factors required to ensure it are still determined empirically, even though they represent much larger and much more uncertain corrections to deterministic calculations than do the typical errors of modern computer analysis of structures. The empirical approach is sufficient for perfectly brittle and perfectly ductile structures since the cumulative distribution function (cdf) of random strength is known, making it possible to extrapolate to the tail from the mean and variance. However, the empirical approach does not apply to structures consisting of quasibrittle materials, which are brittle materials with inhomogeneities that are not negligible compared to structure size. This paper presents a refined theory on the strength distribution of quasibrittle structures, which is based on the fracture mechanics of nanocracks propagating by activation energy controlled small jumps through the atomic lattice and an analytical model for the multi-scale transition of strength statistics. Based on the power law for creep crack growth rate and the cdf of material strength, the lifetime distribution of quasibrittle structures under constant load is derived. Both the strength and lifetime cdfs are shown to be sizeand geometry-dependent. The theory predicts intricate size effects on both the mean structural strength and lifetime, the latter being much stronger. The theory is shown to match the experimentally observed systematic deviations of strength and lifetime histograms of industrial ceramics from the Weibull distribution.

  15. Applications of statistical mechanics to economics: Entropic origin of the probability distributions of money, income, and energy consumption

    OpenAIRE

    Yakovenko, Victor M.

    2012-01-01

    This Chapter is written for the Festschrift celebrating the 70th birthday of the distinguished economist Duncan Foley from the New School for Social Research in New York. This Chapter reviews applications of statistical physics methods, such as the principle of entropy maximization, to the probability distributions of money, income, and global energy consumption per capita. The exponential probability distribution of wages, predicted by the statistical equilibrium theory of a labor market dev...

  16. Application of probability distributions for quantifying uncertainty in radionuclide source terms for Seabrook risk assessment

    International Nuclear Information System (INIS)

    In some of the recent probabilistic safety assessments, discrete probability distributions (DPDs) have been developed to express, in a quantitative form, estimates of the uncertainty and conservatism in the point estimate source term values. In the DPD approach, distributed, discrete factors, which are multipliers on the point estimate values by the selected factor are made based on available data, calculations, and engineering judgment. Initial application of the DPD approach to source terms for risk analysis was based largely on engineering judgment after review of applicable data. However, in more recent applications of the DPD approach, results from an extensive review of existing experimental data and applied calculations have been factored into the estimates. Programs currently in progress, largely sponsored by NRC and EPRI, are beginning to yield significant new information upon which to base improved estimates for the magnitude of radionuclide source terms. The most extensive of the reviews of existing data for application to the DPD approach was that performed as part of the risk assessment for the proposed Sizewell B PWR. As part of the Seabrook risk study, DPD values specifically for that plant were developed based on the Sizewell approach. They represent a significant update to the Sizewell DPD values. In addition, DPD values were developed for associated release parameters which also affect the consequence calculations

  17. Various Models for Pion Probability Distributions from Heavy-Ion Collisions

    CERN Document Server

    Mekjian, A Z; Strottman, D D

    1998-01-01

    Various models for pion multiplicity distributions produced in relativistic heavy ion collisions are discussed. The models include a relativistic hydrodynamic model, a thermodynamic description, an emitting source pion laser model, and a description which generates a negative binomial description. The approach developed can be used to discuss other cases which will be mentioned. The pion probability distributions for these various cases are compared. Comparison of the pion laser model and Bose-Einstein condensation in a laser trap and with the thermal model are made. The thermal model and hydrodynamic model are also used to illustrate why the number of pions never diverges and why the Bose-Einstein correction effects are relatively small. The pion emission strength $\\eta$ of a Poisson emitter and a critical density $\\eta_c$ are connected in a thermal model by $\\eta/n_c = e^{-m/T} < 1$, and this fact reduces any Bose-Einstein correction effects in the number and number fluctuation of pions. Fluctuations can...

  18. A new probability distribution model of turbulent irradiance based on Born perturbation theory

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The subject of the PDF (Probability Density Function) of the irradiance fluctuations in a turbulent atmosphere is still unsettled.Theory reliably describes the behavior in the weak turbulence regime,but theoretical description in the strong and whole turbulence regimes are still controversial.Based on Born perturbation theory,the physical manifestations and correlations of three typical PDF models (Rice-Nakagami,exponential-Bessel and negative-exponential distribution) were theoretically analyzed.It is shown that these models can be derived by separately making circular-Gaussian,strong-turbulence and strong-turbulence-circular-Gaussian approximations in Born perturbation theory,which denies the viewpoint that the Rice-Nakagami model is only applicable in the extremely weak turbulence regime and provides theoretical arguments for choosing rational models in practical applications.In addition,a common shortcoming of the three models is that they are all approximations.A new model,called the Maclaurin-spread distribution,is proposed without any approximation except for assuming the correlation coefficient to be zero.So,it is considered that the new model can exactly reflect the Born perturbation theory.Simulated results prove the accuracy of this new model.

  19. Effect of slope angle of an artificial pool on distributions of turbulence

    Institute of Scientific and Technical Information of China (English)

    Atefeh Fazlollahi; Hossein Afzalimehr; Jueyi Sui

    2015-01-01

    abstract Experiments were carried out over a 2-dimentional pool with a constant length of 1.5 m and four different slopes. The distributions of velocity, Reynolds stress and turbulence intensities have been studied in this paper. Results show that as flow continues up the exit slope, the flow velocity increases near the channel bed and decreases near the water surface. The flow separation was not observed by ADV at the crest of the bed-form. In addition, the length of the separation zone increases with the increasing of entrance and exit slopes. The largest slope angle causes the maximum normalized shear stress. Based on the experiments, it is concluded that the shape of Reynolds stress distribution is generally dependent on the entrance and exit slopes of the pool. Also, the shape of Reynolds stress distribution is affected by both decelerating and accelerating flows. Additionally, with the increase in the slope angle, secondary currents are developed and become more stable. Results of the quadrant analysis show that the momentum between flow and bed-form is mostly transferred by sweep and ejection events.&2015 International Research and Training Centre on Erosion and Sedimentation/the World Association for Sedimentation and Erosion Research. Published by Elsevier B.V. All rights reserved.

  20. The Effects of Radiative Transfer on the Probability Distribution Functions of Molecular Magnetohydrodynamic Turbulence

    Science.gov (United States)

    Burkhart, Blakesley; Ossenkopf, V.; Lazarian, A.; Stutzki, J.

    2013-07-01

    We study the effects of radiative transfer on the probability distribution functions (PDFs) of simulations of magnetohydrodynamic turbulence in the widely studied 13CO 2-1 transition. We find that the integrated intensity maps generally follow a log-normal distribution, with the cases that have τ ≈ 1 best matching the PDF of the column density. We fit a two-dimensional variance-sonic Mach number relationship to our logarithmic PDFs of the form \\sigma _{\\ln (\\Sigma /\\Sigma _0)}^2=A\\times \\ln (1+b^2{\\cal M}_s^2) and find that, for parameter b = 1/3, parameter A depends on the radiative transfer environment. We also explore the variance, skewness, and kurtosis of the linear PDFs finding that higher moments reflect both higher sonic Mach number and lower optical depth. Finally, we apply the Tsallis incremental PDF function and find that the fit parameters depend on both Mach numbers, but also are sensitive to the radiative transfer parameter space, with the τ ≈ 1 case best fitting the incremental PDF of the true column density. We conclude that, for PDFs of low optical depth cases, part of the gas is always subthermally excited so that the spread of the line intensities exceeds the spread of the underlying column densities and hence the PDFs do not reflect the true column density. Similarly, PDFs of optically thick cases are dominated by the velocity dispersion and therefore do not represent the true column density PDF. Thus, in the case of molecules like carbon monoxide, the dynamic range of intensities, structures observed, and, consequently, the observable PDFs are less determined by turbulence and more often determined by radiative transfer effects.

  1. THE EFFECTS OF RADIATIVE TRANSFER ON THE PROBABILITY DISTRIBUTION FUNCTIONS OF MOLECULAR MAGNETOHYDRODYNAMIC TURBULENCE

    International Nuclear Information System (INIS)

    We study the effects of radiative transfer on the probability distribution functions (PDFs) of simulations of magnetohydrodynamic turbulence in the widely studied 13CO 2-1 transition. We find that the integrated intensity maps generally follow a log-normal distribution, with the cases that have τ ≈ 1 best matching the PDF of the column density. We fit a two-dimensional variance-sonic Mach number relationship to our logarithmic PDFs of the form σln2(Σ/Σ0) = A x ln(1+b2Ms2) and find that, for parameter b = 1/3, parameter A depends on the radiative transfer environment. We also explore the variance, skewness, and kurtosis of the linear PDFs finding that higher moments reflect both higher sonic Mach number and lower optical depth. Finally, we apply the Tsallis incremental PDF function and find that the fit parameters depend on both Mach numbers, but also are sensitive to the radiative transfer parameter space, with the τ ≈ 1 case best fitting the incremental PDF of the true column density. We conclude that, for PDFs of low optical depth cases, part of the gas is always subthermally excited so that the spread of the line intensities exceeds the spread of the underlying column densities and hence the PDFs do not reflect the true column density. Similarly, PDFs of optically thick cases are dominated by the velocity dispersion and therefore do not represent the true column density PDF. Thus, in the case of molecules like carbon monoxide, the dynamic range of intensities, structures observed, and, consequently, the observable PDFs are less determined by turbulence and more often determined by radiative transfer effects.

  2. Multiple-streaming and the Probability Distribution of Density in Redshift Space

    CERN Document Server

    Hui, L; Shandarin, S F; Hui, Lam; Kofman, Lev; Shandarin, Sergei F.

    1999-01-01

    We examine several aspects of redshift distortions by expressing the redshift-space density in terms of the eigenvalues and orientation of the local Lagrangian deformation tensor. We explore the importance of multiple-streaming using the Zel'dovich approximation (ZA), and compute the average number of streams in real and redshift-space. It is found that multiple-streaming can be significant in redshift-space but negligible in real-space, even at moderate values of the linear fluctuation amplitude ($\\sigma < 1$). Moreover, unlike their real-space counter-parts, redshift-space multiple-streams can flow past each other with minimal interactions. Such nonlinear redshift-space effects, which operate even when the real-space density field is quite linear, could suppress the classic compression of redshift-structures predicted by linear theory (Kaiser 1987). We also compute using the ZA the probability distribution function (PDF) of density, as well as $S_3$, in real and redshift-space, and compare it with the PD...

  3. Probability distribution functions of gas in M31 and M51

    CERN Document Server

    Berkhuijsen, Elly M

    2015-01-01

    We present probability distribution functions (PDFs) of the surface densities of ionized and neutral gas in the nearby spiral galaxies M31 and M51, as well as of dust emission and extinction Av in M31. The PDFs are close to lognormal and those for HI and Av in M31 are nearly identical. However, the PDFs for H2 are wider than the HI PDFs and the M51 PDFs have larger dispersions than those for M31. We use a simple model to determine how the PDFs are changed by variations in the line-of-sight (LOS) pathlength L through the gas, telescope resolution and the volume filling factor of the gas, f_v. In each of these cases the dispersion sigma of the lognormal PDF depends on the variable with a negative power law. We also derive PDFs of mean LOS volume densities of gas components in M31 and M51. Combining these with the volume density PDFs for different components of the ISM in the Milky Way (MW), we find that sigma decreases with increasing length L with an exponent of -0.76 +/- 0.06, which is steeper than expected. ...

  4. Turbulence-Induced Relative Velocity of Dust Particles III: The Probability Distribution

    CERN Document Server

    Pan, Liubin; Scalo, John

    2014-01-01

    Motivated by its important role in the collisional growth of dust particles in protoplanetary disks, we investigate the probability distribution function (PDF) of the relative velocity of inertial particles suspended in turbulent flows. Using the simulation from our previous work, we compute the relative velocity PDF as a function of the friction timescales, tau_p1 and tau_p2, of two particles of arbitrary sizes. The friction time of particles included in the simulation ranges from 0.1 tau_eta to 54T_L, with tau_eta and T_L the Kolmogorov time and the Lagrangian correlation time of the flow, respectively. The relative velocity PDF is generically non-Gaussian, exhibiting fat tails. For a fixed value of tau_p1, the PDF is the fattest for equal-size particles (tau_p2~tau_p1), and becomes thinner at both tau_p2tau_p1. Defining f as the friction time ratio of the smaller particle to the larger one, we find that, at a given f in 1/2>T_L). These features are successfully explained by the Pan & Padoan model. Usin...

  5. Particle size distribution models of small angle neutron scattering pattern on ferro fluids

    International Nuclear Information System (INIS)

    The Fe3O4 ferro fluids samples were synthesized by a co-precipitation method. The investigation of ferro fluids microstructure is known to be one of the most important problems because the presence of aggregates and their internal structure influence greatly the properties of ferro fluids. The size and the size dispersion of particle in ferro fluids were determined assuming a log normal distribution of particle radius. The scattering pattern of the measurement by small angle neutron scattering were fitted by the theoretical scattering function of two limitation models are log normal sphere distribution and fractal aggregate. Two types of particle are detected, which are presumably primary particle of 30 Armstrong in radius and secondary fractal aggregate of 200 Armstrong with polydispersity of 0.47 up to 0.53. (author)

  6. Pitch angle distributions of electrons at dipolarization sites during geomagnetic activity: THEMIS observations

    Science.gov (United States)

    Wang, Kaiti; Lin, Ching-Huei; Wang, Lu-Yin; Hada, Tohru; Nishimura, Yukitoshi; Turner, Drew L.; Angelopoulos, Vassilis

    2014-12-01

    Changes in pitch angle distributions of electrons with energies from a few eV to 1 MeV at dipolarization sites in Earth's magnetotail are investigated statistically to determine the extent to which adiabatic acceleration may contribute to these changes. Forty-two dipolarization events from 2008 and 2009 observed by Time History of Events and Macroscale Interactions during Substorms probes covering the inner plasma sheet from 8 RE to 12 RE during geomagnetic activity identified by the AL index are analyzed. The number of observed events with cigar-type distributions (peaks at 0° and 180°) decreases sharply below 1 keV after dipolarization because in many of these events, electron distributions became more isotropized. From above 1 keV to a few tens of keV, however, the observed number of cigar-type events increases after dipolarization and the number of isotropic events decreases. These changes can be related to the ineffectiveness of Fermi acceleration below 1 keV (at those energies, dipolarization time becomes comparable to electron bounce time). Model-calculated pitch angle distributions after dipolarization with the effect of betatron and Fermi acceleration tested indicate that these adiabatic acceleration mechanisms can explain the observed patterns of event number changes over a large range of energies for cigar events and isotropic events. Other factors still need to be considered to assess the observed increase in cigar events around 2 keV. Indeed, preferential directional increase/loss of electron fluxes, which may contribute to the formation of cigar events, was observed. Nonadiabatic processes to accelerate electrons in a parallel direction may also be important for future study.

  7. Evolution of electron pitch angle distributions across Saturn's middle magnetospheric region from MIMI/LEMMS

    Science.gov (United States)

    Clark, G.; Paranicas, C.; Santos-Costa, D.; Livi, S.; Krupp, N.; Mitchell, D. G.; Roussos, E.; Tseng, W.-L.

    2014-12-01

    We provide a global view of ~20 to 800 keV electron pitch angle distributions (PADs) close to Saturn's current sheet using observations from the Cassini MIMI/LEMMS instrument. Previous work indicated that the nature of pitch angle distributions in Saturn's inner to middle magnetosphere changes near the radial distance of 10RS. This work confirms the existence of a PAD transition region. Here we go further and develop a new technique to statistically quantify the spatial profile of butterfly PADs as well as present new spatial trends on the isotropic PAD. Additionally, we perform a case study analysis and show the PADs exhibit strong energy dependent features throughout this transition region. We also present a diffusion theory model based on adiabatic transport, Coulomb interactions with Saturn's neutral gas torus, and an energy dependent radial diffusion coefficient. A data-model comparison reveals that adiabatic transport is the dominant transport mechanism between ~8 to 12RS, however interactions with Saturn's neutral gas torus become dominant inside ~7RS and govern the flux level of ~20 to 800 keV electrons. We have also found that field-aligned fluxes were not well reproduced by our modeling approach. We suggest that wave-particle interactions and/or a polar source of the energetic particles needs further investigation.

  8. Derivation of the Molière simultaneous distribution between the deflection angle and the lateral displacement

    Science.gov (United States)

    Okei, K.; Takahashi, N.; Nakatsuka, T.

    Moliere simultaneous distribution between the deflection angle and the lateral displacement is derived by applying numerical Fourier transforms on the solution for frequency distribution acquired through Kamata-Nishimura formulation of Moliere theory. The differences of our result from that under the gaussian approximation and the basic properties of our distribution are investigated closely.

  9. The probability distribution of maintenance cost of a system affected by the gamma process of degradation: Finite time solution

    International Nuclear Information System (INIS)

    The stochastic gamma process has been widely used to model uncertain degradation in engineering systems and structures. The optimization of the condition-based maintenance (CBM) policy is typically based on the minimization of the asymptotic cost rate. In the financial planning of a maintenance program, however, a more accurate prediction interval for the cost is needed for prudent decision making. The prediction interval cannot be estimated unless the probability distribution of cost is known. In this context, the asymptotic cost rate has a limited utility. This paper presents the derivation of the probability distribution of maintenance cost, when the system degradation is modelled as a stochastic gamma process. A renewal equation is formulated to derive the characteristic function, then the discrete Fourier transform of the characteristic function leads to the complete probability distribution of cost in a finite time setting. The proposed approach is useful for a precise estimation of prediction limits and optimization of the maintenance cost.

  10. Simulating Tail Probabilities in GI/GI.1 Queues and Insurance Risk Processes with Subexponentail Distributions

    NARCIS (Netherlands)

    Boots, Nam Kyoo; Shahabuddin, Perwez

    2001-01-01

    This paper deals with estimating small tail probabilities of thesteady-state waiting time in a GI/GI/1 queue with heavy-tailed (subexponential) service times. The problem of estimating infinite horizon ruin probabilities in insurance risk processes with heavy-tailed claims can be transformed into th

  11. Anderson transition on the Cayley tree as a traveling wave critical point for various probability distributions

    International Nuclear Information System (INIS)

    For Anderson localization on the Cayley tree, we study the statistics of various observables as a function of the disorder strength W and the number N of generations. We first consider the Landauer transmission TN. In the localized phase, its logarithm follows the traveling wave form TN≅(ln TN)-bar + ln t* where (i) the disorder-averaged value moves linearly (ln(TN))-bar≅-N/ξloc and the localization length diverges as ξloc∼(W-Wc)-νloc with νloc = 1 and (ii) the variable t* is a fixed random variable with a power-law tail P*(t*) ∼ 1/(t*)1+β(W) for large t* with 0 N are governed by rare events. In the delocalized phase, the transmission TN remains a finite random variable as N → ∞, and we measure near criticality the essential singularity (ln(T∞))-bar∼-|Wc-W|-κT with κT ∼ 0.25. We then consider the statistical properties of normalized eigenstates Σx|ψ(x)|2 = 1, in particular the entropy S = -Σx|ψ(x)|2ln |ψ(x)|2 and the inverse participation ratios (IPR) Iq = Σx|ψ(x)|2q. In the localized phase, the typical entropy diverges as Styp∼( W-Wc)-νS with νS 1.5, whereas it grows linearly as Styp(N) ∼ N in the delocalized phase. Finally for the IPR, we explain how closely related variables propagate as traveling waves in the delocalized phase. In conclusion, both the localized phase and the delocalized phase are characterized by the traveling wave propagation of some probability distributions, and the Anderson localization/delocalization transition then corresponds to a traveling/non-traveling critical point. Moreover, our results point toward the existence of several length scales that diverge with different exponents ν at criticality

  12. Characterisation of seasonal flood types according to timescales in mixed probability distributions

    Science.gov (United States)

    Fischer, Svenja; Schumann, Andreas; Schulte, Markus

    2016-08-01

    When flood statistics are based on annual maximum series (AMS), the sample often contains flood peaks, which differ in their genesis. If the ratios among event types change over the range of observations, the extrapolation of a probability distribution function (pdf) can be dominated by a majority of events that belong to a certain flood type. If this type is not typical for extraordinarily large extremes, such an extrapolation of the pdf is misleading. To avoid this breach of the assumption of homogeneity, seasonal models were developed that differ between winter and summer floods. We show that a distinction between summer and winter floods is not always sufficient if seasonal series include events with different geneses. Here, we differentiate floods by their timescales into groups of long and short events. A statistical method for such a distinction of events is presented. To demonstrate their applicability, timescales for winter and summer floods in a German river basin were estimated. It is shown that summer floods can be separated into two main groups, but in our study region, the sample of winter floods consists of at least three different flood types. The pdfs of the two groups of summer floods are combined via a new mixing model. This model considers that information about parallel events that uses their maximum values only is incomplete because some of the realisations are overlaid. A statistical method resulting in an amendment of statistical parameters is proposed. The application in a German case study demonstrates the advantages of the new model, with specific emphasis on flood types.

  13. Probability distribution function-based classification of structural MRI for the detection of Alzheimer's disease.

    Science.gov (United States)

    Beheshti, I; Demirel, H

    2015-09-01

    High-dimensional classification methods have been a major target of machine learning for the automatic classification of patients who suffer from Alzheimer's disease (AD). One major issue of automatic classification is the feature-selection method from high-dimensional data. In this paper, a novel approach for statistical feature reduction and selection in high-dimensional magnetic resonance imaging (MRI) data based on the probability distribution function (PDF) is introduced. To develop an automatic computer-aided diagnosis (CAD) technique, this research explores the statistical patterns extracted from structural MRI (sMRI) data on four systematic levels. First, global and local differences of gray matter in patients with AD compared to healthy controls (HCs) using the voxel-based morphometric (VBM) technique with 3-Tesla 3D T1-weighted MRI are investigated. Second, feature extraction based on the voxel clusters detected by VBM on sMRI and voxel values as volume of interest (VOI) is used. Third, a novel statistical feature-selection process is employed, utilizing the PDF of the VOI to represent statistical patterns of the respective high-dimensional sMRI sample. Finally, the proposed feature-selection method for early detection of AD with support vector machine (SVM) classifiers compared to other standard feature selection methods, such as partial least squares (PLS) techniques, is assessed. The performance of the proposed technique is evaluated using 130 AD and 130 HC MRI data from the ADNI dataset with 10-fold cross validation(1). The results show that the PDF-based feature selection approach is a reliable technique that is highly competitive with respect to the state-of-the-art techniques in classifying AD from high-dimensional sMRI samples. PMID:26226415

  14. Determining probability distributions of parameter performances for time-series model calibration: A river system trial

    Science.gov (United States)

    Kim, Shaun Sang Ho; Hughes, Justin Douglas; Chen, Jie; Dutta, Dushmanta; Vaze, Jai

    2015-11-01

    A calibration method is presented that uses a sub-period resampling method to estimate probability distributions of performance for different parameter sets. Where conventional calibration methods implicitly identify the best performing parameterisations on average, the new method looks at the consistency of performance during sub-periods. The method is implemented with the conceptual river reach algorithms within the Australian Water Resources Assessments River (AWRA-R) model in the Murray-Darling Basin, Australia. The new method is tested for 192 reaches in a cross-validation scheme and results are compared to a traditional split-sample calibration-validation implementation. This is done to evaluate the new technique's ability to predict daily streamflow outside the calibration period. The new calibration method produced parameterisations that performed better in validation periods than optimum calibration parameter sets for 103 reaches and produced the same parameterisations for 35 reaches. The method showed a statistically significant improvement to predictive performance and potentially provides more rational flux terms over traditional split-sample calibration methods. Particular strengths of the proposed calibration method is that it avoids extra weighting towards rare periods of good agreement and also prevents compensating biases through time. The method can be used as a diagnostic tool to evaluate stochasticity of modelled systems and used to determine suitable model structures of different time-series models. Although the method is demonstrated using a hydrological model, the method is not limited to the field of hydrology and could be adopted for many different time-series modelling applications.

  15. Emergence of visual saliency from natural scenes via context-mediated probability distributions coding.

    Directory of Open Access Journals (Sweden)

    Jinhua Xu

    Full Text Available Visual saliency is the perceptual quality that makes some items in visual scenes stand out from their immediate contexts. Visual saliency plays important roles in natural vision in that saliency can direct eye movements, deploy attention, and facilitate tasks like object detection and scene understanding. A central unsolved issue is: What features should be encoded in the early visual cortex for detecting salient features in natural scenes? To explore this important issue, we propose a hypothesis that visual saliency is based on efficient encoding of the probability distributions (PDs of visual variables in specific contexts in natural scenes, referred to as context-mediated PDs in natural scenes. In this concept, computational units in the model of the early visual system do not act as feature detectors but rather as estimators of the context-mediated PDs of a full range of visual variables in natural scenes, which directly give rise to a measure of visual saliency of any input stimulus. To test this hypothesis, we developed a model of the context-mediated PDs in natural scenes using a modified algorithm for independent component analysis (ICA and derived a measure of visual saliency based on these PDs estimated from a set of natural scenes. We demonstrated that visual saliency based on the context-mediated PDs in natural scenes effectively predicts human gaze in free-viewing of both static and dynamic natural scenes. This study suggests that the computation based on the context-mediated PDs of visual variables in natural scenes may underlie the neural mechanism in the early visual cortex for detecting salient features in natural scenes.

  16. Effects of Turbulent Aberrations on Probability Distribution of Orbital Angular Momentum for Optical Communication

    International Nuclear Information System (INIS)

    Effects of atmospheric turbulence tilt, defocus, astigmatism and coma aberrations on the orbital angular momentum measurement probability of photons propagating in weak turbulent regime are modeled with Rytov approximation. By considering the resulting wave as a superposition of angular momentum eigenstates, the orbital angular momentum measurement probabilities of the transmitted digit are presented. Our results show that the effect of turbulent tilt aberration on the orbital angular momentum measurement probabilities of photons is the maximum among these four kinds of aberrations. As the aberration order increases, the effects of turbulence aberrations on the measurement probabilities of orbital angular momentum generally decrease, whereas the effect of turbulence defocus can be ignored. For tilt aberration, as the difference between the measured orbital angular momentum and the original orbital angular momentum increases, the orbital angular momentum measurement probability decreases. (fundamental areas of phenomenology (including applications))

  17. Understanding star formation in molecular clouds. III. Probability distribution functions of molecular lines in Cygnus X

    Science.gov (United States)

    Schneider, N.; Bontemps, S.; Motte, F.; Ossenkopf, V.; Klessen, R. S.; Simon, R.; Fechtenbaum, S.; Herpin, F.; Tremblin, P.; Csengeri, T.; Myers, P. C.; Hill, T.; Cunningham, M.; Federrath, C.

    2016-03-01

    The probability distribution function of column density (N-PDF) serves as a powerful tool to characterise the various physical processes that influence the structure of molecular clouds. Studies that use extinction maps or H2 column-density maps (N) that are derived from dust show that star-forming clouds can best be characterised by lognormal PDFs for the lower N range and a power-law tail for higher N, which is commonly attributed to turbulence and self-gravity and/or pressure, respectively. While PDFs from dust cover a large dynamic range (typically N ~ 1020-24 cm-2 or Av~ 0.1-1000), PDFs obtained from molecular lines - converted into H2 column density - potentially trace more selectively different regimes of (column) densities and temperatures. They also enable us to distinguish different clouds along the line of sight through using the velocity information. We report here on PDFs that were obtained from observations of 12CO, 13CO, C18O, CS, and N2H+ in the Cygnus X North region, and make a comparison to a PDF that was derived from dust observations with the Herschel satellite. The PDF of 12CO is lognormal for Av ~ 1-30, but is cut for higher Av because of optical depth effects. The PDFs of C18O and 13CO are mostly lognormal up to Av ~ 1-15, followed by excess up to Av ~ 40. Above that value, all CO PDFs drop, which is most likely due to depletion. The high density tracers CS and N2H+ exhibit only a power law distribution between Av ~ 15 and 400, respectively. The PDF from dust is lognormal for Av ~ 3-15 and has a power-law tail up to Av ~ 500. Absolute values for the molecular line column densities are, however, rather uncertain because of abundance and excitation temperature variations. If we take the dust PDF at face value, we "calibrate" the molecular line PDF of CS to that of the dust and determine an abundance [CS]/[H2] of 10-9. The slopes of the power-law tails of the CS, N2H+, and dust PDFs are -1.6, -1.4, and -2.3, respectively, and are thus consistent

  18. Finite Size Scaling of Probability Distributions in SU(2) Lattice Gauge Theory and Phi^4 Field Theory

    OpenAIRE

    Staniford-Chen, Stuart

    1992-01-01

    For a system near a second order phase transition, the probability distribution for the order parameter can be given a finite size scaling form. This fact is used to compare the finite temperature phase transition for the Wilson lines in d=3+1 SU(2) lattice gauge theory with the phase transition in d=3 phi^4 field theory. I exhibit the finite size scaled probability distributions in the form of a function of two variables (the reduced `temperature' and the magnetization) for both models. The ...

  19. PRINCIPLES FOR DEVELOPMENT OF OPTIMIZATION ALGORITHM FOR OPERATIONAL RELIABILITY OF DISTRIBUTIVE SYSTEMS WITH DUE ACCOUNT OF PROBABILITY IMPACT FACTORS

    Directory of Open Access Journals (Sweden)

    O. E. Gudkova

    2014-09-01

    Full Text Available Reliability  of  distributive  systems  for  electric supply of consumers is considered as a multi-criteria function. For this reason while developing an algorithm for determination  of  optimum  reliability  level  of distributive networks it is necessary to take into account probability character of changes  in corresponding indices.  A mathematical model and algorithm have been developed for determination of optimum reliability level of electric supply systems with due account of probability changes in reliability indices of component elements.

  20. Statistical distribution of foveal transverse chromatic aberration, pupil centration, and angle psi in a population of young adult eyes

    Science.gov (United States)

    Rynders, Maurice; Lidkea, Bruce; Chisholm, William; Thibos, Larry N.

    1995-10-01

    Subjective transverse chromatic aberration (sTCA) manifest at the fovea was determined for a population of 85 young adults (19-38 years old) by means of a two-dimensional, two-color, vernier alignment technique. The statistical distribution of sTCA was well fitted by a bivariate Gaussian function with mean values that were not significantly different from zero in either the horizontal or the vertical direction. We conclude from this result that a hypothetical, average eye representing the population mean of human eyes with medium-sized pupils is free of foveal sTCA. However, the absolute magnitude of sTCA for any given individual was often significantly greater than zero and ranged from 0.05 to 2.67 arcmin for the red and the blue lights of a computer monitor (mean wavelengths, 605 and 497 nm, respectively). The statistical distribution of the absolute magnitude of sTCA was well described by a Rayleigh probability distribution with a mean of 0.8 arcmin. A simple device useful for population screening in a clinical setting was also tested and gave concordant results. Assuming that sTCA at the fovea is due to decentering of the pupil with respect to the visual axis, we infer from these results that the pupil is, on average, well centered in human eyes. The average magnitude of pupil decentration in individual eyes is less than 0.5 mm, which corresponds to psi =3 deg for the angle between the achromatic and the visual axes of the eye.

  1. The VIMOS Public Extragalactic Redshift Survey (VIPERS). On the recovery of the count-in-cell probability distribution function

    Science.gov (United States)

    Bel, J.; Branchini, E.; Di Porto, C.; Cucciati, O.; Granett, B. R.; Iovino, A.; de la Torre, S.; Marinoni, C.; Guzzo, L.; Moscardini, L.; Cappi, A.; Abbas, U.; Adami, C.; Arnouts, S.; Bolzonella, M.; Bottini, D.; Coupon, J.; Davidzon, I.; De Lucia, G.; Fritz, A.; Franzetti, P.; Fumana, M.; Garilli, B.; Ilbert, O.; Krywult, J.; Le Brun, V.; Le Fèvre, O.; Maccagni, D.; Małek, K.; Marulli, F.; McCracken, H. J.; Paioro, L.; Polletta, M.; Pollo, A.; Schlagenhaufer, H.; Scodeggio, M.; Tasca, L. A. M.; Tojeiro, R.; Vergani, D.; Zanichelli, A.; Burden, A.; Marchetti, A.; Mellier, Y.; Nichol, R. C.; Peacock, J. A.; Percival, W. J.; Phleps, S.; Wolk, M.

    2016-04-01

    We compare three methods to measure the count-in-cell probability density function of galaxies in a spectroscopic redshift survey. From this comparison we found that, when the sampling is low (the average number of object per cell is around unity), it is necessary to use a parametric method to model the galaxy distribution. We used a set of mock catalogues of VIPERS to verify if we were able to reconstruct the cell-count probability distribution once the observational strategy is applied. We find that, in the simulated catalogues, the probability distribution of galaxies is better represented by a Gamma expansion than a skewed log-normal distribution. Finally, we correct the cell-count probability distribution function from the angular selection effect of the VIMOS instrument and study the redshift and absolute magnitude dependency of the underlying galaxy density function in VIPERS from redshift 0.5 to 1.1. We found a very weak evolution of the probability density distribution function and that it is well approximated by a Gamma distribution, independently of the chosen tracers. Based on observations collected at the European Southern Observatory, Cerro Paranal, Chile, using the Very Large Telescope under programmes 182.A-0886 and partly 070.A-9007. Also based on observations obtained with MegaPrime/MegaCam, a joint project of CFHT and CEA/DAPNIA, at the Canada-France-Hawaii Telescope (CFHT), which is operated by the National Research Council (NRC) of Canada, the Institut National des Sciences de l'Univers of the Centre National de la Recherche Scientifique (CNRS) of France, and the University of Hawaii. This work is based in part on data products produced at TERAPIX and the Canadian Astronomy Data Centre as part of the Canada-France-Hawaii Telescope Legacy Survey, a collaborative project of NRC and CNRS. The VIPERS web site is http://www.vipers.inaf.it/

  2. A Neural Network Approach for Identifying Relativistic Electron Pitch Angle Distributions in Van Allen Probes Data

    Science.gov (United States)

    Souza, V. M. C. E. S.; Vieira, L.; Alves, L. R.; Da Silva, L. A.; Koga, D.; Sibeck, D. G.; Walsh, B.; Kanekal, S. G.; Silveira, M. D.; Medeiros, C.; Mendes, O., Jr.; Marchezi, J.; Rockenbach, M.; Jauer, P. R.; Gonzalez, W.; Baker, D. N.

    2015-12-01

    A myriad of physical phenomena occur in the inner magnetosphere, in particular at the Earth's radiation belts, which can be a result of the combination of both internal and external processes. However, the connection between physical processes occurring deep within the magnetosphere and external interplanetary drivers it is not yet well understood. In this work we investigate whether a selected set of interplanetary structures affect the local time distribution of three different classes of high energy electron pitch angle distributions (PADs), namely normal, isotropic, and butterfly. We split this work into two parts: initially we focus on the methodology used which employs a Self-Organized Feature Map (SOFM) neural network for identifying different classes of electron PAD shapes in the Van Allen Probes' Relativistic Electron Proton Telescope (REPT) data. The algorithm can categorize the input data into an arbitrary number of classes from which three of them appears the most: normal, isotropic and butterfly. Other classes which are related with these three also emerge and deserve to be addressed in detail in future works. We also discuss the uncertainties of the algorithm. Then, we move to the second part where we describe in details the criteria used for selecting the interplanetary events, and also try to investigate the relation between key parameters characterizing such interplanetary structures and the local time distributions of electron PAD shapes.

  3. Simultaneous measurement of deuterium distribution and impurities by emission angle analysis of deuteron induced reaction products

    CERN Document Server

    Kubota, N; Furuyama, Y; Kitamura, A

    2002-01-01

    A novel analytical method of light element distribution in a thin film is presented. The method is based on the deuteron-induced nuclear reaction. The emission angle of the lighter product detected coincidentally with the heavier product is analyzed to deduce the depth distribution of the target atoms, while the conventional energy analysis is applied for impurities, the distributions of which are not of primary interest. Results of proof-of-principle experiments using the D(d,p)t reaction for a deuterated polyethylene (C sub 2 D sub 4) film are described. The depth resolution is evaluated to be 0.66+-0.07 mu m for 400 keV deuteron incidence in the C sub 2 D sub 4 film. Factors limiting the resolution are discussed, and possible improvement even down to several tens of nm is concluded. The present method is applicable for microanalysis of some light elements other than deuterium contained in a film with thickness of several mu m which cannot be reached by conventional heavy ion elastic recoil detection using ...

  4. Effects of Turbulent Aberrations on Probability Distribution of Orbital Angular Momentum for Optical Communication

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yi-Xin; CANG Ji

    2009-01-01

    Effects of atmospheric turbulence tilt, defocus, astigmatism and coma aberrations on the orbital angular mo-mentum measurement probability of photons propagating in weak turbulent regime are modeled with Rytov approximation. By considering the resulting wave as a superposition of angular momentum eigenstates, the or-bital angular momentum measurement probabilities of the transmitted digit axe presented. Our results show that the effect of turbulent tilt aberration on the orbital angular momentum measurement probabilities of photons is the maximum among these four kinds of aberrations. As the aberration order increases, the effects of turbulence aberrations on the measurement probabilities of orbital angular momentum generally decrease, whereas the effect of turbulence defoens can be ignored. For tilt aberration, as the difference between the measured orbital angular momentum and the original orbital angular momentum increases, the orbital angular momentum measurement probabifity decreases.

  5. Seasonal and vertical changes in leaf angle distribution for selected deciduous broadleaf tree species common to Europe

    Science.gov (United States)

    Raabe, Kairi; Pisek, Jan; Sonnentag, Oliver; Annuk, Kalju

    2014-05-01

    Leaf inclination angle distribution is a key parameter in determining the transmission and reflection of radiation by vegetation canopies. It has been previously observed that leaf inclination angle might change gradually from more vertical in the upper canopy and in high light habitats to more horizontal in the lower canopy and in low light habitats [1]. Despite its importance, relatively few measurements on actual leaf angle distributions have been reported for different tree species. Even smaller number of studies have dealt with the possible seasonal changes in leaf angle distribution [2]. In this study the variation of leaf inclination angle distributions was examined both temporally throughout the growing season and vertically at different heights of trees. We report on leaf inclination angle distributions for five deciduous broadleaf species found commonly in several parts of Europe: grey alder (Alnus incana), Silver birch (Betula pendula Roth), chestnut (Castanea), Norway maple (Acer platanoides), and aspen (Populus tremula). The angles were measured using the leveled camera method [3], with the data collected at several separate heights and four times during the period of May-September 2013. The results generally indicate the greatest change in leaf inclination angles for spring, with the changes usually being the most pronounced at the top of the canopy. It should also be noted, however, that whereas the temporal variation proved to be rather consistent for different species, the vertical variation differed more between species. The leveled camera method was additionally tested in terms of sensitivity to different users. Ten people were asked to measure the leaf angles for four different species. The results indicate the method is quite robust in providing coinciding distributions irrespective of the user and level of previous experience with the method. However, certain caution must be exercised when measuring long narrow leaves. References [1] G.G. Mc

  6. Fitting a distribution to censored contamination data using Markov Chain Monte Carlo methods and samples selected with unequal probabilities.

    Science.gov (United States)

    Williams, Michael S; Ebel, Eric D

    2014-11-18

    The fitting of statistical distributions to chemical and microbial contamination data is a common application in risk assessment. These distributions are used to make inferences regarding even the most pedestrian of statistics, such as the population mean. The reason for the heavy reliance on a fitted distribution is the presence of left-, right-, and interval-censored observations in the data sets, with censored observations being the result of nondetects in an assay, the use of screening tests, and other practical limitations. Considerable effort has been expended to develop statistical distributions and fitting techniques for a wide variety of applications. Of the various fitting methods, Markov Chain Monte Carlo methods are common. An underlying assumption for many of the proposed Markov Chain Monte Carlo methods is that the data represent independent and identically distributed (iid) observations from an assumed distribution. This condition is satisfied when samples are collected using a simple random sampling design. Unfortunately, samples of food commodities are generally not collected in accordance with a strict probability design. Nevertheless, pseudosystematic sampling efforts (e.g., collection of a sample hourly or weekly) from a single location in the farm-to-table continuum are reasonable approximations of a simple random sample. The assumption that the data represent an iid sample from a single distribution is more difficult to defend if samples are collected at multiple locations in the farm-to-table continuum or risk-based sampling methods are employed to preferentially select samples that are more likely to be contaminated. This paper develops a weighted bootstrap estimation framework that is appropriate for fitting a distribution to microbiological samples that are collected with unequal probabilities of selection. An example based on microbial data, derived by the Most Probable Number technique, demonstrates the method and highlights the

  7. Effects of Schroth and Pilates exercises on the Cobb angle and weight distribution of patients with scoliosis.

    Science.gov (United States)

    Kim, Gichul; HwangBo, Pil-Neo

    2016-03-01

    [Purpose] The purpose of this study was to compare the effect of Schroth and Pilates exercises on the Cobb angle and body weight distribution of patients with idiopathic scoliosis. [Subjects] Twenty-four scoliosis patients with a Cobb angle of ≥20° were divided into the Schroth exercise group (SEG, n = 12) and the Pilates exercise group (PEG, n = 12). [Methods] The SEG and PEG performed Schroth and Pilates exercises, respectively, three times a week for 12 weeks. The Cobb angle was measured in the standing position with a radiography apparatus, and weight load was measured with Gait View Pro 1.0. [Results] In the intragroup comparison, both groups showed significant changes in the Cobb angle. For weight distribution, the SEG showed significant differences in the total weight between the concave and convex sides, but the PEG did not show significant differences. Furthermore, in the intragroup comparison, the SEG showed significant differences in the changes in the Cobb angle and weight distribution compared with the PEG. [Conclusion] Both Schroth and Pilates exercises were effective in changing the Cobb angle and weight distribution of scoliosis patients; however, the intergroup comparison showed that the Schroth exercise was more effective than the Pilates exercise. PMID:27134403

  8. Comparison of the diagnostic ability of Moorfield′s regression analysis and glaucoma probability score using Heidelberg retinal tomograph III in eyes with primary open angle glaucoma

    Directory of Open Access Journals (Sweden)

    Jindal Shveta

    2010-01-01

    Full Text Available Purpose: To compare the diagnostic performance of the Heidelberg retinal tomograph (HRT glaucoma probability score (GPS with that of Moorfield′s regression analysis (MRA. Materials and Methods: The study included 50 eyes of normal subjects and 50 eyes of subjects with early-to-moderate primary open angle glaucoma. Images were obtained by using HRT version 3.0. Results: The agreement coefficient (weighted k for the overall MRA and GPS classification was 0.216 (95% CI: 0.119 - 0.315. The sensitivity and specificity were evaluated using the most specific (borderline results included as test negatives and least specific criteria (borderline results included as test positives. The MRA sensitivity and specificity were 30.61 and 98% (most specific and 57.14 and 98% (least specific. The GPS sensitivity and specificity were 81.63 and 73.47% (most specific and 95.92 and 34.69% (least specific. The MRA gave a higher positive likelihood ratio (28.57 vs. 3.08 and the GPS gave a higher negative likelihood ratio (0.25 vs. 0.44.The sensitivity increased with increasing disc size for both MRA and GPS. Conclusions: There was a poor agreement between the overall MRA and GPS classifications. GPS tended to have higher sensitivities, lower specificities, and lower likelihood ratios than the MRA. The disc size should be taken into consideration when interpreting the results of HRT, as both the GPS and MRA showed decreased sensitivity for smaller discs and the GPS showed decreased specificity for larger discs.

  9. A neural network approach for identifying particle pitch angle distributions in Van Allen Probes data

    Science.gov (United States)

    Souza, V. M.; Vieira, L. E. A.; Medeiros, C.; Da Silva, L. A.; Alves, L. R.; Koga, D.; Sibeck, D. G.; Walsh, B. M.; Kanekal, S. G.; Jauer, P. R.; Rockenbach, M.; Dal Lago, A.; Silveira, M. V. D.; Marchezi, J. P.; Mendes, O.; Gonzalez, W. D.; Baker, D. N.

    2016-04-01

    Analysis of particle pitch angle distributions (PADs) has been used as a means to comprehend a multitude of different physical mechanisms that lead to flux variations in the Van Allen belts and also to particle precipitation into the upper atmosphere. In this work we developed a neural network-based data clustering methodology that automatically identifies distinct PAD types in an unsupervised way using particle flux data. One can promptly identify and locate three well-known PAD types in both time and radial distance, namely, 90° peaked, butterfly, and flattop distributions. In order to illustrate the applicability of our methodology, we used relativistic electron flux data from the whole month of November 2014, acquired from the Relativistic Electron-Proton Telescope instrument on board the Van Allen Probes, but it is emphasized that our approach can also be used with multiplatform spacecraft data. Our PAD classification results are in reasonably good agreement with those obtained by standard statistical fitting algorithms. The proposed methodology has a potential use for Van Allen belt's monitoring.

  10. Deduction of compound nucleus formation probability from the fragment angular distributions in heavy-ion reactions

    Science.gov (United States)

    Yadav, C.; Thomas, R. G.; Mohanty, A. K.; Kapoor, S. S.

    2015-07-01

    The presence of various fissionlike reactions in heavy-ion induced reactions is a major hurdle in the path to laboratory synthesis of heavy and super-heavy nuclei. It is known that the cross section of forming a heavy evaporation residue in fusion reactions depends on the three factors—the capture cross section, probability of compound nucleus formation PCN, and the survival probability of the compound nucleus against fission. As the probability of compound nucleus formation, PCN is difficult to theoretically estimate because of its complex dependence on several parameters; attempts have been made in the past to deduce it from the fission fragment anisotropy data. In the present work, the fragment anisotropy data for a number of heavy-ion reactions are analyzed and it is found that deduction of PCN from the anisotropy data also requires the knowledge of the ratio of relaxation time of the K degree of freedom to pre-equilibrium fission time.

  11. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    , extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially......The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  12. Space distributions and decay probability for excited state of 7Li*(7,45 MeV) in reaction 7Li(alpha, alpha6Li)n

    International Nuclear Information System (INIS)

    Differential cross-sections of excitation and decay of 7Li*(7,45 MeV) resonance into 6Li + n channel in three particle reaction 7Li(alpha, alpha6Li)n at alpha-particle energy of 27,2 MeV have been determined in kinematically complete and incomplete experiments. Usage of position sensitive detector made it possible to obtain the data on space distributions of decay events for full range of possible angles and to determine the total probability of this process, which value essentially differs from the data for binary reactions. This result is agreed with previously obtained [1] and confirms the theoretical calculations [2] of decay branching ratio for short lived near-threshold resonances in three particle reactions

  13. Tendencies in the distribution of probabilities of rains and flows in Antioquia

    International Nuclear Information System (INIS)

    We perform different statistical tests to quantify trends in the quantiles of the probability density functions of daily records of rainfall and river discharges in Antioquia (north-western Colombia). We found positive trends in the upper quantiles for both variables and negative trends in the lower quantiles of river discharges records. Those results indicate the probability of more intense and frequent extreme events as a consequence of more frequent El Nino and La Nina events, but also as a possible consequence of global and local climate change. There is the need to explore the hydrological, environmental and socio-economical consequences of such trends

  14. Approximating Probability Levels for Testing Null Hypotheses with Noncentral F Distributions.

    Science.gov (United States)

    Fowler, Robert L.

    1984-01-01

    This study compared two approximations for normalizing noncentral F distributions: one based on the square root of the chi-square distribution (SRA), the other derived from a cube root of the chi-square distribution (CRA). The CRA was superior, and generally provided an excellent approximation for noncentral F. (Author/BW)

  15. Effects of Schroth and Pilates exercises on the Cobb angle and weight distribution of patients with scoliosis

    OpenAIRE

    Kim, Gichul; Hwangbo, Pil-Neo

    2016-01-01

    [Purpose] The purpose of this study was to compare the effect of Schroth and Pilates exercises on the Cobb angle and body weight distribution of patients with idiopathic scoliosis. [Subjects] Twenty-four scoliosis patients with a Cobb angle of ≥20° were divided into the Schroth exercise group (SEG, n = 12) and the Pilates exercise group (PEG, n = 12). [Methods] The SEG and PEG performed Schroth and Pilates exercises, respectively, three times a week for 12 weeks. The Cobb angle was measured i...

  16. In favor of general probability distributions: lateral prefrontal and insular cortices respond to stimulus inherent, but irrelevant differences.

    Science.gov (United States)

    Mestres-Missé, Anna; Trampel, Robert; Turner, Robert; Kotz, Sonja A

    2016-04-01

    A key aspect of optimal behavior is the ability to predict what will come next. To achieve this, we must have a fairly good idea of the probability of occurrence of possible outcomes. This is based both on prior knowledge about a particular or similar situation and on immediately relevant new information. One question that arises is: when considering converging prior probability and external evidence, is the most probable outcome selected or does the brain represent degrees of uncertainty, even highly improbable ones? Using functional magnetic resonance imaging, the current study explored these possibilities by contrasting words that differ in their probability of occurrence, namely, unbalanced ambiguous words and unambiguous words. Unbalanced ambiguous words have a strong frequency-based bias towards one meaning, while unambiguous words have only one meaning. The current results reveal larger activation in lateral prefrontal and insular cortices in response to dominant ambiguous compared to unambiguous words even when prior and contextual information biases one interpretation only. These results suggest a probability distribution, whereby all outcomes and their associated probabilities of occurrence-even if very low-are represented and maintained. PMID:25523107

  17. Chaos optimization algorithms based on chaotic maps with different probability distribution and search speed for global optimization

    Science.gov (United States)

    Yang, Dixiong; Liu, Zhenjun; Zhou, Jilei

    2014-04-01

    Chaos optimization algorithms (COAs) usually utilize the chaotic map like Logistic map to generate the pseudo-random numbers mapped as the design variables for global optimization. Many existing researches indicated that COA can more easily escape from the local minima than classical stochastic optimization algorithms. This paper reveals the inherent mechanism of high efficiency and superior performance of COA, from a new perspective of both the probability distribution property and search speed of chaotic sequences generated by different chaotic maps. The statistical property and search speed of chaotic sequences are represented by the probability density function (PDF) and the Lyapunov exponent, respectively. Meanwhile, the computational performances of hybrid chaos-BFGS algorithms based on eight one-dimensional chaotic maps with different PDF and Lyapunov exponents are compared, in which BFGS is a quasi-Newton method for local optimization. Moreover, several multimodal benchmark examples illustrate that, the probability distribution property and search speed of chaotic sequences from different chaotic maps significantly affect the global searching capability and optimization efficiency of COA. To achieve the high efficiency of COA, it is recommended to adopt the appropriate chaotic map generating the desired chaotic sequences with uniform or nearly uniform probability distribution and large Lyapunov exponent.

  18. Zenith angle distributions at Super-Kamiokande and SNO and the solution of the solar neutrino problem

    CERN Document Server

    González-Garciá, M Concepción; Smirnov, Yu A

    2001-01-01

    We have performed a detailed study of the zenith angle dependence of the regeneration factor and distributions of events at SNO and SK for different solutions of the solar neutrino problem. In particular, we discuss oscillatory behaviour and the synchronization effect in the distribution for the LMA solution, the parametric peak for the LOW solution, etc.. Physical interpretation of the effects is given. We suggest a new binning of events which emphasizes distinctive features of zenith angle distributions for the different solutions. We also find the correlations between the integrated day-night asymmetry and the rates of events in different zenith angle bins. Study of these correlations strengthens the identification power of the analysis.

  19. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    2013-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  20. Estimating the probability of stock market crashes for Bucharest Stock Exchange using stable distributions

    Directory of Open Access Journals (Sweden)

    Daniel Traian PELE

    2012-07-01

    Full Text Available In this study we analyse the evolution of BET Bucharest Stock Exchange through an AR-GARCH model and we estimate the likelihood of extreme events using stable distributions.Using the time series of the Bucharest Stock Exchange main index BET we argue that stable distributions can significantly improve the prediction of an extreme event.

  1. Using Hybrid Angle/Distance Information for Distributed Topology Control in Vehicular Sensor Networks

    Directory of Open Access Journals (Sweden)

    Chao-Chi Huang

    2014-10-01

    Full Text Available In a vehicular sensor network (VSN, the key design issue is how to organize vehicles effectively, such that the local network topology can be stabilized quickly. In this work, each vehicle with on-board sensors can be considered as a local controller associated with a group of communication members. In order to balance the load among the nodes and govern the local topology change, a group formation scheme using localized criteria is implemented. The proposed distributed topology control method focuses on reducing the rate of group member change and avoiding the unnecessary information exchange. Two major phases are sequentially applied to choose the group members of each vehicle using hybrid angle/distance information. The operation of Phase I is based on the concept of the cone-based method, which can select the desired vehicles quickly. Afterwards, the proposed time-slot method is further applied to stabilize the network topology. Given the network structure in Phase I, a routing scheme is presented in Phase II. The network behaviors are explored through simulation and analysis in a variety of scenarios. The results show that the proposed mechanism is a scalable and effective control framework for VSNs.

  2. Using hybrid angle/distance information for distributed topology control in vehicular sensor networks.

    Science.gov (United States)

    Huang, Chao-Chi; Chiu, Yang-Hung; Wen, Chih-Yu

    2014-01-01

    In a vehicular sensor network (VSN), the key design issue is how to organize vehicles effectively, such that the local network topology can be stabilized quickly. In this work, each vehicle with on-board sensors can be considered as a local controller associated with a group of communication members. In order to balance the load among the nodes and govern the local topology change, a group formation scheme using localized criteria is implemented. The proposed distributed topology control method focuses on reducing the rate of group member change and avoiding the unnecessary information exchange. Two major phases are sequentially applied to choose the group members of each vehicle using hybrid angle/distance information. The operation of Phase I is based on the concept of the cone-based method, which can select the desired vehicles quickly. Afterwards, the proposed time-slot method is further applied to stabilize the network topology. Given the network structure in Phase I, a routing scheme is presented in Phase II. The network behaviors are explored through simulation and analysis in a variety of scenarios. The results show that the proposed mechanism is a scalable and effective control framework for VSNs. PMID:25350506

  3. High-energy spectrum and zenith-angle distribution of atmospheric neutrinos

    CERN Document Server

    Sinegovsky, S I; Sinegovskaya, T S

    2011-01-01

    High-energy neutrinos, arising from decays of mesons produced through the collisions of cosmic ray particles with air nuclei, form the background in the astrophysical neutrino detection problem. An ambiguity in high-energy behavior of pion and especially kaon production cross sections for nucleon-nucleus collisions may affect essentially the calculated neutrino flux. We present results of the calculation of the energy spectrum and zenith-angle distribution of the muon and electron atmospheric neutrinos in the energy range 10 GeV to 10 PeV. The calculation was performed with usage of known hadronic models (QGSJET-II-03, SIBYLL 2.1, Kimel & Mokhov) for two of the primary spectrum parametrizations, by Gaisser & Honda and by Zatsepin & Sokolskaya. The comparison of the calculated muon neutrino spectrum with the IceCube40 experiment data make it clear that even at energies above 100 TeV the prompt neutrino contribution is not so apparent because of tangled uncertainties of the strange (kaons) and charm...

  4. Counterion Distribution Around Protein-SNAs probed by Small-angle X-ray scattering

    Science.gov (United States)

    Krishnamoorthy, Kurinji; Bedzyk, Michael; Kewalramani, Sumit; Moreau, Liane; Mirkin, Chad

    Protein-DNA conjugates couple the advanced cell transfection capabilities of spherical DNA architecture and the biocompatible enzymatic activity of a protein core to potentially create therapeutic agents with dual functionality. An understanding of their stabilizing ionic environment is crucial to better understand and predict their properties. Here, we use Small-angle X-ray scattering techniques to decipher the structure of the counterion cloud surrounding these DNA coated nanoparticles. Through the use of anomalous scattering techniques we have mapped the local concentrations of Rb+ ions in the region around the Protein-DNA constructs. These results are further corroborated with simulations using a geometric model for the excess charge density as function of radial distance from the protein core. Further, we investigate the influence of solution ionic strength on the structure of the DNA corona and demonstrate a reduction in the extension of the DNA corona with increasing concentration of NaCl in solution for the case of both single and double stranded DNA shells. Our work reveals the distribution of counterions in the vicinity of Protein-DNA conjugates and decouples the effect of solution ionic strength on the thickness of the DNA layer.

  5. The bond angle distribution and local coordination for silica glass under densification

    International Nuclear Information System (INIS)

    We present a simulation of silica glass with density ranging from 2.53 to 3.49 g cm-3 using the molecular dynamics method. The simulation reveals that the density of constructed samples can be expressed by a linear function of fraction of units SiOx. As the density increases, the fraction of units SiOx and linkages OSiy significantly varies, but partial bond angle distributions (BAD) for SiOx, x = 4, 5, 6, and OSiy, y = 2, 3, are identical for all the obtained samples. This allows us to establish a simple relation between total BAD and fraction of SiOx or OSiy. The simulation shows good agreement between the simulation and calculation results for both Si-O-Si and O-Si-O BAD. Moreover, most Si atoms in the low-density sample belong to the perfect tetrahedron (PT), whereas they are mainly present in the distorted tetrahedron for the high-density sample. We also found a large cluster of PTs that are linked to each other via bridging oxygen. The largest cluster consists of 90% Si in the low-density sample and 39% Si in the high-density one. (paper)

  6. Estimation of Genotype Distributions and Posterior Genotype Probabilities for β-Mannosidosis in Salers Cattle

    OpenAIRE

    Taylor, J F; Abbitt, B.; Walter, J P; Davis, S. K.; Jaques, J. T.; Ochoa, R. F.

    1993-01-01

    β-Mannosidosis is a lethal lysosomal storage disease inherited as an autosomal recessive in man, cattle and goats. Laboratory assay data of plasma β-mannosidase activity represent a mixture of homozygous normal and carrier genotype distributions in a proportion determined by genotype frequency. A maximum likelihood approach employing data transformations for each genotype distribution and assuming a diallelic model of inheritance is described. Estimates of the transformation and genotype dist...

  7. Size distribution and probable sources of trace elements in submicron atmospheric particulate material

    International Nuclear Information System (INIS)

    Size-segregated atmospheric particulate material was collected at a rural location in the Great Smokey Mountain National Park, Tennesse, and at an urban site in Pasaden, California. The elemental composition of this material was determined by Instrumental Neutron Activation Analysis. Elements identified as being of anthropogenic origin had mass median diameters of below one micron, while elements of crustal origin were generally found to have a mass median diameters well over one micron. Some of the Pasadena samples however had elevated concentrations of the typical crustal elements Al, Fe, La, and Ce in the finer size fractions, probably due to motor vehicle emissions. (author)

  8. Non-Gaussian corrections to the probability distribution of the curvature perturbation from inflation

    OpenAIRE

    Seery, David; Hidalgo, J. Carlos

    2006-01-01

    We show how to obtain the probability density function for the amplitude of the curvature perturbation, R, produced during an epoch of slow-roll, single-field inflation, working directly from n-point correlation functions of R. These n-point functions are the usual output of quantum field theory calculations, and as a result we bypass approximate statistical arguments based on the central limit theorem. Our method can be extended to deal with arbitrary forms of non-Gaussianity, appearing at a...

  9. PRINCIPLES FOR DEVELOPMENT OF OPTIMIZATION ALGORITHM FOR OPERATIONAL RELIABILITY OF DISTRIBUTIVE SYSTEMS WITH DUE ACCOUNT OF PROBABILITY IMPACT FACTORS

    OpenAIRE

    O. E. Gudkova; I. M Savchenko; O. A. Grebenchikov; A. A. Sergeev

    2014-01-01

    Reliability  of  distributive  systems  for  electric supply of consumers is considered as a multi-criteria function. For this reason while developing an algorithm for determination  of  optimum  reliability  level  of distributive networks it is necessary to take into account probability character of changes  in corresponding indices.  A mathematical model and algorithm have been developed for determination of optimum reliability level of electric supply systems with due account of probabili...

  10. A comparison of the probability distribution of observed substorm magnitude with that predicted by a minimal substorm model

    Directory of Open Access Journals (Sweden)

    S. K. Morley

    2007-11-01

    Full Text Available We compare the probability distributions of substorm magnetic bay magnitudes from observations and a minimal substorm model. The observed distribution was derived previously and independently using the IL index from the IMAGE magnetometer network. The model distribution is derived from a synthetic AL index time series created using real solar wind data and a minimal substorm model, which was previously shown to reproduce observed substorm waiting times. There are two free parameters in the model which scale the contributions to AL from the directly-driven DP2 electrojet and loading-unloading DP1 electrojet, respectively. In a limited region of the 2-D parameter space of the model, the probability distribution of modelled substorm bay magnitudes is not significantly different to the observed distribution. The ranges of the two parameters giving acceptable (95% confidence level agreement are consistent with expectations using results from other studies. The approximately linear relationship between the two free parameters over these ranges implies that the substorm magnitude simply scales linearly with the solar wind power input at the time of substorm onset.

  11. Void probability as a function of the void's shape and scale-invariant models. [in studies of spacial galactic distribution

    Science.gov (United States)

    Elizalde, E.; Gaztanaga, E.

    1992-01-01

    The dependence of counts in cells on the shape of the cell for the large scale galaxy distribution is studied. A very concrete prediction can be done concerning the void distribution for scale invariant models. The prediction is tested on a sample of the CfA catalog, and good agreement is found. It is observed that the probability of a cell to be occupied is bigger for some elongated cells. A phenomenological scale invariant model for the observed distribution of the counts in cells, an extension of the negative binomial distribution, is presented in order to illustrate how this dependence can be quantitatively determined. An original, intuitive derivation of this model is presented.

  12. Research on the behavior of fiber orientation probability distribution function in the planar flows

    Institute of Scientific and Technical Information of China (English)

    ZHOU Kun; LIN Jian-zhong

    2005-01-01

    The equation of two-dimensional fiber direction vector was solved theoretically to give the fiber orientation distribution in simple shear flow, flow with two direction shears, extensional flow and arbitrary planar incompressible flow. The Fokker-Planck equation was solved numerically to validify the theoretical solutions. The stable orientation and orientation period of fiber were obtained. The results showed that the fiber orientation distribution is dependent on the relative not absolute magnitude of the matrix rate-of-strain of flow. The effect of fiber aspect ratio on the orientation distribution of fiber is insignificant in most conditions except the simple shear case. It was proved that the results for a planar flow could be generalized to the case of 3-D fiber direction vector.

  13. Microwave field distribution in a magic angle spinning dynamic nuclear polarization NMR probe

    Science.gov (United States)

    Nanni, Emilio A.; Barnes, Alexander B.; Matsuki, Yoh; Woskov, Paul P.; Corzilius, Björn; Griffin, Robert G.; Temkin, Richard J.

    2011-05-01

    We present a calculation of the microwave field distribution in a magic angle spinning (MAS) probe utilized in dynamic nuclear polarization (DNP) experiments. The microwave magnetic field (B 1 S) profile was obtained from simulations performed with the High Frequency Structure Simulator (HFSS) software suite, using a model that includes the launching antenna, the outer Kel-F stator housing coated with Ag, the RF coil, and the 4 mm diameter sapphire rotor containing the sample. The predicted average B 1 S field is 13 μT/W 1/2, where S denotes the electron spin. For a routinely achievable input power of 5 W the corresponding value is γSB 1 S = 0.84 MHz. The calculations provide insights into the coupling of the microwave power to the sample, including reflections from the RF coil and diffraction of the power transmitted through the coil. The variation of enhancement with rotor wall thickness was also successfully simulated. A second, simplified calculation was performed using a single pass model based on Gaussian beam propagation and Fresnel diffraction. This model provided additional physical insight and was in good agreement with the full HFSS simulation. These calculations indicate approaches to increasing the coupling of the microwave power to the sample, including the use of a converging lens and fine adjustment of the spacing of the windings of the RF coil. The present results should prove useful in optimizing the coupling of microwave power to the sample in future DNP experiments. Finally, the results of the simulation were used to predict the cross effect DNP enhancement ( ɛ) vs. ω1 S/(2 π) for a sample of 13C-urea dissolved in a 60:40 glycerol/water mixture containing the polarizing agent TOTAPOL; very good agreement was obtained between theory and experiment.

  14. Surface winds on Venus: Probability distribution from in-situ measurements

    Science.gov (United States)

    Lorenz, Ralph D.

    2016-01-01

    A surface wind specification is needed for future landed missions to Venus. While sparse, there exist enough data from the limited surface and near-surface measurements to date to define a probability density function that guides expectations of winds for rational design of landing systems. Following a review of all available data (mostly from the Venera missions), a Weibull function, used previously for Mars and Titan, and widely used in terrestrial engineering applications, is proposed. Best-estimate wind measurements are reasonably described by P(>V) = exp[-(V/c)k], with c = 0.8 m/s, k = 1.9: this function yields a 95% chance of winds Mars, Earth or Titan), a prediction testable with radar interferometry on future orbital missions and/or from landed observations. More elaborate analyses should take site-specific factors such as slope or time of day into account, but cannot be meaningfully constrained by present data.

  15. Analyses of rainfall using probability distribution and Markov chain models for crop planning in Daspalla region in Odisha, India

    Science.gov (United States)

    Mandal, K. G.; Padhi, J.; Kumar, A.; Ghosh, S.; Panda, D. K.; Mohanty, R. K.; Raychaudhuri, M.

    2015-08-01

    Rainfed agriculture plays and will continue to play a dominant role in providing food and livelihoods for an increasing world population. Rainfall analyses are helpful for proper crop planning under changing environment in any region. Therefore, in this paper, an attempt has been made to analyse 16 years of rainfall (1995-2010) at the Daspalla region in Odisha, eastern India for prediction using six probability distribution functions, forecasting the probable date of onset and withdrawal of monsoon, occurrence of dry spells by using Markov chain model and finally crop planning for the region. For prediction of monsoon and post-monsoon rainfall, log Pearson type III and Gumbel distribution were the best-fit probability distribution functions. The earliest and most delayed week of the onset of rainy season was the 20th standard meteorological week (SMW) (14th-20th May) and 25th SMW (18th-24th June), respectively. Similarly, the earliest and most delayed week of withdrawal of rainfall was the 39th SMW (24th-30th September) and 47th SMW (19th-25th November), respectively. The longest and shortest length of rainy season was 26 and 17 weeks, respectively. The chances of occurrence of dry spells are high from the 1st-22nd SMW and again the 42nd SMW to the end of the year. The probability of weeks (23rd-40th SMW) remaining wet varies between 62 and 100 % for the region. Results obtained through this analysis would be utilised for agricultural planning and mitigation of dry spells at the Daspalla region in Odisha, India.

  16. Ruin Probabilities and Aggregrate Claims Distributions for Shot Noise Cox Processes

    DEFF Research Database (Denmark)

    Albrecher, H.; Asmussen, Søren

    We consider a risk process Rt where the claim arrival process is a superposition of a homogeneous Poisson process and a Cox process with a Poisson shot noise intensity process, capturing the effect of sudden increases of the claim intensity due to external events. The distribution of the aggregate...

  17. The probability distributions of the first hitting times of Bessel processes

    OpenAIRE

    Hamana, Yuji; Matsumoto, Hiroyuki

    2011-01-01

    We consider the first hitting times of the Bessel processes. We give explicit expressions for the distribution functions and for the densities by means of the zeros of the Bessel functions. The results extend the classical ones and cover all the cases.

  18. Is extrapair mating random? On the probability distribution of extrapair young in avian broods

    NARCIS (Netherlands)

    Brommer, Jon E.; Korsten, Peter; Bouwman, Karen A.; Berg, Mathew L.; Komdeur, Jan

    2007-01-01

    A dichotomy in female extrapair copulation (EPC) behavior, with some females seeking EPC and others not, is inferred if the observed distribution of extrapair young (EPY) over broods differs from a random process on the level of individual offspring (binomial, hypergeometrical, or Poisson). A review

  19. Probability density of spatially distributed soil moisture inferred from crosshole georadar traveltime measurements

    Science.gov (United States)

    Linde, N.; Vrugt, J. A.

    2009-04-01

    Geophysical models are increasingly used in hydrological simulations and inversions, where they are typically treated as an artificial data source with known uncorrelated "data errors". The model appraisal problem in classical deterministic linear and non-linear inversion approaches based on linearization is often addressed by calculating model resolution and model covariance matrices. These measures offer only a limited potential to assign a more appropriate "data covariance matrix" for future hydrological applications, simply because the regularization operators used to construct a stable inverse solution bear a strong imprint on such estimates and because the non-linearity of the geophysical inverse problem is not explored. We present a parallelized Markov Chain Monte Carlo (MCMC) scheme to efficiently derive the posterior spatially distributed radar slowness and water content between boreholes given first-arrival traveltimes. This method is called DiffeRential Evolution Adaptive Metropolis (DREAM_ZS) with snooker updater and sampling from past states. Our inverse scheme does not impose any smoothness on the final solution, and uses uniform prior ranges of the parameters. The posterior distribution of radar slowness is converted into spatially distributed soil moisture values using a petrophysical relationship. To benchmark the performance of DREAM_ZS, we first apply our inverse method to a synthetic two-dimensional infiltration experiment using 9421 traveltimes contaminated with Gaussian errors and 80 different model parameters, corresponding to a model discretization of 0.3 m × 0.3 m. After this, the method is applied to field data acquired in the vadose zone during snowmelt. This work demonstrates that fully non-linear stochastic inversion can be applied with few limiting assumptions to a range of common two-dimensional tomographic geophysical problems. The main advantage of DREAM_ZS is that it provides a full view of the posterior distribution of spatially

  20. Unit-Sphere Anisotropic Multiaxial Stochastic-Strength Model Probability Density Distribution for the Orientation of Critical Flaws

    Science.gov (United States)

    Nemeth, Noel

    2013-01-01

    Models that predict the failure probability of monolithic glass and ceramic components under multiaxial loading have been developed by authors such as Batdorf, Evans, and Matsuo. These "unit-sphere" failure models assume that the strength-controlling flaws are randomly oriented, noninteracting planar microcracks of specified geometry but of variable size. This report develops a formulation to describe the probability density distribution of the orientation of critical strength-controlling flaws that results from an applied load. This distribution is a function of the multiaxial stress state, the shear sensitivity of the flaws, the Weibull modulus, and the strength anisotropy. Examples are provided showing the predicted response on the unit sphere for various stress states for isotropic and transversely isotropic (anisotropic) materials--including the most probable orientation of critical flaws for offset uniaxial loads with strength anisotropy. The author anticipates that this information could be used to determine anisotropic stiffness degradation or anisotropic damage evolution for individual brittle (or quasi-brittle) composite material constituents within finite element or micromechanics-based software

  1. Improving probability distributions for resource levels from master surgery tactical plans for emergency and elective patients

    OpenAIRE

    Tord Gomis, Marta De

    2011-01-01

    Efficiency and patient satisfaction are two of the most important factors for a hospital; in order to be competitive these two factors have to be improved. Tactical admission plans are focused on increasing efficiency, but in this paper we try to also associate patient satisfaction with the tactical plan. To this respect, we present a procedure to calculate exact waiting time distributions and another procedure to compute the exact level of resources usage. Then we explore two different metho...

  2. Application of the Baron-Myerson monopolist regulation mechanism: issues on selecting the cost probability distributions

    OpenAIRE

    Dezerega, Alejandro Bartolome

    1994-01-01

    Approved for public release, distribution unlimited The end of cold war levels of defense expenditures has promoted the reduction in the number of defense-related companies, creating potential monopolistic economic scenarios for defense procurement. This thesis studies one methodology to deal with these scenarios, based on the Baron-Myerson monopolist regulation mechanisms. The Baron-Myerson mechanism provides a tool to regulate monopolists ...

  3. Dynamic Inventory Management with Learning About the Demand Distribution and Substitution Probability

    OpenAIRE

    Li Chen; Erica L. Plambeck

    2008-01-01

    Awell-known result in the Bayesian inventory management literature is: If lost sales are not observed, the Bayesian optimal inventory level is larger than the myopic inventory level (one should "stock more" to learn about the demand distribution). This result has been proven in other studies under the assumption that inventory is perishable, so the myopic inventory level is equal to the Bayesian optimal inventory level with observed lost sales. We break that equivalence by considering nonperi...

  4. Luminosity distance in Swiss cheese cosmology with randomized voids. II. Magnification probability distributions

    CERN Document Server

    Flanagan, Éanna É; Wasserman, Ira; Vanderveld, R Ali

    2011-01-01

    We study the fluctuations in luminosity distances due to gravitational lensing by large scale (> 35 Mpc) structures, specifically voids and sheets. We use a simplified "Swiss cheese" model consisting of a \\Lambda -CDM Friedman-Robertson-Walker background in which a number of randomly distributed non-overlapping spherical regions are replaced by mass compensating comoving voids, each with a uniform density interior and a thin shell of matter on the surface. We compute the distribution of magnitude shifts using a variant of the method of Holz & Wald (1998), which includes the effect of lensing shear. The standard deviation of this distribution is ~ 0.027 magnitudes and the mean is ~ 0.003 magnitudes for voids of radius 35 Mpc, sources at redshift z_s=1.0, with the voids chosen so that 90% of the mass is on the shell today. The standard deviation varies from 0.005 to 0.06 magnitudes as we vary the void size, source redshift, and fraction of mass on the shells today. If the shell walls are given a finite thic...

  5. Fast computation of the neutron flux distribution in x-y geometry using the QP0 first collision probability method

    International Nuclear Information System (INIS)

    For the space dependent condensation of cross section libraries with a large number of free energy groups a fast method is developed to calculate the neutron spectrum in rectangular multicells. Material zones in any given arrangement are divided into uniform meshes using an overall mesh grid. Inside each mesh the neutron flux is assumed to be spatially constant. The boundary currents are approximated using quadrupole cosine distribution. In contrast to the 'surface currents' method the authors use QP0 collision probabilities which are represented as product sums of a set of first collision probabilities for single meshes. The theoretical and programming work had been finished already in 1976 and the results described in an internal EIR report. (Auth.)

  6. Dealing with varying detection probability, unequal sample sizes and clumped distributions in count data.

    Directory of Open Access Journals (Sweden)

    D Johan Kotze

    Full Text Available Temporal variation in the detectability of a species can bias estimates of relative abundance if not handled correctly. For example, when effort varies in space and/or time it becomes necessary to take variation in detectability into account when data are analyzed. We demonstrate the importance of incorporating seasonality into the analysis of data with unequal sample sizes due to lost traps at a particular density of a species. A case study of count data was simulated using a spring-active carabid beetle. Traps were 'lost' randomly during high beetle activity in high abundance sites and during low beetle activity in low abundance sites. Five different models were fitted to datasets with different levels of loss. If sample sizes were unequal and a seasonality variable was not included in models that assumed the number of individuals was log-normally distributed, the models severely under- or overestimated the true effect size. Results did not improve when seasonality and number of trapping days were included in these models as offset terms, but only performed well when the response variable was specified as following a negative binomial distribution. Finally, if seasonal variation of a species is unknown, which is often the case, seasonality can be added as a free factor, resulting in well-performing negative binomial models. Based on these results we recommend (a add sampling effort (number of trapping days in our example to the models as an offset term, (b if precise information is available on seasonal variation in detectability of a study object, add seasonality to the models as an offset term; (c if information on seasonal variation in detectability is inadequate, add seasonality as a free factor; and (d specify the response variable of count data as following a negative binomial or over-dispersed Poisson distribution.

  7. Luminosity distance in ``Swiss cheese'' cosmology with randomized voids. II. Magnification probability distributions

    Science.gov (United States)

    Flanagan, Éanna É.; Kumar, Naresh; Wasserman, Ira; Vanderveld, R. Ali

    2012-01-01

    We study the fluctuations in luminosity distances due to gravitational lensing by large scale (≳35Mpc) structures, specifically voids and sheets. We use a simplified “Swiss cheese” model consisting of a ΛCDM Friedman-Robertson-Walker background in which a number of randomly distributed nonoverlapping spherical regions are replaced by mass-compensating comoving voids, each with a uniform density interior and a thin shell of matter on the surface. We compute the distribution of magnitude shifts using a variant of the method of Holz and Wald , which includes the effect of lensing shear. The standard deviation of this distribution is ˜0.027 magnitudes and the mean is ˜0.003 magnitudes for voids of radius 35 Mpc, sources at redshift zs=1.0, with the voids chosen so that 90% of the mass is on the shell today. The standard deviation varies from 0.005 to 0.06 magnitudes as we vary the void size, source redshift, and fraction of mass on the shells today. If the shell walls are given a finite thickness of ˜1Mpc, the standard deviation is reduced to ˜0.013 magnitudes. This standard deviation due to voids is a factor ˜3 smaller than that due to galaxy scale structures. We summarize our results in terms of a fitting formula that is accurate to ˜20%, and also build a simplified analytic model that reproduces our results to within ˜30%. Our model also allows us to explore the domain of validity of weak-lensing theory for voids. We find that for 35 Mpc voids, corrections to the dispersion due to lens-lens coupling are of order ˜4%, and corrections due to shear are ˜3%. Finally, we estimate the bias due to source-lens clustering in our model to be negligible.

  8. Confidence limits with multiple channels and arbitrary probability distributions for sensitivity and expected background

    CERN Document Server

    Perrotta, A

    2002-01-01

    A MC method is proposed to compute upper limits, in a pure Bayesian approach, when the errors associated with the experimental sensitivity and expected background content are not Gaussian distributed or not small enough to apply usual approximations. It is relatively easy to extend the procedure to the multichannel case (for instance when different decay branching, luminosities or experiments have to be combined). Some of the searches for supersymmetric particles performed in the DELPHI experiment at the LEP electron- positron collider use such a procedure to propagate systematics into the calculation of cross-section upper limits. One of these searches is described as an example. (6 refs).

  9. 精密播种粒距的概率分布%Probability Distribution of Seed Spacing of Precision Drilling

    Institute of Scientific and Technical Information of China (English)

    王玉顺; 司慧萍; 郑德聪; 吴海平

    2001-01-01

    The placement formula was deduced from the seed placement coordinate which was expressed as the sum of expectation value and normal stochastic error based on the observation of the seed spacing forming process in precision drilling. On this basis, a mathematical model was set up, from which the probability density function, probability distribution function, and first as well as second order moments of the seed spacing were obtained. The result show that the seed spacing is a stochastic variable conforming to a deductive distribution of several normal distributions with the same variance but different mean value, and it can be regarded as the folding tail and addition of normal distributions with different distribution parameters. The results of goodness-of-fit test proved that the deduced seed spacing distribution coincided with various practical seed spacing samples. The graph of the spacing probability density whose shape relies on the distribution parameters usually takes the form of “multi-peak and non-symmetry”.%考察精密播种的粒距形成过程,将种子落点坐标表示为期望值与正态随机偏差之和,导出落点间隔表达式。基于平稳随机过程建立了粒距的数学模型,演绎推导出粒距分布的分布函数、分布密度、一阶原点矩、二阶原点矩和方差。结果表明:粒距随机变量遵从一种源于方差相同但均值不同的多个正态总体的导出分布,可看作是不同分布参数正态分布的折尾与叠加。拟合优度检验的结果证实所求粒距分布对多种实测粒距样本拟合良好。分布密度图像的形状决定于分布参数,一般呈现为“多峰非对称”形式。

  10. Probability distribution, conditional dissipation, and transport of passive temperature fluctuations in grid-generated turbulence

    International Nuclear Information System (INIS)

    The evolution of the scalar probability density function (pdf), the conditional scalar dissipation rate, and other statistics including transport properties are studied for passive temperature fluctuations in decaying grid-generated turbulence. The effect of filtering and differentiating the time series is also investigated. For a nonzero mean temperature gradient it is shown that the pdf of the temperature fluctuations has pronounced exponential tails for turbulence Reynolds number (Rel) greater than 70 but below this value the pdf is close to Gaussian. The scalar dissipation rate, conditioned on the fluctuations, shows that there is a high expectation of dissipation in the presence of the large, rare fluctuations that produce the exponential tails. Significant positive correlation between the mean square scalar fluctuations and the instantaneous scalar dissipation rate is found when exponential tails occur. The case of temperature fluctuations in the absence of a mean gradient is also studied. Here, the results are less definite because the generation of the fluctuations (by means of fine heated wires) causes an asymmetry in the pdf. The results show, however, that the pdf is close to Gaussian and that the correlation between the mean square temperature fluctuations and the instantaneous scalar dissipation rate is very weak. For the linear profile case, measurements over the range 60≤Rel≤1100 show that the dimensionless heat flux Nu is proportional to Rel0.88 and that the transition from a Gaussian pdf to one with exponential tails occurs at Nu∼31, a value close to transitions observed in other recent mixing experiments conducted in entirely different turbulent flows

  11. Codon information value and codon transition-probability distributions in short-term evolution

    Science.gov (United States)

    Jiménez-Montaño, M. A.; Coronel-Brizio, H. F.; Hernández-Montoya, A. R.; Ramos-Fernández, A.

    2016-07-01

    To understand the way the Genetic Code and the physical-chemical properties of coded amino acids affect accepted amino acid substitutions in short-term protein evolution, taking into account only overall amino acid conservation, we consider an underlying codon-level model. This model employs codon pair-substitution frequencies from an empirical matrix in the literature, modified for single-base mutations only. Ordering the degenerated codons according to their codon information value (Volkenstein, 1979), we found that three-fold and most of four-fold degenerated codons, which have low codon values, were best fitted to rank-frequency distributions with constant failure rate (exponentials). In contrast, almost all two-fold degenerated codons, which have high codon values, were best fitted to rank-frequency distributions with variable failure rate (inverse power-laws). Six-fold degenerated codons are considered to be doubly assigned. The exceptional behavior of some codons, including non-degenerate codons, is discussed.

  12. Probability density functions for description of diameter distribution in thinned stands of Tectona grandis

    Directory of Open Access Journals (Sweden)

    Julianne de Castro Oliveira

    2012-06-01

    Full Text Available The objective of this study was to evaluate the effectiveness of fatigue life, Frechet, Gamma, Generalized Gamma, Generalized Logistic, Log-logistic, Nakagami, Beta, Burr, Dagum, Weibull and Hyperbolic distributions in describing diameter distribution in teak stands subjected to thinning at different ages. Data used in this study originated from 238 rectangular permanent plots 490 m2 in size, installed in stands of Tectona grandis L. f. in Mato Grosso state, Brazil. The plots were measured at ages 34, 43, 55, 68, 81, 82, 92, 104, 105, 120, 134 and 145 months on average. Thinning was done in two occasions: the first was systematic at age 81months, with a basal area intensity of 36%, while the second was selective at age 104 months on average and removed poorer trees, reducing basal area by 30%. Fittings were assessed by the Kolmogorov-Smirnov goodness-of-fit test. The Log-logistic (3P, Burr (3P, Hyperbolic (3P, Burr (4P, Weibull (3P, Hyperbolic (2P, Fatigue Life (3P and Nakagami functions provided more satisfactory values for the k-s test than the more commonly used Weibull function.

  13. Uranium concentration and distribution in six peridotite inclusions of probable mantle origin

    Science.gov (United States)

    Haines, E. L.; Zartman, R. E.

    1973-01-01

    Fission-track activation was used to investigate uranium concentration and distribution in peridotite inclusions in alkali basalt from six localities. Whole-rock uranium concentrations range from 24 to 82 ng/g. Most of the uranium is uniformly distributed in the major silicate phases - olivine, orthopyroxene, and clinopyroxene. Chromian spinels may be classified into two groups on the basis of their uranium content - those which have less than 10 ng/g and those which have 100 to 150 ng/g U. In one sample accessory hydrous phases, phlogopite and hornblende, contain 130 and 300 ng/g U, respectively. The contact between the inclusion and the host basalt is usually quite sharp. Glassy or microcrystalline veinlets found in some samples contain more than 1 microgram/g. Very little uranium is associated with microcrystals of apatite. These results agree with some earlier investigators, who have concluded that suboceanic peridotites contain too little uranium to account for normal oceanic heat flow by conduction alone.

  14. Multivariate probability distribution for sewer system vulnerability assessment under data-limited conditions.

    Science.gov (United States)

    Del Giudice, G; Padulano, R; Siciliano, D

    2016-01-01

    The lack of geometrical and hydraulic information about sewer networks often excludes the adoption of in-deep modeling tools to obtain prioritization strategies for funds management. The present paper describes a novel statistical procedure for defining the prioritization scheme for preventive maintenance strategies based on a small sample of failure data collected by the Sewer Office of the Municipality of Naples (IT). Novelty issues involve, among others, considering sewer parameters as continuous statistical variables and accounting for their interdependences. After a statistical analysis of maintenance interventions, the most important available factors affecting the process are selected and their mutual correlations identified. Then, after a Box-Cox transformation of the original variables, a methodology is provided for the evaluation of a vulnerability map of the sewer network by adopting a joint multivariate normal distribution with different parameter sets. The goodness-of-fit is eventually tested for each distribution by means of a multivariate plotting position. The developed methodology is expected to assist municipal engineers in identifying critical sewers, prioritizing sewer inspections in order to fulfill rehabilitation requirements. PMID:26901717

  15. Confidence Limits with Multiple Channels and Arbitrary Probability Distributions for Sensitivity and Expected Background

    Science.gov (United States)

    Perrotta, Andrea

    A MC method is proposed to compute upper limits, in a pure Bayesian approach, when the errors associated to the experimental sensitivity and to the expected background content are not Gaussian distributed or not small enough to apply the usual approximations. It is relatively easy to extend the procedure to the multichannel case (for instance when different decay branchings, or luminosities or experiments have to be combined). Some of the searches for supersymmetric particles performed in the DELPHI experiment at the LEP electron-positron collider use such a procedure to propagate the systematics into the calculation of the cross-section upper limits. One of these searches will be described as an example.

  16. Measurement of Diurnal Body Tilt Angle Distributions of Threeline Grunt Parapristipoma trilineatum Using Micro-Acceleration Data Loggers

    Directory of Open Access Journals (Sweden)

    Hideaki Tanoue

    2013-07-01

    Full Text Available The body tilt angle of a fish has a large effect on the acoustic target strength. For an accurate estimation of fish abundance using acoustic methods, it is necessary to measure body tilt angles in free-ranging fish. We measured diurnal body tilt angle distributions of threeline grunt (Parapristipoma trilineatum while swimming in schools in a fish cage. Micro-acceleration data loggers were used to record (for 3 days swaying and surging accelerations (at 16 Hz intervals of 10 individuals among 20 forming a school in a fish cage. Time series analysis of 1-h mean body tilt angles revealed significant differences in body tilt angles between day (−7.9 ± 3.28° and night (0.8 ± 5.89°, which must be taken into account when conducting acoustic surveys. These results will be useful for calculating the average dorsal aspect target strength (TS of threeline grunt for accurate estimations of fish abundance.

  17. Families of Fokker-Planck equations and the associated entropic form for a distinct steady-state probability distribution with a known external force field.

    Science.gov (United States)

    Asgarani, Somayeh

    2015-02-01

    A method of finding entropic form for a given stationary probability distribution and specified potential field is discussed, using the steady-state Fokker-Planck equation. As examples, starting with the Boltzmann and Tsallis distribution and knowing the force field, we obtain the Boltzmann-Gibbs and Tsallis entropies. Also, the associated entropy for the gamma probability distribution is found, which seems to be in the form of the gamma function. Moreover, the related Fokker-Planck equations are given for the Boltzmann, Tsallis, and gamma probability distributions. PMID:25768455

  18. Structured Coupling of Probability Loss Distributions: Assessing Joint Flood Risk in Multiple River Basins.

    Science.gov (United States)

    Timonina, Anna; Hochrainer-Stigler, Stefan; Pflug, Georg; Jongman, Brenden; Rojas, Rodrigo

    2015-11-01

    Losses due to natural hazard events can be extraordinarily high and difficult to cope with. Therefore, there is considerable interest to estimate the potential impact of current and future extreme events at all scales in as much detail as possible. As hazards typically spread over wider areas, risk assessment must take into account interrelations between regions. Neglecting such interdependencies can lead to a severe underestimation of potential losses, especially for extreme events. This underestimation of extreme risk can lead to the failure of riskmanagement strategies when they are most needed, namely, in times of unprecedented events. In this article, we suggest a methodology to incorporate such interdependencies in risk via the use of copulas. We demonstrate that by coupling losses, dependencies can be incorporated in risk analysis, avoiding the underestimation of risk. Based on maximum discharge data of river basins and stream networks, we present and discuss different ways to couple loss distributions of basins while explicitly incorporating tail dependencies. We distinguish between coupling methods that require river structure data for the analysis and those that do not. For the later approach we propose a minimax algorithm to choose coupled basin pairs so that the underestimation of risk is avoided and the use of river structure data is not needed. The proposed methodology is especially useful for large-scale analysis and we motivate and apply our method using the case of Romania. The approach can be easily extended to other countries and natural hazards. PMID:26010101

  19. Understanding star formation in molecular clouds III. Probability distribution functions of molecular lines in Cygnus X

    CERN Document Server

    Schneider, N; Motte, F; Ossenkopf, V; Klessen, R S; Simon, R; Fechtenbaum, S; Herpin, F; Tremblin, P; Csengeri, T; Myers, P C; Hill, T; Cunningham, M; Federrath, C

    2015-01-01

    Column density (N) PDFs serve as a powerful tool to characterize the physical processes that influence the structure of molecular clouds. Star-forming clouds can best be characterized by lognormal PDFs for the lower N range and a power-law tail for higher N, commonly attributed to turbulence and self-gravity and/or pressure, respectively. We report here on PDFs obtained from observations of 12CO, 13CO, C18O, CS, and N2H+ in the Cygnus X North region and compare to a PDF derived from dust observations with the Herschel satellite. The PDF of 12CO is lognormal for Av~1-30, but is cut for higher Av due to optical depth effects. The PDFs of C18O and 13CO are mostly lognormal up for Av~1-15, followed by excess up to Av~40. Above that value, all CO PDFs drop, most likely due to depletion. The high density tracers CS and N2H+ exhibit only a power law distribution between Av~15 and 400, respectively. The PDF from dust is lognormal for Av~2-15 and has a power-law tail up to Av~500. Absolute values for the molecular lin...

  20. Wigner quasi-probability distribution for the infinite square well: energy eigenstates and time-dependent wave packets

    CERN Document Server

    Belloni, M; Robinett, R W

    2003-01-01

    We calculate the Wigner quasi-probability distribution for position and momentum, P_W^(n)(x,p), for the energy eigenstates of the standard infinite well potential, using both x- and p-space stationary-state solutions, as well as visualizing the results. We then evaluate the time-dependent Wigner distribution, P_W(x,p;t), for Gaussian wave packet solutions of this system, illustrating both the short-term semi-classical time dependence, as well as longer-term revival and fractional revival behavior and the structure during the collapsed state. This tool provides an excellent way of demonstrating the patterns of highly correlated Schrodinger-cat-like `mini-packets' which appear at fractional multiples of the exact revival time.

  1. A laser speckle sensor to measure the distribution of static torsion angles of twisted targets

    DEFF Research Database (Denmark)

    Rose, B.; Imam, H.; Hanson, Steen Grüner;

    1998-01-01

    cylindrical lens serves to image the closely spaced lateral positions of the target along the twist axis onto corresponding lines of the two dimensional image sensor. Thus, every single line of the image sensor measures the torsion angle of the corresponding surface position along the twist axis of the target...

  2. A review of wind speed probability distributions used in wind energy analysis. Case studies in the Canary Islands

    Energy Technology Data Exchange (ETDEWEB)

    Carta, J.A. [Department of Mechanical Engineering, University of Las Palmas de Gran Canaria, Campus de Tafira s/n, 35017 Gran Canaria, Canary Islands (Spain); Ramirez, P. [Department of Renewable Energies, Technological Institute of the Canary Islands, Pozo Izquierdo Beach s/n, 35119 Santa Lucia, Gran Canaria, Canary Islands (Spain); Velazquez, S. [Department of Electronics and Automatics Engineering, University of Las Palmas de Gran Canaria, Campus de Tafira s/n, 35017 Gran Canaria, Canary Islands (Spain)

    2009-06-15

    The probability density function (PDF) of wind speed is important in numerous wind energy applications. A large number of studies have been published in scientific literature related to renewable energies that propose the use of a variety of PDFs to describe wind speed frequency distributions. In this paper a review of these PDFs is carried out. The flexibility and usefulness of the PDFs in the description of different wind regimes (high frequencies of null winds, unimodal, bimodal, bitangential regimes, etc.) is analysed for a wide collection of models. Likewise, the methods that have been used to estimate the parameters on which these models depend are reviewed and the degree of complexity of the estimation is analysed in function of the model selected: these are the method of moments (MM), the maximum likelihood method (MLM) and the least squares method (LSM). In addition, a review is conducted of the statistical tests employed to see whether a sample of wind data comes from a population with a particular probability distribution. With the purpose of cataloguing the various PDFs, a comparison is made between them and the two parameter Weibull distribution (W.pdf), which has been the most widely used and accepted distribution in the specialised literature on wind energy and other renewable energy sources. This comparison is based on: (a) an analysis of the degree of fit of the continuous cumulative distribution functions (CDFs) for wind speed to the cumulative relative frequency histograms of hourly mean wind speeds recorded at weather stations located in the Canarian Archipelago; (b) an analysis of the degree of fit of the CDFs for wind power density to the cumulative relative frequency histograms of the cube of hourly mean wind speeds recorded at the aforementioned weather stations. The suitability of the distributions is judged from the coefficient of determination R{sup 2}. Amongst the various conclusions obtained, it can be stated that the W.pdf presents a

  3. Signatures of the various regions of the outer magnetosphere in the pitch angle distributions of energetic particles

    Energy Technology Data Exchange (ETDEWEB)

    West, H.I. Jr.

    1978-12-11

    An account is given of the obervations of the pitch angle distributions of energetic particles in the near equatorial regions of the Earth's magnetosphere. The emphasis is on relating the observed distributions to the field configuration responsible for the observed effects. The observed effects relate to drift-shell splitting, to the breakdown of adiabatic guiding center motion in regions of sharp field curvature relative to partial gyro radii, to wave-particle interactions, and to moving field configurations. 39 references.

  4. Small-angle X-ray scattering determination of the distribution of particle diameters in photochromic glasses

    International Nuclear Information System (INIS)

    The existing methods for determining particle size distributions from small angle X-ray scattering data are reviewed. The improved transform technique was used for calculating diameter distributions N(D) of lightsensitive silverhalide crystallites in photochromic glasses. From the evolution of N(D) during a certain heat treatment it can be concluded that two generations of crystallites of different size are precipitated. In glass I, the mean diameter D increases proportional to the time t of the treatment (reaction-limited growth) and in glass II D3 approximately t (diffusion-limited ripening) is obtained. (author)

  5. Azimuth angle distribution of thermal-infrared temperature over rice canopy with row orientation

    International Nuclear Information System (INIS)

    Using ground-based and airborne observation, as well as numerical simulation, we confirmed that the thermal-infrared temperature (TIT) of a rice canopy surface with row orientation changes with azimuth viewing angle. The TIT of the direction parallel to row orientation is 1-4degC higher than that of the other directions. The TIT differences occur during the daytime, and for a leaf area index (LAI) around 0.5-3 because the field of view of an infrared thermometer viewing a direction parallel to the rows contains much more of the water surface under the rice canopy than the plant surface of the canopy. The temperature of the water surface between rows is much higher than that of the plant surface, because the intense incoming solar radiation near noon is not absorbed by the canopy and so warms the water efficiently. Matsushima and Kondo (1997) developed a radiation transfer model for TIT of a rice canopy surface, and confirmed a nadir viewing angle dependence of TIT of according to leaf area index. Based on the above model, a model of a rice canopy with row orientation was developed to investigate the TIT variation with azimuth viewing angle. The model design employs the ratio of the apparent areas of the plant surface and the underground water surface, which change with the azimuth and nadir viewing angles, and reproduces the observation well. These results indicate that the main cause of the TIT difference is the ratio of the apparent areas of the plant surface and the water surface when the temperature of the water surface is much higher than that of the plant surface. The TIT in a westerly direction exceeds that of the other directions shortly after sunrise because the solar elevation is low and the azimuth of the sun is around east. This is because the plant surface temperature exceeds that of the water surface, which is opposite the near noon cases. On the scale of a satellite grid, a simple numerical experiment demonstrated that the TIT difference of azimuth

  6. Binomial probability distribution model-based protein identification algorithm for tandem mass spectrometry utilizing peak intensity information.

    Science.gov (United States)

    Xiao, Chuan-Le; Chen, Xiao-Zhou; Du, Yang-Li; Sun, Xuesong; Zhang, Gong; He, Qing-Yu

    2013-01-01

    Mass spectrometry has become one of the most important technologies in proteomic analysis. Tandem mass spectrometry (LC-MS/MS) is a major tool for the analysis of peptide mixtures from protein samples. The key step of MS data processing is the identification of peptides from experimental spectra by searching public sequence databases. Although a number of algorithms to identify peptides from MS/MS data have been already proposed, e.g. Sequest, OMSSA, X!Tandem, Mascot, etc., they are mainly based on statistical models considering only peak-matches between experimental and theoretical spectra, but not peak intensity information. Moreover, different algorithms gave different results from the same MS data, implying their probable incompleteness and questionable reproducibility. We developed a novel peptide identification algorithm, ProVerB, based on a binomial probability distribution model of protein tandem mass spectrometry combined with a new scoring function, making full use of peak intensity information and, thus, enhancing the ability of identification. Compared with Mascot, Sequest, and SQID, ProVerB identified significantly more peptides from LC-MS/MS data sets than the current algorithms at 1% False Discovery Rate (FDR) and provided more confident peptide identifications. ProVerB is also compatible with various platforms and experimental data sets, showing its robustness and versatility. The open-source program ProVerB is available at http://bioinformatics.jnu.edu.cn/software/proverb/ . PMID:23163785

  7. Visualization of the operational space of edge-localized modes through low-dimensional embedding of probability distributions

    Energy Technology Data Exchange (ETDEWEB)

    Shabbir, A., E-mail: aqsa.shabbir@ugent.be; Noterdaeme, J. M. [Department of Applied Physics, Ghent University, B-9000 Ghent (Belgium); Max-Planck-Institut für Plasmaphysik, Garching D-85748 (Germany); Verdoolaege, G. [Department of Applied Physics, Ghent University, B-9000 Ghent (Belgium); Laboratoire de Physique des Plasmas de l' ERM, Laboratorium voor Plasmafysica van de KMS (LPP-ERM/KMS), Ecole Royale Militaire, Koninklijke Militaire School, B-1000 Brussels (Belgium); Kardaun, O. J. W. F. [Max-Planck-Institut für Plasmaphysik, Garching D-85748 (Germany); Collaboration: JET-EFDA Team

    2014-11-15

    Information visualization aimed at facilitating human perception is an important tool for the interpretation of experiments on the basis of complex multidimensional data characterizing the operational space of fusion devices. This work describes a method for visualizing the operational space on a two-dimensional map and applies it to the discrimination of type I and type III edge-localized modes (ELMs) from a series of carbon-wall ELMy discharges at JET. The approach accounts for stochastic uncertainties that play an important role in fusion data sets, by modeling measurements with probability distributions in a metric space. The method is aimed at contributing to physical understanding of ELMs as well as their control. Furthermore, it is a general method that can be applied to the modeling of various other plasma phenomena as well.

  8. Visualization of the operational space of edge-localized modes through low-dimensional embedding of probability distributions

    International Nuclear Information System (INIS)

    Information visualization aimed at facilitating human perception is an important tool for the interpretation of experiments on the basis of complex multidimensional data characterizing the operational space of fusion devices. This work describes a method for visualizing the operational space on a two-dimensional map and applies it to the discrimination of type I and type III edge-localized modes (ELMs) from a series of carbon-wall ELMy discharges at JET. The approach accounts for stochastic uncertainties that play an important role in fusion data sets, by modeling measurements with probability distributions in a metric space. The method is aimed at contributing to physical understanding of ELMs as well as their control. Furthermore, it is a general method that can be applied to the modeling of various other plasma phenomena as well

  9. Wave function and the probability current distribution for a bound electron moving in a uniform magnetic field

    Science.gov (United States)

    Rodionov, V. N.; Kravtsova, G. A.; Mandel', A. M.

    2010-07-01

    We study the effects of electromagnetic fields on nonrelativistic charged spinning particles bound by a short-range potential. We analyze the exact solution of the Pauli equation for an electron moving in the potential field determined by the three-dimensional δ-well in the presence of a strong magnetic field. We obtain asymptotic expressions for this solution for different values of the problem parameters. In addition, we consider electron probability currents and their dependence on the magnetic field. We show that including the spin in the framework of the nonrelativistic approach allows correctly taking the effect of the magnetic field on the electric current into account. The obtained dependences of the current distribution, which is an experimentally observable quantity, can be manifested directly in scattering processes, for example.

  10. Forecasting the Stock Market with Linguistic Rules Generated from the Minimize Entropy Principle and the Cumulative Probability Distribution Approaches

    Directory of Open Access Journals (Sweden)

    Chung-Ho Su

    2010-12-01

    Full Text Available To forecast a complex and non-linear system, such as a stock market, advanced artificial intelligence algorithms, like neural networks (NNs and genetic algorithms (GAs have been proposed as new approaches. However, for the average stock investor, two major disadvantages are argued against these advanced algorithms: (1 the rules generated by NNs and GAs are difficult to apply in investment decisions; and (2 the time complexity of the algorithms to produce forecasting outcomes is very high. Therefore, to provide understandable rules for investors and to reduce the time complexity of forecasting algorithms, this paper proposes a novel model for the forecasting process, which combines two granulating methods (the minimize entropy principle approach and the cumulative probability distribution approach and a rough set algorithm. The model verification demonstrates that the proposed model surpasses the three listed conventional fuzzy time-series models and a multiple regression model (MLR in forecast accuracy.

  11. Updated greenhouse gas and criteria air pollutant emission factors and their probability distribution functions for electricity generating units

    Energy Technology Data Exchange (ETDEWEB)

    Cai, H.; Wang, M.; Elgowainy, A.; Han, J. (Energy Systems)

    2012-07-06

    Greenhouse gas (CO{sub 2}, CH{sub 4} and N{sub 2}O, hereinafter GHG) and criteria air pollutant (CO, NO{sub x}, VOC, PM{sub 10}, PM{sub 2.5} and SO{sub x}, hereinafter CAP) emission factors for various types of power plants burning various fuels with different technologies are important upstream parameters for estimating life-cycle emissions associated with alternative vehicle/fuel systems in the transportation sector, especially electric vehicles. The emission factors are typically expressed in grams of GHG or CAP per kWh of electricity generated by a specific power generation technology. This document describes our approach for updating and expanding GHG and CAP emission factors in the GREET (Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation) model developed at Argonne National Laboratory (see Wang 1999 and the GREET website at http://greet.es.anl.gov/main) for various power generation technologies. These GHG and CAP emissions are used to estimate the impact of electricity use by stationary and transportation applications on their fuel-cycle emissions. The electricity generation mixes and the fuel shares attributable to various combustion technologies at the national, regional and state levels are also updated in this document. The energy conversion efficiencies of electric generating units (EGUs) by fuel type and combustion technology are calculated on the basis of the lower heating values of each fuel, to be consistent with the basis used in GREET for transportation fuels. On the basis of the updated GHG and CAP emission factors and energy efficiencies of EGUs, the probability distribution functions (PDFs), which are functions that describe the relative likelihood for the emission factors and energy efficiencies as random variables to take on a given value by the integral of their own probability distributions, are updated using best-fit statistical curves to characterize the uncertainties associated with GHG and CAP emissions in life

  12. Constructing probability distributions of uncertain variables in models of the performance of the Waste Isolation Pilot Plant: The 1990 performance simulations

    International Nuclear Information System (INIS)

    A five-step procedure was used in the 1990 performance simulations to construct probability distributions of the uncertain variables appearing in the mathematical models used to simulate the Waste Isolation Pilot Plant's (WIPP's) performance. This procedure provides a consistent approach to the construction of probability distributions in cases where empirical data concerning a variable are sparse or absent and minimizes the amount of spurious information that is often introduced into a distribution by assumptions of nonspecialists. The procedure gives first priority to the professional judgment of subject-matter experts and emphasizes the use of site-specific empirical data for the construction of the probability distributions when such data are available. In the absence of sufficient empirical data, the procedure employs the Maximum Entropy Formalism and the subject-matter experts' subjective estimates of the parameters of the distribution to construct a distribution that can be used in a performance simulation. (author)

  13. Probability distribution function values in mobile phones;Valores de funciones de distribución de probabilidad en el teléfono móvil

    OpenAIRE

    Luis Vicente Chamorro Marcillllo; Oscar Revelo Sánchez

    2013-01-01

    Engineering, within its academic and application forms, as well as any formal research work requires the use of statistics and every inferential statistical analysis requires the use of values of probability distribution functions that are generally available in tables. Generally, management of those tables have physical problems (wasteful transport and wasteful consultation) and operational (incomplete lists and limited accuracy). The study, “Probability distribution function values in mobil...

  14. Fast chemical reaction in a two-dimensional Navier-Stokes flow: Probability distribution in the initial regime

    CERN Document Server

    Ait-Chaalal, Farid; Bartello, Peter

    2011-01-01

    We study an instantaneous bimolecular chemical reaction in a two-dimensional chaotic, incompressible and closed Navier-Stokes flow. Areas of well mixed reactants are initially separated by infinite gradients. We focus on the initial regime, characterized by a well-defined one-dimensional contact line between the reactants. The amount of reactant consumed is given by the diffusive flux along this line, and hence relates directly to its length and to the gradients along it. We show both theoretically and numerically that the probability distribution of the modulus of the gradient of the reactants along this contact line multiplied by {\\kappa} does not depend on the diffusion {\\kappa} and can be inferred, after a few turnover times, from the joint distribution of the finite time Lyapunov exponent {\\lambda} and the frequency 1/{\\tau} . The equivalent time {\\tau} measures the stretching time scale of a Lagrangian parcel in the recent past, while {\\lambda} measures it on the whole chaotic orbit. At smaller times, w...

  15. IGM Constraints from the SDSS-III/BOSS DR9 Ly-alpha Forest Flux Probability Distribution Function

    CERN Document Server

    Lee, Khee-Gan; Spergel, David N; Weinberg, David H; Hogg, David W; Viel, Matteo; Bolton, James S; Bailey, Stephen; Pieri, Matthew M; Carithers, William; Schlegel, David J; Lundgren, Britt; Palanque-Delabrouille, Nathalie; Suzuki, Nao; Schneider, Donald P; Yeche, Christophe

    2014-01-01

    The Ly$\\alpha$ forest flux probability distribution function (PDF) is an established probe of the intergalactic medium (IGM) astrophysics, especially the temperature-density relationship of the IGM. We measure the flux PDF from 3393 Baryon Oscillations Spectroscopic Survey (BOSS) quasars from SDSS Data Release 9, and compare with mock spectra that include careful modeling of the noise, continuum, and astrophysical uncertainties. The BOSS flux PDFs, measured at $\\langle z \\rangle = [2.3,2.6,3.0]$, are compared with PDFs created from mock spectra drawn from a suite of hydrodynamical simulations that sample the IGM temperature-density relationship, $\\gamma$, and temperature at mean-density, $T_0$, where $T(\\Delta) = T_0 \\Delta^{\\gamma-1}$. We find that a significant population of partial Lyman-limit systems with a column-density distribution slope of $\\beta_\\mathrm{pLLS} \\sim -2$ are required to explain the data at the low-flux end of flux PDF, while uncertainties in the mean \\lya\\ forest transmission affect the...

  16. Determination of the spatial distribution of multiple fluid phases in porous media by ultra-small-angle neutron scattering

    International Nuclear Information System (INIS)

    In the present work contrast-matching USANS (ultra-small-angle neutron scattering) was employed in order to determine the spatial distribution of immiscible fluids confined within a macroporous α-Al2O3 membrane. Water-air as well as water-hydrocarbon and hydrocarbon-air systems were examined and the analysis of the results, also on the basis of a complementary numerical study provided significant information on the behaviour of the multiphase ensemble as it has been demonstrated that the individual fluids occupy certain positions in the pore space, regardless of the actual values of the respective interfacial properties.

  17. Efficiency enhancement and angle-dependent color change in see-through organic photovoltaics using distributed Bragg reflectors

    Science.gov (United States)

    Dong, Wan Jae; Lo, Nhat-Truong; Jung, Gwan Ho; Ham, Juyoung; Lee, Jong-Lam

    2016-03-01

    A distributed Bragg reflector (DBR) is conducted as a bottom reflector in see-through organic photovoltaics (OPVs) with an active layer of poly(3-hexylthiophene) and phenyl-C61-butyric acid methyl ester (P3HT:PCBM). The DBR consists of alternative layers of the high- and low-refractive index materials of Ta2O5 (n = 2.16) and SiO2 (n = 1.46). The DBR selectively reflects the light within a specific wavelength region (490 nm-630 nm) where the absorbance of P3HT:PCBM is maximum. The see-through OPVs fabricated on DBR exhibit efficiency enhancement by 31% compared to the device without DBR. Additionally, the angle-dependent transmittance of DBR is analysed using optical simulation and verified by experimental results. As the incident angle of light increases, peak of reflectance shifts to shorter wavelength and the bandwidth gets narrower. This unique angle-dependent optical properties of DBR allows the facile color change of see-through OPVs.

  18. Generating Within-Plant Spatial Distributions of an Insect Herbivore Based on Aggregation Patterns and Per-Node Infestation Probabilities.

    Science.gov (United States)

    Rincon, Diego F; Hoy, Casey W; Cañas, Luis A

    2015-04-01

    Most predator-prey models extrapolate functional responses from small-scale experiments assuming spatially uniform within-plant predator-prey interactions. However, some predators focus their search in certain plant regions, and herbivores tend to select leaves to balance their nutrient uptake and exposure to plant defenses. Individual-based models that account for heterogeneous within-plant predator-prey interactions can be used to scale-up functional responses, but they would require the generation of explicit prey spatial distributions within-plant architecture models. The silverleaf whitefly, Bemisia tabaci biotype B (Gennadius) (Hemiptera: Aleyrodidae), is a significant pest of tomato crops worldwide that exhibits highly aggregated populations at several spatial scales, including within the plant. As part of an analytical framework to understand predator-silverleaf whitefly interactions, the objective of this research was to develop an algorithm to generate explicit spatial counts of silverleaf whitefly nymphs within tomato plants. The algorithm requires the plant size and the number of silverleaf whitefly individuals to distribute as inputs, and includes models that describe infestation probabilities per leaf nodal position and the aggregation pattern of the silverleaf whitefly within tomato plants and leaves. The output is a simulated number of silverleaf whitefly individuals for each leaf and leaflet on one or more plants. Parameter estimation was performed using nymph counts per leaflet censused from 30 artificially infested tomato plants. Validation revealed a substantial agreement between algorithm outputs and independent data that included the distribution of counts of both eggs and nymphs. This algorithm can be used in simulation models that explore the effect of local heterogeneity on whitefly-predator dynamics. PMID:26313173

  19. Angle-resolved energy distributions of laser ablated silver ions in vacuum

    DEFF Research Database (Denmark)

    Hansen, T.N.; Schou, Jørgen; Lunney, J.G.

    1998-01-01

    The energy distributions of ions ablated from silver in vacuum have been measured in situ for pulsed laser irradiation at 355 nm. We have determined the energy spectra for directions ranging from 5 degrees to 75 degrees with respect to the normal in the intensity range from 100 to 400 MW/cm(2). At...

  20. Remote Sensing of Spatial Distributions of Greenhouse Gases in the Los Angles Basin

    Science.gov (United States)

    Fu, Dejian; Pongetti, Thomas J.; Sander, Stanley P.; Cheung, Ross; Stutz, Jochen; Park, Chang Hyoun; Li, Qinbin

    2011-01-01

    The Los Angeles air basin is a significant anthropogenic source of greenhouse gases and pollutants including CO2, CH4, N2O, and CO, contributing significantly to regional and global climate change. Recent legislation in California, the California Global Warming Solutions Act (AB32), established a statewide cap for greenhouse gas emissions for 2020 based on 1990 emissions. Verifying the effectiveness of regional greenhouse gas emissions controls requires high-precision, regional-scale measurement methods combined with models that capture the principal anthropogenic and biogenic sources and sinks. We present a novel approach for monitoring the spatial distributions of greenhouse gases in the Los Angeles basin using high resolution remote sensing spectroscopy. We participated in the CalNex 2010 campaign to provide greenhouse gas distributions for comparison between top-down and bottom-up emission estimates.

  1. Using Hybrid Angle/Distance Information for Distributed Topology Control in Vehicular Sensor Networks

    OpenAIRE

    Chao-Chi Huang; Yang-Hung Chiu; Chih-Yu Wen

    2014-01-01

    In a vehicular sensor network (VSN), the key design issue is how to organize vehicles effectively, such that the local network topology can be stabilized quickly. In this work, each vehicle with on-board sensors can be considered as a local controller associated with a group of communication members. In order to balance the load among the nodes and govern the local topology change, a group formation scheme using localized criteria is implemented. The proposed distributed topology control meth...

  2. IGM CONSTRAINTS FROM THE SDSS-III/BOSS DR9 Lyα FOREST TRANSMISSION PROBABILITY DISTRIBUTION FUNCTION

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Khee-Gan; Hennawi, Joseph F. [Max Planck Institute for Astronomy, Königstuhl 17, D-69117 Heidelberg (Germany); Spergel, David N. [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States); Weinberg, David H. [Department of Astronomy and Center for Cosmology and Astro-Particle Physics, Ohio State University, Columbus, OH 43210 (United States); Hogg, David W. [Center for Cosmology and Particle Physics, New York University, 4 Washington Place, Meyer Hall of Physics, New York, NY 10003 (United States); Viel, Matteo [INAF, Osservatorio Astronomico di Trieste, Via G. B. Tiepolo 11, I-34131 Trieste (Italy); Bolton, James S. [School of Physics and Astronomy, University of Nottingham, University Park, Nottingham NG7 2RD (United Kingdom); Bailey, Stephen; Carithers, William; Schlegel, David J. [E.O. Lawrence Berkeley National Lab, 1 Cyclotron Road, Berkeley, CA 94720 (United States); Pieri, Matthew M. [Institute of Cosmology and Gravitation, University of Portsmouth, Dennis Sciama Building, Portsmouth PO1 3FX (United Kingdom); Lundgren, Britt [Department of Astronomy, University of Wisconsin, Madison, WI 53706 (United States); Palanque-Delabrouille, Nathalie; Yèche, Christophe [CEA, Centre de Saclay, Irfu/SPP, F-91191 Gif-sur-Yvette (France); Suzuki, Nao [Kavli Institute for the Physics and Mathematics of the Universe (IPMU), The University of Tokyo, Kashiwano-ha 5-1-5, Kashiwa-shi, Chiba (Japan); Schneider, Donald P., E-mail: lee@mpia.de [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States)

    2015-02-01

    The Lyα forest transmission probability distribution function (PDF) is an established probe of the intergalactic medium (IGM) astrophysics, especially the temperature-density relationship of the IGM. We measure the transmission PDF from 3393 Baryon Oscillations Spectroscopic Survey (BOSS) quasars from Sloan Digital Sky Survey Data Release 9, and compare with mock spectra that include careful modeling of the noise, continuum, and astrophysical uncertainties. The BOSS transmission PDFs, measured at (z) = [2.3, 2.6, 3.0], are compared with PDFs created from mock spectra drawn from a suite of hydrodynamical simulations that sample the IGM temperature-density relationship, γ, and temperature at mean density, T {sub 0}, where T(Δ) = T {sub 0}Δ{sup γ} {sup –} {sup 1}. We find that a significant population of partial Lyman-limit systems (LLSs) with a column-density distribution slope of β{sub pLLS} ∼ – 2 are required to explain the data at the low-transmission end of transmission PDF, while uncertainties in the mean Lyα forest transmission affect the high-transmission end. After modeling the LLSs and marginalizing over mean transmission uncertainties, we find that γ = 1.6 best describes the data over our entire redshift range, although constraints on T {sub 0} are affected by systematic uncertainties. Within our model framework, isothermal or inverted temperature-density relationships (γ ≤ 1) are disfavored at a significance of over 4σ, although this could be somewhat weakened by cosmological and astrophysical uncertainties that we did not model.

  3. IGM CONSTRAINTS FROM THE SDSS-III/BOSS DR9 Lyα FOREST TRANSMISSION PROBABILITY DISTRIBUTION FUNCTION

    International Nuclear Information System (INIS)

    The Lyα forest transmission probability distribution function (PDF) is an established probe of the intergalactic medium (IGM) astrophysics, especially the temperature-density relationship of the IGM. We measure the transmission PDF from 3393 Baryon Oscillations Spectroscopic Survey (BOSS) quasars from Sloan Digital Sky Survey Data Release 9, and compare with mock spectra that include careful modeling of the noise, continuum, and astrophysical uncertainties. The BOSS transmission PDFs, measured at (z) = [2.3, 2.6, 3.0], are compared with PDFs created from mock spectra drawn from a suite of hydrodynamical simulations that sample the IGM temperature-density relationship, γ, and temperature at mean density, T 0, where T(Δ) = T 0Δγ – 1. We find that a significant population of partial Lyman-limit systems (LLSs) with a column-density distribution slope of βpLLS ∼ – 2 are required to explain the data at the low-transmission end of transmission PDF, while uncertainties in the mean Lyα forest transmission affect the high-transmission end. After modeling the LLSs and marginalizing over mean transmission uncertainties, we find that γ = 1.6 best describes the data over our entire redshift range, although constraints on T 0 are affected by systematic uncertainties. Within our model framework, isothermal or inverted temperature-density relationships (γ ≤ 1) are disfavored at a significance of over 4σ, although this could be somewhat weakened by cosmological and astrophysical uncertainties that we did not model

  4. The distance and luminosity probability distributions derived from parallax and flux with their measurement errors with application to the millisecond pulsar PSR J0218+4232

    CERN Document Server

    Igoshev, A P; Cator, E

    2016-01-01

    We use a Bayesian approach to derive the distance probability distribution for one object from its parallax with measurement uncertainty for two spatial distribution priors, viz. a homogeneous spherical distribution and a galactocentric distribution - applicable for radio pulsars - observed from Earth. We investigate the dependence on measurement uncertainty, and show that a parallax measurement can underestimate or overestimate the actual distance, depending on the spatial distribution prior. We derive the probability distributions for distance and luminosity combined, and for each separately, when a flux with measurement error for the object is also available, and demonstrate the necessity of and dependence on the luminosity function prior. We apply this to estimate the distance and the radio and gamma-ray luminosities of PSR J0218+4232. The use of realistic priors improves the quality of the estimates for distance and luminosity, compared to those based on measurement only. Use of a wrong prior, for exampl...

  5. Combining scenarios in a calculation of the overall probability distribution of cumulative releases of radioactivity from the Waste Isolation Pilot Plant, southeastern New Mexico

    International Nuclear Information System (INIS)

    The Waste Isolation Pilot Plant (WIPP), in southeastern New Mexico, is a research and development facility to demonstrate safe disposal of defense-generated transuranic waste. The US Department of Energy will designate WIPP as a disposal facility if it meets the US Environmental Protection Agency's standard for disposal of such waste; the standard includes a requirement that estimates of cumulative releases of radioactivity to the accessible environment be incorporated in an overall probability distribution. The WIPP Project has chosen an approach to calculation of an overall probability distribution that employs the concept of scenarios for release and transport of radioactivity to the accessible environment. This report reviews the use of Monte Carlo methods in the calculation of an overall probability distribution and presents a logical and mathematical foundation for use of the scenario concept in such calculations. The report also draws preliminary conclusions regarding the shape of the probability distribution for the WIPP system; preliminary conclusions are based on the possible occurrence of three events and the presence of one feature: namely, the events ''attempted boreholes over rooms and drifts,'' ''mining alters ground-water regime,'' ''water-withdrawal wells provide alternate pathways,'' and the feature ''brine pocket below room or drift.'' Calculation of the WIPP systems's overall probability distributions for only five of sixteen possible scenario classes that can be obtained by combining the four postulated events or features

  6. Energy distributions of plume ions from silver at different angles ablated in vacuum

    DEFF Research Database (Denmark)

    Christensen, Bo Toftmann; Schou, Jørgen; Canulescu, Stela

    A typical pulsed laser deposition (PLD) is carried out for a fluence between 0.5 and 2.5 J/cm2. The ablated particles are largely neutrals at the lowest fluence, but the fraction of ions increases strongly with fluence and accounts for more 0.5 of the particles at 2.5 J/cm2 [1,2]. Since it may be...... comparatively difficult to measure the energy and angular distribution of neutrals, measurements of the ionic fraction will be valuable for any modeling of PLD. We have irradiated silver in a vacuum chamber (~ 10-7 mbar) with a Nd:YAG laser at a wavelength of 355 nm and made detailed measurements of the time...

  7. Elements of probability theory

    CERN Document Server

    Rumshiskii, L Z

    1965-01-01

    Elements of Probability Theory presents the methods of the theory of probability. This book is divided into seven chapters that discuss the general rule for the multiplication of probabilities, the fundamental properties of the subject matter, and the classical definition of probability. The introductory chapters deal with the functions of random variables; continuous random variables; numerical characteristics of probability distributions; center of the probability distribution of a random variable; definition of the law of large numbers; stability of the sample mean and the method of moments

  8. Control of the Diameter and Chiral Angle Distributions during Production of Single-Wall Carbon Nanotubes

    Science.gov (United States)

    Nikolaev, Pavel

    2009-01-01

    Many applications of single wall carbon nanotubes (SWCNT), especially in microelectronics, will benefit from use of certain (n,m) nanotube types (metallic, small gap semiconductor, etc.) Especially fascinating is the possibility of quantum conductors that require metallic armchair nanotubes. However, as produced SWCNT samples are polydisperse, with many (n,m) types present and typical approx.1:2 metal/semiconductor ratio. Nanotube nucleation models predict that armchair nuclei are energetically preferential due to formation of partial triple bonds along the armchair edge. However, nuclei can not reach any meaningful thermal equilibrium in a rapidly expanding and cooling plume of carbon clusters, leading to polydispersity. In the present work, SWCNTs were produced by a pulsed laser vaporization (PLV) technique. The carbon vapor plume cooling rate was either increased by change in the oven temperature (expansion into colder gas), or decreased via "warm-up" with a laser pulse at the moment of nucleation. The effect of oven temperature and "warm-up" on nanotube type population was studied via photoluminescence, UV-Vis-NIR absorption and Raman spectroscopy. It was found that reduced temperatures leads to smaller average diameters, progressively narrower diameter distributions, and some preference toward armchair structures. "Warm-up" shifts nanotube population towards arm-chair structures as well, but the effect is small. Possible improvement of the "warm-up" approach to produce armchair SWCNTs will be discussed. These results demonstrate that PLV production technique can provide at least partial control over the nanotube (n,m) population. In addition, these results have implications for the understanding the nanotube nucleation mechanism in the laser oven.

  9. Establishment of uncertainty ranges and probability distributions of actinide solubilities for performance assessment in the Waste Isolation Pilot Plant (WIPP)

    International Nuclear Information System (INIS)

    The Fracture-Matrix Transport (FMT) code developed at Sandia National Laboratories solves chemical equilibrium problems using the Pitzer activity coefficient model with a database containing actinide species. The code is capable of predicting actinide solubilities at 25 C in various ionic-strength solutions from dilute groundwaters to high-ionic-strength brines. The code uses oxidation state analogies, i.e., Am(III) is used to predict solubilities of actinides in the +III oxidation state; Th(IV) is used to predict solubilities of actinides in the +IV state; Np(V) is utilized to predict solubilities of actinides in the +V state. This code has been qualified for predicting actinide solubilities for the Waste Isolation Pilot Plant (WIPP) Compliance Certification Application in 1996, and Compliance Re-Certification Applications in 2004 and 2009. We have established revised actinide-solubility uncertainty ranges and probability distributions for Performance Assessment (PA) by comparing actinide solubilities predicted by the FMT code with solubility data in various solutions from the open literature. The literature data used in this study include solubilities in simple solutions (NaCl, NaHCO3, Na2CO3, NaClO4, KCl, K2CO3, etc.), binary mixing solutions (NaCl+NaHCO3, NaCl+Na2CO3, KCl+K2CO3, etc.), ternary mixing solutions (NaCl+Na2CO3+KCl, NaHCO3+Na2CO3+NaClO4, etc.), and multi-component synthetic brines relevant to the WIPP.

  10. Asymmetric distribution of cone-shaped lipids in a highly curved bilayer revealed by a small angle neutron scattering technique

    International Nuclear Information System (INIS)

    We have investigated the lipid sorting in a binary small unilamellar vesicle (SUV) composed of cone-shaped (1,2-dihexanoyl-sn-glycero-3-phosphocholine: DHPC) and cylinder-shaped (1,2-dipalmitoyl-sn-glycero-3-phosphocholine: DPPC) lipids. In order to reveal the lipid sorting we adopted a contrast matching technique of small angle neutron scattering (SANS), which extracts the distribution of deuterated lipids in the bilayer quantitatively without steric modification of lipids as in fluorescence probe techniques. First the SANS profile of protonated SUVs at a film contrast condition showed that SUVs have a spherical shape with an inner radius of 190 A and a bilayer thickness of 40 A. The SANS profile of deuterated SUVs at a contrast matching condition showed a characteristic scattering profile, indicating an asymmetric distribution of cone-shaped lipids in the bilayer. The characteristic profile was described well by a spherical bilayer model. The fitting revealed that most DHPC molecules are localized in the outer leaflet. Thus the shape of the lipid is strongly coupled with the membrane curvature. We compared the obtained asymmetric distribution of the cone-shaped lipids in the bilayer with the theoretical prediction based on the curvature energy model.

  11. Distribution angle control of a light-emitting diode downlight lens with high color uniformity using a scattering polymer

    Science.gov (United States)

    Mochizuki, Keiichi; Oosumi, Kazumasa; Koizumi, Fumiaki; Shinohara, Yoshinori; Tagaya, Akihiro; Koike, Yasuhiro

    2015-06-01

    We have proposed a light-emitting diode (LED) downlight lens that is made of a highly scattered optical transmission (HSOT) polymer. The HSOT polymer contains optimized heterogeneous structures that produce homogeneously scattered light with forward directivity. The full width at half maximum of the illuminance distribution angle can be increased from 16.7° to 37.9° as the concentration of the scattering particles in the HSOT polymer LED downlight lenses of identical shape is increased from 0.015 to 0.100 wt%. The colors in an illuminated area are highly uniform, which is not discernible by the human eye, with a high output efficiency greater than 85 %.

  12. Lexicographic probability, conditional probability, and nonstandard probability

    OpenAIRE

    Halpern, Joseph Y.

    2003-01-01

    The relationship between Popper spaces (conditional probability spaces that satisfy some regularity conditions), lexicographic probability systems (LPS's), and nonstandard probability spaces (NPS's) is considered. If countable additivity is assumed, Popper spaces and a subclass of LPS's are equivalent; without the assumption of countable additivity, the equivalence no longer holds. If the state space is finite, LPS's are equivalent to NPS's. However, if the state space is infinite, NPS's are ...

  13. Numerical Renormalization Group Study of Probability Distributions for Local Fluctuations in the Anderson-Holstein and Holstein-Hubbard Models

    OpenAIRE

    Hewson, Alex C.; Bauer, Johannes

    2010-01-01

    We show that information on the probability density of local fluctuations can be obtained from a numerical renormalisation group calculation of a reduced density matrix. We apply this approach to the Anderson-Holstein impurity model to calculate the ground state probability density $\\rho(x)$ for the displacement $x$ of the local oscillator. From this density we can deduce an effective local potential for the oscillator and compare its form with that obtained from a semiclassical approximation...

  14. An investigation on effect of geometrical parameters on spray cone angle and droplet size distribution of a two-fluid atomizer

    International Nuclear Information System (INIS)

    A visual study is conducted to determine the effect of geometrical parameters of a two-fluid atomizer on its spray cone angle. The liquid (water) jets exit from six peripheral inclined orifices and are introduced to a high speed gas (air) stream in the gravitational direction. Using a high speed imaging system, the spray cone angle has been determined in constant operational conditions, i.e., Reynolds and Weber numbers for different nozzle geometries. Also, the droplet sizes (Sauter mean diameter) and their distributions have been determined using Malvern Master Sizer x. The investigated geometrical parameters are the liquid jet diameter, liquid port angle and the length of the gas-liquid mixing chamber. The results show that among these parameters, the liquid jet diameter has a significant effect on spray cone angle. In addition, an empirical correlation has been obtained to predict the spray cone angle of the present two-fluid atomizer in terms of nozzle geometries

  15. Estimating the probability distribution of the incubation period for rabies using data from the 1948-1954 rabies epidemic in Tokyo.

    Science.gov (United States)

    Tojinbara, Kageaki; Sugiura, K; Yamada, A; Kakitani, I; Kwan, N C L; Sugiura, K

    2016-01-01

    Data of 98 rabies cases in dogs and cats from the 1948-1954 rabies epidemic in Tokyo were used to estimate the probability distribution of the incubation period. Lognormal, gamma and Weibull distributions were used to model the incubation period. The maximum likelihood estimates of the mean incubation period ranged from 27.30 to 28.56 days according to different distributions. The mean incubation period was shortest with the lognormal distribution (27.30 days), and longest with the Weibull distribution (28.56 days). The best distribution in terms of AIC value was the lognormal distribution with mean value of 27.30 (95% CI: 23.46-31.55) days and standard deviation of 20.20 (15.27-26.31) days. There were no significant differences between the incubation periods for dogs and cats, or between those for male and female dogs. PMID:26688561

  16. Determining the Probability Distribution of Hillslope Peak Discharge Using an Analytical Solution of Kinematic Wave Time of Concentration

    Science.gov (United States)

    Baiamonte, Giorgio; Singh, Vijay P.

    2016-04-01

    extended to the case of pervious hillslopes, accounting for infiltration. In particular, an analytical solution for the time of concentration for overland flow on a rectangular plane surface was derived using the kinematic wave equation under the Green-Ampt infiltration (Baiamonte and Singh, 2015). The objective of this work is to apply the latter solution to determine the probability distribution of hillslope peak discharge by combining it with the familiar rainfall duration-intensity-frequency approach. References Agnese, C., Baiamonte, G., and Corrao, C. (2001). "A simple model of hillslope response for overland flow generation". Hydrol. Process., 15, 3225-3238, ISSN: 0885-6087, doi: 10.1002/hyp.182. Baiamonte, G., and Agnese, C. (2010). "An analytical solution of kinematic wave equations for overland flow under Green-Ampt infiltration". J. Agr. Eng., vol. 1, p. 41-49, ISSN: 1974-7071. Baiamonte, G., and Singh, V.P., (2015). "Analytical solution of kinematic wave time of concentration for overland flow under Green-Ampt Infiltration." J Hydrol E - ASCE, DOI: 10.1061/(ASCE)HE.1943-5584.0001266. Robinson, J.S., and Sivapalan, M. (1996). "Instantaneous response functions of overland flow and subsurface stormflow for catchment models". Hydrol. Process., 10, 845-862. Singh, V.P. (1976). "Derivation of time of concentration". J. of Hydrol., 30, 147-165. Singh, V.P., (1996). Kinematic-Wave Modeling in Water Resources: Surface-Water Hydrology. John Wiley & Sons, Inc., New York, 1399 pp.

  17. Estimating extreme flood probabilities

    International Nuclear Information System (INIS)

    Estimates of the exceedance probabilities of extreme floods are needed for the assessment of flood hazard at Department of Energy facilities. A new approach using a joint probability distribution of extreme rainfalls and antecedent soil moisture conditions, along with a rainfall runoff model, provides estimates of probabilities for floods approaching the probable maximum flood. This approach is illustrated for a 570 km2 catchment in Wisconsin and a 260 km2 catchment in Tennessee

  18. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  19. Evaluation of size distribution of starch granules in selected wheat varieties by the Low Angle Laser Light Scattering method

    International Nuclear Information System (INIS)

    The distribution of the size of wheat starch granules using the method LALLS (Low Angle Laser Light Scattering), followed by the evaluation of the effect of variety, experimental site and intensity of cultivation on the vol. % of the starch A (starch granules > 10 μm) was determined. The total starch content and crude protein content in dry matter of flour T530 in selected collection of five winter wheat varieties were determined. Vol. % of the starch A in evaluated collection of wheat varieties varied between 65.31 and 72.34%. The effect of a variety on the vol. % of starch A seemed to be more marked than the effect of site and intensity of cultivation. The highest vol. % of starch A reached evaluated varieties from the quality group C, i.e. varieties unsuitable for baking utilisation (except variety Contra with high total content of starch in dry matter of flour T530, but relatively low vol. % of starch A). A low vol. % of starch A was also found in the variety Hana (very good variety for baking utilisation). Certain variety differences followed from the evaluation of distribution of starch fractions of starch granules, forming starch A. In the case of varieties Hana, Contra and Siria higher representation of fractions up to 30 μm was recorded, while starch A in the varieties Estica and Versailles was formed in higher degree by size fractions of starch granules over 30 μm and particularly size fraction > 50 μm was greatest in these varieties of all evaluated samples. With increasing total starch content in dry matter of flour T530 the crude protein content decreased; the vol. % of starch A not always increased proportionally with increasing total starch content. (author)

  20. Peptide-induced Asymmetric Distribution of Charged Lipids in a Vesicle Bilayer Revealed by Small-Angle Neutron Scattering

    Science.gov (United States)

    Heller, William; Qian, Shuo

    2012-02-01

    Cellular membranes are complex mixtures of lipids, proteins and other small molecules that provide functional, dynamic barriers between the cell and its environment, as well as between environments within the cell. The lipid composition of the membrane is highly specific and controlled in terms of both content and lipid localization. Here, small-angle neutron scattering and selective deuterium labeling were used to probe the impact of the membrane-active peptides melittin and alamethicin on the structure of lipid bilayers composed of a mixture of the lipids dimyristoyl phosphatidylglycerol (DMPG) and chain-perdeuterated dimyristoyl phosphatidylcholine (DMPC). We found that both peptides enriched the outer leaflet of the bilayer with the negatively charged DMPG, creating an asymmetric distribution of lipids. The level of enrichment is peptide concentration-dependent and is stronger for melittin than alamethicin. The enrichment between the inner and outer bilayer leaflets occurs at very low peptide concentrations, and increases with peptide concentration, including when the peptide adopts a membrane-spanning, pore-forming state.

  1. The distribution of Sr2+ counterions around polyacrylate chains analyzed by anomalous small-angle X-ray scattering

    Science.gov (United States)

    Goerigk, G.; Schweins, R.; Huber, K.; Ballauff, M.

    2004-05-01

    The distribution of Sr counterions around negatively charged sodium polyacrylate chains (NaPA) in aqueous solution was studied by anomalous small-angle X-ray scattering. Different ratios of the concentrations of SrCl2/[NaPA] reveal dramatic changes in the scattering curves. At the lower ratio the scattering curves indicate a coil-like behavior, while at the higher ratio the scattering curves are contracted to smaller q-values, caused by the collapse of the NaPA coil. The form factor of the scattering contribution of the counterions was separated and analyzed. For the scattering curves of the collapsed chains, this analysis agrees with the model of a pearl necklace, consisting of collapsed sphere-like subdomains which are connected by stretched chain segments. An averaged radius of the pearls of 19 nm and a distance between neighbouring pearls close to 60 nm could be established for the collapsed state of the NaPA chains.

  2. Impact of pitch angle setup error and setup error correction on dose distribution in volumetric modulated arc therapy for prostate cancer.

    Science.gov (United States)

    Takemura, Akihiro; Togawa, Kumiko; Yokoi, Tomohiro; Ueda, Shinichi; Noto, Kimiya; Kojima, Hironori; Isomura, Naoki; Kumano, Tomoyasu

    2016-07-01

    In volumetric modulated arc therapy (VMAT) for prostate cancer, a positional and rotational error correction is performed according to the position and angle of the prostate. The correction often involves body leaning, and there is concern regarding variation in the dose distribution. Our purpose in this study was to evaluate the impact of body pitch rotation on the dose distribution regarding VMAT. Treatment plans were obtained retrospectively from eight patients with prostate cancer. The body in the computed tomography images for the original VMAT plan was shifted to create VMAT plans with virtual pitch angle errors of ±1.5° and ±3°. Dose distributions for the tilted plans were recalculated with use of the same beam arrangement as that used for the original VMAT plan. The mean value of the maximum dose differences in the dose distributions between the original VMAT plan and the tilted plans was 2.98 ± 0.96 %. The value of the homogeneity index for the planning target volume (PTV) had an increasing trend according to the pitch angle error, and the values of the D 95 for the PTV and D 2ml, V 50, V 60, and V 70 for the rectum had decreasing trends (p pitch angle error caused by body leaning had little effect on the dose distribution; in contrast, the pitch angle correction reduced the effects of organ displacement and improved these indexes. Thus, the pitch angle setup error in VMAT for prostate cancer should be corrected. PMID:26873139

  3. Probability distribution function values in mobile phones;Valores de funciones de distribución de probabilidad en el teléfono móvil

    Directory of Open Access Journals (Sweden)

    Luis Vicente Chamorro Marcillllo

    2013-06-01

    Full Text Available Engineering, within its academic and application forms, as well as any formal research work requires the use of statistics and every inferential statistical analysis requires the use of values of probability distribution functions that are generally available in tables. Generally, management of those tables have physical problems (wasteful transport and wasteful consultation and operational (incomplete lists and limited accuracy. The study, “Probability distribution function values in mobile phones”, permitted determining – through a needs survey applied to students involved in statistics studies at Universidad de Nariño – that the best known and most used values correspond to Chi-Square, Binomial, Student’s t, and Standard Normal distributions. Similarly, it showed user’s interest in having the values in question within an alternative means to correct, at least in part, the problems presented by “the famous tables”. To try to contribute to the solution, we built software that allows immediately and dynamically obtaining the values of the probability distribution functions most commonly used by mobile phones.

  4. In memory of Alois Apfelbeck: An Interconnection between Cayley-Eisenstein-Pólya and Landau Probability Distributions

    Directory of Open Access Journals (Sweden)

    Vladimír Vojta

    2013-01-01

    Full Text Available The interconnection between the Cayley-Eisenstein-Pólya distribution and the Landau distribution is studied, and possibly new transform pairs for the Laplace and Mellin transform and integral expressions for the Lambert W function have been found.

  5. A Finding Method of Business Risk Factors Using Characteristics of Probability Distributions of Effect Ratios on Qualitative and Quantitative Hybrid Simulation

    Science.gov (United States)

    Samejima, Masaki; Negoro, Keisuke; Mitsukuni, Koshichiro; Akiyoshi, Masanori

    We propose a finding method of business risk factors on qualitative and quantitative hybrid simulation in time series. Effect ratios of qualitative arcs in the hybrid simulation vary output values of the simulation, so we define effect ratios causing risk as business risk factors. Finding business risk factors in entire ranges of effect ratios is time-consuming. It is considered that probability distributions of effect ratios in present time step and ones in previous time step are similar, the probability distributions in present time step can be estimated. Our method finds business risk factors in only estimated ranges effectively. Experimental results show that a precision rate and a recall rate are 86%, and search time is decreased 20% at least.

  6. Distribution of Prompt Neutron Emission Probability for Fission Fragments in Spontaneous Fission of 252Cf and 244,248Cm

    Science.gov (United States)

    Vorobyev, A. S.; Dushin, V. N.; Hambsch, F.-J.; Jakovlev, V. A.; Kalinin, V. A.; Laptev, A. B.; Petrov, B. F.; Shcherbakov, O. A.

    2005-05-01

    Neutrons emitted in fission events were measured separately for each complementary fragment in correlation with fission fragment energies. Two high-efficiency Gd-loaded liquid scintillator tanks were used for neutron registration. Fission fragment energies were measured using a twin Frisch gridded ionization chamber with a pinhole collimator. The neutron multiplicity distributions were obtained for each value of the fission fragment mass and energy and corrected for neutron registration efficiency, background, and pile-up. The dependency of these distributions on fragment mass and energy for different energy and mass bins as well as mass and energy distribution of fission fragments are presented and discussed.

  7. Study of local environment and cation distribution in Al(III) oxides by 27Al-NMR with sample rotation at a ''magic'' angle

    International Nuclear Information System (INIS)

    The possible use of the 27Al-NMR method with sample rotation at a ''magic'' angle to study the local environment and cation distribution of Al(III) ions in the oxide lattice are exemplified by γ-, eta-, chi-, α-Al2O3 and commercial A-1 Al(III) oxide. (author)

  8. Approximation of the probability distribution of the customer waiting time under an (r,s,q) inventory policy in discrete time

    OpenAIRE

    Tempelmeier, Horst; Fischer, Lars

    2009-01-01

    Abstract We study a single-item periodic review (r,s,q) inventory policy. Customer demands arrive on a discrete (e.\\,g.\\ daily) time axis. The replenishment lead times are discrete random variables. This is the time model underlying the majority of the Advanced Planning Software systems used for supply chain management in industrial practice. We present an approximation of the probability distribution of the customer waiting time which is a customer-oriented performance criterio...

  9. The difference between the joint probability distributions of apparent wave heights and periods and individual wave heights and periods

    Institute of Scientific and Technical Information of China (English)

    ZHENGGuizhen; JIANGXiulan; HANShuzong

    2004-01-01

    The joint distribution of wave heights and periods of individual waves is usually approximated by the joint distribution of apparent wave heights and periods. However there is difference between them. This difference is addressed and the theoretical joint distributions of apparent wave heights and periods due to Longuet-Higgins and Sun are modified to give more reasonable representations of the joint distribution of wave heights and periods of individual waves. The modification has overcome an inherent drawback of these joint PDFs that the mean wave period is infinite. A comparison is made between the modified formulae and the field data of Goda, which shows that the new formulae consist with the measurement better than their original counterparts.

  10. Binomial distribution of Poisson statistics and tracks overlapping probability to estimate total tracks count with low uncertainty

    International Nuclear Information System (INIS)

    In the solid state nuclear track detectors of chemically etched type, nuclear tracks with center-to-center neighborhood of distance shorter than two times the radius of tracks will emerge as overlapping tracks. Track overlapping in this type of detectors causes tracks count losses and it becomes rather severe in high track densities. Therefore, tracks counting in this condition should include a correction factor for count losses of different tracks overlapping orders since a number of overlapping tracks may be counted as one track. Another aspect of the problem is the cases where imaging the whole area of the detector and counting all tracks are not possible. In these conditions a statistical generalization method is desired to be applicable in counting a segmented area of the detector and the results can be generalized to the whole surface of the detector. Also there is a challenge in counting the tracks in densely overlapped tracks because not sufficient geometrical or contextual information are available. It this paper we present a statistical counting method which gives the user a relation between the tracks overlapping probabilities on a segmented area of the detector surface and the total number of tracks. To apply the proposed method one can estimate the total number of tracks on a solid state detector of arbitrary shape and dimensions by approximating the tracks averaged area, whole detector surface area and some orders of tracks overlapping probabilities. It will be shown that this method is applicable in high and ultra high density tracks images and the count loss error can be enervated using a statistical generalization approach. - Highlights: • A correction factor for count losses of different tracks overlapping orders. • For the cases imaging the whole area of the detector is not possible. • Presenting a statistical generalization method for segmented areas. • Giving a relation between the tracks overlapping probabilities and the total tracks

  11. Empirical probability distribution of journal impact factor and over-the-samples stability in its estimated parameters

    OpenAIRE

    Mishra, SK

    2010-01-01

    The data on JIFs provided by Thomson Scientific can only be considered as a sample since they do not cover the entire universe of those documents that cite an intellectual output (paper, article, etc) or are cited by others. Then, questions arise if the empirical distribution (best fit to the JIF data for any particular year) really represents the true or universal distribution, are its estimated parameters stable over the samples and do they have some scientific interpretation? It may be not...

  12. Modeling the uncertainties in the parameter values of a sparse data set using the beta probability distribution

    International Nuclear Information System (INIS)

    The geological formations in the unsaturated zone at Yucca Mountain, Nevada, are being investigated as the proposed site of a repository for the disposal of high-level radioactive waste. The numerical and conceptual tools that will be used to assess the degree to which the site complies with specified regulatory criteria are currently under development. This paper reports the status of a probability model that has been implemented to address uncertainties in quantitative predictions of parameter values that are used as input to numerical simulation models of flow. 12 refs., 2 figs., 2 tabs

  13. Distribution of poles in a series expansion of the asymmetric directed-bond percolation probability on the square lattice

    Science.gov (United States)

    Inui, Norio

    1998-12-01

    We investigate numerically the percolation probability of the asymmetric directed-bond percolation on the square lattice with two parameters p and q based on Guttmann and Enting's procedure (1996 Phys. Rev. Lett. 76 344). A series in the form of 0305-4470/31/48/001/img1 is derived by using the finite transfer-matrix method. The denominator of 0305-4470/31/48/001/img2 is directly calculated from the determinant of the transfer matrix and it leads to a proof that poles all lies on the unit circle in the complex q plane. The solvability of the bond directed percolation is also discussed.

  14. Diagnostics of Rovibrational Distribution of H2 in Low Temperature Plasmas by Fulcher-α band Spectroscopy - on the Reaction Rates and Transition Probabilities

    Institute of Scientific and Technical Information of China (English)

    Xiao Bingjia; Shinichiro Kado; Shin Kajita; Daisuge Yamasaki; Satoru Tanaka

    2005-01-01

    A novel fitting procedure is proposed for a better determination of H2 rovibrational distribution from the Fulcher-a band spectroscopy. We have recalculated the transition probabilities and the results show that they deviate from Franck-Condon approximation especially for the non-diagonal transitions. We also calculated the complete sets of vibrationally resolved crosssections for electron impact d3∏u- X3∑g transition based on the semi-classical Gryzinski theory.An example of experimental study confirms that current approach provides a tool for a better diagnostics of H2 rovibrational distribution in electronic ground state.

  15. Nonlinear systems with fast and slow motions. Changes in the probability distribution for fast motions under the influence of slower ones

    International Nuclear Information System (INIS)

    The influence of slow processes on the probability distribution of fast random processes is investigated. By reviewing four examples we show that such influence is apparently of a universal character and that, in some cases, this universality is of multifractal form. As our examples we consider theoretically stochastic resonance, turbulent jets with acoustic forcing, and two problems studied experimentally by Shnoll on the influence of the Earth’s slow rotation on the probability distribution for the velocities of model Brownian particles and on alpha decay. In the case of stochastic resonance, the slow process is a low frequency, harmonic, external force. In the case of turbulent jets, the slow process is acoustic forcing. In the models based on Shnoll’s experiments, the slow processes are inertial forces arising from the rotation of the Earth, both about its own axis and about the Sun. It is shown that all of these slow processes cause changes in the probability distributions for the velocities of fast processes interacting with them, and that these changes are similar in form

  16. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  17. Measurement of the weak mixing angle and the spin of the gluon from angular distributions in the reaction pp→ Z/γ*+X→μ+μ-+X with ATLAS

    International Nuclear Information System (INIS)

    The measurement of the effective weak mixing angle with the ATLAS experiment at the LHC is presented. It is extracted from the forward-backward asymmetry in the polar angle distribution of the muons originating from Z boson decays in the reaction pp→Z/γ*+X→ μ+μ-+X. In total 4.7 fb-1 of proton-proton collisions at √(s)=7 TeV are analysed. In addition, the full polar and azimuthal angular distributions are measured as a function of the transverse momentum of the Z/γ* system and are compared to several simulations as well as recent results obtained in p anti p collisions. Finally, the angular distributions are used to confirm the spin of the gluon using the Lam-Tung relation.

  18. Quantum probability

    CERN Document Server

    Gudder, Stanley P

    2014-01-01

    Quantum probability is a subtle blend of quantum mechanics and classical probability theory. Its important ideas can be traced to the pioneering work of Richard Feynman in his path integral formalism.Only recently have the concept and ideas of quantum probability been presented in a rigorous axiomatic framework, and this book provides a coherent and comprehensive exposition of this approach. It gives a unified treatment of operational statistics, generalized measure theory and the path integral formalism that can only be found in scattered research articles.The first two chapters survey the ne

  19. On the Efficient Simulation of the Distribution of the Sum of Gamma-Gamma Variates with Application to the Outage Probability Evaluation Over Fading Channels

    KAUST Repository

    Chaouki Ben Issaid

    2016-06-01

    The Gamma-Gamma distribution has recently emerged in a number of applications ranging from modeling scattering and reverbation in sonar and radar systems to modeling atmospheric turbulence in wireless optical channels. In this respect, assessing the outage probability achieved by some diversity techniques over this kind of channels is of major practical importance. In many circumstances, this is intimately related to the difficult question of analyzing the statistics of a sum of Gamma-Gamma random variables. Answering this question is not a simple matter. This is essentially because outage probabilities encountered in practice are often very small, and hence the use of classical Monte Carlo methods is not a reasonable choice. This lies behind the main motivation of the present work. In particular, this paper proposes a new approach to estimate the left tail of the sum of Gamma-Gamma variates. More specifically, we propose a mean-shift importance sampling scheme that efficiently evaluates the outage probability of L-branch maximum ratio combining diversity receivers over Gamma-Gamma fading channels. The proposed estimator satisfies the well-known bounded relative error criterion, a well-desired property characterizing the robustness of importance sampling schemes, for both identically and non-identically independent distributed cases. We show the accuracy and the efficiency of our approach compared to naive Monte Carlo via some selected numerical simulations.

  20. A bivariate gamma probability distribution with application to gust modeling. [for the ascent flight of the space shuttle

    Science.gov (United States)

    Smith, O. E.; Adelfang, S. I.; Tubbs, J. D.

    1982-01-01

    A five-parameter gamma distribution (BGD) having two shape parameters, two location parameters, and a correlation parameter is investigated. This general BGD is expressed as a double series and as a single series of the modified Bessel function. It reduces to the known special case for equal shape parameters. Practical functions for computer evaluations for the general BGD and for special cases are presented. Applications to wind gust modeling for the ascent flight of the space shuttle are illustrated.

  1. Predicting probability of occurrence and factors affecting distribution and abundance of three Ozark endemic crayfish species at multiple spatial scales

    Science.gov (United States)

    Nolen, Matthew S.; Magoulick, Daniel D.; DiStefano, Robert J.; Imhoff, Emily M.; Wagner, Brian K.

    2014-01-01

    Crayfishes and other freshwater aquatic fauna are particularly at risk globally due to anthropogenic demand, manipulation and exploitation of freshwater resources and yet are often understudied. The Ozark faunal region of Missouri and Arkansas harbours a high level of aquatic biological diversity, especially in regard to endemic crayfishes. Three such endemics, Orconectes eupunctus,Orconectes marchandi and Cambarus hubbsi, are threatened by limited natural distribution and the invasions of Orconectes neglectus.

  2. The Derivation of the Probability Density Function of the t Distribution%t 分布概率密度的分析

    Institute of Scientific and Technical Information of China (English)

    彭定忠; 张映辉; 刘朝才

    2012-01-01

      t 分布是数理统计中应用广泛的3个重要分布之一,大多数教材没有或仅用直接法推导其概率密度,本文采用变换法推导,简化了运算过程,降低了计算难度。%  The t distribution is one of three most important distributions which are applied widely in mathematically statistical analysis, most of the teaching material not including or only use the direct method to derivate the probability density function of the distribution. In this paper, the transform method which features a more simple operation and less difficult computation is presented for derivation.

  3. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  4. Comparative analysis of methods for modelling the short-term probability distribution of extreme wind turbine loads

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimirov

    2016-01-01

    We have tested the performance of statistical extrapolation methods in predicting the extreme response of a multi-megawatt wind turbine generator. We have applied the peaks-over-threshold, block maxima and average conditional exceedance rates (ACER) methods for peaks extraction, combined with four...... levels, based on the assumption that the response tail is asymptotically Gumbel distributed. Example analyses were carried out, aimed at comparing the different methods, analysing the statistical uncertainties and identifying the factors, which are critical to the accuracy and reliability of the...

  5. Influence of the fitted probability distribution type on the annual mean power generated by wind turbines: A case study at the Canary Islands

    International Nuclear Information System (INIS)

    This paper aims to quantify the influence that probability distribution selected to fit wind speed data has on the estimation of the annual mean energy production of wind turbines. To perform this task, a comparative analysis between the well-known two parameter wind speed Weibull distribution and alternative mixture of finite distribution models (less simple but providing better fits in many locations) is applied, in order to contrast simplicity versus accuracy. Data fitted from a set of weather stations located at the Canary Islands and a representative sample of commercial wind turbines are taken into account to carry out this analysis. The calculations provide a wide variety of numerical results but, as a general conclusion, the analysis evidences that any improvement in wind data fits given by the use of a mixture of finite distributions, instead of the standard Weibull distribution, is partially or even totally lost as the annual mean energy production is worked out, practically regardless the weather station, the wind speed distribution model, the turbine size or the turbine concept

  6. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to...

  7. Determination of the bond-angle distribution in vitreous B2O3 by 11B double rotation (DOR) NMR spectroscopy

    International Nuclear Information System (INIS)

    The B-O-B bond angle distributions for both ring and non-ring boron sites in vitreous B2O3 have been determined by 11B double rotation (DOR) NMR and multiple-quantum (MQ) DOR NMR. The [B3O6] boroxol rings are observed to have a mean internal B-O-B angle of 120.0±0.7 deg. with a small standard deviation, σR=3.2±0.4 deg., indicating that the rings are near-perfect planar, hexagonal structures. The rings are linked predominantly by non-ring [BO3] units, which share oxygens with the boroxol ring, with a mean Bring-O-Bnon-ring angle of 135.1±0.6 deg. and σNR=6.7±0.4 deg. In addition, the fraction of boron atoms, f, which reside in the boroxol rings has been measured for this sample as f=0.73±0.01. - Graphical abstract: Connectivities and B-O-B bond angle distributions of ring and non-ring boron atoms in v-B2O3 have been determined by 11B double rotation (DOR) NMR, multiple-quantum (MQ) DOR NMR and spin-diffusion DOR. Near-perfect planar, hexagonal [B3O6] boroxol rings are shown to be present. Display Omitted

  8. An investigation of the dose distribution effect related with collimator angle in volumetric arc therapy of prostate cancer

    Directory of Open Access Journals (Sweden)

    Bora Tas

    2016-01-01

    Full Text Available To investigate the dose-volume variations of planning target volume (PTV and organ at risks (OARs in eleven prostate cancer patients planned with single and double arc volumetric modulated arc therapy (VMAT when varying collimator angle. Single and double arc VMAT treatment plans were created using Monaco5.0® with collimator angle set to 0°. All plans were normalized 7600 cGy dose to the 95% of clinical target volume (CTV volume. The single arc VMAT plans were reoptimized with different collimator angles (0°, 15°, 30°, 45°, 60°, 75°, and 90°, and for double arc VMAT plans (0–0°, 15°–345, 30–330°, 45–315°, 60–300°, 75–285°, 90–270° using the same optimization parameters. For the comparison the parameters of heterogeneity index (HI, dose-volume histogram and minimum dose to the 95% of PTV volume (D95 PTV calculated and analyzed. The best plans were verified using 2 dimensional ion chamber array IBA Matrixx® and three-dimensional IBA Compass® program. The comparison between calculation and measurement were made by the γ-index (3%/3 mm analysis. A higher D95 (PTV were found for single arc VMAT with 15° collimator angle. For double arc, VMAT with 60–300° and 75–285° collimator angles. However, lower rectum doses obtained for 75–285° collimator angles. There was no significant dose difference, based on other OARs which are bladder and femur head. When we compared single and double arc VMAT's D95 (PTV, we determined 2.44% high coverage and lower HI with double arc VMAT. All plans passed the γ-index (3%/3 mm analysis with more than 97% of the points and we had an average γ-index for CTV 0.36, for PTV 0.32 with double arc VMAT. These results were significant by Wilcoxon signed rank test statistically. The results show that dose coverage of target and OAR's doses also depend significantly on the collimator angles due to the geometry of target and OARs. Based on the results we have decided to plan prostate

  9. An investigation of the dose distribution effect related with collimator angle in volumetric arc therapy of prostate cancer.

    Science.gov (United States)

    Tas, Bora; Bilge, Hatice; Ozturk, Sibel Tokdemir

    2016-01-01

    To investigate the dose-volume variations of planning target volume (PTV) and organ at risks (OARs) in eleven prostate cancer patients planned with single and double arc volumetric modulated arc therapy (VMAT) when varying collimator angle. Single and double arc VMAT treatment plans were created using Monaco5.0(®) with collimator angle set to 0°. All plans were normalized 7600 cGy dose to the 95% of clinical target volume (CTV) volume. The single arc VMAT plans were reoptimized with different collimator angles (0°, 15°, 30°, 45°, 60°, 75°, and 90°), and for double arc VMAT plans (0-0°, 15°-345, 30-330°, 45-315°, 60-300°, 75-285°, 90-270°) using the same optimization parameters. For the comparison the parameters of heterogeneity index (HI), dose-volume histogram and minimum dose to the 95% of PTV volume (D95 PTV) calculated and analyzed. The best plans were verified using 2 dimensional ion chamber array IBA Matrixx(®) and three-dimensional IBA Compass(®) program. The comparison between calculation and measurement were made by the γ-index (3%/3 mm) analysis. A higher D95 (PTV) were found for single arc VMAT with 15° collimator angle. For double arc, VMAT with 60-300° and 75-285° collimator angles. However, lower rectum doses obtained for 75-285° collimator angles. There was no significant dose difference, based on other OARs which are bladder and femur head. When we compared single and double arc VMAT's D95 (PTV), we determined 2.44% high coverage and lower HI with double arc VMAT. All plans passed the γ-index (3%/3 mm) analysis with more than 97% of the points and we had an average γ-index for CTV 0.36, for PTV 0.32 with double arc VMAT. These results were significant by Wilcoxon signed rank test statistically. The results show that dose coverage of target and OAR's doses also depend significantly on the collimator angles due to the geometry of target and OARs. Based on the results we have decided to plan prostate cancer patients in our

  10. Minimum Probability Flow Learning

    CERN Document Server

    Sohl-Dickstein, Jascha; DeWeese, Michael R

    2009-01-01

    Learning in probabilistic models is often severely hampered by the general intractability of the normalization factor and its derivatives. Here we propose a new learning technique that obviates the need to compute an intractable normalization factor or sample from the equilibrium distribution of the model. This is achieved by establishing dynamics that would transform the observed data distribution into the model distribution, and then setting as the objective the minimization of the initial flow of probability away from the data distribution. Score matching, minimum velocity learning, and certain forms of contrastive divergence are shown to be special cases of this learning technique. We demonstrate the application of minimum probability flow learning to parameter estimation in Ising models, deep belief networks, multivariate Gaussian distributions and a continuous model with a highly general energy function defined as a power series. In the Ising model case, minimum probability flow learning outperforms cur...

  11. Brain-derived neurotrophic factor enhances GABA release probability and nonuniform distribution of N- and P/Q-type channels on release sites of hippocampal inhibitory synapses.

    Science.gov (United States)

    Baldelli, Pietro; Hernandez-Guijo, Jesus-Miguel; Carabelli, Valentina; Carbone, Emilio

    2005-03-30

    Long-lasting exposures to brain-derived neurotrophic factor (BDNF) accelerate the functional maturation of GABAergic transmission in embryonic hippocampal neurons, but the molecular bases of this phenomenon are still debated. Evidence in favor of a postsynaptic site of action has been accumulated, but most of the data support a presynaptic site effect. A crucial issue is whether the enhancement of evoked IPSCs (eIPSCs) induced by BDNF is attributable to an increase in any of the elementary parameters controlling neurosecretion, namely the probability of release, the number of release sites, the readily releasable pool (RRP), and the quantal size. Here, using peak-scaled variance analysis of miniature IPSCs, multiple probability fluctuation analysis, and cumulative amplitude analysis of action potential-evoked postsynaptic currents, we show that BDNF increases release probability and vesicle replenishment with little or no effect on the quantal size, the number of release sites, the RRP, and the Ca2+ dependence of eIPSCs. BDNF treatment changes markedly the distribution of Ca2+ channels controlling neurotransmitter release. It enhances markedly the contribution of N- and P/Q-type channels, which summed to >100% ("supra-additivity"), and deletes the contribution of R-type channels. BDNF accelerates the switch of presynaptic Ca2+ channel distribution from "segregated" to "nonuniform" distribution. This maturation effect was accompanied by an uncovered increased control of N-type channels on paired-pulse depression, otherwise dominated by P/Q-type channels in untreated neurons. Nevertheless, BDNF preserved the fast recovery from depression associated with N-type channels. These novel presynaptic BDNF actions derive mostly from an enhanced overlapping and better colocalization of N- and P/Q-type channels to vesicle release sites. PMID:15800191

  12. SSNTD study of the probable influence of alpha activity on the mass distribution of 252Cf fission fragments

    International Nuclear Information System (INIS)

    The SSNTD has come a long way in its application for the study of nuclear phenomena. Spontaneous fission of transuranic elements is one such phenomena wherein use of SSNTD offers easy registration of the signature of the fission fragments. The object of the present study is to explore whether any one of the track parameters such as the diameter can be used to estimate the atomic mass ratios of the spontaneous fission fragments. The spontaneous fission data from 252Cf recorded almost at the end of one and four half-life periods for alpha decay are analysed, taking a plot of the number of tracks versus the track diameter. From these plots it is seen that initially, when significant alpha activity of 252Cf persists, the fission fragments appear to cluster into two predominant groups as indicated by two peaks. The ratio of the diameters at these peak positions appear to be related to the ratio of average mass numbers of the light and heavy groups of fission fragments. However, absence of two peaks for similar plots at the end of about four half-life periods for alpha decay suggests that presumably the presence of alphas influence the mass distribution of the fission fragments

  13. Towards a Global Water Scarcity Risk Assessment Framework: Incorporation of Probability Distributions and Hydro-Climatic Variability

    Science.gov (United States)

    Veldkamp, T. I. E.; Wada, Y.; Aerts, J. C. J. H.; Ward, P. J.

    2016-01-01

    Changing hydro-climatic and socioeconomic conditions increasingly put pressure on fresh water resources and are expected to aggravate water scarcity conditions towards the future. Despite numerous calls for risk-based water scarcity assessments, a global-scale framework that includes UNISDR's definition of risk does not yet exist. This study provides a first step towards such a risk based assessment, applying a Gamma distribution to estimate water scarcity conditions at the global scale under historic and future conditions, using multiple climate change and population growth scenarios. Our study highlights that water scarcity risk, expressed in terms of expected annual exposed population, increases given all future scenarios, up to greater than 56.2% of the global population in 2080. Looking at the drivers of risk, we find that population growth outweigh the impacts of climate change at global and regional scales. Using a risk-based method to assess water scarcity, we show the results to be less sensitive than traditional water scarcity assessments to the use of fixed threshold to represent different levels of water scarcity. This becomes especially important when moving from global to local scales, whereby deviations increase up to 50% of estimated risk levels.

  14. Towards a global water scarcity risk assessment framework: incorporation of probability distributions and hydro-climatic variability

    Science.gov (United States)

    Veldkamp, T. I. E.; Wada, Y.; Aerts, J. C. J. H.; Ward, P. J.

    2016-02-01

    Changing hydro-climatic and socioeconomic conditions increasingly put pressure on fresh water resources and are expected to aggravate water scarcity conditions towards the future. Despite numerous calls for risk-based water scarcity assessments, a global-scale framework that includes UNISDR’s definition of risk does not yet exist. This study provides a first step towards such a risk-based assessment, applying a Gamma distribution to estimate water scarcity conditions at the global scale under historic and future conditions, using multiple climate change and population growth scenarios. Our study highlights that water scarcity risk, expressed in terms of expected annual exposed population, increases given all future scenarios, up to >56.2% of the global population in 2080. Looking at the drivers of risk, we find that population growth outweigh the impacts of climate change at global and regional scales. Using a risk-based method to assess water scarcity, we show the results to be less sensitive than traditional water scarcity assessments to the use of fixed threshold to represent different levels of water scarcity. This becomes especially important when moving from global to local scales, whereby deviations increase up to 50% of estimated risk levels.

  15. Angular and lateral distributions from small angle multiple scattering including elastic and inelastic energy loss effects based on the Valdes and Arista model

    Energy Technology Data Exchange (ETDEWEB)

    Ikegami, Seiji, E-mail: double1892@gmail.com

    2013-12-01

    The aims of this work are to compare and to include two energy loss effects in multiple scattering caused by elastic and inelastic collisions in angular and lateral distributions based on Valdes and Arista (VA) theory. VA developed small angle multiple scattering theory including energy loss effects based on the Sigmund and Winterbon model for the first time. However, the energy loss effects on lateral distributions have not yet been estimated. In the VA model, target thickness and energy loss are independently treated. In this study, those effects are successfully introduced on the basis of the VA model. We considered the lateral spread and angular distribution separately and included the nuclear and electronic energy loss effects as a function of target thickness. Our results indicate that discrepancies occur between the two distributions, including nuclear and electronic stopping for several target thickness. Moreover, we constructed a multiple scattering model that includes both elastic and inelastic energy losses.

  16. Estimation method for random sonic fatigue life of thin-walled structure of a combustor liner based on stress probability distribution%Estimation method for random sonic fatigue life of thin-walled structure of a combustor liner based on stress probability distribution

    Institute of Scientific and Technical Information of China (English)

    SHA Yun-dong; GUO Xiao-peng; LIAO Lian-fang; XIE Li-juan

    2011-01-01

    As to the sonic fatigue problem of an aero-engine combustor liner structure under the random acoustic loadings, an effective method for predicting the fatigue life of a structure under random loadings was studied. Firstly, the probability distribution of Von Mises stress of thin-walled structure under random loadings was studied, analysis suggested that probability density function of Von Mises stress process accord approximately with two-parameter Weibull distribution. The formula for calculating Weibull parameters were given. Based on the Miner linear theory, the method for predicting the random sonic fatigue life based on the stress probability density was developed, and the model for fatigue life prediction was constructed. As an example, an aero-engine combustor liner structure was considered. The power spectrum density (PSD) of the vibrational stress response was calculated by using the coupled FEM/BEM (finite element method/boundary element method) model, the fatigue life was estimated by using the constructed model. And considering the influence of the wide frequency band, the calculated results were modified. Comparetive analysis shows that the estimated results of sonic fatigue of the combustor liner structure by using Weibull distribution of Von Mises stress are more conservative than using Dirlik distribution to some extend. The results show that the methods presented in this paper are practical for the random fatigue life analysis of the aeronautical thin-walled structures.

  17. Characterization of the energy distribution of neutrons generated by 5 MeV protons on a thick beryllium target at different emission angles

    International Nuclear Information System (INIS)

    Neutron energy spectra at different emission angles, between 0° and 120° from the Be(p,xn) reaction generated by a beryllium thick-target bombarded with 5 MeV protons, have been measured at the Legnaro Laboratories (LNL) of the Italian National Institute for Nuclear Physics research (INFN). A new and quite compact recoil-proton spectrometer, based on a monolithic silicon telescope, coupled to a polyethylene converter, was efficiently used with respect to the traditional Time-of-Flight (TOF) technique. The measured distributions of recoil-protons were processed through an iterative unfolding algorithm in order to determine the neutron energy spectra at all the angles accounted for. The neutron energy spectrum measured at 0° resulted to be in good agreement with the only one so far available at the requested energy and measured years ago with TOF technique. Moreover, the results obtained at different emission angles resulted to be consistent with detailed past measurements performed at 4 MeV protons at the same angles by TOF techniques.

  18. Characterization of the energy distribution of neutrons generated by 5 MeV protons on a thick beryllium target at different emission angles

    Energy Technology Data Exchange (ETDEWEB)

    Agosteo, S. [Politecnico di Milano, Dipartimento di Energia, Piazza Leonardo da Vinci 32, 20133 Milano (Italy)] [Istituto Nazionale di Fisica Nucleare, Sezione di Milano, via Celoria 16, 20133 Milano (Italy); Colautti, P., E-mail: paolo.colautti@lnl.infn.it [INFN, Laboratori Nazionali di Legnaro (LNL), Via dell' Universita, 2, I-35020 Legnaro (PD) (Italy); Esposito, J., E-mail: juan.esposito@tin.it [INFN, Laboratori Nazionali di Legnaro (LNL), Via dell' Universita, 2, I-35020 Legnaro (PD) (Italy); Fazzi, A.; Introini, M.V.; Pola, A. [Politecnico di Milano, Dipartimento di Energia, Piazza Leonardo da Vinci 32, 20133 Milano (Italy)] [Istituto Nazionale di Fisica Nucleare, Sezione di Milano, via Celoria 16, 20133 Milano (Italy)

    2011-12-15

    Neutron energy spectra at different emission angles, between 0 Degree-Sign and 120 Degree-Sign from the Be(p,xn) reaction generated by a beryllium thick-target bombarded with 5 MeV protons, have been measured at the Legnaro Laboratories (LNL) of the Italian National Institute for Nuclear Physics research (INFN). A new and quite compact recoil-proton spectrometer, based on a monolithic silicon telescope, coupled to a polyethylene converter, was efficiently used with respect to the traditional Time-of-Flight (TOF) technique. The measured distributions of recoil-protons were processed through an iterative unfolding algorithm in order to determine the neutron energy spectra at all the angles accounted for. The neutron energy spectrum measured at 0 Degree-Sign resulted to be in good agreement with the only one so far available at the requested energy and measured years ago with TOF technique. Moreover, the results obtained at different emission angles resulted to be consistent with detailed past measurements performed at 4 MeV protons at the same angles by TOF techniques.

  19. Fractal dimension and unscreened angles measured for radial viscous fingering.

    Science.gov (United States)

    Praud, Olivier; Swinney, Harry L

    2005-07-01

    We have examined fractal patterns formed by the injection of air into oil in a thin (0.127 mm) layer contained between two cylindrical glass plates of 288 mm diameter (a Hele-Shaw cell), for pressure differences in the range 0.25 DLA) clusters. We have also measured the probability distribution of unscreened angles. At late times, the distribution approaches a universal (i.e., forcing and size-independent) asymptotic form that has mean 145 degrees Celsius and standard deviation 36 degrees Celsius. These results indicate that the distribution function for the unscreened angle is an invariant property of the growth process. PMID:16089960

  20. Intracranial cerebrospinal fluid spaces imaging using a pulse-triggered three-dimensional turbo spin echo MR sequence with variable flip-angle distribution

    International Nuclear Information System (INIS)

    To assess the three-dimensional turbo spin echo with variable flip-angle distribution magnetic resonance sequence (SPACE: Sampling Perfection with Application optimised Contrast using different flip-angle Evolution) for the imaging of intracranial cerebrospinal fluid (CSF) spaces. We prospectively investigated 18 healthy volunteers and 25 patients, 20 with communicating hydrocephalus (CH), five with non-communicating hydrocephalus (NCH), using the SPACE sequence at 1.5T. Volume rendering views of both intracranial and ventricular CSF were obtained for all patients and volunteers. The subarachnoid CSF distribution was qualitatively evaluated on volume rendering views using a four-point scale. The CSF volumes within total, ventricular and subarachnoid spaces were calculated as well as the ratio between ventricular and subarachnoid CSF volumes. Three different patterns of subarachnoid CSF distribution were observed. In healthy volunteers we found narrowed CSF spaces within the occipital aera. A diffuse narrowing of the subarachnoid CSF spaces was observed in patients with NCH whereas patients with CH exhibited narrowed CSF spaces within the high midline convexity. The ratios between ventricular and subarachnoid CSF volumes were significantly different among the volunteers, patients with CH and patients with NCH. The assessment of CSF spaces volume and distribution may help to characterise hydrocephalus. (orig.)

  1. Variability of the pitch angle distribution of radiation belt ultrarelativistic electrons during and following intense geomagnetic storms: Van Allen Probes observations

    Science.gov (United States)

    Zou, Z.; Ni, B.; Gu, X.; Zhao, Z.; Zhou, C.

    2015-12-01

    Fifteen month of pitch angle resolved Van Allen Probes Relativistic Electron-Proton Telescope (REPT) measurements of differential electron flux are analyzed to investigate the characteristics of the pitch angle distribution of radiation belt ultrarelativistic(> 2 MeV) electrons during storm conditions and during the long-storm decay. By modeling the ultrarelativistic electron pitch angle distribution as ,where is the equatorial pitch angle we examine the spatiotemporal variations of n value. The results show that in general n values increases with the level of geomagnetic activity. In principle the ultrarelativistic electrons respond to geomagnetic storms by becoming peaked at 90° pitch angle with n-values of 2 - 3 as a supportive signature of chorus acceleration outside the plasmasphere. High n-values also exists inside the plasmasphere, being localized adjacent to the plasmapause and energy dependent, which suggests a significant contribution from electronmagnetic ion cyclotron (EMIC) waves scattering. During quiet periods, n values generally evolve to become small, i.e., 0-1. The slow and long-term decays of the ultrarelativistic electrons after geomagnetic storms, while prominent, produce energy and L-shell-dependent decay time scales in association with the solar and geomagnetic activity and wave-particle interaction processes. At lower L shells inside the plasmasphere, the decay time scales for electrons at REPT energies are generally larger, varying from tens of days to hundreds of days, which can be mainly attributed to the combined effect of hiss-induced pitch angle scattering and inward radial diffusion. As L shell increases to L~3.5, a narrow region exists (with a width of ~0.5 L), where the observed ultrarelativistic electrons decay fastest, possibly resulting from efficient EMIC wave scattering. As L shell continues to increase, generally becomes larger again, indicating an overall slower loss process by waves at high L shells. Our investigation based

  2. Characterization of a particle size distribution in a Ni-C granular thin film by grazing incidence small-angle X-ray scattering

    International Nuclear Information System (INIS)

    A grazing incidence small-angle X-ray scattering (GISAXS) technique has been applied for characterizing a particle size distribution of nickel nano-particles in a nickel-carbon granular (Ni-C granular) film fabricated by a cosputtering method on a silicon substrate. The particles were modelled as a spherical shape in order to calculate scattering intensity, and a Γ-distribution was employed for determining the size distribution. In addition, a grazing incidence X-ray diffraction (GIXRD) was also measured in order to determine crystallite size of the particles. The crystallite size was analyzed by the Sherrer equation. The average particle size and the crystallite size are 5.7 and 5.2 nm respectively. These results suggest most of nickel particles are single crystal

  3. Fire Radiative Energy and Biomass Burned Estimation Under Sparse Satellite Sampling Conditions: Using Power Law Probability Distribution Properties of MODIS Fire Radiative Power Retrievals

    Science.gov (United States)

    Sathyachandran, S.; Roy, D. P.; Boschetti, L.

    2010-12-01

    Spatially and temporally explicit mapping of the amount of biomass burned by fire is needed to estimate atmospheric emissions of green house gases and aerosols. The instantaneous Fire Radiative Power (FRP) [units: W] is retrieved at active fire detections from mid-infrared wavelength remotely sensed data and can be used to estimate the rate of biomass consumed. Temporal integration of FRP measurements over the duration of the fire provides the Fire Radiative Energy (FRE) [units: J] that has been shown to be linearly related to the total biomass burned [units: g]. However, FRE, and thus biomass burned retrieval, is sensitive to the satellite spatial and temporal sampling of FRP which can be sparse under cloudy conditions and with polar orbiting sensors such as MODIS. In this paper the FRE is derived in a new way as the product of the fire duration and the first moment of the FRP power law probability distribution. MODIS FRP data retrieved over savanna fires in Australia and deforestation fires in Brazil are shown to have power law distributions with different scaling parameters that are related to the fire energy in these two contrasting systems. The FRE derived burned biomass estimates computed using this new method are compared to estimates using the conventional temporal FRP integration method and with literature values. The results of the comparison suggest that the new method may provide more reliable burned biomass estimates under sparse satellite sampling conditions if the fire duration and the power law distribution parameters are characterized a priori.

  4. Ionization compression impact on dense gas distribution and star formation, Probability density functions around H ii regions as seen by Herschel

    CERN Document Server

    Tremblin, P; Minier, V; Didelon, P; Hill, T; Anderson, L D; Motte, F; Zavagno, A; André, Ph; Arzoumanian, D; Audit, E; Benedettini, M; Bontemps, S; Csengeri, T; Di Francesco, J; Giannini, T; Hennemann, M; Luong, Q Nguyen; Marston, A P; Peretto, N; Rivera-Ingraham, A; Russeil, D; Rygl, K L J; Spinoglio, L; White, G J

    2014-01-01

    Ionization feedback should impact the probability distribution function (PDF) of the column density around the ionized gas. We aim to quantify this effect and discuss its potential link to the Core and Initial Mass Function (CMF/IMF). We used in a systematic way Herschel column density maps of several regions observed within the HOBYS key program: M16, the Rosette and Vela C molecular cloud, and the RCW 120 H ii region. We fitted the column density PDFs of all clouds with two lognormal distributions, since they present a double-peak or enlarged shape in the PDF. Our interpretation is that the lowest part of the column density distribution describes the turbulent molecular gas while the second peak corresponds to a compression zone induced by the expansion of the ionized gas into the turbulent molecular cloud. The condensations at the edge of the ionized gas have a steep compressed radial profile, sometimes recognizable in the flattening of the power-law tail. This could lead to an unambiguous criterion able t...

  5. Probability Distribution Functions OF 12CO(J = 1-0) Brightness and Integrated Intensity in M51: The PAWS View

    CERN Document Server

    Hughes, Annie; Schinnerer, Eva; Colombo, Dario; Pety, Jerome; Leroy, Adam K; Dobbs, Clare L; Garcia-Burillo, Santiago; Thompson, Todd A; Dumas, Gaelle; Schuster, Karl F; Kramer, Carsten

    2013-01-01

    We analyse the distribution of CO brightness temperature and integrated intensity in M51 at ~40 pc resolution using new CO data from the Plateau de Bure Arcsecond Whirlpool Survey (PAWS). We present probability distribution functions (PDFs) of the CO emission within the PAWS field, which covers the inner 11 x 7 kpc of M51. We find variations in the shape of CO PDFs within different M51 environments, and between M51 and M33 and the Large Magellanic Cloud (LMC). Globally, the PDFs for the inner disk of M51 can be represented by narrow lognormal functions that cover 1 to 2 orders of magnitude in CO brightness and integrated intensity. The PDFs for M33 and the LMC are narrower and peak at lower CO intensities. However, the CO PDFs for different dynamical environments within the PAWS field depart from the shape of the global distribution. The PDFs for the interarm region are approximately lognormal, but in the spiral arms and central region of M51, they exhibit diverse shapes with a significant excess of bright CO...

  6. Probability distributions of landslide volumes

    OpenAIRE

    M. T. Brunetti; Guzzetti, F.; M. Rossi

    2009-01-01

    We examine 19 datasets with measurements of landslide volume, VL, for sub-aerial, submarine, and extraterrestrial mass movements. Individual datasets include from 17 to 1019 landslides of different types, including rock fall, rock slide, rock avalanche, soil slide, slide, and debris flow, with individual landslide volumes ranging over 10−4 m3VL&am...

  7. RANDOM VARIABLE WITH FUZZY PROBABILITY

    Institute of Scientific and Technical Information of China (English)

    吕恩琳; 钟佑明

    2003-01-01

    Mathematic description about the second kind fuzzy random variable namely the random variable with crisp event-fuzzy probability was studied. Based on the interval probability and using the fuzzy resolution theorem, the feasible condition about a probability fuzzy number set was given, go a step further the definition arid characters of random variable with fuzzy probability ( RVFP ) and the fuzzy distribution function and fuzzy probability distribution sequence of the RVFP were put forward. The fuzzy probability resolution theorem with the closing operation of fuzzy probability was given and proved. The definition and characters of mathematical expectation and variance of the RVFP were studied also. All mathematic description about the RVFP has the closing operation for fuzzy probability, as a result, the foundation of perfecting fuzzy probability operation method is laid.

  8. MERA: a webserver for evaluating backbone torsion angle distributions in dynamic and disordered proteins from NMR data

    Energy Technology Data Exchange (ETDEWEB)

    Mantsyzov, Alexey B. [M.V. Lomonosov Moscow State University, Faculty of Fundamental Medicine (Russian Federation); Shen, Yang; Lee, Jung Ho [National Institutes of Health, Laboratory of Chemical Physics, National Institute of Diabetes and Digestive and Kidney Diseases (United States); Hummer, Gerhard [Max Planck Institute of Biophysics (Germany); Bax, Ad, E-mail: bax@nih.gov [National Institutes of Health, Laboratory of Chemical Physics, National Institute of Diabetes and Digestive and Kidney Diseases (United States)

    2015-09-15

    MERA (Maximum Entropy Ramachandran map Analysis from NMR data) is a new webserver that generates residue-by-residue Ramachandran map distributions for disordered proteins or disordered regions in proteins on the basis of experimental NMR parameters. As input data, the program currently utilizes up to 12 different parameters. These include three different types of short-range NOEs, three types of backbone chemical shifts ({sup 15}N, {sup 13}C{sup α}, and {sup 13}C′), six types of J couplings ({sup 3}J{sub HNHα}, {sup 3}J{sub C′C′}, {sup 3}J{sub C′Hα}, {sup 1}J{sub HαCα}, {sup 2}J{sub CαN} and {sup 1}J{sub CαN}), as well as the {sup 15}N-relaxation derived J(0) spectral density. The Ramachandran map distributions are reported in terms of populations of their 15° × 15° voxels, and an adjustable maximum entropy weight factor is available to ensure that the obtained distributions will not deviate more from a newly derived coil library distribution than required to account for the experimental data. MERA output includes the agreement between each input parameter and its distribution-derived value. As an application, we demonstrate performance of the program for several residues in the intrinsically disordered protein α-synuclein, as well as for several static and dynamic residues in the folded protein GB3.

  9. MERA: a webserver for evaluating backbone torsion angle distributions in dynamic and disordered proteins from NMR data

    International Nuclear Information System (INIS)

    MERA (Maximum Entropy Ramachandran map Analysis from NMR data) is a new webserver that generates residue-by-residue Ramachandran map distributions for disordered proteins or disordered regions in proteins on the basis of experimental NMR parameters. As input data, the program currently utilizes up to 12 different parameters. These include three different types of short-range NOEs, three types of backbone chemical shifts (15N, 13Cα, and 13C′), six types of J couplings (3JHNHα, 3JC′C′, 3JC′Hα, 1JHαCα, 2JCαN and 1JCαN), as well as the 15N-relaxation derived J(0) spectral density. The Ramachandran map distributions are reported in terms of populations of their 15° × 15° voxels, and an adjustable maximum entropy weight factor is available to ensure that the obtained distributions will not deviate more from a newly derived coil library distribution than required to account for the experimental data. MERA output includes the agreement between each input parameter and its distribution-derived value. As an application, we demonstrate performance of the program for several residues in the intrinsically disordered protein α-synuclein, as well as for several static and dynamic residues in the folded protein GB3

  10. Challenging measurement of the {sup 16}O+{sup 27}Al elastic and inelastic angular distributions up to large angles

    Energy Technology Data Exchange (ETDEWEB)

    Cavallaro, M., E-mail: manuela.cavallaro@lns.infn.it [INFN, Laboratori Nazionali del Sud, Via S. Sofia 62, I-95125 Catania (Italy); Cappuzzello, F.; Carbone, D.; Cunsolo, A. [INFN, Laboratori Nazionali del Sud, Via S. Sofia 62, I-95125 Catania (Italy); Dipartimento di Fisica e Astronomia, Universita di Catania, Via S. Sofia 64, I-95125 Catania (Italy); Foti, A. [Dipartimento di Fisica e Astronomia, Universita di Catania, Via S. Sofia 64, I-95125 Catania (Italy); INFN, Sezione di Catania, Via S. Sofia 64, I-95125 Catania (Italy); Linares, R. [Instituto de Fisica, Universidade Federal Fluminense, Litoranea s/n, Gragoata, Niteroi, Rio de Janeiro 24210-340 (Brazil); Pereira, D.; Oliveira, J.R.B.; Gomes, P.R.S.; Lubian, J. [Universidade de Sao Paulo, Departamento de Fisica Nuclear, Instituto de Fisica da Universidade de Sao Paulo, Caixa Postal 66318, 05315-970 Sao Paulo, SP (Brazil); Chen, R. [Institute of Modern Physics, CAS, Lanzhou (China)

    2011-08-21

    The {sup 16}O+{sup 27}Al elastic and inelastic angular distributions have been measured in a broad angular range (13{sup o}<{theta}{sub lab}<52{sup o}) at about 100 MeV incident energy. The use of the MAGNEX large acceptance magnetic spectrometer and of the ray-reconstruction analysis technique has been crucial in order to provide, in the same experiment, high-resolution energy spectra and cross-section measurements distributed over more than seven orders of magnitude down to hundreds of nb/sr.

  11. Scale-by-scale analysis of probability distributions for global MODIS-AQUA cloud properties: how the large scale signature of turbulence may impact statistical analyses of clouds

    Directory of Open Access Journals (Sweden)

    M. de la Torre Juárez

    2011-03-01

    Full Text Available Means, standard deviations, homogeneity parameters used in models based on their ratio, and the probability distribution functions (PDFs of cloud properties from the MODerate resolution Infrared Spectrometer (MODIS are estimated globally as function of averaging scale varying from 5 to 500 km. The properties – cloud fraction, droplet effective radius, and liquid water path – all matter for cloud-climate uncertainty quantification and reduction efforts. Global means and standard deviations are confirmed to change with scale. For the range of scales considered, global means vary only within 3% for cloud fraction, 7% for liquid water path, and 0.2% for cloud particle effective radius. These scale dependences contribute to the uncertainties in their global budgets. Scale dependence for standard deviations and generalized flatness are compared to predictions for turbulent systems. Analytical expressions are identified that fit best to each observed PDF. While the best analytical PDF fit to each variable differs, all PDFs are well described by log-normal PDFs when the mean is normalized by the standard deviation inside each averaging domain. Importantly, log-normal distributions yield significantly better fits to the observations than gaussians at all scales. This suggests a possible approach for both sub-grid and unified stochastic modeling of these variables at all scales. The results also highlight the need to establish an adequate spatial resolution for two-stream radiative studies of cloud-climate interactions.

  12. Characteristics of the spatiotemporal distribution of daily extreme temperature events in China: Minimum temperature records in different climate states against the background of the most probable temperature

    Institute of Scientific and Technical Information of China (English)

    Qian Zhong-Hua; Hu Jing-Guo; Feng Guo-Lin; Cao Yong-Zhong

    2012-01-01

    Based on the skewed function,the most probable temperature is defined and the spatiotemporal distributions of the frequencies and strengths of extreme temperature events in different climate states over China are investigated,where the climate states are referred to as State Ⅰ,State Ⅱ and State Ⅲ,i.e.,the daily minimum temperature records of 1961-1990,1971-2000,and 1981-2009.The results show that in space the frequency of high temperature events in summer decreases clearly in the lower and middle reaches of the Yellow River in State Ⅰ and that low temperature events decrease in northern China in State Ⅱ.In the present state,the frequency of high temperature events increases significantly in most areas over China except the north east,while the frequency of low temperature events decreases mainly in north China and the regions between the Yangtze River and the Yellow River.The distributions of frequencies and strengths of extreme temperature events are consistent in space.The analysis of time evolution of extreme events shows that the occurrence of high temperature events become higher with the change in state,while that of low temperature events decreases.High temperature events are becoming stronger as well and deserve to be paid special attention.

  13. Position-probability-sampled Monte Carlo calculation of VMAT, 3DCRT, step-shoot IMRT, and helical tomotherapy dose distributions using BEAMnrc/DOSXYZnrc

    International Nuclear Information System (INIS)

    Purpose: The commercial release of volumetric modulated arc therapy techniques using a conventional linear accelerator and the growing number of helical tomotherapy users have triggered renewed interest in dose verification methods, and also in tools for exploring the impact of machine tolerance and patient motion on dose distributions without the need to approximate time-varying parameters such as gantry position, MLC leaf motion, or patient motion. To this end we have developed a Monte Carlo-based calculation method capable of simulating a wide variety of treatment techniques without the need to resort to discretization approximations. Methods: The ability to perform complete position-probability-sampled Monte Carlo dose calculations was implemented in the BEAMnrc/DOSXZYnrc user codes of EGSnrc. The method includes full accelerator head simulations of our tomotherapy and Elekta linacs, and a realistic representation of continous motion via the sampling of a time variable. The functionality of this algorithm was tested via comparisons with both measurements and treatment planning dose distributions for four types of treatment techniques: 3D conformal, step-shoot intensity modulated radiation therapy, helical tomotherapy, and volumetric modulated arc therapy. Results: For static fields, the absolute dose agreement between the EGSnrc Monte Carlo calculations and measurements is within 2%/1 mm. Absolute dose agreement between Monte Carlo calculations and treatment planning system for the four different treatment techniques is within 3%/3 mm. Discrepancies with the tomotherapy TPS on the order of 10%/5 mm were observed for the extreme example of a small target located 15 cm off-axis and planned with a low modulation factor. The increase in simulation time associated with using position-probability sampling, as opposed to the discretization approach, was less than 2% in most cases. Conclusions: A single Monte Carlo simulation method can be used to calculate patient

  14. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  15. The transformation of frequency distributions of winter precipitation to spring streamflow probabilities in cold regions; case studies from the Canadian Prairies

    Science.gov (United States)

    Shook, Kevin; Pomeroy, John; van der Kamp, Garth

    2015-02-01

    Hydrological processes alter the states and/or locations of water, and so they can be regarded as being transformations of the properties of the time series of input variables to those of output variables, such as the transformation of precipitation to streamflow. Semi-arid cold regions such as the Canadian Prairies have extremely low annual streamflow efficiencies because of high infiltration rates, large surface water storage capacities, high evaporation rates and strong climate seasonality. As a result snowfall produces the majority of streamflow. It is demonstrated that the probability distributions of Prairie spring streamflows are controlled by three frequency transformations. The first is the transformation of snowfall by wind redistribution and ablation over the winter to form the spring snowpack. The second transformation is the melt of the spring snowpack to produce runoff over frozen agricultural soils. The third is the transformation of runoff to streamflow by the filling and spilling of depressional storage by connecting fields, ponds, wetlands and lakes. Each transformation of the PDF of the input variable to that of the output variable is demonstrated at a number of locations in the Canadian Prairies and is explained in terms of the hydrological processes causing the transformation. The resulting distributions are highly modified from that of precipitation, and the modification depends on which processes dominate streamflow formation in each basin. The results demonstrate the need to consider the effect of the interplay among hydrological processes, climate and basin characteristics in transforming precipitation frequency distributions into those of streamflow for the design of infrastructure and for water management.

  16. Monte Carlo dose calculations and radiobiological modelling: analysis of the effect of the statistical noise of the dose distribution on the probability of tumour control

    International Nuclear Information System (INIS)

    The aim of this work is to investigate the influence of the statistical fluctuations of Monte Carlo (MC) dose distributions on the dose volume histograms (DVHs) and radiobiological models, in particular the Poisson model for tumour control probability (tcp). The MC matrix is characterized by a mean dose in each scoring voxel, d, and a statistical error on the mean dose, σd; whilst the quantities d and σd depend on many statistical and physical parameters, here we consider only their dependence on the phantom voxel size and the number of histories from the radiation source. Dose distributions from high-energy photon beams have been analysed. It has been found that the DVH broadens when increasing the statistical noise of the dose distribution, and the tcp calculation systematically underestimates the real tumour control value, defined here as the value of tumour control when the statistical error of the dose distribution tends to zero. When increasing the number of energy deposition events, either by increasing the voxel dimensions or increasing the number of histories from the source, the DVH broadening decreases and tcp converges to the 'correct' value. It is shown that the underestimation of the tcp due to the noise in the dose distribution depends on the degree of heterogeneity of the radiobiological parameters over the population; in particular this error decreases with increasing the biological heterogeneity, whereas it becomes significant in the hypothesis of a radiosensitivity assay for single patients, or for subgroups of patients. It has been found, for example, that when the voxel dimension is changed from a cube with sides of 0.5 cm to a cube with sides of 0.25 cm (with a fixed number of histories of 108 from the source), the systematic error in the tcp calculation is about 75% in the homogeneous hypothesis, and it decreases to a minimum value of about 15% in a case of high radiobiological heterogeneity. The possibility of using the error on the tcp to

  17. Water quality analysis in rivers with non-parametric probability distributions and fuzzy inference systems: application to the Cauca River, Colombia.

    Science.gov (United States)

    Ocampo-Duque, William; Osorio, Carolina; Piamba, Christian; Schuhmacher, Marta; Domingo, José L

    2013-02-01

    The integration of water quality monitoring variables is essential in environmental decision making. Nowadays, advanced techniques to manage subjectivity, imprecision, uncertainty, vagueness, and variability are required in such complex evaluation process. We here propose a probabilistic fuzzy hybrid model to assess river water quality. Fuzzy logic reasoning has been used to compute a water quality integrative index. By applying a Monte Carlo technique, based on non-parametric probability distributions, the randomness of model inputs was estimated. Annual histograms of nine water quality variables were built with monitoring data systematically collected in the Colombian Cauca River, and probability density estimations using the kernel smoothing method were applied to fit data. Several years were assessed, and river sectors upstream and downstream the city of Santiago de Cali, a big city with basic wastewater treatment and high industrial activity, were analyzed. The probabilistic fuzzy water quality index was able to explain the reduction in water quality, as the river receives a larger number of agriculture, domestic, and industrial effluents. The results of the hybrid model were compared to traditional water quality indexes. The main advantage of the proposed method is that it considers flexible boundaries between the linguistic qualifiers used to define the water status, being the belongingness of water quality to the diverse output fuzzy sets or classes provided with percentiles and histograms, which allows classify better the real water condition. The results of this study show that fuzzy inference systems integrated to stochastic non-parametric techniques may be used as complementary tools in water quality indexing methodologies. PMID:23266912

  18. Scoliosis angle

    International Nuclear Information System (INIS)

    The most commonly used methods of assessing the scoliotic deviation measure angles that are not clearly defined in relation to the anatomy of the patient. In order to give an anatomic basis for such measurements it is proposed to define the scoliotic deviation as the deviation the vertebral column makes with the sagittal plane. Both the Cobb and the Ferguson angles may be based on this definition. The present methods of measurement are then attempts to measure these angles. If the plane of these angles is parallel to the film, the measurement will be correct. Errors in the measurements may be incurred by the projection. A hypothetical projection, called a 'rectified orthogonal projection', is presented, which correctly represents all scoliotic angles in accordance with these principles. It can be constructed in practice with the aid of a computer and by performing measurements on two projections of the vertebral column; a scoliotic curve can be represented independent of the kyphosis and lordosis. (Auth.)

  19. Anisotropic pitch angle distribution of ~100 keV microburst electrons in the loss cone: measurements from STSAT-1

    Directory of Open Access Journals (Sweden)

    J. J. Lee

    2012-11-01

    Full Text Available Electron microburst energy spectra in the range of 170 keV to 360 keV have been measured using two solid-state detectors onboard the low-altitude (680 km, polar-orbiting Korean STSAT-1 (Science and Technology SATellite-1. Applying a unique capability of the spacecraft attitude control system, microburst energy spectra have been accurately resolved into two components: perpendicular to and parallel to the geomagnetic field direction. The former measures trapped electrons and the latter those electrons with pitch angles in the loss cone and precipitating into atmosphere. It is found that the perpendicular component energy spectra are harder than the parallel component and the loss cone is not completely filled by the electrons in the energy range of 170 keV to 360 keV. These results have been modeled assuming a wave-particle cyclotron resonance mechanism, where higher energy electrons travelling within a magnetic flux tube interact with whistler mode waves at higher latitudes (lower altitudes. Our results suggest that because higher energy (relativistic microbursts do not fill the loss cone completely, only a small portion of electrons is able to reach low altitude (~100 km atmosphere. Thus assuming that low energy microbursts and relativistic microbursts are created by cyclotron resonance with chorus elements (but at different locations, the low energy portion of the microburst spectrum will dominate at low altitudes. This explains why relativistic microbursts have not been observed by balloon experiments, which typically float at altitudes of ~30 km and measure only X-ray flux produced by collisions between neutral atmospheric particles and precipitating electrons.

  20. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    insurance companies facing losses due to natural disasters, banks seeking protection against huge losses, failures in expensive and sophisticated systems or loss of valuable information in electronic systems. The main difficulty when dealing with this kind of problems is the unavailability of a closed...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators. By...

  1. Generating target probability sequences and events

    OpenAIRE

    Ella, Vaignana Spoorthy

    2013-01-01

    Cryptography and simulation of systems require that events of pre-defined probability be generated. This paper presents methods to generate target probability events based on the oblivious transfer protocol and target probabilistic sequences using probability distribution functions.

  2. Probability distribution functions of 12CO(J = 1 → 0) brightness and integrated intensity in M51: The PAWS view

    International Nuclear Information System (INIS)

    We analyze the distribution of CO brightness temperature and integrated intensity in M51 at ∼40 pc resolution using new 12CO(J = 1 → 0) data from the Plateau de Bure Arcsecond Whirlpool Survey (PAWS). We present probability distribution functions (PDFs) of the CO emission within the PAWS field of view, which covers the inner ∼11 × 7 kpc of M51. We find clear variations in the shape of CO PDFs both within different M51 environments, defined according to dynamical criteria, and among M51 and two nearby low-mass galaxies, M33 and the Large Magellanic Cloud (LMC). Globally, the PDFs for the inner disk of M51 can be represented by narrow lognormal functions that cover ∼1-2 orders of magnitude in CO brightness and integrated intensity. The PDFs for M33 and the LMC are narrower and peak at lower CO intensities, consistent with their lower gas surface densities. However, the CO PDFs for different dynamical environments within the PAWS field depart significantly from the shape of the global distribution. The PDFs for the interarm region are approximately lognormal, but in the spiral arms and central region of M51, they exhibit diverse shapes with a significant excess of bright CO emission. The observed environmental dependence on the shape of the CO PDFs is qualitatively consistent with changes that would be expected if molecular gas in the spiral arms is characterized by a larger range of average densities, gas temperatures, and velocity fluctuations, although further work is required to disentangle the relative importance of large-scale dynamical effects versus star formation feedback in regulating these properties. We show that the shape of the CO PDFs for different M51 environments is only weakly related to global properties of the CO emission, e.g., the total CO luminosity, but is strongly correlated with properties of the local giant molecular cloud (GMC) and young stellar cluster populations, including the shape of their mass distributions. For galaxies with

  3. Probability distribution functions of {sup 12}CO(J = 1 → 0) brightness and integrated intensity in M51: The PAWS view

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, Annie; Meidt, Sharon E.; Schinnerer, Eva; Colombo, Dario [Max-Planck-Institut für Astronomie, Königstuhl 17, D-69117 Heidelberg (Germany); Pety, Jerôme; Dumas, Gaëlle; Schuster, Karl F. [Institut de Radioastronomie Millimétrique, 300 Rue de la Piscine, F-38406 Saint Martin d' Hères (France); Leroy, Adam K. [National Radio Astronomy Observatory, 520 Edgemont Road, Charlottesville, VA 22903 (United States); Dobbs, Clare L. [School of Physics and Astronomy, University of Exeter, Stocker Road, Exeter EX4 4QL (United Kingdom); García-Burillo, Santiago [Observatorio Astronómico Nacional, Observatorio de Madrid, Alfonso XII, 3, E-28014 Madrid (Spain); Thompson, Todd A. [Department of Astronomy, The Ohio State University, 140 West 18th Avenue, Columbus, OH 43210 (United States); Kramer, Carsten [Instituto Radioastronomía Milimétrica, Avenida Divina Pastora 7, Nucleo Central, E-18012 Granada (Spain)

    2013-12-10

    We analyze the distribution of CO brightness temperature and integrated intensity in M51 at ∼40 pc resolution using new {sup 12}CO(J = 1 → 0) data from the Plateau de Bure Arcsecond Whirlpool Survey (PAWS). We present probability distribution functions (PDFs) of the CO emission within the PAWS field of view, which covers the inner ∼11 × 7 kpc of M51. We find clear variations in the shape of CO PDFs both within different M51 environments, defined according to dynamical criteria, and among M51 and two nearby low-mass galaxies, M33 and the Large Magellanic Cloud (LMC). Globally, the PDFs for the inner disk of M51 can be represented by narrow lognormal functions that cover ∼1-2 orders of magnitude in CO brightness and integrated intensity. The PDFs for M33 and the LMC are narrower and peak at lower CO intensities, consistent with their lower gas surface densities. However, the CO PDFs for different dynamical environments within the PAWS field depart significantly from the shape of the global distribution. The PDFs for the interarm region are approximately lognormal, but in the spiral arms and central region of M51, they exhibit diverse shapes with a significant excess of bright CO emission. The observed environmental dependence on the shape of the CO PDFs is qualitatively consistent with changes that would be expected if molecular gas in the spiral arms is characterized by a larger range of average densities, gas temperatures, and velocity fluctuations, although further work is required to disentangle the relative importance of large-scale dynamical effects versus star formation feedback in regulating these properties. We show that the shape of the CO PDFs for different M51 environments is only weakly related to global properties of the CO emission, e.g., the total CO luminosity, but is strongly correlated with properties of the local giant molecular cloud (GMC) and young stellar cluster populations, including the shape of their mass distributions. For

  4. The neolithic demographic transition in Europe: correlation with juvenility index supports interpretation of the summed calibrated radiocarbon date probability distribution (SCDPD as a valid demographic proxy.

    Directory of Open Access Journals (Sweden)

    Sean S Downey

    Full Text Available Analysis of the proportion of immature skeletons recovered from European prehistoric cemeteries has shown that the transition to agriculture after 9000 BP triggered a long-term increase in human fertility. Here we compare the largest analysis of European cemeteries to date with an independent line of evidence, the summed calibrated date probability distribution of radiocarbon dates (SCDPD from archaeological sites. Our cemetery reanalysis confirms increased growth rates after the introduction of agriculture; the radiocarbon analysis also shows this pattern, and a significant correlation between both lines of evidence confirms the demographic validity of SCDPDs. We analyze the areal extent of Neolithic enclosures and demographic data from ethnographically known farming and foraging societies and we estimate differences in population levels at individual sites. We find little effect on the overall shape and precision of the SCDPD and we observe a small increase in the correlation with the cemetery trends. The SCDPD analysis supports the hypothesis that the transition to agriculture dramatically increased demographic growth, but it was followed within centuries by a general pattern of collapse even after accounting for higher settlement densities during the Neolithic. The study supports the unique contribution of SCDPDs as a valid demographic proxy for the demographic patterns associated with early agriculture.

  5. Evaluation of the mercury contamination in mushrooms of genus Leccinum from two different regions of the world: Accumulation, distribution and probable dietary intake.

    Science.gov (United States)

    Falandysz, Jerzy; Zhang, Ji; Wang, Yuanzhong; Krasińska, Grażyna; Kojta, Anna; Saba, Martyna; Shen, Tao; Li, Tao; Liu, Honggao

    2015-12-15

    This study focused on investigation of the accumulation and distribution of mercury (Hg) in mushrooms of the genus Leccinum that emerged on soils of totally different geochemical bedrock composition. Hg in 6 species from geographically diverse regions of the mercuriferous belt areas in Yunnan of SW China, and 8 species from the non-mercuriferous regions of Poland in Europe was measured. Also assessed was the probable dietary intake of Hg from consumption of Leccinum spp., which are traditional organic food items in SW China and Poland. The results showed that L. chromapes, L. extremiorientale, L. griseum and L. rugosicepes are good accumulators of Hg and the sequestered Hg in caps were up to 4.8, 3.5, 3.6 and 4.7 mg Hg kg(-1) dry matter respectively. Leccinum mushrooms from Poland also efficiently accumulated Hg with their average Hg content being an order of magnitude lower due to low concentrations of Hg in forest topsoil of Poland compared to the elevated contents in Yunnan. Consumption of Leccinum mushrooms with elevated Hg contents in Yunnan at rates of up to 300 g fresh product per week during the foraging season would not result in Hg intake that exceeds the provisional weekly tolerance limit of 0.004 mg kg(-1) body mass, assuming no Hg ingestion from other foods. PMID:26322595

  6. Summed Probability Distribution of 14C Dates Suggests Regional Divergences in the Population Dynamics of the Jomon Period in Eastern Japan.

    Science.gov (United States)

    Crema, Enrico R; Habu, Junko; Kobayashi, Kenichi; Madella, Marco

    2016-01-01

    Recent advances in the use of summed probability distribution (SPD) of calibrated 14C dates have opened new possibilities for studying prehistoric demography. The degree of correlation between climate change and population dynamics can now be accurately quantified, and divergences in the demographic history of distinct geographic areas can be statistically assessed. Here we contribute to this research agenda by reconstructing the prehistoric population change of Jomon hunter-gatherers between 7,000 and 3,000 cal BP. We collected 1,433 14C dates from three different regions in Eastern Japan (Kanto, Aomori and Hokkaido) and established that the observed fluctuations in the SPDs were statistically significant. We also introduced a new non-parametric permutation test for comparing multiple sets of SPDs that highlights point of divergences in the population history of different geographic regions. Our analyses indicate a general rise-and-fall pattern shared by the three regions but also some key regional differences during the 6th millennium cal BP. The results confirm some of the patterns suggested by previous archaeological studies based on house and site counts but offer statistical significance and an absolute chronological framework that will enable future studies aiming to establish potential correlation with climatic changes. PMID:27128032

  7. Summed Probability Distribution of 14C Dates Suggests Regional Divergences in the Population Dynamics of the Jomon Period in Eastern Japan.

    Directory of Open Access Journals (Sweden)

    Enrico R Crema

    Full Text Available Recent advances in the use of summed probability distribution (SPD of calibrated 14C dates have opened new possibilities for studying prehistoric demography. The degree of correlation between climate change and population dynamics can now be accurately quantified, and divergences in the demographic history of distinct geographic areas can be statistically assessed. Here we contribute to this research agenda by reconstructing the prehistoric population change of Jomon hunter-gatherers between 7,000 and 3,000 cal BP. We collected 1,433 14C dates from three different regions in Eastern Japan (Kanto, Aomori and Hokkaido and established that the observed fluctuations in the SPDs were statistically significant. We also introduced a new non-parametric permutation test for comparing multiple sets of SPDs that highlights point of divergences in the population history of different geographic regions. Our analyses indicate a general rise-and-fall pattern shared by the three regions but also some key regional differences during the 6th millennium cal BP. The results confirm some of the patterns suggested by previous archaeological studies based on house and site counts but offer statistical significance and an absolute chronological framework that will enable future studies aiming to establish potential correlation with climatic changes.

  8. Summed Probability Distribution of 14C Dates Suggests Regional Divergences in the Population Dynamics of the Jomon Period in Eastern Japan

    Science.gov (United States)

    Habu, Junko; Kobayashi, Kenichi; Madella, Marco

    2016-01-01

    Recent advances in the use of summed probability distribution (SPD) of calibrated 14C dates have opened new possibilities for studying prehistoric demography. The degree of correlation between climate change and population dynamics can now be accurately quantified, and divergences in the demographic history of distinct geographic areas can be statistically assessed. Here we contribute to this research agenda by reconstructing the prehistoric population change of Jomon hunter-gatherers between 7,000 and 3,000 cal BP. We collected 1,433 14C dates from three different regions in Eastern Japan (Kanto, Aomori and Hokkaido) and established that the observed fluctuations in the SPDs were statistically significant. We also introduced a new non-parametric permutation test for comparing multiple sets of SPDs that highlights point of divergences in the population history of different geographic regions. Our analyses indicate a general rise-and-fall pattern shared by the three regions but also some key regional differences during the 6th millennium cal BP. The results confirm some of the patterns suggested by previous archaeological studies based on house and site counts but offer statistical significance and an absolute chronological framework that will enable future studies aiming to establish potential correlation with climatic changes. PMID:27128032

  9. Surface drift prediction in the Adriatic Sea using hyper-ensemble statistics on atmospheric, ocean and wave models: Uncertainties and probability distribution areas

    Science.gov (United States)

    Rixen, M.; Ferreira-Coelho, E.; Signell, R.

    2008-01-01

    Despite numerous and regular improvements in underlying models, surface drift prediction in the ocean remains a challenging task because of our yet limited understanding of all processes involved. Hence, deterministic approaches to the problem are often limited by empirical assumptions on underlying physics. Multi-model hyper-ensemble forecasts, which exploit the power of an optimal local combination of available information including ocean, atmospheric and wave models, may show superior forecasting skills when compared to individual models because they allow for local correction and/or bias removal. In this work, we explore in greater detail the potential and limitations of the hyper-ensemble method in the Adriatic Sea, using a comprehensive surface drifter database. The performance of the hyper-ensembles and the individual models are discussed by analyzing associated uncertainties and probability distribution maps. Results suggest that the stochastic method may reduce position errors significantly for 12 to 72??h forecasts and hence compete with pure deterministic approaches. ?? 2007 NATO Undersea Research Centre (NURC).

  10. The Chandra COSMOS Legacy Survey: Clustering of X-ray selected AGN at 2.9Probability Distribution Functions

    CERN Document Server

    Allevato, V; Finoguenov, A; Marchesi, S; Zamorani, G; Hasinger, G; Salvato, M; Miyaji, T; Gilli, R; Cappelluti, N; Brusa, M; Suh, H; Lanzuisi, G; Trakhtenbrot, B; Griffiths, R; Vignali, C; Schawinski, K; Karim, A

    2016-01-01

    We present the measurement of the projected and redshift space 2-point correlation function (2pcf) of the new catalog of Chandra COSMOS-Legacy AGN at 2.9$\\leq$z$\\leq$5.5 ($\\langle L_{bol} \\rangle \\sim$10$^{46}$ erg/s) using the generalized clustering estimator based on phot-z probability distribution functions (Pdfs) in addition to any available spec-z. We model the projected 2pcf estimated using $\\pi_{max}$ = 200 h$^{-1}$ Mpc with the 2-halo term and we derive a bias at z$\\sim$3.4 equal to b = 6.6$^{+0.60}_{-0.55}$, which corresponds to a typical mass of the hosting halos of log M$_h$ = 12.83$^{+0.12}_{-0.11}$ h$^{-1}$ M$_{\\odot}$. A similar bias is derived using the redshift-space 2pcf, modelled including the typical phot-z error $\\sigma_z$ = 0.052 of our sample at z$\\geq$2.9. Once we integrate the projected 2pcf up to $\\pi_{max}$ = 200 h$^{-1}$ Mpc, the bias of XMM and \\textit{Chandra} COSMOS at z=2.8 used in Allevato et al. (2014) is consistent with our results at higher redshift. The results suggest only...

  11. Moments of the Hilbert-Schmidt probability distributions over determinants of real two-qubit density matrices and of their partial transposes

    CERN Document Server

    Slater, Paul B

    2010-01-01

    The nonnegativity of the determinant of the partial transpose of a two-qubit (4 x 4) density matrix is both a necessary and sufficient condition for its separability. While the determinant is restricted to the interval [0,1/256], the determinant of the partial transpose can range over [-1/16,1/256], with negative values corresponding to entangled states. We report here the exact values of the first nine moments of the probability distribution of the partial transpose over this interval, with respect to the Hilbert-Schmidt (metric volume element) measure on the nine-dimensional convex set of real two-qubit density matrices. Rational functions C_{2 j}(m), yielding the coefficients of the 2j-th power of even polynomials occurring at intermediate steps in our derivation of the m-th moment, emerge. These functions possess poles at finite series of consecutive half-integers (m=-3/2,-1/2,...,(2j-1)/2), and certain (trivial) roots at finite series of consecutive natural numbers (m=0, 1,...). Additionally, the (nontri...

  12. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications...

  13. Angle-resolved intensity and energy distributions of positive and negative hydrogen ions released from tungsten surface by molecular hydrogen ion impact

    Science.gov (United States)

    Kato, S.; Tanaka, N.; Sasao, M.; Kisaki, M.; Tsumori, K.; Nishiura, M.; Matsumoto, Y.; Kenmotsu, T.; Wada, M.; Yamaoka, H.

    2015-08-01

    Hydrogen ion reflection properties have been investigated following the injection of H+, H2+ and H3+ ions onto a polycrystalline W surface. Angle- and energy-resolved intensity distributions of both scattered H+ and H- ions are measured by a magnetic momentum analyzer. We have detected atomic hydrogen ions reflected from the surface, while molecular hydrogen ions are unobserved within our detection limit. The reflected hydrogen ion energy is approximately less than one-third of the incident beam energy for H3+ ion injection and less than a half of that for H2+ ion injection. Other reflection properties are very similar to those of monoatomic H+ ion injection. Experimental results are compared to the classical trajectory simulations using the ACAT code based on the binary collision approximation.

  14. Angle-resolved intensity and energy distributions of positive and negative hydrogen ions released from tungsten surface by molecular hydrogen ion impact

    International Nuclear Information System (INIS)

    Hydrogen ion reflection properties have been investigated following the injection of H+, H2+ and H3+ ions onto a polycrystalline W surface. Angle- and energy-resolved intensity distributions of both scattered H+ and H− ions are measured by a magnetic momentum analyzer. We have detected atomic hydrogen ions reflected from the surface, while molecular hydrogen ions are unobserved within our detection limit. The reflected hydrogen ion energy is approximately less than one-third of the incident beam energy for H3+ ion injection and less than a half of that for H2+ ion injection. Other reflection properties are very similar to those of monoatomic H+ ion injection. Experimental results are compared to the classical trajectory simulations using the ACAT code based on the binary collision approximation

  15. Angle-resolved intensity and energy distributions of positive and negative hydrogen ions released from tungsten surface by molecular hydrogen ion impact

    Energy Technology Data Exchange (ETDEWEB)

    Kato, S., E-mail: eun1302@mail4.doshsha.ac.jp [Doshisha University, Kyotanabe, Kyoto 610-0321 (Japan); Tanaka, N. [Institute of Laser Engineering, Osaka University, Suita, Osaka 565-0871 (Japan); Sasao, M. [Doshisha University, Kyotanabe, Kyoto 610-0321 (Japan); Kisaki, M.; Tsumori, K. [National Institute for Fusion Science, Toki, Gifu 509-5292 (Japan); Nishiura, M. [University of Tokyo, Kashiwa, Chiba 277-8568 (Japan); Matsumoto, Y. [Tokushima Bunri University, Yamashiro, Tokushima 770-8514 (Japan); Kenmotsu, T.; Wada, M. [Doshisha University, Kyotanabe, Kyoto 610-0321 (Japan); Yamaoka, H. [RIKEN SPring-8 Center, Sayo, Hyogo 679-5148 (Japan)

    2015-08-15

    Hydrogen ion reflection properties have been investigated following the injection of H{sup +}, H{sub 2}{sup +} and H{sub 3}{sup +} ions onto a polycrystalline W surface. Angle- and energy-resolved intensity distributions of both scattered H{sup +} and H{sup −} ions are measured by a magnetic momentum analyzer. We have detected atomic hydrogen ions reflected from the surface, while molecular hydrogen ions are unobserved within our detection limit. The reflected hydrogen ion energy is approximately less than one-third of the incident beam energy for H{sub 3}{sup +} ion injection and less than a half of that for H{sub 2}{sup +} ion injection. Other reflection properties are very similar to those of monoatomic H{sup +} ion injection. Experimental results are compared to the classical trajectory simulations using the ACAT code based on the binary collision approximation.

  16. Winding angle distribution for planar random walk, polymer ring entangled with an obstacle, and all that: Spitzer-Edwards-Prager-Frisch model revisited

    International Nuclear Information System (INIS)

    Using a general Green function formulation, we re-derive, both (i) Spitzer and his followers results for the winding angle distribution of the planar Brownian motion, and (ii) Edwards-Prager-Frisch results on the statistical mechanics of a ring polymer entangled with a straight bar. In the statistical mechanics part, we consider both cases of quenched and annealed topology. Among new results, we compute exactly the (expectation value of) the surface area of the locus of points such that each of them has linking number n with a given closed random walk trajectory (ring polymer). We also consider the generalizations of the problem for the finite diameter (disc-like) obstacle and winding within a cavity

  17. Distribution of the Reynolds stress tensor inside tip leakage vortex of a linear compressor cascade (I): effect of inlet flow angle

    International Nuclear Information System (INIS)

    A steady-state Reynolds averaged Navier-Stokes simulation was conducted to investigate the distribution of the Reynolds stress tensor inside tip leakage vortex of a linear compressor cascade. Two different inlet flow angles β=29.3 .deg. (design condition) and 36.5 .deg. (off-design condition) at a constant tip clearance size of 1% blade span were considered. Classical methods of solid mechanics, applied to view the Reynolds stress tensor in the principal direction system, clearly showed that the high anisotropic feature of turbulent flow field was dominant at the outer part of tip leakage vortex near the suction side of the blade and endwall flow separation region, whereas a nearly isotropic turbulence was found at the center of tip leakage vortex. There was no significant difference in the anisotropy of the Reynolds normal stresses inside tip leakage vortex between the design and off-design condition

  18. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving a...

  19. Review on Gene Expression Models: Probability Distribution%基因表达模型的研究进展:概率分布

    Institute of Scientific and Technical Information of China (English)

    周天寿

    2012-01-01

    Quantifying gene expression (including mathematical modeling and qualitative and quantitative analysis) -is not only an important step toward to understanding intracellular processes but also the core of the current sys-tems biology. Gene expression models have been developed to complicated multi-state models considering detailed biological processes and a number of biological factors from the initial simple single-state models. Based on central dogma in biology, The proceeding in the study of gene expression models, focusing on improvement of mathe-matical models, probability distribution of mRNAs and proteins, etc. are simply reviewed. Consequently, some general laws related to gene expression are summarized. In addition, some issues to deserve further studies are discussed and potential directions are pointed out.%定量化基因表达(包括数学建模及定性与定量分析)是理解细胞内部过程的重要一步,也是当今系统生物学的核心研究内容.基因表达模型已从最初的单状态简单模型发展到考虑细化生物过程、众多生物因素的多状态复杂模型.基于生物学的中心法则,综述了有关基因表达模型的最新研究进展,聚焦于数学模型的完善、mRNA与蛋白质数目的概率分布等研究方面.通过综述,试图总结出有关基因表达的某些一般性规律,并提出今后需要进一步研究的问题与发展方向.

  20. Multivariate joint probability distribution of droughts in Wei River basin%渭河流域干旱特征联合概率分布研究

    Institute of Scientific and Technical Information of China (English)

    马明卫; 宋松柏; 于艺; 张雨; 李扬

    2012-01-01

    This study aims to model the dependence structures of multivariate drought variables using elliptical copulas.Bivariate dependence was estimated with Pearson′s classical correlation coefficient γn,Spearman′s ρnand Kendall′s τn,together with rank scatter plot and Chi-plot and K-plot,while parameters of trivariate copulas were estimated with the maximum likelihood method.For best-fitting of these copulas,Akaike information criterion(AIC),Bayesian information criterion(BIC) and RMS error(RMSE) were used,and a bootstrap version of Rosenblatt′s transformation was used to test goodness-of-fit for Gaussian copula and student t copula.In application to the Wei River basin for determination of its spatial distribution of drought return periods,Gaussian copula was selected for modeling the multivariate joint probability distribution of its drought duration,drought severity and severity peak.The results show that both Gaussian and student t copulas are applicable,but Gaussian copula gives better fitting.In the basin,prolonged droughts had frequently broken out with rather short return periods and thus more emphases should be placed on drought forecast and management in the future.%应用椭圆copulas描述干旱多变量间的相依性结构。采用Pearson'sγn、Spearman'sρn、Kendall'sτn、秩相关图、Chi-plot和K-plot度量2变量相依性;根据极大似然法估计3维copulas的参数,并以AIC、BIC和RMSE进行copulas拟合效果评价;运用基于Rosenblatt变换的Bootstrap法进行Gaussian copula和Student t copula的拟合度检验;选择Gaussiancopula描述干旱历时D、烈度S、和峰值P的联合概率分布,探讨渭河流域干旱重现期的空间分布规律。研究表明:①3维Gaussian copula和Student t copula均适合用来描述干旱多变量联合概率分布,且前者拟合效果优于后者;②渭河流域发生较长时期持续干旱的频率高、重现期短,应加强干旱预报与管理。

  1. Analysis of biopsy outcome after three-dimensional conformal radiation therapy of prostate cancer using dose-distribution variables and tumor control probability models

    International Nuclear Information System (INIS)

    Purpose: To investigate tumor control following three-dimensional conformal radiation therapy (3D-CRT) of prostate cancer and to identify dose-distribution variables that correlate with local control assessed through posttreatment prostate biopsies. Methods and Material: Data from 132 patients, treated at Memorial Sloan-Kettering Cancer Center (MSKCC), who had a prostate biopsy 2.5 years or more after 3D-CRT for T1c-T3 prostate cancer with prescription doses of 64.8-81 Gy were analyzed. Variables derived from the dose distribution in the PTV included: minimum dose (Dmin), maximum dose (Dmax), mean dose (Dmean), dose to n% of the PTV (Dn), where n = 1%, ..., 99%. The concept of the equivalent uniform dose (EUD) was evaluated for different values of the surviving fraction at 2 Gy (SF2). Four tumor control probability (TCP) models (one phenomenologic model using a logistic function and three Poisson cell kill models) were investigated using two sets of input parameters, one for low and one for high T-stage tumors. Application of both sets to all patients was also investigated. In addition, several tumor-related prognostic variables were examined (including T-stage, Gleason score). Univariate and multivariate logistic regression analyses were performed. The ability of the logistic regression models (univariate and multivariate) to predict the biopsy result correctly was tested by performing cross-validation analyses and evaluating the results in terms of receiver operating characteristic (ROC) curves. Results: In univariate analysis, prescription dose (Dprescr), Dmax, Dmean, dose to n% of the PTV with n of 70% or less correlate with outcome (p 2: EUD correlates significantly with outcome for SF2 of 0.4 or more, but not for lower SF2 values. Using either of the two input parameters sets, all TCP models correlate with outcome (p 2, is limited because the low dose region may not coincide with the tumor location. Instead, for MSKCC prostate cancer patients with their

  2. Extracting magnetic cluster size and its distributions in advanced perpendicular recording media with shrinking grain size using small angle x-ray scattering

    International Nuclear Information System (INIS)

    We analyze the magnetic cluster size (MCS) and magnetic cluster size distribution (MCSD) in a variety of perpendicular magnetic recording (PMR) media designs using resonant small angle x-ray scattering at the Co L3 absorption edge. The different PMR media flavors considered here vary in grain size between 7.5 and 9.5 nm as well as in lateral inter-granular exchange strength, which is controlled via the segregant amount. While for high inter-granular exchange, the MCS increases rapidly for grain sizes below 8.5 nm, we show that for increased amount of segregant with less exchange the MCS remains relatively small, even for grain sizes of 7.5 and 8 nm. However, the MCSD still increases sharply when shrinking grains from 8 to 7.5 nm. We show evidence that recording performance such as signal-to-noise-ratio on the spin stand correlates well with the product of magnetic cluster size and magnetic cluster size distribution

  3. Assessment of evaluated (n,d) energy-angle elastic scattering distributions using MCNP simulations of critical measurements and simplified calculation benchmarks

    International Nuclear Information System (INIS)

    Different evaluated (n,d) energy-angle elastic scattering distributions produce k-effective differences in MCNP5 simulations of critical experiments involving heavy water (D2O) of sufficient magnitude to suggest a need for new (n,d) scattering measurements and/or distributions derived from modern theoretical nuclear models, especially at neutron energies below a few MeV. The present work focuses on the small reactivity change of 2O coolant-void-reactivity calculation bias for simulations of two pairs of critical experiments performed in the ZED-2 reactor at the Chalk River Laboratories when different nuclear data libraries are used for deuterium. The deuterium data libraries tested include Endf/B-VII.0, Endf/B-VI.4, JENDL-3.3 and a new evaluation, labelled Bonn-B, which is based on recent theoretical nuclear-model calculations. Comparison calculations were also performed for a simplified, two-region, spherical model having an inner, 250-cm radius, homogeneous sphere of UO2, without and with deuterium, and an outer 20-cm-thick deuterium reflector. A notable observation from this work is the reduction of about 0.4 mk in the MCNP5 ZED-2 CVR calculation bias that is obtained when the O-in-UO2 thermal scattering data comes from Endf-B-VII.0. (author)

  4. Extracting magnetic cluster size and its distributions in advanced perpendicular recording media with shrinking grain size using small angle x-ray scattering

    Energy Technology Data Exchange (ETDEWEB)

    Mehta, Virat; Ikeda, Yoshihiro; Takano, Ken; Terris, Bruce D.; Hellwig, Olav [San Jose Research Center, HGST a Western Digital company, 3403 Yerba Buena Rd., San Jose, California 95135 (United States); Wang, Tianhan [Department of Materials Science and Engineering, Stanford University, Stanford, California 94035 (United States); Stanford Institute for Materials and Energy Science (SIMES), SLAC National Accelerator Laboratory, 2575 Sand Hill Road, Menlo Park, California 94025 (United States); Wu, Benny; Graves, Catherine [Stanford Institute for Materials and Energy Science (SIMES), SLAC National Accelerator Laboratory, 2575 Sand Hill Road, Menlo Park, California 94025 (United States); Department of Applied Physics, Stanford University, Stanford, California 94035 (United States); Dürr, Hermann A.; Scherz, Andreas; Stöhr, Jo [Stanford Institute for Materials and Energy Science (SIMES), SLAC National Accelerator Laboratory, 2575 Sand Hill Road, Menlo Park, California 94025 (United States)

    2015-05-18

    We analyze the magnetic cluster size (MCS) and magnetic cluster size distribution (MCSD) in a variety of perpendicular magnetic recording (PMR) media designs using resonant small angle x-ray scattering at the Co L{sub 3} absorption edge. The different PMR media flavors considered here vary in grain size between 7.5 and 9.5 nm as well as in lateral inter-granular exchange strength, which is controlled via the segregant amount. While for high inter-granular exchange, the MCS increases rapidly for grain sizes below 8.5 nm, we show that for increased amount of segregant with less exchange the MCS remains relatively small, even for grain sizes of 7.5 and 8 nm. However, the MCSD still increases sharply when shrinking grains from 8 to 7.5 nm. We show evidence that recording performance such as signal-to-noise-ratio on the spin stand correlates well with the product of magnetic cluster size and magnetic cluster size distribution.

  5. Solar proton event of April 16, 1970. 3. Evolution of pitch angle distribution as < or approx. =1-MeV protons propagate into the high-latitude magnetotail

    International Nuclear Information System (INIS)

    The solar proton event of April 16, 1970 was monitored by Vela satellites, of orbit r=18 R/sub E/, in the solar wind and high-latitude magnetotail (lobe). Intensity structure at < or approx. =1 MeV indicates a delay of 85--102 min in access of protons to near the center of the north lobe, corresponding to entry points at 340--370 R/sub E/ from the earth. In three sequential periods, of 16, 181, and 124 min duration, the average intensity in the north lobe was lower, higher, and lower, respectively, than that in interplanetary space, by factors which varied from 2 to 5. These reversals were a consequence of reversals in field-aligned anisotropy in interplanetary space, the interplanetary magnetic field remaining southward. Pitch angle distributions were measured in three dimensions in interplanetary space and in the north lobe. In the lobe the distributions were essentially isotropic at r=18 R/sub E/. Comparison is made with theoretical propagation of solar particles along field lines in an open tail model, under the following conditions along a trajectory: (1) adiabatic motion all the way (Liouville theorem) : the 'adiabatic access model' (2) isotropization at the magnetopause followed by adiabatic motion: the 'nonadiabatic access model'. Neither mode of access explains the observations adequately. A hybrid mode is proposed, in which a minimal amount of scattering occurs as particles enter the tail, followed by amplification (attenuation) of intensity as the pitch distribution is transformed to near 18 R/sub E/ in the favored (unfavored) lobe. In this mode a large part of the isotropization at Vela orbit is accomplished by the Liouville transformation, since particles entering the tail beyond approx. =100 R/sub E/ will see an increase in magnetic field by a factor of 3 as they propagate along the tail. The amount of scatter at the magnetopause is estimated to be Δμ (rms) =0.3, where μ is cosine of pitch angle

  6. Probability and statistics: selected problems

    OpenAIRE

    Machado, J.A. Tenreiro; Pinto, Carla M. A.

    2014-01-01

    Probability and Statistics—Selected Problems is a unique book for senior undergraduate and graduate students to fast review basic materials in Probability and Statistics. Descriptive statistics are presented first, and probability is reviewed secondly. Discrete and continuous distributions are presented. Sample and estimation with hypothesis testing are presented in the last two chapters. The solutions for proposed excises are listed for readers to references.

  7. A graduate course in probability

    CERN Document Server

    Tucker, Howard G

    2014-01-01

    Suitable for a graduate course in analytic probability, this text requires only a limited background in real analysis. Topics include probability spaces and distributions, stochastic independence, basic limiting options, strong limit theorems for independent random variables, central limit theorem, conditional expectation and Martingale theory, and an introduction to stochastic processes.

  8. Evolution of the plasma sheet electron pitch angle distribution by whistler-mode chorus waves in non-dipole magnetic fields

    Directory of Open Access Journals (Sweden)

    R. M. Thorne

    2012-04-01

    Full Text Available We present a detailed numerical study on the effects of a non-dipole magnetic field on the Earth's plasma sheet electron distribution and its implication for diffuse auroral precipitation. Use of the modified bounce-averaged Fokker-Planck equation developed in the companion paper by Ni et al. (2012 for 2-D non-dipole magnetic fields suggests that we can adopt a numerical scheme similar to that used for a dipole field, but should evaluate bounce-averaged diffusion coefficients and bounce period related terms in non-dipole magnetic fields. Focusing on nightside whistler-mode chorus waves at L = 6, and using various Dungey magnetic models, we calculate and compare of the bounce-averaged diffusion coefficients in each case. Using the Alternative Direction Implicit (ADI scheme to numerically solve the 2-D Fokker-Planck diffusion equation, we demonstrate that chorus driven resonant scattering causes plasma sheet electrons to be scattered much faster into loss cone in a non-dipole field than a dipole. The electrons subject to such scattering extends to lower energies and higher equatorial pitch angles when the southward interplanetary magnetic field (IMF increases in the Dungey magnetic model. Furthermore, we find that changes in the diffusion coefficients are the dominant factor responsible for variations in the modeled temporal evolution of plasma sheet electron distribution. Our study demonstrates that the effects of realistic ambient magnetic fields need to be incorporated into both the evaluation of resonant diffusion coefficients and the calculation of Fokker-Planck diffusion equation to understand quantitatively the evolution of plasma sheet electron distribution and the occurrence of diffuse aurora, in particular at L > 5 during geomagnetically disturbed periods when the ambient magnetic field considerably deviates from a magnetic dipole.

  9. SU-E-J-182: Reproducibility of Tumor Motion Probability Distribution Function in Stereotactic Body Radiation Therapy of Lung Using Real-Time Tumor-Tracking Radiotherapy System

    International Nuclear Information System (INIS)

    Purpose: We aim to achieve new four-dimensional radiotherapy (4DRT) using the next generation real-time tumor-tracking (RTRT) system and flattening-filter-free techniques. To achieve new 4DRT, it is necessary to understand the respiratory motion of tumor. The purposes of this study were: 1.To develop the respiratory motion analysis tool using log files. 2.To evaluate the reproducibility of tumor motion probability distribution function (PDF) during stereotactic body RT (SBRT) of lung tumor. Methods: Seven patients having fiducial markers closely implanted to the lung tumor were enrolled in this study. The positions of fiducial markers were measured using the RTRT system (Mitsubishi Electronics Co., JP) and recorded as two types of log files during the course of SBRT. For each patients, tumor motion range and tumor motion PDFs in left-right (LR), anterior-posterior (AP) and superior-inferior (SI) directions were calculated using log files of all beams per fraction (PDFn). Fractional PDF reproducibility (Rn) was calculated as Kullback-Leibler (KL) divergence between PDF1 and PDFn of tumor motion. The mean of Rn (Rm) was calculated for each patient and correlated to the patient’s mean tumor motion range (Am). The change of Rm during the course of SBRT was also evluated. These analyses were performed using in-house developed software. Results: The Rm were 0.19 (0.07–0.30), 0.14 (0.07–0.32) and 0.16 (0.09–0.28) in LR, AP and SI directions, respectively. The Am were 5.11 mm (2.58–9.99 mm), 7.81 mm (2.87–15.57 mm) and 11.26 mm (3.80–21.27 mm) in LR, AP and SI directions, respectively. The PDF reproducibility decreased as the tumor motion range increased in AP and SI direction. That decreased slightly through the course of RT in SI direction. Conclusion: We developed the respiratory motion analysis tool for 4DRT using log files and quantified the range and reproducibility of respiratory motion for lung tumors

  10. Interpretations of Negative Probabilities

    OpenAIRE

    Burgin, Mark

    2010-01-01

    In this paper, we give a frequency interpretation of negative probability, as well as of extended probability, demonstrating that to a great extent, these new types of probabilities, behave as conventional probabilities. Extended probability comprises both conventional probability and negative probability. The frequency interpretation of negative probabilities gives supportive evidence to the axiomatic system built in (Burgin, 2009; arXiv:0912.4767) for extended probability as it is demonstra...

  11. Novel Bounds on Marginal Probabilities

    OpenAIRE

    Mooij, Joris M.; Kappen, Hilbert J

    2008-01-01

    We derive two related novel bounds on single-variable marginal probability distributions in factor graphs with discrete variables. The first method propagates bounds over a subtree of the factor graph rooted in the variable, and the second method propagates bounds over the self-avoiding walk tree starting at the variable. By construction, both methods not only bound the exact marginal probability distribution of a variable, but also its approximate Belief Propagation marginal (``belief''). Th...

  12. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  13. Accurate Size and Size-Distribution Determination of Polystyrene Latex Nanoparticles in Aqueous Medium Using Dynamic Light Scattering and Asymmetrical Flow Field Flow Fractionation with Multi-Angle Light Scattering

    Directory of Open Access Journals (Sweden)

    Shinichi Kinugasa

    2012-01-01

    Full Text Available Accurate determination of the intensity-average diameter of polystyrene latex (PS-latex by dynamic light scattering (DLS was carried out through extrapolation of both the concentration of PS-latex and the observed scattering angle. Intensity-average diameter and size distribution were reliably determined by asymmetric flow field flow fractionation (AFFFF using multi-angle light scattering (MALS with consideration of band broadening in AFFFF separation. The intensity-average diameter determined by DLS and AFFFF-MALS agreed well within the estimated uncertainties, although the size distribution of PS-latex determined by DLS was less reliable in comparison with that determined by AFFFF-MALS.

  14. Distribution of functional groups in periodic mesoporous organosilica materials studied by small-angle neutron scattering with in situ adsorption of nitrogen

    Directory of Open Access Journals (Sweden)

    Monir Sharifi

    2012-05-01

    Full Text Available Periodic mesoporous materials of the type (R′O3Si-R-Si(OR′3 with benzene as an organic bridge and a crystal-like periodicity within the pore walls were functionalized with SO3H or SO3− groups and investigated by small-angle neutron scattering (SANS with in situ nitrogen adsorption at 77 K. If N2 is adsorbed in the pores the SANS measurements show a complete matching of all of the diffraction signals that are caused by the long-range ordering of the mesopores in the benzene-PMO, due to the fact that the benzene-PMO walls possess a neutron scattering length density (SLD similar to that of nitrogen in the condensed state. However, signals at higher q-values (>1 1/Å are not affected with respect to their SANS intensity, even after complete pore filling, confirming the assumption of a crystal-like periodicity within the PMO material walls due to π–π interactions between the organic bridges. The SLD of pristine benzene-PMO was altered by functionalizing the surface with different amounts of SO3H-groups, using the grafting method. For a low degree of functionalization (0.81 mmol SO3H·g−1 and/or an inhomogeneous distribution of the SO3H-groups, the SLD changes only negligibly, and thus, complete contrast matching is still found. However, for higher amounts of SO3H-groups (1.65 mmol SO3H·g−1 being present in the mesopores, complete matching of the neutron diffraction signals is no longer observed proving that homogeneously distributed SO3H-groups on the inner pore walls of the benzene-PMO alter the SLD in a way that it no longer fits to the SLD of the condensed N2.

  15. Three-dimensional shapes and distribution of FePd nanoparticles observed by electron tomography using high-angle annular dark-field scanning transmission electron microscopy

    Science.gov (United States)

    Sato, Kazuhisa; Aoyagi, Kenta; Konno, Toyohiko J.

    2010-01-01

    We have studied three-dimensional shapes and distribution of FePd nanoparticles, prepared by electron beam deposition and postdeposition annealing, by means of single-axis tilt tomography using atomic number contrasts obtained by high-angle annular dark-field scanning transmission electron microscopy. Particle size, shape, and locations were reconstructed by weighted backprojection (WBP), as well as by simultaneous iterative reconstruction technique (SIRT). We have also estimated the particle size by simple extrapolation of tilt-series original data sets, which proved to be quite powerful. The results of the two algorithms for reconstruction have been compared quantitatively with those obtained by the extrapolation method and those independently reported by electron holography. It was found that the reconstructed intensity map by WBP contains a small amount of dotlike artifacts, which do not exist in the results by SIRT, and that the particle surface obtained by WBP is rougher than that by SIRT. We demonstrate, on the other hand, that WBP yields a better estimation of the particle size in the z direction than SIRT does, most likely due to the presence of a "missing wedge" in the original data set.

  16. The Inductive Applications of Probability Calculus

    Directory of Open Access Journals (Sweden)

    Corrado Gini

    2015-06-01

    Full Text Available The Author goes back to Founders of Probability calculus to investigate their original interpretation of the probability measure in the applications of the probability theory to real problems. The Author puts in evidence some misunderstandings related to the inversion of deductions derived by the use of probability distributions for investigating the causes of events.

  17. Asymptotic Expansions of the Probability Density Function and the Distribution Function of Chi-Square Distribution%卡方分布密度函数与分布函数的渐近展开

    Institute of Scientific and Technical Information of China (English)

    陈刚; 王梦婕

    2014-01-01

    通过对χ2分布概率密度函数的自变量进行标准化变换,将其展开成如下形式:2nχ2( x;n)=1+r1(t)n +r2(t)n +r3(t)n n +r4(t)n2éëùûφ(t)+o 1n2(),其中n为自由度,φ(t)为标准正态分布的密度函数,ri(t)(1≤i≤4)均为关于t的多项式。从该展开式得到χ2分布密度函数的一个近似计算公式。进一步建立φ( t)的幂系数积分递推关系,得到χ2分布函数的渐近展开式。最后通过数值计算验证了这些结果在实际应用中的有效性。%Through the transformation of the independent variable of χ2 distribution probability density function,degree of freedom of which is n,the equation can be expanded as follows: 2nχ2(x;n)=f(t;n)= 1+r1(t)n +r2(t)n +r3(t)n n +r4(t)n2éë ùûφ(t)+o 1n2( ) ,here,φ(t) is a density function of standard normal distribution;ri(t) is a 3i order polynomial of t(1≤i≤4). An approximate formula can be obtained from the expansion of the distribution density function. We further establish the integral recurrence relations of the power coefficients of the standard normal density function and obtain the asymptotic expansion of the distribution function ofχ2 . Finally,the effectiveness of these results in practical application was verified by the numerical calculations.

  18. Inferring Beliefs as Subjectively Imprecise Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.; Hole, Arna Risa; Rutström, E. Elisabeth

    2012-01-01

    We propose a method for estimating subjective beliefs, viewed as a subjective probability distribution. The key insight is to characterize beliefs as a parameter to be estimated from observed choices in a well-defined experimental task and to estimate that parameter as a random coefficient. The...... probabilities are indeed best characterized as probability distributions with non-zero variance....

  19. Probability for physicists

    CERN Document Server

    Sirca, Simon

    2016-01-01

    This book is designed as a practical and intuitive introduction to probability, statistics and random quantities for physicists. The book aims at getting to the main points by a clear, hands-on exposition supported by well-illustrated and worked-out examples. A strong focus on applications in physics and other natural sciences is maintained throughout. In addition to basic concepts of random variables, distributions, expected values and statistics, the book discusses the notions of entropy, Markov processes, and fundamentals of random number generation and Monte-Carlo methods.

  20. The use of small-angle scattering and the maximum-entropy method for shape-model determination from distance-distribution functions

    International Nuclear Information System (INIS)

    The maximum-entropy method is well established for the analysis of scattering data [Bricogne (1993)[. For this method, prior structure knowledge can be included in the structure determination. This prior estimate is an essential element for a successful application of the maximum-entropy method. The most likely prior estimate can be found by maximization of the entropy. With the assumption a priori of a special type of structure model, the unknown parameters can be calculated from real-space functions. For practical use, analytical expressions for the Fourier transform of model scattering curves, the distance-distribution function of the models, are of interest. Formulas are presented for rotational ellipsoids, Gaussian chains and two-phase spheres, and a parameter estimation by the program MAXENT is demonstrated for the ellipsoidal shape of cytochrome c using theoretical X-ray scattering curves calculated from atomic coordinates. The calculated dimensions of prolate and oblate ellipsoids agree within the error limits with the direct structure-related inertia-equivalent ellipsoid of the molecule. Furthermore, error limits have been determined from the a posteriori probability or 'evidence' function for the model parameters. To avoid over-interpretation of the scattering data, the real number of degrees of freedom is calculated for noisy data. This measure of information content is almost independent of the collimation distortion but strongly influenced by the statistical noise in the scattering data. The numerical value is smaller than the ideal number of degrees of freedom provided by the information theory. (orig.)

  1. The use of small-angle scattering and the maximum-entropy method for shape-model determination from distance-distribution functions

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, J.J. [Max-Delbrueck-Centrum fuer Molekulare Medizin Berlin-Buch SdoeR, Berlin (Germany); Hansen, S. [Royal Veterinary and Agricultural Univ., Copenhagen (Denmark). Dept. of Mathematics and Physics; Puerschel, H.-V. [Max-Delbrueck-Centrum fuer Molekulare Medizin Berlin-Buch SdoeR, Berlin (Germany)

    1996-10-01

    The maximum-entropy method is well established for the analysis of scattering data [Bricogne (1993)]. For this method, prior structure knowledge can be included in the structure determination. This prior estimate is an essential element for a successful application of the maximum-entropy method. The most likely prior estimate can be found by maximization of the entropy. With the assumption a priori of a special type of structure model, the unknown parameters can be calculated from real-space functions. For practical use, analytical expressions for the Fourier transform of model scattering curves, the distance-distribution function of the models, are of interest. Formulas are presented for rotational ellipsoids, Gaussian chains and two-phase spheres, and a parameter estimation by the program MAXENT is demonstrated for the ellipsoidal shape of cytochrome c using theoretical X-ray scattering curves calculated from atomic coordinates. The calculated dimensions of prolate and oblate ellipsoids agree within the error limits with the direct structure-related inertia-equivalent ellipsoid of the molecule. Furthermore, error limits have been determined from the a posteriori probability or `evidence` function for the model parameters. To avoid over-interpretation of the scattering data, the real number of degrees of freedom is calculated for noisy data. This measure of information content is almost independent of the collimation distortion but strongly influenced by the statistical noise in the scattering data. The numerical value is smaller than the ideal number of degrees of freedom provided by the information theory. (orig.).

  2. Limit Theorems in Free Probability Theory I

    OpenAIRE

    Chistyakov, G. P.; Götze, F.

    2006-01-01

    Based on a new analytical approach to the definition of additive free convolution on probability measures on the real line we prove free analogs of limit theorems for sums for non-identically distributed random variables in classical Probability Theory.

  3. Multimodal particle size distribution or fractal surface of acrylic acid copolymer nanoparticles: A small-angle X-ray scattering study using direct Fourier and indirect maximum entropy methods

    OpenAIRE

    Mueller, J.J.; Hansen, S; Lukowski, G.; Gast, K.

    1995-01-01

    Acrylic acid copolymers are potential carriers for drug delivery. The surface, surface rugosity and the absolute dimension of the particles are parameters that determine the binding of drugs or detergents, diffusion phenomena at the surface and the distribution of the carrier within the human body. The particle-size distribution and surface rugosity of the particles have been investigated by small-angle X-ray scattering and dynamic light scattering. Direct Fourier transform as well as a new s...

  4. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  5. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  6. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    2014-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  7. Cosmological dynamics in tomographic probability representation

    OpenAIRE

    Man'ko, V. I.; G. Marmo(Università di Napoli and INFN, Napoli, Italy); Stornaiolo, C.

    2004-01-01

    The probability representation for quantum states of the universe in which the states are described by a fair probability distribution instead of wave function (or density matrix) is developed to consider cosmological dynamics. The evolution of the universe state is described by standard positive transition probability (tomographic transition probability) instead of the complex transition probability amplitude (Feynman path integral) of the standard approach. The latter one is expressed in te...

  8. Probability representations of fuzzy systems

    Institute of Scientific and Technical Information of China (English)

    LI Hongxing

    2006-01-01

    In this paper, the probability significance of fuzzy systems is revealed. It is pointed out that COG method, a defuzzification technique used commonly in fuzzy systems, is reasonable and is the optimal method in the sense of mean square. Based on different fuzzy implication operators, several typical probability distributions such as Zadeh distribution, Mamdani distribution, Lukasiewicz distribution, etc. are given. Those distributions act as "inner kernels" of fuzzy systems. Furthermore, by some properties of probability distributions of fuzzy systems, it is also demonstrated that CRI method, proposed by Zadeh, for constructing fuzzy systems is basically reasonable and effective. Besides, the special action of uniform probability distributions in fuzzy systems is characterized. Finally, the relationship between CRI method and triple I method is discussed. In the sense of construction of fuzzy systems, when restricting three fuzzy implication operators in triple I method to the same operator, CRI method and triple I method may be related in the following three basic ways: 1) Two methods are equivalent; 2) the latter is a degeneration of the former; 3) the latter is trivial whereas the former is not. When three fuzzy implication operators in triple I method are not restricted to the same operator, CRI method is a special case of triple I method; that is, triple I method is a more comprehensive algorithm. Since triple I method has a good logical foundation and comprises an idea of optimization of reasoning, triple I method will possess a beautiful vista of application.

  9. Survival probability and ruin probability of a risk model

    Institute of Scientific and Technical Information of China (English)

    LUO Jian-hua

    2008-01-01

    In this paper, a new risk model is studied in which the rate of premium income is regarded as a random variable, the arrival of insurance policies is a Poisson process and the process of claim occurring is p-thinning process. The integral representations of the survival probability are gotten. The explicit formula of the survival probability on the infinite interval is obtained in the special casc--exponential distribution.The Lundberg inequality and the common formula of the ruin probability are gotten in terms of some techniques from martingale theory.

  10. On Probability Leakage

    OpenAIRE

    Briggs, William M

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  11. The Probability Distribution of the Maximum Amount of Daily Precipitation During 20 Days in Summer of the Huaihe Basins%淮河流域汛期20d内最大日降水量概率分布

    Institute of Scientific and Technical Information of China (English)

    梁莉; 赵琳娜; 巩远发; 包红军; 王成鑫; 王志

    2011-01-01

    利用淮河流域158个站点1980-2007年夏季降水量资料,选取淮河上游、淮河中上游、淮河中下游、洪泽湖以下和沂沭河5个子流域,采用(Τ)分布函数分析了淮河流域首雨日(前1日无雨)和连续雨日(前1日有雨)的夏季多年降水的概率分布特点.通过对代表站息县、阜阳、商丘、淮安、连云港(Τ)分布概率密度与样本频率的对比分析和K-S检验表明:(Τ)分布函数能较好拟合分条件的淮河流域夏季雨日的概率分布,用该分布函数递推得到的1d,10 d,20 d内最大日降水量概率分布比较规则合理.淮河流域5个子流域中淮河上游、淮河中下游、沂沭河流域在10 d,20 d内最大日降水量不低于10 mm,25 mm,50mm的可能性更大.%The daily precipitation records of 158 meteorological rain gauges over the Huaihe Basins make it possible to analyze the probability distribution, using gamma distribution of precipitation during the summer of 1980-2007 by distinguishing rainy days following a dry or wet preceding day over the years. Five precipitation rain gauge stations, namely Xixian, Fuyang, Shangqiu, Huaian, Lianyungang stations, are investigated as representative stations of five catchments, namely the upper stream of the Huaihe River, the part stream between Wangjiaba Dam and Bengbu Station of the Huaihe River, the part stream between Bengbu Station and Hongze Lake of Huaihe River, the Huaihe River downstream below Hongze Lake and the Yishu River watershed, to analyze their probability distribution respectively. Through the Kolmogorov-Smirnov (K-S) test and the comparison between the gamma distribution probability density function of the five representative stations and the sample frequency of the daily precipitation records, it is proved that gamma distribution function can be an adequate fitting to the probability distribution of the precipitation in summer of the rainy days following a dry or wet preceding day. The probability

  12. A random matrix/transition state theory for the probability distribution of state-specific unimolecular decay rates: Generalization to include total angular momentum conservation and other dynamical symmetries

    International Nuclear Information System (INIS)

    A previously developed random matrix/transition state theory (RM/TST) model for the probability distribution of state-specific unimolecular decay rates has been generalized to incorporate total angular momentum conservation and other dynamical symmetries. The model is made into a predictive theory by using a semiclassical method to determine the transmission probabilities of a nonseparable rovibrational Hamiltonian at the transition state. The overall theory gives a good description of the state-specific rates for the D2CO→D2+CO unimolecular decay; in particular, it describes the dependence of the distribution of rates on total angular momentum J. Comparison of the experimental values with results of the RM/TST theory suggests that there is mixing among the rovibrational states

  13. Subjective probability models for lifetimes

    CERN Document Server

    Spizzichino, Fabio

    2001-01-01

    Bayesian methods in reliability cannot be fully utilized and understood without full comprehension of the essential differences that exist between frequentist probability and subjective probability. Switching from the frequentist to the subjective approach requires that some fundamental concepts be rethought and suitably redefined. Subjective Probability Models for Lifetimes details those differences and clarifies aspects of subjective probability that have a direct influence on modeling and drawing inference from failure and survival data. In particular, within a framework of Bayesian theory, the author considers the effects of different levels of information in the analysis of the phenomena of positive and negative aging.The author coherently reviews and compares the various definitions and results concerning stochastic ordering, statistical dependence, reliability, and decision theory. He offers a detailed but accessible mathematical treatment of different aspects of probability distributions for exchangea...

  14. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  15. Local Percolation Probabilities for a Natural Sandstone

    OpenAIRE

    Hilfer, R.; Rag, T.; Virgi, B.

    1996-01-01

    Local percolation probabilities are used to characterize the connectivity in porous and heterogeneous media. Together with local porosity distributions they allow to predict transport properties \\cite{hil91d}. While local porosity distributions are readily obtained, measurements of the local percolation probabilities are more difficult and have not been attempted previously. First measurements of three dimensional local porosity distributions and percolation probabilities from a pore space re...

  16. Measurement of the weak mixing angle and the spin of the gluon from angular distributions in the reaction pp{yields} Z/{gamma}*+X{yields}{mu}{sup +}{mu}{sup -}+X with ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Schmieden, Kristof

    2013-04-15

    The measurement of the effective weak mixing angle with the ATLAS experiment at the LHC is presented. It is extracted from the forward-backward asymmetry in the polar angle distribution of the muons originating from Z boson decays in the reaction pp{yields}Z/{gamma}{sup *}+X{yields} {mu}{sup +}{mu}{sup -}+X. In total 4.7 fb{sup -1} of proton-proton collisions at {radical}(s)=7 TeV are analysed. In addition, the full polar and azimuthal angular distributions are measured as a function of the transverse momentum of the Z/{gamma}{sup *} system and are compared to several simulations as well as recent results obtained in p anti p collisions. Finally, the angular distributions are used to confirm the spin of the gluon using the Lam-Tung relation.

  17. Measurement of the angular distributions in the reaction pp → Z/γ* + X → μ+μ- + X and extraction of the weak mixing angle and the spin of the gluon

    International Nuclear Information System (INIS)

    The measurement of the effective weak mixing angle with the ATLAS experiment at the LHC is presented. It is extracted from the forward-backward asymmetry in the polar angle distribution of the muons originating from Z boson decays in the reaction pp → Z/γ* + X → μ+μ- + X. In total 4.7 fb-1 of proton-proton collisions at √(s) = 7 TeV are analysed. In addition, the full polar and azimuthal angular distributions are measured as a function of the transverse momentum of the Z/γ* system. The comparisons to several simulations as well as recent results obtained in p anti p collisions are presented. Finally, the angular distributions are used to confirm the spin of the gluon using the Lam-Tung relation.

  18. A Novel Approach to Probability

    CERN Document Server

    Kafri, Oded

    2016-01-01

    When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...

  19. Comparação de distribuições de probabilidade e estimativa da precipitação provável para região de Barbacena, MG Comparasion of probability distribution models and estimative of the probable rainfall for the Barbacena County, MG

    Directory of Open Access Journals (Sweden)

    Bruno Teixeira Ribeiro

    2007-10-01

    Full Text Available Estudos probabilísticos envolvendo variáveis climáticas são de extrema importância para as atividades da agropecuária, construção civil, turismo, transporte, dentre outros. Visando contribuir para o planejamento da agricultura irrigada, este trabalho teve como objetivos comparar distribuições de probabilidade ajustadas às séries históricas decendiais e mensais, e estimar as precipitações prováveis para o município de Barbacena, MG. Foram estudados os meses de dezembro, janeiro e fevereiro, no período de 1942 a 2003, constituindo-se séries históricas com 62 anos de observações. As lâminas diárias foram totalizadas em períodos mensais e decendiais, sendo aplicadas as distribuições log-Normal 2 parâmetros, log-Normal 3 parâmetros e Gama. Para avaliar a adequabilidade das distribuições, nos períodos estudados, utilizou-se o teste de Qui-quadrado (chi2, ao nível de 5% de significância. As precipitações prováveis foram estimadas para cada período estudado utilizando a distribuição que apresentou o menor valor de chi2, nos níveis de probabilidade de excedência de 75, 90 e 98%. A distribuição Gama foi a que melhor se ajustou aos dados. O estudo de precipitações prováveis é uma boa ferramenta no auxílio da tomada de decisão quanto ao planejamento e uso da irrigação.Probabilistic studies involving climatic variables are of extreme importance for farming activities, construction, tourism, among others. Seeking to contribute for the planning of irrigate agriculture, this work had as objectives to compare adjusted probability distribution models to the monthly and decennial historical series and to estimate the probable rainfall for the Barbacena County, Minas Gerais State, Brazil. Rainfall data of December, January and February, from 1942 to 2003, were studied, constituting historical series with 62 years of observations. Daily rainfall depths were added for 10 and 30 days, applying Gama, log-Normal 2 and

  20. Ruin probabilities in tough times - Part 1 - Heavy-traffic approximation for fractionally integrated random walks in the domain of attraction of a nonGaussian stable distribution

    CERN Document Server

    Barbe, Ph

    2011-01-01

    Motivated by applications to insurance mathematics, we prove some heavy-traffic limit theorems for process which encompass the fractionally integrated random walk as well as some FARIMA processes, when the innovations are in the domain of attraction of a nonGaussian stable distribution.

  1. Ruin probabilities in tough times - Part 2 - Heavy-traffic approximation for fractionally differentiated random walks in the domain of attraction of a nonGaussian stable distribution

    CERN Document Server

    Barbe, Ph

    2011-01-01

    Motivated by applications to insurance mathematics, we prove some heavy-traffic limit theorems for processes which encompass the fractionally differentiated random walk as well as some FARIMA processes, when the innovations are in the domain of attraction of a nonGaussian stable distribution.

  2. Convergence of the probability of large deviations in a model of correlated random variables having compact-support Q-Gaussians as limiting distributions

    Science.gov (United States)

    Jauregui, Max; Tsallis, Constantino

    2015-02-01

    We consider correlated random variables X1, …, Xn taking values in {0, 1} such that, for any permutation π of {1, …, n}, the random vectors (X1, …, Xn) and (Xπ(1), …, Xπ(n)) have the same distribution. This distribution, which was introduced by Rodríguez et al. [J. Stat. Mech. 2008, P09006] and then generalized by Hanel et al. [Eur. Phys. J. B 72, 263 (2009)], is scale-invariant and depends on a real parameter ν > 0 (ν → ∞ implies independence). Putting Sn = X1 + ⋯ + Xn, the distribution of Sn - n/2 approaches a Q-Gaussian distribution with compact support (Q = 1 - 1/(ν - 1) 0, we show that ℙ(Sn = 0) decays to zero like a power law of the form 1/nν with a subdominant term of the form 1/nν+1. If 0 0 is an integer, we show that we can analytically find upper and lower bounds for the difference between ℙ(Sn/n ≤ x) and its (n → ∞) limit. We also show that these bounds vanish like a power law of the form 1/n with a subdominant term of the form 1/n2.

  3. Effects of temperature distribution on failure probability of coated particles in spherical fuel elements%球形燃料元件温度分布对包覆燃料颗粒失效概率的影响

    Institute of Scientific and Technical Information of China (English)

    张永栋; 林俊; 朱天宝; 张海青; 朱智勇

    2016-01-01

    Background:Particles coated by TRISO (Tristructural isotropic) embedded in spherical fuel elements are used in solid fuel molten salt reactor. Temperature distribution during operation can affect the failure probability of TRISO particles embedded in different parts of fuel elements. Purpose: This study aims to investigate the temperature distribution effects on failure probability of coated fuel particles. Methods: Micro-volume element analysis of temperature distribution effect on the failure probability of coated particles was carried out for the first time, and the impact of spherical fuel element size on the average failure probability of TRISO particles was also evaluated. Results: At a given power density, the failure probability of TRISO particles would be deviated by an order of magnitude when either core temperature or average temperature of the fuel element was used to calculate the average failure probability. With the same power density and the same burnups, the average failure probability of coated particles could be lowered by two orders of magnitude through reducing the diameter of fuel element by 1 cm. Conclusion:It is necessary to take the temperature distribution into account for calculating the failure probability of coated fuel particles. In addition, it is found that the average failure probability of coated fuel particles can be lowered by reducing the sizes of the fuel element. This may be a proper way to secure the fuel elements working at high power densities.%固态熔盐堆采用TRISO (Tristructural isotropic)包覆颗粒球形燃料元件。在运行工况下,燃料元件内部存在一定的温度分布,填充在燃料元件内部不同位置的TRISO颗粒的失效概率会因此受到影响。利用体积微元的方法分析了温度分布对包覆颗粒失效概率的影响,并进一步研究了球形燃料元件尺寸对TRISO颗粒平均失效概率的影响。结果表明,在一定的功率密度下,如果利用球心

  4. Effect of elbow flexion angles on stress distribution of the proximal ulnar and radius bones under a vertical load: measurement using resistance strain gauges

    OpenAIRE

    Rao, Zhi-Tao; Yuan, Feng; Li, Bing; Ma, Ning

    2014-01-01

    Objectives This study aimed to explore the surface stress at the proximal ends of the ulna and radius at different elbow flexion angles using the resistance strain method. Methods Eight fresh adult cadaveric elbows were tested. The forearms were fixed in a neutral position. Axial load increment experiments were conducted at four different elbow flexion angles (0°, 15°, 30°, and 45°). Surface stain was measured at six sites (tip, middle, and base of the coronoid process; back ulnar notch; olec...

  5. Exploring non-signalling polytopes with negative probability

    OpenAIRE

    Oas, G.; de Barros, J. Acacio; Carvalhaes, C.

    2014-01-01

    Bipartite and tripartite EPR-Bell type systems are examined via joint quasi-probability distributions where elementary probabilities are permitted to be negative. It is shown that such distributions exist only when the no-signalling condition is satisfied. A characteristic measure, the probability mass, is introduced and, via its minimization, limits the number of quasi-distributions describing a given marginal probability distribution. The minimized probability mass is shown to be an alterna...

  6. Non-Archimedean Probability

    OpenAIRE

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    2011-01-01

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and zero- and unit-probability events pose no particular epistemological problems. We use a non-Archimedean field as the range of the probability function. As a result, the property of countable additivity in Kolmogorov's axiomatization o...

  7. Probability and paternity testing.

    OpenAIRE

    Elston, R C

    1986-01-01

    A probability can be viewed as an estimate of a variable that is sometimes 1 and sometimes 0. To have validity, the probability must equal the expected value of that variable. To have utility, the average squared deviation of the probability from the value of that variable should be small. It is shown that probabilities of paternity calculated by the use of Bayes' theorem under appropriate assumptions are valid, but they can vary in utility. In particular, a recently proposed probability of p...

  8. Logical Probability Preferences

    OpenAIRE

    Saad, Emad

    2013-01-01

    We present a unified logical framework for representing and reasoning about both probability quantitative and qualitative preferences in probability answer set programming, called probability answer set optimization programs. The proposed framework is vital to allow defining probability quantitative preferences over the possible outcomes of qualitative preferences. We show the application of probability answer set optimization programs to a variant of the well-known nurse restoring problem, c...

  9. Monte Carlo method of macroscopic modulation of small-angle charged particle reflection from solid surfaces

    CERN Document Server

    Bratchenko, M I

    2001-01-01

    A novel method of Monte Carlo simulation of small-angle reflection of charged particles from solid surfaces has been developed. Instead of atomic-scale simulation of particle-surface collisions the method treats the reflection macroscopically as 'condensed history' event. Statistical parameters of reflection are sampled from the theoretical distributions upon energy and angles. An efficient sampling algorithm based on combination of inverse probability distribution function method and rejection method has been proposed and tested. As an example of application the results of statistical modeling of particles flux enhancement near the bottom of vertical Wehner cone are presented and compared with simple geometrical model of specular reflection.

  10. Estimating Long GRB Jet Opening Angles and Rest-Frame Energetics

    Science.gov (United States)

    Goldstein, Adam; Connaughton, Valerie; Briggs, Michael Stephen; Burns, Eric

    2016-04-01

    We present a method to estimate the jet opening angles of long duration Gamma-Ray Bursts (GRBs) using the prompt gamma-ray energetics and a correlation between the time-integrated peak energy of the GRB prompt spectrum and the collimation-corrected energy in gamma rays. The derived jet opening angles using this method match well with the corresponding inferred jet opening angles obtained when a break in the afterglow is observed. Furthermore, using a model of the predicted long GRB redshift probability distribution observable by the Fermi Gamma-ray Burst Monitor (GBM), we estimate the probability distributions for the jet opening angle and rest-frame energetics for a large sample of GBM GRBs for which the redshifts have not been observed. Previous studies have only used a handful of GRBs to estimate these properties due to the paucity of observed afterglow jet breaks, spectroscopic redshifts, and comprehensive prompt gamma-ray observations, and we expand the number of GRBs that can be used in this analysis by more than an order of magnitude. In this analysis, we also present an inferred distribution of jet breaks which indicates that a large fraction of jet breaks are not observable with current instrumentation and observing strategies. We present simple parameterizations for the jet angle, energetics, and jet break distributions so that they may be used in future studies.

  11. VizieR Online Data Catalog: Jet angles and gamma-ray energetics estimations (Goldstein+, 2016)

    Science.gov (United States)

    Goldstein, A.; Connaughton, V.; Briggs, M. S.; Burns, E.

    2016-04-01

    We present a method to estimate the jet opening angles of long duration gamma-ray bursts (GRBs) using the prompt gamma-ray energetics and an inversion of the Ghirlanda relation, which is a correlation between the time-integrated peak energy of the GRB prompt spectrum and the collimation-corrected energy in gamma-rays. The derived jet opening angles using this method and detailed assumptions match well with the corresponding inferred jet opening angles obtained when a break in the afterglow is observed. Furthermore, using a model of the predicted long GRB redshift probability distribution observable by the Fermi Gamma-ray Burst Monitor (GBM), we estimate the probability distributions for the jet opening angle and rest-frame energetics for a large sample of GBM GRBs for which the redshifts have not been observed. Previous studies have only used a handful of GRBs to estimate these properties due to the paucity of observed afterglow jet breaks, spectroscopic redshifts, and comprehensive prompt gamma-ray observations, and we potentially expand the number of GRBs that can be used in this analysis by more than an order of magnitude. In this analysis, we also present an inferred distribution of jet breaks which indicates that a large fraction of jet breaks are not observable with current instrumentation and observing strategies. We present simple parameterizations for the jet angle, energetics, and jet break distributions so that they may be used in future studies. (1 data file).

  12. Agreeing Probability Measures for Comparative Probability Structures

    OpenAIRE

    Wakker, Peter

    1981-01-01

    It is proved that fine and tight comparative probability structures (where the set of events is assumed to be an algebra, not necessarily a $\\sigma$-algebra) have agreeing probability measures. Although this was often claimed in the literature, all proofs the author encountered are not valid for the general case, but only for $\\sigma$-algebras. Here the proof of Niiniluoto (1972) is supplemented. Furthermore an example is presented that reveals many misunderstandings in the literature. At the...

  13. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  14. Non-Archimedean Probability

    CERN Document Server

    Benci, Vieri; Wenmackers, Sylvia

    2011-01-01

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and zero- and unit-probability events pose no particular epistemological problems. We use a non-Archimedean field as the range of the probability function. As a result, the property of countable additivity in Kolmogorov's axiomatization of probability is replaced by a different type of infinite additivity.

  15. Evaluating probability forecasts

    CERN Document Server

    Lai, Tze Leung; Shen, David Bo; 10.1214/11-AOS902

    2012-01-01

    Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability forecasts. This approach uses loss functions relating the predicted to the actual probabilities of the events and applies martingale theory to exploit the temporal structure between the forecast and the subsequent occurrence or nonoccurrence of the event.

  16. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  17. Probability Distribution of Precipitation Extremes over the Yangtze River Basin%1960-2005年长江流域降水极值概率分布特征

    Institute of Scientific and Technical Information of China (English)

    苏布达; Marco Gemmer; 姜彤

    2008-01-01

    Based on the daily observational precipitation data of 147 stations in the Yangtze River basin for 1960-2005,and the projected daily data of 79 grids from ECHAM5/MPI-OM in the 20th century,time series of precipitation extremes which contain annual maximum(AM)and Munger index(MI)were constructed.The distribution feature of precipitation extremes was analyzed based on the two index series.Research results show that(1)the intensity and probability of extreme heavy precipitation are higher in the middle Mintuo River sub-catchment,the Dongting Lake area,the mid-lower main stream section of the Yangtze River,and the southeastern Poyang Lake sub-catchment;whereas,the intensity and probability of drought events are higher in the mid-lower Jinsha River sub-catchment and the Jialing River sub-catchment;(2)compared with observational data,the averaged value of AM is higher but the deviation coefficient is lower in projected data,and the center of precipitation extremes moves northwards;(3)in spite of certain differences in the spatial distributions of observed and projected precipitation extremes,by applying General Extreme Value(GEV)and Wakeby(WAK)models with the method of L-Moment Estimator(LME)to the precipitation extremes,it is proved that WAK can simulate the probability distribution of precipitation extremes calculated from both observed and projected data quite well.The WAK could be an important function for estimating the precipitation extreme events in the Yangtze River basin under future climatic scenarios.

  18. Probability Representation of Quantum Mechanics: Comments and Bibliography

    OpenAIRE

    Man'ko, V. I.; Pilyavets, O. V.; Zborovskii, V. G.

    2006-01-01

    The probability representation of states in standard quantum mechanics where the quantum states are associated with fair probability distributions (instead of wave function or density matrix) is shortly commented and bibliography related to the probability representation is given.

  19. A lattice determination of gA and left angle x right angle from overlap fermions

    International Nuclear Information System (INIS)

    We present results for the nucleon's axial charge gA and the first moment left angle x right angle of the unpolarized parton distribution function from a simulation of quenched overlap fermions. (orig.)

  20. Introduction to probability

    CERN Document Server

    Roussas, George G

    2006-01-01

    Roussas's Introduction to Probability features exceptionally clear explanations of the mathematics of probability theory and explores its diverse applications through numerous interesting and motivational examples. It provides a thorough introduction to the subject for professionals and advanced students taking their first course in probability. The content is based on the introductory chapters of Roussas's book, An Intoduction to Probability and Statistical Inference, with additional chapters and revisions. Written by a well-respected author known for great exposition an