WorldWideScience

Sample records for cumulative probability distribution

  1. Calculating Cumulative Binomial-Distribution Probabilities

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CUMBIN, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), used independently of one another. Reliabilities and availabilities of k-out-of-n systems analyzed. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Used for calculations of reliability and availability. Program written in C.

  2. Benchmarking PARTISN with Analog Monte Carlo: Moments of the Neutron Number and the Cumulative Fission Number Probability Distributions

    Energy Technology Data Exchange (ETDEWEB)

    O' Rourke, Patrick Francis [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-10-27

    The purpose of this report is to provide the reader with an understanding of how a Monte Carlo neutron transport code was written, developed, and evolved to calculate the probability distribution functions (PDFs) and their moments for the neutron number at a final time as well as the cumulative fission number, along with introducing several basic Monte Carlo concepts.

  3. Forecasting the Stock Market with Linguistic Rules Generated from the Minimize Entropy Principle and the Cumulative Probability Distribution Approaches

    Directory of Open Access Journals (Sweden)

    Chung-Ho Su

    2010-12-01

    Full Text Available To forecast a complex and non-linear system, such as a stock market, advanced artificial intelligence algorithms, like neural networks (NNs and genetic algorithms (GAs have been proposed as new approaches. However, for the average stock investor, two major disadvantages are argued against these advanced algorithms: (1 the rules generated by NNs and GAs are difficult to apply in investment decisions; and (2 the time complexity of the algorithms to produce forecasting outcomes is very high. Therefore, to provide understandable rules for investors and to reduce the time complexity of forecasting algorithms, this paper proposes a novel model for the forecasting process, which combines two granulating methods (the minimize entropy principle approach and the cumulative probability distribution approach and a rough set algorithm. The model verification demonstrates that the proposed model surpasses the three listed conventional fuzzy time-series models and a multiple regression model (MLR in forecast accuracy.

  4. Predicting Cumulative Incidence Probability by Direct Binomial Regression

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard......Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard...

  5. Combining scenarios in a calculation of the overall probability distribution of cumulative releases of radioactivity from the Waste Isolation Pilot Plant, southeastern New Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Tierney, M.S.

    1991-11-01

    The Waste Isolation Pilot Plant (WIPP), in southeastern New Mexico, is a research and development facility to demonstrate safe disposal of defense-generated transuranic waste. The US Department of Energy will designate WIPP as a disposal facility if it meets the US Environmental Protection Agency's standard for disposal of such waste; the standard includes a requirement that estimates of cumulative releases of radioactivity to the accessible environment be incorporated in an overall probability distribution. The WIPP Project has chosen an approach to calculation of an overall probability distribution that employs the concept of scenarios for release and transport of radioactivity to the accessible environment. This report reviews the use of Monte Carlo methods in the calculation of an overall probability distribution and presents a logical and mathematical foundation for use of the scenario concept in such calculations. The report also draws preliminary conclusions regarding the shape of the probability distribution for the WIPP system; preliminary conclusions are based on the possible occurrence of three events and the presence of one feature: namely, the events attempted boreholes over rooms and drifts,'' mining alters ground-water regime,'' water-withdrawal wells provide alternate pathways,'' and the feature brine pocket below room or drift.'' Calculation of the WIPP systems's overall probability distributions for only five of sixteen possible scenario classes that can be obtained by combining the four postulated events or features.

  6. Probability distributions for magnetotellurics

    Energy Technology Data Exchange (ETDEWEB)

    Stodt, John A.

    1982-11-01

    Estimates of the magnetotelluric transfer functions can be viewed as ratios of two complex random variables. It is assumed that the numerator and denominator are governed approximately by a joint complex normal distribution. Under this assumption, probability distributions are obtained for the magnitude, squared magnitude, logarithm of the squared magnitude, and the phase of the estimates. Normal approximations to the distributions are obtained by calculating mean values and variances from error propagation, and the distributions are plotted with their normal approximations for different percentage errors in the numerator and denominator of the estimates, ranging from 10% to 75%. The distribution of the phase is approximated well by a normal distribution for the range of errors considered, while the distribution of the logarithm of the squared magnitude is approximated by a normal distribution for a much larger range of errors than is the distribution of the squared magnitude. The distribution of the squared magnitude is most sensitive to the presence of noise in the denominator of the estimate, in which case the true distribution deviates significantly from normal behavior as the percentage errors exceed 10%. In contrast, the normal approximation to the distribution of the logarithm of the magnitude is useful for errors as large as 75%.

  7. Superpositions of probability distributions

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=σ2 play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  8. Superpositions of probability distributions.

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=sigma;{2} play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  9. Probability distribution relationships

    Directory of Open Access Journals (Sweden)

    Yousry Abdelkader

    2013-05-01

    Full Text Available In this paper, we are interesting to show the most famous distributions and their relations to the other distributions in collected diagrams. Four diagrams are sketched as networks. The first one is concerned to the continuous distributions and their relations. The second one presents the discrete distributions. The third diagram is depicted the famous limiting distributions. Finally, the Balakrishnan skew-normal density and its relationship with the other distributions are shown in the fourth diagram.

  10. INTERACTIVE VISUALIZATION OF PROBABILITY AND CUMULATIVE DENSITY FUNCTIONS

    KAUST Repository

    Potter, Kristin

    2012-01-01

    The probability density function (PDF), and its corresponding cumulative density function (CDF), provide direct statistical insight into the characterization of a random process or field. Typically displayed as a histogram, one can infer probabilities of the occurrence of particular events. When examining a field over some two-dimensional domain in which at each point a PDF of the function values is available, it is challenging to assess the global (stochastic) features present within the field. In this paper, we present a visualization system that allows the user to examine two-dimensional data sets in which PDF (or CDF) information is available at any position within the domain. The tool provides a contour display showing the normed difference between the PDFs and an ansatz PDF selected by the user and, furthermore, allows the user to interactively examine the PDF at any particular position. Canonical examples of the tool are provided to help guide the reader into the mapping of stochastic information to visual cues along with a description of the use of the tool for examining data generated from an uncertainty quantification exercise accomplished within the field of electrophysiology.

  11. Recursive Numerical Evaluation of the Cumulative Bivariate Normal Distribution

    CERN Document Server

    Meyer, Christian

    2010-01-01

    We propose an algorithm for evaluation of the cumulative bivariate normal distribution, building upon Marsaglia's ideas for evaluation of the cumulative univariate normal distribution. The algorithm is mathematically transparent, delivers competitive performance and can easily be extended to arbitrary precision.

  12. Diurnal distribution of sunshine probability

    Energy Technology Data Exchange (ETDEWEB)

    Aydinli, S.

    1982-01-01

    The diurnal distribution of the sunshine probability is essential for the predetermination of average irradiances and illuminances by solar radiation on sloping surfaces. The most meteorological stations have only monthly average values of the sunshine duration available. It is, therefore, necessary to compute the diurnal distribution of sunshine probability starting from the average monthly values. It is shown how the symmetric component of the distribution of the sunshine probability which is a consequence of a ''sidescene effect'' of the clouds can be calculated. The asymmetric components of the sunshine probability depending on the location and the seasons and their influence on the predetermination of the global radiation are investigated and discussed.

  13. Semi-stable distributions in free probability theory

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Semi-stable distributions, in classical probability theory, are characterized as limiting distributions of subsequences of normalized partial sums of independent and identically distributed random variables. We establish the noncommutative counterpart of semi-stable distributions. We study the characterization of noncommutative semi-stability through free cumulant transform and develop the free semi-stability and domain of semi-stable attraction in free probability theory.

  14. Exact Probability Distribution versus Entropy

    Directory of Open Access Journals (Sweden)

    Kerstin Andersson

    2014-10-01

    Full Text Available The problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations  are necessary in order to estimate the number of guesses.  Several  kinds of approximations  are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic  sizes of alphabets and words (100, the number of guesses can be estimated  within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions.  For many probability  distributions,  the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion  of guesses needed on average compared to the total number  decreases almost exponentially with the word length. The leading term in an asymptotic  expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.

  15. Probability distributions for multimeric systems.

    Science.gov (United States)

    Albert, Jaroslav; Rooman, Marianne

    2016-01-01

    We propose a fast and accurate method of obtaining the equilibrium mono-modal joint probability distributions for multimeric systems. The method necessitates only two assumptions: the copy number of all species of molecule may be treated as continuous; and, the probability density functions (pdf) are well-approximated by multivariate skew normal distributions (MSND). Starting from the master equation, we convert the problem into a set of equations for the statistical moments which are then expressed in terms of the parameters intrinsic to the MSND. Using an optimization package on Mathematica, we minimize a Euclidian distance function comprising of a sum of the squared difference between the left and the right hand sides of these equations. Comparison of results obtained via our method with those rendered by the Gillespie algorithm demonstrates our method to be highly accurate as well as efficient.

  16. On the method of logarithmic cumulants for parametric probability density function estimation.

    Science.gov (United States)

    Krylov, Vladimir A; Moser, Gabriele; Serpico, Sebastiano B; Zerubia, Josiane

    2013-10-01

    Parameter estimation of probability density functions is one of the major steps in the area of statistical image and signal processing. In this paper we explore several properties and limitations of the recently proposed method of logarithmic cumulants (MoLC) parameter estimation approach which is an alternative to the classical maximum likelihood (ML) and method of moments (MoM) approaches. We derive the general sufficient condition for a strong consistency of the MoLC estimates which represents an important asymptotic property of any statistical estimator. This result enables the demonstration of the strong consistency of MoLC estimates for a selection of widely used distribution families originating from (but not restricted to) synthetic aperture radar image processing. We then derive the analytical conditions of applicability of MoLC to samples for the distribution families in our selection. Finally, we conduct various synthetic and real data experiments to assess the comparative properties, applicability and small sample performance of MoLC notably for the generalized gamma and K families of distributions. Supervised image classification experiments are considered for medical ultrasound and remote-sensing SAR imagery. The obtained results suggest that MoLC is a feasible and computationally fast yet not universally applicable alternative to MoM. MoLC becomes especially useful when the direct ML approach turns out to be unfeasible.

  17. On Interpreting and Extracting Information from the Cumulative Distribution Function Curve: A New Perspective with Applications

    Science.gov (United States)

    Balasooriya, Uditha; Li, Jackie; Low, Chan Kee

    2012-01-01

    For any density function (or probability function), there always corresponds a "cumulative distribution function" (cdf). It is a well-known mathematical fact that the cdf is more general than the density function, in the sense that for a given distribution the former may exist without the existence of the latter. Nevertheless, while the density…

  18. The quantum normal form approach to reactive scattering : The cumulative reaction probability for collinear exchange reactions

    NARCIS (Netherlands)

    Goussev, Arseni; Schubert, Roman; Waalkens, Holger; Wiggins, Stephen

    2009-01-01

    The quantum normal form approach to quantum transition state theory is used to compute the cumulative reaction probability for collinear exchange reactions. It is shown that for heavy-atom systems such as the nitrogen-exchange reaction, the quantum normal form approach gives excellent results and

  19. Robust user equilibrium model based on cumulative prospect theory under distribution-free travel time

    Institute of Scientific and Technical Information of China (English)

    王伟; 孙会君; 吴建军

    2015-01-01

    The assumption widely used in the user equilibrium model for stochastic network was that the probability distributions of the travel time were known explicitly by travelers. However, this distribution may be unavailable in reality. By relaxing the restrictive assumption, a robust user equilibrium model based on cumulative prospect theory under distribution-free travel time was presented. In the absence of the cumulative distribution function of the travel time, the exact cumulative prospect value (CPV) for each route cannot be obtained. However, the upper and lower bounds on the CPV can be calculated by probability inequalities. Travelers were assumed to choose the routes with the best worst-case CPVs. The proposed model was formulated as a variational inequality problem and solved via a heuristic solution algorithm. A numerical example was also provided to illustrate the application of the proposed model and the efficiency of the solution algorithm.

  20. ASYMPTOTIC QUANTIZATION OF PROBABILITY DISTRIBUTIONS

    Institute of Scientific and Technical Information of China (English)

    Klaus P(o)tzelberger

    2003-01-01

    We give a brief introduction to results on the asymptotics of quantization errors.The topics discussed include the quantization dimension,asymptotic distributions of sets of prototypes,asymptotically optimal quantizations,approximations and random quantizations.

  1. The Multivariate Gaussian Probability Distribution

    DEFF Research Database (Denmark)

    Ahrendt, Peter

    2005-01-01

    This technical report intends to gather information about the multivariate gaussian distribution, that was previously not (at least to my knowledge) to be found in one place and written as a reference manual. Additionally, some useful tips and tricks are collected that may be useful in practical...

  2. Covariate adjustment of cumulative incidence functions for competing risks data using inverse probability of treatment weighting.

    Science.gov (United States)

    Neumann, Anke; Billionnet, Cécile

    2016-06-01

    In observational studies without random assignment of the treatment, the unadjusted comparison between treatment groups may be misleading due to confounding. One method to adjust for measured confounders is inverse probability of treatment weighting. This method can also be used in the analysis of time to event data with competing risks. Competing risks arise if for some individuals the event of interest is precluded by a different type of event occurring before, or if only the earliest of several times to event, corresponding to different event types, is observed or is of interest. In the presence of competing risks, time to event data are often characterized by cumulative incidence functions, one for each event type of interest. We describe the use of inverse probability of treatment weighting to create adjusted cumulative incidence functions. This method is equivalent to direct standardization when the weight model is saturated. No assumptions about the form of the cumulative incidence functions are required. The method allows studying associations between treatment and the different types of event under study, while focusing on the earliest event only. We present a SAS macro implementing this method and we provide a worked example.

  3. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  4. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  5. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  6. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  7. A cumulative entropy method for distribution recognition of model error

    Science.gov (United States)

    Liang, Yingjie; Chen, Wen

    2015-02-01

    This paper develops a cumulative entropy method (CEM) to recognize the most suitable distribution for model error. In terms of the CEM, the Lévy stable distribution is employed to capture the statistical properties of model error. The strategies are tested on 250 experiments of axially loaded CFT steel stub columns in conjunction with the four national building codes of Japan (AIJ, 1997), China (DL/T, 1999), the Eurocode 4 (EU4, 2004), and United States (AISC, 2005). The cumulative entropy method is validated as more computationally efficient than the Shannon entropy method. Compared with the Kolmogorov-Smirnov test and root mean square deviation, the CEM provides alternative and powerful model selection criterion to recognize the most suitable distribution for the model error.

  8. Generalized Cumulative Residual Entropy for Distributions with Unrestricted Supports

    Directory of Open Access Journals (Sweden)

    Noomane Drissi

    2008-01-01

    Full Text Available We consider the cumulative residual entropy (CRE a recently introduced measure of entropy. While in previous works distributions with positive support are considered, we generalize the definition of CRE to the case of distributions with general support. We show that several interesting properties of the earlier CRE remain valid and supply further properties and insight to problems such as maximum CRE power moment problems. In addition, we show that this generalized CRE can be used as an alternative to differential entropy to derive information-based optimization criteria for system identification purpose.

  9. Exact probability distribution functions for Parrondo's games

    Science.gov (United States)

    Zadourian, Rubina; Saakian, David B.; Klümper, Andreas

    2016-12-01

    We study the discrete time dynamics of Brownian ratchet models and Parrondo's games. Using the Fourier transform, we calculate the exact probability distribution functions for both the capital dependent and history dependent Parrondo's games. In certain cases we find strong oscillations near the maximum of the probability distribution with two limiting distributions for odd and even number of rounds of the game. Indications of such oscillations first appeared in the analysis of real financial data, but now we have found this phenomenon in model systems and a theoretical understanding of the phenomenon. The method of our work can be applied to Brownian ratchets, molecular motors, and portfolio optimization.

  10. Proposal for Modified Damage Probability Distribution Functions

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup; Hansen, Peter Friis

    1996-01-01

    Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...

  11. CDFTBL: A statistical program for generating cumulative distribution functions from data

    Energy Technology Data Exchange (ETDEWEB)

    Eslinger, P.W. (Pacific Northwest Lab., Richland, WA (United States))

    1991-06-01

    This document describes the theory underlying the CDFTBL code and gives details for using the code. The CDFTBL code provides an automated tool for generating a statistical cumulative distribution function that describes a set of field data. The cumulative distribution function is written in the form of a table of probabilities, which can be used in a Monte Carlo computer code. A a specific application, CDFTBL can be used to analyze field data collected for parameters required by the PORMC computer code. Section 2.0 discusses the mathematical basis of the code. Section 3.0 discusses the code structure. Section 4.0 describes the free-format input command language, while Section 5.0 describes in detail the commands to run the program. Section 6.0 provides example program runs, and Section 7.0 provides references. The Appendix provides a program source listing. 11 refs., 2 figs., 19 tabs.

  12. Microcanonical thermostatistics analysis without histograms: cumulative distribution and Bayesian approaches

    CERN Document Server

    Alves, Nelson A; Rizzi, Leandro G

    2015-01-01

    Microcanonical thermostatistics analysis has become an important tool to reveal essential aspects of phase transitions in complex systems. An efficient way to estimate the microcanonical inverse temperature $\\beta(E)$ and the microcanonical entropy $S(E)$ is achieved with the statistical temperature weighted histogram analysis method (ST-WHAM). The strength of this method lies on its flexibility, as it can be used to analyse data produced by algorithms with generalised sampling weights. However, for any sampling weight, ST-WHAM requires the calculation of derivatives of energy histograms $H(E)$, which leads to non-trivial and tedious binning tasks for models with continuous energy spectrum such as those for biomolecular and colloidal systems. Here, we discuss two alternative methods that avoid the need for such energy binning to obtain continuous estimates for $H(E)$ in order to evaluate $\\beta(E)$ by using ST-WHAM: (i) a series expansion to estimate probability densities from the empirical cumulative distrib...

  13. Generating pseudo-random discrete probability distributions

    Energy Technology Data Exchange (ETDEWEB)

    Maziero, Jonas, E-mail: jonasmaziero@gmail.com [Universidade Federal de Santa Maria (UFSM), RS (Brazil). Departamento de Fisica

    2015-08-15

    The generation of pseudo-random discrete probability distributions is of paramount importance for a wide range of stochastic simulations spanning from Monte Carlo methods to the random sampling of quantum states for investigations in quantum information science. In spite of its significance, a thorough exposition of such a procedure is lacking in the literature. In this article, we present relevant details concerning the numerical implementation and applicability of what we call the iid, normalization, and trigonometric methods for generating an unbiased probability vector p=(p{sub 1},⋯ ,p{sub d}). An immediate application of these results regarding the generation of pseudo-random pure quantum states is also described. (author)

  14. Cumulative Interarrival Time Distributions of Freeway Entrance Ramp Traffic for Traffic Simulations

    Directory of Open Access Journals (Sweden)

    Erdinç Öner

    2013-02-01

    Full Text Available Cumulative interarrival time (IAT distributions for signalized and non-signalized freeway entrance ramps were developed to be used in digital computer traffic simulation models. The data from four different non-signalized entrance ramps (three ramps with a single lane, one ramp with two lanes and two different signalized entrance ramps (both with a single lane were used for developing the cumulative IAT distributions. The cumulative IAT distributions for the signalized and non-signalized entrance ramps were compared with each other and with the cumulative IAT distributions of the lanes for freeways. The comparative results showed that the cumulative IAT distributions for non-signalized entrance ramps are very close to the leftmost lane of a 3-lane freeway where the maximum absolute difference between the cumulative IAT distribution of the leftmost lane of a 3-lane freeway and the entrance ramps cumulative IAT distribution was 3%. The cumulative IAT distribution for the signalized entrance ramps was found to be different from the non-signalized entrance ramp cumulative IAT distribution. The approximated cumulative IAT distributions for signalized and non-signalized entrance ramp traffic for any hourly traffic volume from a few vehicles/hour up to 2,500 vehicles/hour can be obtained at http://www.ohio.edu/orite/research/uitds.cfm.

  15. Cumulative overlap distribution function in realistic spin glasses

    Science.gov (United States)

    Billoire, A.; Maiorano, A.; Marinari, E.; Martin-Mayor, V.; Yllanes, D.

    2014-09-01

    We use a sample-dependent analysis, based on medians and quantiles, to analyze the behavior of the overlap probability distribution of the Sherrington-Kirkpatrick and 3D Edwards-Anderson models of Ising spin glasses. We find that this approach is an effective tool to distinguish between replica symmetry breaking-like and droplet-like behavior of the spin-glass phase. Our results are in agreement with a replica symmetry breaking-like behavior for the 3D Edwards-Anderson model.

  16. Fast and accurate calculations for cumulative first-passage time distributions in Wiener diffusion models

    DEFF Research Database (Denmark)

    Blurton, Steven Paul; Kesselmeier, M.; Gondan, Matthias

    2012-01-01

    related work on the density of first-passage times [Navarro, D.J., Fuss, I.G. (2009). Fast and accurate calculations for first-passage times in Wiener diffusion models. Journal of Mathematical Psychology, 53, 222-230]. Two representations exist for the distribution, both including infinite series. We......We propose an improved method for calculating the cumulative first-passage time distribution in Wiener diffusion models with two absorbing barriers. This distribution function is frequently used to describe responses and error probabilities in choice reaction time tasks. The present work extends...... derive upper bounds for the approximation error resulting from finite truncation of the series, and we determine the number of iterations required to limit the error below a pre-specified tolerance. For a given set of parameters, the representation can then be chosen which requires the least...

  17. Probability distributions with summary graph structure

    CERN Document Server

    Wermuth, Nanny

    2010-01-01

    A set of independence statements may define the independence structure of interest in a family of joint probability distributions. This structure is often captured by a graph that consists of nodes representing the random variables and of edges that couple node pairs. One important class are multivariate regression chain graphs. They describe the independences of stepwise processes, in which at each step single or joint responses are generated given the relevant explanatory variables in their past. For joint densities that then result after possible marginalising or conditioning, we use summary graphs. These graphs reflect the independence structure implied by the generating process for the reduced set of variables and they preserve the implied independences after additional marginalising and conditioning. They can identify generating dependences which remain unchanged and alert to possibly severe distortions due to direct and indirect confounding. Operators for matrix representations of graphs are used to de...

  18. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  19. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  20. The Interannual Stability of Cumulative Frequency Distributions for Convective System Size and Intensity

    Science.gov (United States)

    Mohr, Karen I.; Molinari, John; Thorncroft, Chris D,

    2010-01-01

    The characteristics of convective system populations in West Africa and the western Pacific tropical cyclone basin were analyzed to investigate whether interannual variability in convective activity in tropical continental and oceanic environments is driven by variations in the number of events during the wet season or by favoring large and/or intense convective systems. Convective systems were defined from TRMM data as a cluster of pixels with an 85 GHz polarization-corrected brightness temperature below 255 K and with an area at least 64 km 2. The study database consisted of convective systems in West Africa from May Sep for 1998-2007 and in the western Pacific from May Nov 1998-2007. Annual cumulative frequency distributions for system minimum brightness temperature and system area were constructed for both regions. For both regions, there were no statistically significant differences among the annual curves for system minimum brightness temperature. There were two groups of system area curves, split by the TRMM altitude boost in 2001. Within each set, there was no statistically significant interannual variability. Sub-setting the database revealed some sensitivity in distribution shape to the size of the sampling area, length of sample period, and climate zone. From a regional perspective, the stability of the cumulative frequency distributions implied that the probability that a convective system would attain a particular size or intensity does not change interannually. Variability in the number of convective events appeared to be more important in determining whether a year is wetter or drier than normal.

  1. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd;

    report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...

  2. Polarization Mode Dispersion Probability Distribution for Arbitrary Mode Coupling

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    The probability distribution of the differential group delay for arbitrary mode coupling is simulated with Monte-Carlo method. Fitting the simulation results, we obtain probability distribution function for arbitrary mode coupling.

  3. Links between potential energy structures and quantum cumulative reaction probabilities of double proton transfer reactions

    Energy Technology Data Exchange (ETDEWEB)

    Horsten, H.F. von [Institut fuer Physikalische Chemie, Christian-Albrechts-Universitaet, Olshausenstrasse 40, 24098 Kiel (Germany); Hartke, B. [Institut fuer Physikalische Chemie, Christian-Albrechts-Universitaet, Olshausenstrasse 40, 24098 Kiel (Germany)], E-mail: hartke@phc.uni-kiel.de

    2007-09-25

    Double proton transfer reactions of pyrazole-guanidine species exhibit unusual energy profiles of a plateau form, different from the standard single and double barrier shapes. We have demonstrated earlier that this leads to a characteristically different quantum dynamical behavior of plateau reactions, when measured appropriately. Here we show that these differences also carry over to traditional measures of reaction probability.

  4. Matrix-exponential distributions in applied probability

    CERN Document Server

    Bladt, Mogens

    2017-01-01

    This book contains an in-depth treatment of matrix-exponential (ME) distributions and their sub-class of phase-type (PH) distributions. Loosely speaking, an ME distribution is obtained through replacing the intensity parameter in an exponential distribution by a matrix. The ME distributions can also be identified as the class of non-negative distributions with rational Laplace transforms. If the matrix has the structure of a sub-intensity matrix for a Markov jump process we obtain a PH distribution which allows for nice probabilistic interpretations facilitating the derivation of exact solutions and closed form formulas. The full potential of ME and PH unfolds in their use in stochastic modelling. Several chapters on generic applications, like renewal theory, random walks and regenerative processes, are included together with some specific examples from queueing theory and insurance risk. We emphasize our intention towards applications by including an extensive treatment on statistical methods for PH distribu...

  5. Some New Approaches to Multivariate Probability Distributions.

    Science.gov (United States)

    1986-12-01

    Forte, B. (1985). Mutual dependence of random variables and maximum discretized entropy , Ann. Prob., 13, 630-637. .. 3. Billingsley, P. (1968...characterizations of distributions, such as the Marshall-Olkin bivariate distribution or Frechet’s multi- variate distribution with continuous marginals or a...problem mentioned in Remark 8. He has given in this context a uniqueness theorem in the bivariate case under certain assump- tions. The following

  6. Foundations of quantization for probability distributions

    CERN Document Server

    Graf, Siegfried

    2000-01-01

    Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.

  7. TESTING FOR DIFFERENCES BETWEEN CUMULATIVE DISTRIBUTION FUNCTIONS FROM COMPLEX ENVIRONMENTAL SAMPLING SURVEYS

    Science.gov (United States)

    The U.S. Environmental Protection Agency's Environmental Monitoring and Assessment Program (EMAP) employs the cumulative distribution function (cdf) to measure the status of quantitative variables for resources of interest. The ability to compare cdf's for a resource from, say,...

  8. CUMULATIVE DISTRIBUTION FUNCTIONS FOR THE TEMPERATURE THRESHOLD FOR THE ONSET OF CARBON STEEL CORROSION

    Energy Technology Data Exchange (ETDEWEB)

    K.G. Mon

    1998-05-15

    The purpose of this calculation is to process the cumulative distribution functions (CDFs) characterizing the temperature threshold for the onset of corrosion provided by expert elicitation and minimize the set of values to 200 points for use in WAPDEG.

  9. Eliciting Subjective Probability Distributions with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd;

    2015-01-01

    We test in a laboratory experiment the theoretical prediction that risk attitudes have a surprisingly small role in distorting reports from true belief distributions. We find evidence consistent with theory in our experiment.......We test in a laboratory experiment the theoretical prediction that risk attitudes have a surprisingly small role in distorting reports from true belief distributions. We find evidence consistent with theory in our experiment....

  10. Probability distribution fitting of schedule overruns in construction projects

    OpenAIRE

    P E D Love; C-P Sing; WANG, X; Edwards, D.J.; H Odeyinka

    2013-01-01

    The probability of schedule overruns for construction and engineering projects can be ascertained using a ‘best fit’ probability distribution from an empirical distribution. The statistical characteristics of schedule overruns occurring in 276 Australian construction and engineering projects were analysed. Skewness and kurtosis values revealed that schedule overruns are non-Gaussian. Theoretical probability distributions were then fitted to the schedule overrun data; including the Kolmogorov–...

  11. Statistical tests for whether a given set of independent, identically distributed draws comes from a specified probability density.

    Science.gov (United States)

    Tygert, Mark

    2010-09-21

    We discuss several tests for determining whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov-Smirnov tests, particularly Kuiper's variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).

  12. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    2017-01-01

    Subjective beliefs are elicited routinely in economics experiments. However, such elicitation often suffers from two possible disadvantages. First, beliefs are recovered in the form of a summary statistic, usually the mean, of the underlying latent distribution. Second, recovered beliefs are bias...

  13. NONPARAMETRIC ESTIMATION OF CHARACTERISTICS OF PROBABILITY DISTRIBUTIONS

    Directory of Open Access Journals (Sweden)

    Orlov A. I.

    2015-10-01

    Full Text Available The article is devoted to the nonparametric point and interval estimation of the characteristics of the probabilistic distribution (the expectation, median, variance, standard deviation, variation coefficient of the sample results. Sample values are regarded as the implementation of independent and identically distributed random variables with an arbitrary distribution function having the desired number of moments. Nonparametric analysis procedures are compared with the parametric procedures, based on the assumption that the sample values have a normal distribution. Point estimators are constructed in the obvious way - using sample analogs of the theoretical characteristics. Interval estimators are based on asymptotic normality of sample moments and functions from them. Nonparametric asymptotic confidence intervals are obtained through the use of special output technology of the asymptotic relations of Applied Statistics. In the first step this technology uses the multidimensional central limit theorem, applied to the sums of vectors whose coordinates are the degrees of initial random variables. The second step is the conversion limit multivariate normal vector to obtain the interest of researcher vector. At the same considerations we have used linearization and discarded infinitesimal quantities. The third step - a rigorous justification of the results on the asymptotic standard for mathematical and statistical reasoning level. It is usually necessary to use the necessary and sufficient conditions for the inheritance of convergence. This article contains 10 numerical examples. Initial data - information about an operating time of 50 cutting tools to the limit state. Using the methods developed on the assumption of normal distribution, it can lead to noticeably distorted conclusions in a situation where the normality hypothesis failed. Practical recommendations are: for the analysis of real data we should use nonparametric confidence limits

  14. Kriging with cumulative distribution function of order statistics for delineation of heavy-metal contaminated soils

    Energy Technology Data Exchange (ETDEWEB)

    Juang, K.W.; Lee, D.Y.; Hsiao, C.K. [National Taiwan Univ., Tapei (Taiwan, Province of China)

    1998-10-01

    Accurate delineation of contaminated soils is essential for risk assessment and remediation. The probability of pollutant concentrations lower than a cutoff value is more important than the best estimate of pollutant concentrations for unsampled locations in delineating contaminated soils. In this study, a new method, kriging with the cumulative distribution function (CDF) of order statistics (CDF kriging), is introduced and compared with indicator kriging. It is used to predict the probability that extractable concentrations of Zn will be less than a cut-off value for soils to be declared hazardous. The 0.1 M HCl-extractable Zn concentrations of topsoil of a paddy field having an area of about 2000 ha located in Taiwan are used. A comparison of the CDF of order statistics and indicator function transformation shows that the variance and the coefficient of variation (CV) of the CDF of order statistics transformed data are smaller than those of the indicator function transformed data. This suggests that the CDF of order statistics transformation possesses less variability than does the indicator function transformation. In addition, based on cross-validation, CDF kriging is found to reduce the mean squared errors of estimations by about 30% and to reduce the mean kriging variances by about 26% compared with indicator kriging.

  15. Stable Probability Distributions and their Domains of Attraction

    NARCIS (Netherlands)

    J.L. Geluk (Jaap); L.F.M. de Haan (Laurens)

    1997-01-01

    textabstractThe theory of stable probability distributions and their domains of attraction is derived in a direct way (avoiding the usual route via infinitely divisible distributions) using Fourier transforms. Regularly varying functions play an important role in the exposition.

  16. Incorporating Skew into RMS Surface Roughness Probability Distribution

    Science.gov (United States)

    Stahl, Mark T.; Stahl, H. Philip.

    2013-01-01

    The standard treatment of RMS surface roughness data is the application of a Gaussian probability distribution. This handling of surface roughness ignores the skew present in the surface and overestimates the most probable RMS of the surface, the mode. Using experimental data we confirm the Gaussian distribution overestimates the mode and application of an asymmetric distribution provides a better fit. Implementing the proposed asymmetric distribution into the optical manufacturing process would reduce the polishing time required to meet surface roughness specifications.

  17. Probability distributions in risk management operations

    CERN Document Server

    Artikis, Constantinos

    2015-01-01

    This book is about the formulations, theoretical investigations, and practical applications of new stochastic models for fundamental concepts and operations of the discipline of risk management. It also examines how these models can be useful in the descriptions, measurements, evaluations, and treatments of risks threatening various modern organizations. Moreover, the book makes clear that such stochastic models constitute very strong analytical tools which substantially facilitate strategic thinking and strategic decision making in many significant areas of risk management. In particular the incorporation of fundamental probabilistic concepts such as the sum, minimum, and maximum of a random number of continuous, positive, independent, and identically distributed random variables in the mathematical structure of stochastic models significantly supports the suitability of these models in the developments, investigations, selections, and implementations of proactive and reactive risk management operations. The...

  18. Effects of site characteristics on cumulative frequency distribution of water table depth in peatlands

    Science.gov (United States)

    Bechtold, Michel; Tiemeyer, Bärbel; Frahm, Enrico; Roßkopf, Niko

    2013-04-01

    Previous studies demonstrated strong dependency of vegetation development and GHG emissions from peatlands on annual mean water table depth. It is also proposed that the duration of ponding and low water level periods are important indicators for CH4 emissions and the presence of specific plant species. Better understanding of the annual water table dynamics and the influence of site characteristics helps to explain variability of vegetation and emissions at the plot scale. It also provides essential information for a nation-wide upscaling of local gas flux measurements and for estimating the impact of regional adaption strategies. In this study, we analyze the influence of site characteristics on the cumulative frequency distribution of water table depth in a peatland. On the basis of data from about 100 sites we evaluate how distribution functions, e.g. the beta distribution function, are a tool for the systematic analysis of the site-specific frequency distribution of water table depth. Our analysis shows that it is possible to differentiate different shape types of frequency distributions, in particular left-skewed (bias towards the water table minimum), right-skewed (bias towards the water table maximum), and 'S'-shaped distributions (bias towards the mid of min and max). The shape is primarily dependent on the annual mean water table depth, but also shows dependencies on land use, peatland type, catchment size and soil properties. Forest soils are for example all characterized by a 'S'-shaped distribution. Preliminary results indicate that data sets that do not show a beta distribution are mostly from observation wells that are located close to drainage courses and/or are from sites characterized by strong water management (e.g. abruptly changing weir levels). The beta distribution might thus be a tool to identify sites with a 'non-natural' frequency distribution or erroneous data sets. Because the parameters of the beta distribution show a dependency on site

  19. Some explicit expressions for the probability distribution of force magnitude

    Indian Academy of Sciences (India)

    Saralees Nadarajah

    2008-08-01

    Recently, empirical investigations have suggested that the components of contact forces follow the exponential distribution. However, explicit expressions for the probability distribution of the corresponding force magnitude have not been known and only approximations have been used in the literature. In this note, for the first time, I provide explicit expressions for the probability distribution of the force magnitude. Both two-dimensional and three-dimensional cases are considered.

  20. The estimation of tree posterior probabilities using conditional clade probability distributions.

    Science.gov (United States)

    Larget, Bret

    2013-07-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample.

  1. Cumulants of multiplicity distributions in most-central heavy-ion collisions

    Science.gov (United States)

    Xu, Hao-jie

    2016-11-01

    I investigate the volume corrections on cumulants of total charge distributions and net proton distributions. The required volume information is generated by an optical Glauber model. I find that the corrected statistical expectations of multiplicity distributions mimic the negative binomial distributions at noncentral collisions, and they tend to approach the Poisson ones at most-central collisions due to the "boundary effects," which suppress the volume corrections. However, net proton distributions and reference multiplicity distributions are sensitive to the external volume fluctuations at most-central collisions, which imply that one has to consider the details of volume distributions in event-by-event multiplicity fluctuation studies.

  2. How Can Histograms Be Useful for Introducing Continuous Probability Distributions?

    Science.gov (United States)

    Derouet, Charlotte; Parzysz, Bernard

    2016-01-01

    The teaching of probability has changed a great deal since the end of the last century. The development of technologies is indeed part of this evolution. In France, continuous probability distributions began to be studied in 2002 by scientific 12th graders, but this subject was marginal and appeared only as an application of integral calculus.…

  3. Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution

    Energy Technology Data Exchange (ETDEWEB)

    Hamadameen, Abdulqader Othman [Optimization, Department of Mathematical Sciences, Faculty of Science, UTM (Malaysia); Zainuddin, Zaitul Marlizawati [Department of Mathematical Sciences, Faculty of Science, UTM (Malaysia)

    2014-06-19

    This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α{sup –}. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen’s method is employed to find a compromise solution, supported by illustrative numerical example.

  4. Information-theoretic methods for estimating of complicated probability distributions

    CERN Document Server

    Zong, Zhi

    2006-01-01

    Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur

  5. Application-dependent Probability Distributions for Offshore Wind Speeds

    Science.gov (United States)

    Morgan, E. C.; Lackner, M.; Vogel, R. M.; Baise, L. G.

    2010-12-01

    The higher wind speeds of the offshore environment make it an attractive setting for future wind farms. With sparser field measurements, the theoretical probability distribution of short-term wind speeds becomes more important in estimating values such as average power output and fatigue load. While previous studies typically compare the accuracy of probability distributions using R2, we show that validation based on this metric is not consistent with validation based on engineering parameters of interest, namely turbine power output and extreme wind speed. Thus, in order to make the most accurate estimates possible, the probability distribution that an engineer picks to characterize wind speeds should depend on the design parameter of interest. We introduce the Kappa and Wakeby probability distribution functions to wind speed modeling, and show that these two distributions, along with the Biweibull distribution, fit wind speed samples better than the more widely accepted Weibull and Rayleigh distributions based on R2. Additionally, out of the 14 probability distributions we examine, the Kappa and Wakeby give the most accurate and least biased estimates of turbine power output. The fact that the 2-parameter Lognormal distribution estimates extreme wind speeds (i.e. fits the upper tail of wind speed distributions) with least error indicates that not one single distribution performs satisfactorily for all applications. Our use of a large dataset composed of 178 buoys (totaling ~72 million 10-minute wind speed observations) makes these findings highly significant, both in terms of large sample size and broad geographical distribution across various wind regimes. Boxplots of R2 from the fit of each of the 14 distributions to the 178 boy wind speed samples. Distributions are ranked from left to right by ascending median R2, with the Biweibull having the closest median to 1.

  6. Most probable degree distribution at fixed structural entropy

    Indian Academy of Sciences (India)

    Ginestra Bianconi

    2008-06-01

    The structural entropy is the entropy of the ensemble of uncorrelated networks with given degree sequence. Here we derive the most probable degree distribution emerging when we distribute stubs (or half-edges) randomly through the nodes of the network by keeping fixed the structural entropy. This degree distribution is found to decay as a Poisson distribution when the entropy is maximized and to have a power-law tail with an exponent → 2 when the entropy is minimized.

  7. PROBABILITY DISTRIBUTION FUNCTION OF NEAR-WALL TURBULENT VELOCITY FLUCTUATIONS

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    By large eddy simulation (LES), turbulent databases of channel flows at different Reynolds numbers were established. Then, the probability distribution functions of the streamwise and wall-normal velocity fluctuations were obtained and compared with the corresponding normal distributions. By hypothesis test, the deviation from the normal distribution was analyzed quantitatively. The skewness and flatness factors were also calculated. And the variations of these two factors in the viscous sublayer, buffer layer and log-law layer were discussed. Still illustrated were the relations between the probability distribution functions and the burst events-sweep of high-speed fluids and ejection of low-speed fluids-in the viscous sub-layer, buffer layer and loglaw layer. Finally the variations of the probability distribution functions with Reynolds number were examined.

  8. Evaluation of probability distributions for concentration fluctuations in a building array

    Science.gov (United States)

    Efthimiou, G. C.; Andronopoulos, S.; Bartzis, J. G.

    2017-10-01

    The wide range of values observed in a measured concentration time series after the release of a dispersing airborne pollutant from a point source in the atmospheric boundary layer, and the hazard level associated with the peak values, demonstrate the necessity of predicting the concentration probability distribution. For this, statistical models describing the probability of occurrence are preferably employed. In this paper a concentration database pertaining to a field experiment of dispersion in an urban-like area (MUST experiment) from a continuously emitting source is used for the selection of the best performing statistical model between the Gamma and the Beta distributions. The skewness, the kurtosis as well as the inverses of the cumulative distribution function were compared between the two statistical models and the experiment. The evaluation is performed in the form of validation metrics such as the Fractional Bias (FB), the Normalized Mean Square Error and the factor-of-2 percentage. The Beta probability distribution agreed with the experimental results better than the Gamma probability distribution except for the 25th percentile. Also according to the significant tests using the BOOT software the Beta model presented FB and NMSE values that are statistical different than the ones of the Gamma model except the 75th percentiles and the FB of the 99th percentiles. The effect of the stability conditions and source heights on the performance of the statistical models is also examined. For both cases the performance of the Beta distribution was slightly better than that of the Gamma.

  9. Generating Probability Distributions using Multivalued Stochastic Relay Circuits

    CERN Document Server

    Lee, David

    2011-01-01

    The problem of random number generation dates back to von Neumann's work in 1951. Since then, many algorithms have been developed for generating unbiased bits from complex correlated sources as well as for generating arbitrary distributions from unbiased bits. An equally interesting, but less studied aspect is the structural component of random number generation as opposed to the algorithmic aspect. That is, given a network structure imposed by nature or physical devices, how can we build networks that generate arbitrary probability distributions in an optimal way? In this paper, we study the generation of arbitrary probability distributions in multivalued relay circuits, a generalization in which relays can take on any of N states and the logical 'and' and 'or' are replaced with 'min' and 'max' respectively. Previous work was done on two-state relays. We generalize these results, describing a duality property and networks that generate arbitrary rational probability distributions. We prove that these network...

  10. Evidence for Truncated Exponential Probability Distribution of Earthquake Slip

    KAUST Repository

    Thingbaijam, Kiran K. S.

    2016-07-13

    Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.

  11. NORMALLY DISTRIBUTED PROBABILITY MEASURE ON THE METRIC SPACE OF NORMS

    Institute of Scientific and Technical Information of China (English)

    Á.G. HORVÁTH

    2013-01-01

    In this paper we propose a method to construct probability measures on the space of convex bodies. For this purpose, first, we introduce the notion of thinness of a body. Then we show the existence of a measure with the property that its pushforward by the thinness function is a probability measure of truncated normal distribution. Finally, we improve this method to find a measure satisfying some important properties in geometric measure theory.

  12. Probability distributions for Poisson processes with pile-up

    CERN Document Server

    Sevilla, Diego J R

    2013-01-01

    In this paper, two parametric probability distributions capable to describe the statistics of X-ray photon detection by a CCD are presented. They are formulated from simple models that account for the pile-up phenomenon, in which two or more photons are counted as one. These models are based on the Poisson process, but they have an extra parameter which includes all the detailed mechanisms of the pile-up process that must be fitted to the data statistics simultaneously with the rate parameter. The new probability distributions, one for number of counts per time bins (Poisson-like), and the other for waiting times (exponential-like) are tested fitting them to statistics of real data, and between them through numerical simulations, and their results are analyzed and compared. The probability distributions presented here can be used as background statistical models to derive likelihood functions for statistical methods in signal analysis.

  13. Probability distribution functions in the finite density lattice QCD

    CERN Document Server

    Ejiri, S; Aoki, S; Kanaya, K; Saito, H; Hatsuda, T; Ohno, H; Umeda, T

    2012-01-01

    We study the phase structure of QCD at high temperature and density by lattice QCD simulations adopting a histogram method. We try to solve the problems which arise in the numerical study of the finite density QCD, focusing on the probability distribution function (histogram). As a first step, we investigate the quark mass dependence and the chemical potential dependence of the probability distribution function as a function of the Polyakov loop when all quark masses are sufficiently large, and study the properties of the distribution function. The effect from the complex phase of the quark determinant is estimated explicitly. The shape of the distribution function changes with the quark mass and the chemical potential. Through the shape of the distribution, the critical surface which separates the first order transition and crossover regions in the heavy quark region is determined for the 2+1-flavor case.

  14. Assigning probability distributions to input parameters of performance assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, Srikanta [INTERA Inc., Austin, TX (United States)

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available.

  15. Probability distribution of arrival times in quantum mechanics

    CERN Document Server

    Delgado, V

    1998-01-01

    In a previous paper [Phys. Rev. A, in press] we introduced a self-adjoint operator $\\hat {{\\cal T}}(X)$ whose eigenstates can be used to define consistently a probability distribution of the time of arrival at a given spatial point. In the present work we show that the probability distribution previously proposed can be well understood on classical grounds in the sense that it is given by the expectation value of a certain positive definite operator $\\hat J^{(+)}(X)$ which is nothing but a straightforward quantum version of the modulus of the classical current. For quantum states highly localized in momentum space about a certain momentum $p_0 \

  16. Parametric Probability Distribution Functions for Axon Diameters of Corpus Callosum

    Directory of Open Access Journals (Sweden)

    Farshid eSepehrband

    2016-05-01

    Full Text Available Axon diameter is an important neuroanatomical characteristic of the nervous system that alters in the course of neurological disorders such as multiple sclerosis. Axon diameters vary, even within a fiber bundle, and are not normally distributed. An accurate distribution function is therefore beneficial, either to describe axon diameters that are obtained from a direct measurement technique (e.g., microscopy, or to infer them indirectly (e.g., using diffusion-weighted MRI. The gamma distribution is a common choice for this purpose (particularly for the inferential approach because it resembles the distribution profile of measured axon diameters which has been consistently shown to be non-negative and right-skewed. In this study we compared a wide range of parametric probability distribution functions against empirical data obtained from electron microscopy images. We observed that the gamma distribution fails to accurately describe the main characteristics of the axon diameter distribution, such as location and scale of the mode and the profile of distribution tails. We also found that the generalized extreme value distribution consistently fitted the measured distribution better than other distribution functions. This suggests that there may be distinct subpopulations of axons in the corpus callosum, each with their own distribution profiles. In addition, we observed that several other distributions outperformed the gamma distribution, yet had the same number of unknown parameters; these were the inverse Gaussian, log normal, log logistic and Birnbaum-Saunders distributions.

  17. Probability distributions of the electroencephalogram envelope of preterm infants.

    Science.gov (United States)

    Saji, Ryoya; Hirasawa, Kyoko; Ito, Masako; Kusuda, Satoshi; Konishi, Yukuo; Taga, Gentaro

    2015-06-01

    To determine the stationary characteristics of electroencephalogram (EEG) envelopes for prematurely born (preterm) infants and investigate the intrinsic characteristics of early brain development in preterm infants. Twenty neurologically normal sets of EEGs recorded in infants with a post-conceptional age (PCA) range of 26-44 weeks (mean 37.5 ± 5.0 weeks) were analyzed. Hilbert transform was applied to extract the envelope. We determined the suitable probability distribution of the envelope and performed a statistical analysis. It was found that (i) the probability distributions for preterm EEG envelopes were best fitted by lognormal distributions at 38 weeks PCA or less, and by gamma distributions at 44 weeks PCA; (ii) the scale parameter of the lognormal distribution had positive correlations with PCA as well as a strong negative correlation with the percentage of low-voltage activity; (iii) the shape parameter of the lognormal distribution had significant positive correlations with PCA; (iv) the statistics of mode showed significant linear relationships with PCA, and, therefore, it was considered a useful index in PCA prediction. These statistics, including the scale parameter of the lognormal distribution and the skewness and mode derived from a suitable probability distribution, may be good indexes for estimating stationary nature in developing brain activity in preterm infants. The stationary characteristics, such as discontinuity, asymmetry, and unimodality, of preterm EEGs are well indicated by the statistics estimated from the probability distribution of the preterm EEG envelopes. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  18. Augmenting momentum resolution with well tuned probability distributions

    CERN Document Server

    Landi, Gregorio

    2016-01-01

    The realistic probability distributions of a previous article are applied to the reconstruction of tracks in constant magnetic field. The complete forms and their schematic approximations produce excellent momentum estimations, drastically better than standard fits. A simplified derivation of one of our probability distributions is illustrated. The momentum reconstructions are compared with standard fits (least squares) with two different position algorithms: the eta-algorithm and the two-strip center of gravity. The quality of our results are expressed as the increase of the magnetic field and signal-to-noise ratio that overlap the standard fit reconstructions with ours best distributions. The data and the simulations are tuned on the tracker of a running experiment and its double sided microstrip detectors, here each detector side is simulated to measure the magnetic bending. To overlap with our best distributions, the magnetic field must be increased by a factor 1.5 for the least squares based on the eta-a...

  19. Computer routines for probability distributions, random numbers, and related functions

    Science.gov (United States)

    Kirby, W.H.

    1980-01-01

    Use of previously codes and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main programs. The probability distributions provided include the beta, chisquare, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F tests. Other mathematical functions include the Bessel function I (subzero), gamma and log-gamma functions, error functions and exponential integral. Auxiliary services include sorting and printer plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)

  20. Probability distribution of extreme share returns in Malaysia

    Science.gov (United States)

    Zin, Wan Zawiah Wan; Safari, Muhammad Aslam Mohd; Jaaman, Saiful Hafizah; Yie, Wendy Ling Shin

    2014-09-01

    The objective of this study is to investigate the suitable probability distribution to model the extreme share returns in Malaysia. To achieve this, weekly and monthly maximum daily share returns are derived from share prices data obtained from Bursa Malaysia over the period of 2000 to 2012. The study starts with summary statistics of the data which will provide a clue on the likely candidates for the best fitting distribution. Next, the suitability of six extreme value distributions, namely the Gumbel, Generalized Extreme Value (GEV), Generalized Logistic (GLO) and Generalized Pareto (GPA), the Lognormal (GNO) and the Pearson (PE3) distributions are evaluated. The method of L-moments is used in parameter estimation. Based on several goodness of fit tests and L-moment diagram test, the Generalized Pareto distribution and the Pearson distribution are found to be the best fitted distribution to represent the weekly and monthly maximum share returns in Malaysia stock market during the studied period, respectively.

  1. Log-concave Probability Distributions: Theory and Statistical Testing

    DEFF Research Database (Denmark)

    An, Mark Yuing

    1996-01-01

    This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete and mul...

  2. Probability Measure of Navigation pattern predition using Poisson Distribution Analysis

    Directory of Open Access Journals (Sweden)

    Dr.V.Valli Mayil

    2012-06-01

    Full Text Available The World Wide Web has become one of the most important media to store, share and distribute information. The rapid expansion of the web has provided a great opportunity to study user and system behavior by exploring web access logs. Web Usage Mining is the application of data mining techniques to large web data repositories in order to extract usage patterns. Every web server keeps a log of all transactions between the server and the clients. The log data which are collected by web servers contains information about every click of user to the web documents of the site. The useful log information needs to be analyzed and interpreted in order to obtainknowledge about actual user preferences in accessing web pages. In recent years several methods have been proposed for mining web log data. This paper addresses the statistical method of Poisson distribution analysis to find out the higher probability session sequences which is then used to test the web application performance.The analysis of large volumes of click stream data demands the employment of data mining methods. Conducting data mining on logs of web servers involves the determination of frequently occurring access sequences. A statistical poisson distribution shows the frequency probability of specific events when the average probability of a single occurrence is known. The Poisson distribution is a discrete function wich is used in this paper to find out the probability frequency of particular page is visited by the user.

  3. On Probability Distributions for Trees: Representations, Inference and Learning

    CERN Document Server

    Denis, François; Gilleron, Rémi; Tommasi, Marc; Gilbert, Édouard

    2008-01-01

    We study probability distributions over free algebras of trees. Probability distributions can be seen as particular (formal power) tree series [Berstel et al 82, Esik et al 03], i.e. mappings from trees to a semiring K . A widely studied class of tree series is the class of rational (or recognizable) tree series which can be defined either in an algebraic way or by means of multiplicity tree automata. We argue that the algebraic representation is very convenient to model probability distributions over a free algebra of trees. First, as in the string case, the algebraic representation allows to design learning algorithms for the whole class of probability distributions defined by rational tree series. Note that learning algorithms for rational tree series correspond to learning algorithms for weighted tree automata where both the structure and the weights are learned. Second, the algebraic representation can be easily extended to deal with unranked trees (like XML trees where a symbol may have an unbounded num...

  4. Probability distributions of continuous measurement results for conditioned quantum evolution

    Science.gov (United States)

    Franquet, A.; Nazarov, Yuli V.

    2017-02-01

    We address the statistics of continuous weak linear measurement on a few-state quantum system that is subject to a conditioned quantum evolution. For a conditioned evolution, both the initial and final states of the system are fixed: the latter is achieved by the postselection in the end of the evolution. The statistics may drastically differ from the nonconditioned case, and the interference between initial and final states can be observed in the probability distributions of measurement outcomes as well as in the average values exceeding the conventional range of nonconditioned averages. We develop a proper formalism to compute the distributions of measurement outcomes, and evaluate and discuss the distributions in experimentally relevant setups. We demonstrate the manifestations of the interference between initial and final states in various regimes. We consider analytically simple examples of nontrivial probability distributions. We reveal peaks (or dips) at half-quantized values of the measurement outputs. We discuss in detail the case of zero overlap between initial and final states demonstrating anomalously big average outputs and sudden jump in time-integrated output. We present and discuss the numerical evaluation of the probability distribution aiming at extending the analytical results and describing a realistic experimental situation of a qubit in the regime of resonant fluorescence.

  5. Convolutions Induced Discrete Probability Distributions and a New Fibonacci Constant

    CERN Document Server

    Rajan, Arulalan; Rao, Vittal; Rao, Ashok

    2010-01-01

    This paper proposes another constant that can be associated with Fibonacci sequence. In this work, we look at the probability distributions generated by the linear convolution of Fibonacci sequence with itself, and the linear convolution of symmetrized Fibonacci sequence with itself. We observe that for a distribution generated by the linear convolution of the standard Fibonacci sequence with itself, the variance converges to 8.4721359... . Also, for a distribution generated by the linear convolution of symmetrized Fibonacci sequences, the variance converges in an average sense to 17.1942 ..., which is approximately twice that we get with common Fibonacci sequence.

  6. Measuring Robustness of Timetables at Stations using a Probability Distribution

    DEFF Research Database (Denmark)

    Jensen, Lars Wittrup; Landex, Alex

    Stations are often the limiting capacity factor in a railway network. This induces interdependencies, especially at at-grade junctions, causing network effects. This paper presents three traditional methods that can be used to measure the complexity of a station, indicating the robustness...... infrastructure layouts given a timetable. These two methods provide different precision at the expense of a more complex calculation process. The advanced and more precise method is based on a probability distribution that can describe the expected delay between two trains as a function of the buffer time....... This paper proposes to use the exponential distribution, only taking non-negative delays into account, but any probability distribution can be used. Furthermore, the paper proposes that the calculation parameters are estimated from existing delay data, at a station, to achieve a higher precision. As delay...

  7. Probability Distribution Function of Passive Scalars in Shell Models

    Institute of Scientific and Technical Information of China (English)

    LIU Chun-Ping; ZHANG Xiao-Qiang; LIU Yu-Rong; WANG Guang-Rui; HE Da-Ren; CHEN Shi-Gang; ZHU Lu-Jin

    2008-01-01

    A shell-model version of passive scalar problem is introduced, which is inspired by the model of K. Ohkitani and M. Yakhot [K. Ohkitani and M. Yakhot, Phys. Rev. Lett. 60 (1988) 983; K. Ohkitani and M. Yakhot, Prog. Theor. Phys. 81 (1988) 329]. As in the original problem, the prescribed random velocity field is Gaussian and 5 correlated in time. Deterministic differential equations are regarded as nonlinear Langevin equation. Then, the Fokker-Planck equations of PDF for passive scalars axe obtained and solved numerically. In energy input range (n < 5, n is the shell number.), the probability distribution function (PDF) of passive scalars is near the Gaussian distribution. In inertial range (5 < n < 16) and dissipation range (n ≥ 17), the probability distribution function (PDF) of passive scalars has obvious intermittence. And the scaling power of passive scalar is anomalous. The results of numerical simulations are compared with experimental measurements.

  8. Distribution probability of large-scale landslides in central Nepal

    Science.gov (United States)

    Timilsina, Manita; Bhandary, Netra P.; Dahal, Ranjan Kumar; Yatabe, Ryuichi

    2014-12-01

    Large-scale landslides in the Himalaya are defined as huge, deep-seated landslide masses that occurred in the geological past. They are widely distributed in the Nepal Himalaya. The steep topography and high local relief provide high potential for such failures, whereas the dynamic geology and adverse climatic conditions play a key role in the occurrence and reactivation of such landslides. The major geoscientific problems related with such large-scale landslides are 1) difficulties in their identification and delineation, 2) sources of small-scale failures, and 3) reactivation. Only a few scientific publications have been published concerning large-scale landslides in Nepal. In this context, the identification and quantification of large-scale landslides and their potential distribution are crucial. Therefore, this study explores the distribution of large-scale landslides in the Lesser Himalaya. It provides simple guidelines to identify large-scale landslides based on their typical characteristics and using a 3D schematic diagram. Based on the spatial distribution of landslides, geomorphological/geological parameters and logistic regression, an equation of large-scale landslide distribution is also derived. The equation is validated by applying it to another area. For the new area, the area under the receiver operating curve of the landslide distribution probability in the new area is 0.699, and a distribution probability value could explain > 65% of existing landslides. Therefore, the regression equation can be applied to areas of the Lesser Himalaya of central Nepal with similar geological and geomorphological conditions.

  9. Polynomial probability distribution estimation using the method of moments.

    Science.gov (United States)

    Munkhammar, Joakim; Mattsson, Lars; Rydén, Jesper

    2017-01-01

    We suggest a procedure for estimating Nth degree polynomial approximations to unknown (or known) probability density functions (PDFs) based on N statistical moments from each distribution. The procedure is based on the method of moments and is setup algorithmically to aid applicability and to ensure rigor in use. In order to show applicability, polynomial PDF approximations are obtained for the distribution families Normal, Log-Normal, Weibull as well as for a bimodal Weibull distribution and a data set of anonymized household electricity use. The results are compared with results for traditional PDF series expansion methods of Gram-Charlier type. It is concluded that this procedure is a comparatively simple procedure that could be used when traditional distribution families are not applicable or when polynomial expansions of probability distributions might be considered useful approximations. In particular this approach is practical for calculating convolutions of distributions, since such operations become integrals of polynomial expressions. Finally, in order to show an advanced applicability of the method, it is shown to be useful for approximating solutions to the Smoluchowski equation.

  10. Chest wall segmentation in automated 3D breast ultrasound using rib shadow enhancement and multi-plane cumulative probability enhanced map

    Science.gov (United States)

    Kim, Hyeonjin; Kim, Hannah; Hong, Helen

    2015-03-01

    We propose an automatic segmentation method of chest wall in 3D ABUS images using rib shadow enhancement and multi-planar cumulative probability enhanced map. For the identification of individual dark rib shadows, each rib shadow is enhanced using intensity transfer function and 3D sheet-like enhancement filtering. Then, wrongly enhanced intercostal regions and small fatty tissues are removed using coronal and sagittal cumulative probability enhanced maps. The large fatty tissues with globular and sheet-like shapes at the top of rib shadow are removed using shape and orientation analysis based on moment matrix. Detected chest walls are connected with cubic B-spline interpolation. Experimental results show that the Dice similarity coefficient of proposed method as comparison with two manually outlining results provides over 90% in average.

  11. Unitary equilibrations: probability distribution of the Loschmidt echo

    CERN Document Server

    Venuti, Lorenzo Campos

    2009-01-01

    Closed quantum systems evolve unitarily and therefore cannot converge in a strong sense to an equilibrium state starting out from a generic pure state. Nevertheless for large system size one observes temporal typicality. Namely, for the overwhelming majority of the time instants, the statistics of observables is practically indistinguishable from an effective equilibrium one. In this paper we consider the Loschmidt echo (LE) to study this sort of unitary equilibration after a quench. We draw several conclusions on general grounds and on the basis of an exactly-solvable example of a quasi-free system. In particular we focus on the whole probability distribution of observing a given value of the LE after waiting a long time. Depending on the interplay between the initial state and the quench Hamiltonian, we find different regimes reflecting different equilibration dynamics. When the perturbation is small and the system is away from criticality the probability distribution is Gaussian. However close to criticali...

  12. Measurement of probability distributions for internal stresses in dislocated crystals

    Energy Technology Data Exchange (ETDEWEB)

    Wilkinson, Angus J.; Tarleton, Edmund; Vilalta-Clemente, Arantxa; Collins, David M. [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Jiang, Jun; Britton, T. Benjamin [Department of Materials, Imperial College London, Royal School of Mines, Exhibition Road, London SW7 2AZ (United Kingdom)

    2014-11-03

    Here, we analyse residual stress distributions obtained from various crystal systems using high resolution electron backscatter diffraction (EBSD) measurements. Histograms showing stress probability distributions exhibit tails extending to very high stress levels. We demonstrate that these extreme stress values are consistent with the functional form that should be expected for dislocated crystals. Analysis initially developed by Groma and co-workers for X-ray line profile analysis and based on the so-called “restricted second moment of the probability distribution” can be used to estimate the total dislocation density. The generality of the results are illustrated by application to three quite different systems, namely, face centred cubic Cu deformed in uniaxial tension, a body centred cubic steel deformed to larger strain by cold rolling, and hexagonal InAlN layers grown on misfitting sapphire and silicon carbide substrates.

  13. Fibonacci Sequence, Recurrence Relations, Discrete Probability Distributions and Linear Convolution

    CERN Document Server

    Rajan, Arulalan; Rao, Ashok; Jamadagni, H S

    2012-01-01

    The classical Fibonacci sequence is known to exhibit many fascinating properties. In this paper, we explore the Fibonacci sequence and integer sequences generated by second order linear recurrence relations with positive integer coe?cients from the point of view of probability distributions that they induce. We obtain the generalizations of some of the known limiting properties of these probability distributions and present certain optimal properties of the classical Fibonacci sequence in this context. In addition, we also look at the self linear convolution of linear recurrence relations with positive integer coefficients. Analysis of self linear convolution is focused towards locating the maximum in the resulting sequence. This analysis, also highlights the influence that the largest positive real root, of the "characteristic equation" of the linear recurrence relations with positive integer coefficients, has on the location of the maximum. In particular, when the largest positive real root is 2,the locatio...

  14. Outage probability of distributed beamforming with co-channel interference

    KAUST Repository

    Yang, Liang

    2012-03-01

    In this letter, we consider a distributed beamforming scheme (DBF) in the presence of equal-power co-channel interferers for both amplify-and-forward and decode-and-forward relaying protocols over Rayleigh fading channels. We first derive outage probability expressions for the DBF systems. We then present a performance analysis for a scheme relying on source selection. Numerical results are finally presented to verify our analysis. © 2011 IEEE.

  15. Cosmological constraints from the convergence 1-point probability distribution

    OpenAIRE

    Patton, Kenneth; Blazek, Jonathan; Honscheid, Klaus; Huff, Eric; Melchior, Peter; Ross, Ashley J.; Suchyta, Eric

    2016-01-01

    We examine the cosmological information available from the 1-point probability distribution (PDF) of the weak-lensing convergence field, utilizing fast L-PICOLA simulations and a Fisher analysis. We find competitive constraints in the $\\Omega_m$-$\\sigma_8$ plane from the convergence PDF with $188\\ arcmin^2$ pixels compared to the cosmic shear power spectrum with an equivalent number of modes ($\\ell < 886$). The convergence PDF also partially breaks the degeneracy cosmic shear exhibits in that...

  16. Testing for the maximum cell probabilities in multinomial distributions

    Institute of Scientific and Technical Information of China (English)

    XIONG Shifeng; LI Guoying

    2005-01-01

    This paper investigates one-sided hypotheses testing for p[1], the largest cell probability of multinomial distribution. A small sample test of Ethier (1982) is extended to the general cases. Based on an estimator of p[1], a kind of large sample tests is proposed. The asymptotic power of the above tests under local alternatives is derived. An example is presented at the end of this paper.

  17. Estimating probable flaw distributions in PWR steam generator tubes

    Energy Technology Data Exchange (ETDEWEB)

    Gorman, J.A.; Turner, A.P.L. [Dominion Engineering, Inc., McLean, VA (United States)

    1997-02-01

    This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regarding uncertainties and assumptions in the data and analyses.

  18. Steady-state distributions of probability fluxes on complex networks

    Science.gov (United States)

    Chełminiak, Przemysław; Kurzyński, Michał

    2017-02-01

    We consider a simple model of the Markovian stochastic dynamics on complex networks to examine the statistical properties of the probability fluxes. The additional transition, called hereafter a gate, powered by the external constant force breaks a detailed balance in the network. We argue, using a theoretical approach and numerical simulations, that the stationary distributions of the probability fluxes emergent under such conditions converge to the Gaussian distribution. By virtue of the stationary fluctuation theorem, its standard deviation depends directly on the square root of the mean flux. In turn, the nonlinear relation between the mean flux and the external force, which provides the key result of the present study, allows us to calculate the two parameters that entirely characterize the Gaussian distribution of the probability fluxes both close to as well as far from the equilibrium state. Also, the other effects that modify these parameters, such as the addition of shortcuts to the tree-like network, the extension and configuration of the gate and a change in the network size studied by means of computer simulations are widely discussed in terms of the rigorous theoretical predictions.

  19. EVALUATION OF THE PROBABILITY DISTRIBUTION OF PITTING CORROSION FATIGUE LIFE IN AIRCRAFT MATERIALS

    Institute of Scientific and Technical Information of China (English)

    WANG Qingyuan (王清远); N.KAWAGOISHI; Q.CHEN; R.M.PIDAPARTI

    2003-01-01

    Corrosion and fatigue properties of aircraft materials are known to have a considerable scatter due to the random nature of materials,loading,and environmental conditions.A probabilistic approach for predicting the pitting corrosion fatigue life has been investigated which captures the effect of the interaction of the cyclic load and corrosive environment and all stages of the corrosion fatigue process (i.e.the pit nucleation and growth,pit-crack transition,short- and long-crack propagation).The probabilistic model investigated considers the uncertainties in the initial pit size,corrosion pitting current,and material properties due to the scatter found in the experimental data.Monte Carlo simulations were performed to define the failure probability distribution.Predicted cumulative distribution functions of fatigue life agreed reasonably well with the existing experimental data.

  20. The Probability Distribution Model of Wind Speed over East Malaysia

    Directory of Open Access Journals (Sweden)

    Nurulkamal Masseran

    2013-07-01

    Full Text Available Many studies have found that wind speed is the most significant parameter of wind power. Thus, an accurate determination of the probability distribution of wind speed is an important parameter to measure before estimating the wind energy potential over a particular region. Utilizing an accurate distribution will minimize the uncertainty in wind resource estimates and improve the site assessment phase of planning. In general, different regions have different wind regimes. Hence, it is reasonable that different wind distributions will be found for different regions. Because it is reasonable to consider that wind regimes vary according to the region of a particular country, nine different statistical distributions have been fitted to the mean hourly wind speed data from 20 wind stations in East Malaysia, for the period from 2000 to 2009. The values from Kolmogorov-Smirnov statistic, Akaike’s Information Criteria, Bayesian Information Criteria and R2 correlation coefficient were compared with the distributions to determine the best fit for describing the observed data. A good fit for most of the stations in East Malaysia was found using the Gamma and Burr distributions, though there was no clear pattern observed for all regions in East Malaysia. However, the Gamma distribution was a clear fit to the data from all stations in southern Sabah.

  1. Galactic Subsystems on the Basis of Cumulative Distribution of Space Velocities

    Directory of Open Access Journals (Sweden)

    Vidojević, S.

    2008-12-01

    Full Text Available A sample containing $4,614$ stars with available space velocities and high-quality kinematical data from the Arihip Catalogue is formed. For the purpose of distinguishing galactic subsystems the cumulative distribution of space velocities is studied. The fractions of the three subsystems are found to be: thin disc 92\\%, thick disc 6\\% and halo 2\\%. These results are verified by analysing the elements of velocity ellipsoids and the shape and size of the galactocentric orbits of the sample stars, i.e. the planar and vertical eccentricities of the orbits.

  2. Research on probability distribution of port cargo throughput

    Institute of Scientific and Technical Information of China (English)

    SUN Liang; TAN De-rong

    2008-01-01

    In order to more accurately examine developing trends in gross cargo throughput, we have modeled the probability distribution of cargo throughput. Gross cargo throughput is determined by the time spent by cargo ships in the port and the operating efficiency of handling equipment. Gross cargo throughput is the sum of all compound variables determining each aspect of cargo throughput for every cargo ship arriving at the port. Probability distribution was determined using the Wald equation. The results show that the variability of gross cargo throughput primarily depends on the different times required by different cargo ships arriving at the port. This model overcomes the shortcoming of previous models: inability to accurately determine the probability of a specific value of future gross cargo throughput. Our proposed model of cargo throughput depends on the relationship between time required by a cargo ship arriving at the port and the operational capacity of handling equipment at the port. At the same time, key factors affecting gross cargo throughput are analyzed. In order to test the efficiency of the model, the cargo volume of a port in Shandong Province was used as an example. In the case study the actual results matched our theoretical analysis.

  3. Methods for fitting a parametric probability distribution to most probable number data.

    Science.gov (United States)

    Williams, Michael S; Ebel, Eric D

    2012-07-01

    Every year hundreds of thousands, if not millions, of samples are collected and analyzed to assess microbial contamination in food and water. The concentration of pathogenic organisms at the end of the production process is low for most commodities, so a highly sensitive screening test is used to determine whether the organism of interest is present in a sample. In some applications, samples that test positive are subjected to quantitation. The most probable number (MPN) technique is a common method to quantify the level of contamination in a sample because it is able to provide estimates at low concentrations. This technique uses a series of dilution count experiments to derive estimates of the concentration of the microorganism of interest. An application for these data is food-safety risk assessment, where the MPN concentration estimates can be fitted to a parametric distribution to summarize the range of potential exposures to the contaminant. Many different methods (e.g., substitution methods, maximum likelihood and regression on order statistics) have been proposed to fit microbial contamination data to a distribution, but the development of these methods rarely considers how the MPN technique influences the choice of distribution function and fitting method. An often overlooked aspect when applying these methods is whether the data represent actual measurements of the average concentration of microorganism per milliliter or the data are real-valued estimates of the average concentration, as is the case with MPN data. In this study, we propose two methods for fitting MPN data to a probability distribution. The first method uses a maximum likelihood estimator that takes average concentration values as the data inputs. The second is a Bayesian latent variable method that uses the counts of the number of positive tubes at each dilution to estimate the parameters of the contamination distribution. The performance of the two fitting methods is compared for two

  4. CDF-XL: computing cumulative distribution functions of reaction time data in Excel.

    Science.gov (United States)

    Houghton, George; Grange, James A

    2011-12-01

    In experimental psychology, central tendencies of reaction time (RT) distributions are used to compare different experimental conditions. This emphasis on the central tendency ignores additional information that may be derived from the RT distribution itself. One method for analysing RT distributions is to construct cumulative distribution frequency plots (CDFs; Ratcliff, Psychological Bulletin 86:446-461, 1979). However, this method is difficult to implement in widely available software, severely restricting its use. In this report, we present an Excel-based program, CDF-XL, for constructing and analysing CDFs, with the aim of making such techniques more readily accessible to researchers, including students (CDF-XL can be downloaded free of charge from the Psychonomic Society's online archive). CDF-XL functions as an Excel workbook and starts from the raw experimental data, organised into three columns (Subject, Condition, and RT) on an Input Data worksheet (a point-and-click utility is provided for achieving this format from a broader data set). No further preprocessing or sorting of the data is required. With one click of a button, CDF-XL will generate two forms of cumulative analysis: (1) "standard" CDFs, based on percentiles of participant RT distributions (by condition), and (2) a related analysis employing the participant means of rank-ordered RT bins. Both analyses involve partitioning the data in similar ways, but the first uses a "median"-type measure at the participant level, while the latter uses the mean. The results are presented in three formats: (i) by participants, suitable for entry into further statistical analysis; (ii) grand means by condition; and (iii) completed CDF plots in Excel charts.

  5. Phase diagram of epidemic spreading - unimodal vs. bimodal probability distributions

    CERN Document Server

    Lancic, Alen; Sikic, Mile; Stefancic, Hrvoje

    2009-01-01

    The disease spreading on complex networks is studied in SIR model. Simulations on empirical complex networks reveal two specific regimes of disease spreading: local containment and epidemic outbreak. The variables measuring the extent of disease spreading are in general characterized by a bimodal probability distribution. Phase diagrams of disease spreading for empirical complex networks are introduced. A theoretical model of disease spreading on m-ary tree is investigated both analytically and in simulations. It is shown that the model reproduces qualitative features of phase diagrams of disease spreading observed in empirical complex networks. The role of tree-like structure of complex networks in disease spreading is discussed.

  6. Log-concave Probability Distributions: Theory and Statistical Testing

    DEFF Research Database (Denmark)

    An, Mark Yuing

    1996-01-01

    This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...... and multivariate distributions are also discussed. We propose simple non-parametric testing procedures for log-concavity. The test statistics are constructed to test one of the two implicati ons of log-concavity: increasing hazard rates and new-is-better-than-used (NBU) property. The test for increasing hazard...... rates are based on normalized spacing of the sample order statistics. The tests for NBU property fall into the category of Hoeffding's U-statistics...

  7. Net-proton probability distribution in heavy ion collisions

    CERN Document Server

    Braun-Munzinger, P; Karsch, F; Redlich, K; Skokov, V

    2011-01-01

    We compute net-proton probability distributions in heavy ion collisions within the hadron resonance gas model. The model results are compared with data taken by the STAR Collaboration in Au-Au collisions at sqrt(s_{NN})= 200 GeV for different centralities. We show that in peripheral Au-Au collisions the measured distributions, and the resulting first four moments of net-proton fluctuations, are consistent with results obtained from the hadron resonance gas model. However, data taken in central Au-Au collisions differ from the predictions of the model. The observed deviations can not be attributed to uncertainties in model parameters. We discuss possible interpretations of the observed deviations.

  8. Maximum-entropy probability distributions under Lp-norm constraints

    Science.gov (United States)

    Dolinar, S.

    1991-01-01

    Continuous probability density functions and discrete probability mass functions are tabulated which maximize the differential entropy or absolute entropy, respectively, among all probability distributions with a given L sub p norm (i.e., a given pth absolute moment when p is a finite integer) and unconstrained or constrained value set. Expressions for the maximum entropy are evaluated as functions of the L sub p norm. The most interesting results are obtained and plotted for unconstrained (real valued) continuous random variables and for integer valued discrete random variables. The maximum entropy expressions are obtained in closed form for unconstrained continuous random variables, and in this case there is a simple straight line relationship between the maximum differential entropy and the logarithm of the L sub p norm. Corresponding expressions for arbitrary discrete and constrained continuous random variables are given parametrically; closed form expressions are available only for special cases. However, simpler alternative bounds on the maximum entropy of integer valued discrete random variables are obtained by applying the differential entropy results to continuous random variables which approximate the integer valued random variables in a natural manner. All the results are presented in an integrated framework that includes continuous and discrete random variables, constraints on the permissible value set, and all possible values of p. Understanding such as this is useful in evaluating the performance of data compression schemes.

  9. Cumulative distribution functions associated with bubble-nucleation processes in cavitation

    KAUST Repository

    Watanabe, Hiroshi

    2010-11-15

    Bubble-nucleation processes of a Lennard-Jones liquid are studied by molecular dynamics simulations. Waiting time, which is the lifetime of a superheated liquid, is determined for several system sizes, and the apparent finite-size effect of the nucleation rate is observed. From the cumulative distribution function of the nucleation events, the bubble-nucleation process is found to be not a simple Poisson process but a Poisson process with an additional relaxation time. The parameters of the exponential distribution associated with the process are determined by taking the relaxation time into account, and the apparent finite-size effect is removed. These results imply that the use of the arithmetic mean of the waiting time until a bubble grows to the critical size leads to an incorrect estimation of the nucleation rate. © 2010 The American Physical Society.

  10. Landslide Probability Assessment by the Derived Distributions Technique

    Science.gov (United States)

    Muñoz, E.; Ochoa, A.; Martínez, H.

    2012-12-01

    Landslides are potentially disastrous events that bring along human and economic losses; especially in cities where an accelerated and unorganized growth leads to settlements on steep and potentially unstable areas. Among the main causes of landslides are geological, geomorphological, geotechnical, climatological, hydrological conditions and anthropic intervention. This paper studies landslides detonated by rain, commonly known as "soil-slip", which characterize by having a superficial failure surface (Typically between 1 and 1.5 m deep) parallel to the slope face and being triggered by intense and/or sustained periods of rain. This type of landslides is caused by changes on the pore pressure produced by a decrease in the suction when a humid front enters, as a consequence of the infiltration initiated by rain and ruled by the hydraulic characteristics of the soil. Failure occurs when this front reaches a critical depth and the shear strength of the soil in not enough to guarantee the stability of the mass. Critical rainfall thresholds in combination with a slope stability model are widely used for assessing landslide probability. In this paper we present a model for the estimation of the occurrence of landslides based on the derived distributions technique. Since the works of Eagleson in the 1970s the derived distributions technique has been widely used in hydrology to estimate the probability of occurrence of extreme flows. The model estimates the probability density function (pdf) of the Factor of Safety (FOS) from the statistical behavior of the rainfall process and some slope parameters. The stochastic character of the rainfall is transformed by means of a deterministic failure model into FOS pdf. Exceedance probability and return period estimation is then straightforward. The rainfall process is modeled as a Rectangular Pulses Poisson Process (RPPP) with independent exponential pdf for mean intensity and duration of the storms. The Philip infiltration model

  11. Probability Distribution and Projected Trends of Daily Precipitation in China

    Institute of Scientific and Technical Information of China (English)

    CAO; Li-Ge; ZHONG; Jun; SU; Bu-Da; ZHAI; Jian-Qing; Macro; GEMMER

    2013-01-01

    Based on observed daily precipitation data of 540 stations and 3,839 gridded data from the high-resolution regional climate model COSMO-Climate Limited-area Modeling(CCLM)for 1961–2000,the simulation ability of CCLM on daily precipitation in China is examined,and the variation of daily precipitation distribution pattern is revealed.By applying the probability distribution and extreme value theory to the projected daily precipitation(2011–2050)under SRES A1B scenario with CCLM,trends of daily precipitation series and daily precipitation extremes are analyzed.Results show that except for the western Qinghai-Tibetan Plateau and South China,distribution patterns of the kurtosis and skewness calculated from the simulated and observed series are consistent with each other;their spatial correlation coefcients are above 0.75.The CCLM can well capture the distribution characteristics of daily precipitation over China.It is projected that in some parts of the Jianghuai region,central-eastern Northeast China and Inner Mongolia,the kurtosis and skewness will increase significantly,and precipitation extremes will increase during 2011–2050.The projected increase of maximum daily rainfall and longest non-precipitation period during flood season in the aforementioned regions,also show increasing trends of droughts and floods in the next 40 years.

  12. EVALUATION OF THE PROBABILITY DISTRIBUTION OF PITTING CORROSION FATIGUE LIFE IN AIRCRAFT MATERIALS

    Institute of Scientific and Technical Information of China (English)

    王清远; N.KAWAGOISHI; Q.CHEN; R.M.PIDAPARTI

    2003-01-01

    Corrosion and fatigue properties of aircraft materials axe known to have a considerablescatter due to the random nature of materials, loading, and environmental conditions. A probabilisticapproach for predicting the pitting corrosion fatigue life has been investigated which captures the effectof the interaction of the cyclic load and corrosive environment and all stages of the corrosion fatigueprocess (i.e. the pit nucleation and growth, pit-crack transition, short- and long-crack propagation).The probabilistic model investigated considers the uncertainties in the initial pit size, corrosion pittingcurrent, and material properties due to the scatter found in the experimental data. Monte Carlo simu-lations were performed to define the failure probability distribution. Predicted cumulative distributionfunctions of fatigue life agreed reasonably well with the existing experimental data.

  13. Uncertainty squared: Choosing among multiple input probability distributions and interpreting multiple output probability distributions in Monte Carlo climate risk models

    Science.gov (United States)

    Baer, P.; Mastrandrea, M.

    2006-12-01

    Simple probabilistic models which attempt to estimate likely transient temperature change from specified CO2 emissions scenarios must make assumptions about at least six uncertain aspects of the causal chain between emissions and temperature: current radiative forcing (including but not limited to aerosols), current land use emissions, carbon sinks, future non-CO2 forcing, ocean heat uptake, and climate sensitivity. Of these, multiple PDFs (probability density functions) have been published for the climate sensitivity, a couple for current forcing and ocean heat uptake, one for future non-CO2 forcing, and none for current land use emissions or carbon cycle uncertainty (which are interdependent). Different assumptions about these parameters, as well as different model structures, will lead to different estimates of likely temperature increase from the same emissions pathway. Thus policymakers will be faced with a range of temperature probability distributions for the same emissions scenarios, each described by a central tendency and spread. Because our conventional understanding of uncertainty and probability requires that a probabilistically defined variable of interest have only a single mean (or median, or modal) value and a well-defined spread, this "multidimensional" uncertainty defies straightforward utilization in policymaking. We suggest that there are no simple solutions to the questions raised. Crucially, we must dispel the notion that there is a "true" probability probabilities of this type are necessarily subjective, and reasonable people may disagree. Indeed, we suggest that what is at stake is precisely the question, what is it reasonable to believe, and to act as if we believe? As a preliminary suggestion, we demonstrate how the output of a simple probabilistic climate model might be evaluated regarding the reasonableness of the outputs it calculates with different input PDFs. We suggest further that where there is insufficient evidence to clearly

  14. Non-Gaussian probability distributions of solar wind fluctuations

    Directory of Open Access Journals (Sweden)

    E. Marsch

    Full Text Available The probability distributions of field differences ∆x(τ=x(t+τ-x(t, where the variable x(t may denote any solar wind scalar field or vector field component at time t, have been calculated from time series of Helios data obtained in 1976 at heliocentric distances near 0.3 AU. It is found that for comparatively long time lag τ, ranging from a few hours to 1 day, the differences are normally distributed according to a Gaussian. For shorter time lags, of less than ten minutes, significant changes in shape are observed. The distributions are often spikier and narrower than the equivalent Gaussian distribution with the same standard deviation, and they are enhanced for large, reduced for intermediate and enhanced for very small values of ∆x. This result is in accordance with fluid observations and numerical simulations. Hence statistical properties are dominated at small scale τ by large fluctuation amplitudes that are sparsely distributed, which is direct evidence for spatial intermittency of the fluctuations. This is in agreement with results from earlier analyses of the structure functions of ∆x. The non-Gaussian features are differently developed for the various types of fluctuations. The relevance of these observations to the interpretation and understanding of the nature of solar wind magnetohydrodynamic (MHD turbulence is pointed out, and contact is made with existing theoretical concepts of intermittency in fluid turbulence.

  15. Subspace Learning via Local Probability Distribution for Hyperspectral Image Classification

    Directory of Open Access Journals (Sweden)

    Huiwu Luo

    2015-01-01

    Full Text Available The computational procedure of hyperspectral image (HSI is extremely complex, not only due to the high dimensional information, but also due to the highly correlated data structure. The need of effective processing and analyzing of HSI has met many difficulties. It has been evidenced that dimensionality reduction has been found to be a powerful tool for high dimensional data analysis. Local Fisher’s liner discriminant analysis (LFDA is an effective method to treat HSI processing. In this paper, a novel approach, called PD-LFDA, is proposed to overcome the weakness of LFDA. PD-LFDA emphasizes the probability distribution (PD in LFDA, where the maximum distance is replaced with local variance for the construction of weight matrix and the class prior probability is applied to compute the affinity matrix. The proposed approach increases the discriminant ability of the transformed features in low dimensional space. Experimental results on Indian Pines 1992 data indicate that the proposed approach significantly outperforms the traditional alternatives.

  16. Some Useful Distributions and Probabilities for Cellular Networks

    CERN Document Server

    Yu, Seung Min

    2011-01-01

    The cellular network is one of the most useful networks for wireless communications and now universally used. There have been a lot of analytic results about the performance of the mobile user at a specific location such as the cell center or edge. On the other hand, there have been few analytic results about the performance of the mobile user at an arbitrary location. Moreover, to the best of our knowledge, there is no analytic result on the performance of the mobile user at an arbitrary location considering the mobile user density. In this paper, we use the stochastic geometry approach and derive useful distributions and probabilities for cellular networks. Using those, we analyze the performance of the mobile user, e.g., outage probability at an arbitrary location considering the mobile user density. Under some assumptions, those can be expressed by closed form formulas. Our analytic results will provide a fundamental framework for the performance analysis of cellular networks, which will significantly red...

  17. Characterizing the \\lyaf\\ flux probability distribution function using Legendre polynomials

    CERN Document Server

    Cieplak, Agnieszka M

    2016-01-01

    The Lyman-$\\alpha$ forest is a highly non-linear field with a lot of information available in the data beyond the power spectrum. The flux probability distribution function (PDF) has been used as a successful probe of small-scale physics. In this paper we argue that measuring coefficients of the Legendre polyonomial expansion of the PDF offers several advantages over measuring the binned values as is commonly done. In particular, $n$-th coefficient can be expressed as a linear combination of the first $n$ moments, allowing these coefficients to be measured in the presence of noise and allowing a clear route for marginalisation over mean flux. Moreover, in the presence of noise, our numerical work shows that a finite number of coefficients are well measured with very sharp transition into noise dominance. This compresses the available information into a small number of well-measured quantities.

  18. Cosmological constraints from the convergence 1-point probability distribution

    CERN Document Server

    Patton, Kenneth; Honscheid, Klaus; Huff, Eric; Melchior, Peter; Ross, Ashley J; Suchyta, Eric

    2016-01-01

    We examine the cosmological information available from the 1-point probability distribution (PDF) of the weak-lensing convergence field, utilizing fast L-PICOLA simulations and a Fisher analysis. We find competitive constraints in the $\\Omega_m$-$\\sigma_8$ plane from the convergence PDF with $188\\ arcmin^2$ pixels compared to the cosmic shear power spectrum with an equivalent number of modes ($\\ell < 886$). The convergence PDF also partially breaks the degeneracy cosmic shear exhibits in that parameter space. A joint analysis of the convergence PDF and shear 2-point function also reduces the impact of shape measurement systematics, to which the PDF is less susceptible, and improves the total figure of merit by a factor of $2-3$, depending on the level of systematics. Finally, we present a correction factor necessary for calculating the unbiased Fisher information from finite differences using a limited number of cosmological simulations.

  19. A probability distribution approach to synthetic turbulence time series

    Science.gov (United States)

    Sinhuber, Michael; Bodenschatz, Eberhard; Wilczek, Michael

    2016-11-01

    The statistical features of turbulence can be described in terms of multi-point probability density functions (PDFs). The complexity of these statistical objects increases rapidly with the number of points. This raises the question of how much information has to be incorporated into statistical models of turbulence to capture essential features such as inertial-range scaling and intermittency. Using high Reynolds number hot-wire data obtained at the Variable Density Turbulence Tunnel at the Max Planck Institute for Dynamics and Self-Organization, we establish a PDF-based approach on generating synthetic time series that reproduce those features. To do this, we measure three-point conditional PDFs from the experimental data and use an adaption-rejection method to draw random velocities from this distribution to produce synthetic time series. Analyzing these synthetic time series, we find that time series based on even low-dimensional conditional PDFs already capture some essential features of real turbulent flows.

  20. Probability distributions for one component equations with multiplicative noise

    CERN Document Server

    Deutsch, J M

    1993-01-01

    Abstract: Systems described by equations involving both multiplicative and additive noise are common in nature. Examples include convection of a passive scalar field, polymersin turbulent flow, and noise in dye lasers. In this paper the one component version of this problem is studied. The steady state probability distribution is classified into two different types of behavior. One class has power law tails and the other is of the form of an exponential to a power law. The value of the power law exponent is determined analytically for models having colored gaussian noise. It is found to only depend on the power spectrum of the noise at zero frequency. When non-gaussian noise is considered it is shown that stretched exponential tails are possible. An intuitive understanding of the results is found and makes use of the Lyapunov exponents for these systems.

  1. Gesture Recognition Based on the Probability Distribution of Arm Trajectories

    Science.gov (United States)

    Wan, Khairunizam; Sawada, Hideyuki

    The use of human motions for the interaction between humans and computers is becoming an attractive alternative to verbal media, especially through the visual interpretation of the human body motion. In particular, hand gestures are used as non-verbal media for the humans to communicate with machines that pertain to the use of the human gestures to interact with them. This paper introduces a 3D motion measurement of the human upper body for the purpose of the gesture recognition, which is based on the probability distribution of arm trajectories. In this study, by examining the characteristics of the arm trajectories given by a signer, motion features are selected and classified by using a fuzzy technique. Experimental results show that the use of the features extracted from arm trajectories effectively works on the recognition of dynamic gestures of a human, and gives a good performance to classify various gesture patterns.

  2. Seismic pulse propagation with constant Q and stable probability distributions

    Directory of Open Access Journals (Sweden)

    M. Tomirotti

    1997-06-01

    Full Text Available The one-dimensional propagation of seismic waves with constant Q is shown to be governed by an evolution equation of fractional order in time, which interpolates the heat equation and the wave equation. The fundamental solutions for the Cauchy and Signalling problems are expressed in terms of entire functions (of Wright type in the similarity variable and their behaviours turn out to be intermediate between those for the limiting cases of a perfectly viscous fluid and a perfectly elastic solid. In view of the small dissipation exhibited by the seismic pulses, the nearly elastic limit is considered. Furthermore, the fundamental solutions for the Cauchy and Signalling problems are shown to be related to stable probability distributions with an index of stability determined by the order of the fractional time derivative in the evolution equation.

  3. Seismic pulse propagation with constant Q and stable probability distributions

    CERN Document Server

    Mainardi, Francesco

    2010-01-01

    The one-dimensional propagation of seismic waves with constant Q is shown to be governed by an evolution equation of fractional order in time, which interpolates the heat equation and the wave equation. The fundamental solutions for the Cauchy and Signalling problems are expressed in terms of entire functions (of Wright type) in the similarity variable and their behaviours turn out to be intermediate between those for the limiting cases of a perfectly viscous fluid and a perfectly elastic solid. In view of the small dissipation exhibited by the seismic pulses, the nearly elastic limit is considered. Furthermore, the fundamental solutions for the Cauchy and Signalling problems are shown to be related to stable probability distributions with index of stability determined by the order of the fractional time derivative in the evolution equation.

  4. probably

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    【说词】1. He can probably tell us the truth.2. Will it rain this afternoong ? Probably【解语】作副词,意为“大概、或许”,表示可能性很大,通常指根据目前情况作出积极推测或判断;

  5. Multifractality of stock markets based on cumulative distribution function and multiscale multifractal analysis

    Science.gov (United States)

    Lin, Aijing; Shang, Pengjian

    2016-04-01

    Considering the diverse application of multifractal techniques in natural scientific disciplines, this work underscores the versatility of multiscale multifractal detrended fluctuation analysis (MMA) method to investigate artificial and real-world data sets. The modified MMA method based on cumulative distribution function is proposed with the objective of quantifying the scaling exponent and multifractality of nonstationary time series. It is demonstrated that our approach can provide a more stable and faithful description of multifractal properties in comprehensive range rather than fixing the window length and slide length. Our analyzes based on CDF-MMA method reveal significant differences in the multifractal characteristics in the temporal dynamics between US and Chinese stock markets, suggesting that these two stock markets might be regulated by very different mechanism. The CDF-MMA method is important for evidencing the stable and fine structure of multiscale and multifractal scaling behaviors and can be useful to deepen and broaden our understanding of scaling exponents and multifractal characteristics.

  6. Insights from probability distribution functions of intensity maps

    CERN Document Server

    Breysse, Patrick C; Behroozi, Peter S; Dai, Liang; Kamionkowski, Marc

    2016-01-01

    In the next few years, intensity-mapping surveys that target lines such as CO, Ly$\\alpha$, and CII stand to provide powerful probes of high-redshift astrophysics. However, these line emissions are highly non-Gaussian, and so the typical power-spectrum methods used to study these maps will leave out a significant amount of information. We propose a new statistic, the probability distribution of voxel intensities, which can access this extra information. Using a model of a CO intensity map at $z\\sim3$ as an example, we demonstrate that this voxel intensity distribution (VID) provides substantial constraining power beyond what is obtainable from the power spectrum alone. We find that a future survey similar to the planned COMAP Full experiment could constrain the CO luminosity function to order $\\sim10\\%$. We also explore the effects of contamination from continuum emission, interloper lines, and gravitational lensing on our constraints and find that the VID statistic retains significant constraining power even ...

  7. A Monte Carlo procedure for the construction of complementary cumulative distribution functions for comparison with the EPA release limits for radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C.; Shiver, A.W.

    1994-10-01

    A Monte Carlo procedure for the construction of complementary cumulative distribution functions (CCDFs) for comparison with the US Environmental Protection Agency (EPA) release limits for radioactive waste disposal (40 CFR 191, Subpart B) is described and illustrated with results from a recent performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP). The Monte Carlo procedure produces CCDF estimates similar to those obtained with stratified sampling in several recent PAs for the WIPP. The advantages of the Monte Carlo procedure over stratified sampling include increased resolution in the calculation of probabilities for complex scenarios involving drilling intrusions and better use of the necessarily limited number of mechanistic calculations that underlie CCDF construction.

  8. Simulations of the Hadamard Variance: Probability Distributions and Confidence Intervals.

    Science.gov (United States)

    Ashby, Neil; Patla, Bijunath

    2016-04-01

    Power-law noise in clocks and oscillators can be simulated by Fourier transforming a modified spectrum of white phase noise. This approach has been applied successfully to simulation of the Allan variance and the modified Allan variance in both overlapping and nonoverlapping forms. When significant frequency drift is present in an oscillator, at large sampling times the Allan variance overestimates the intrinsic noise, while the Hadamard variance is insensitive to frequency drift. The simulation method is extended in this paper to predict the Hadamard variance for the common types of power-law noise. Symmetric real matrices are introduced whose traces-the sums of their eigenvalues-are equal to the Hadamard variances, in overlapping or nonoverlapping forms, as well as for the corresponding forms of the modified Hadamard variance. We show that the standard relations between spectral densities and Hadamard variance are obtained with this method. The matrix eigenvalues determine probability distributions for observing a variance at an arbitrary value of the sampling interval τ, and hence for estimating confidence in the measurements.

  9. The cumulative overlap distribution function in spin glasses: mean field vs. three dimensions

    Science.gov (United States)

    Yllanes, David; Billoire, Alain; Maiorano, Andrea; Marinari, Enzo; Martin-Mayor, Victor

    2015-03-01

    We use a sample-dependent analysis, based on medians and quantiles, to analyze the behavior of the overlap probability distribution in spin glasses. Using analytical and numerical mean-field results for the Sherrington-Kirkpatrick model, as well as data from toy models, we show that this approach is an effective tool to distinguish the low-temperature behavior of replica symmmetry breaking systems from that expected in the droplet picture. An application of the method to the three-dimensional Edwards-Anderson models shows agreement with the replica symmetry breaking predictions. Supported by ERC Grant No. 247328 and from MINECO (Spain), Contract No. FIS2012-35719-C02.

  10. Energy probability distribution zeros: A route to study phase transitions

    Science.gov (United States)

    Costa, B. V.; Mól, L. A. S.; Rocha, J. C. S.

    2017-07-01

    In the study of phase transitions a very few models are accessible to exact solution. In most cases analytical simplifications have to be done or some numerical techniques have to be used to get insight about their critical properties. Numerically, the most common approaches are those based on Monte Carlo simulations together with finite size scaling analysis. The use of Monte Carlo techniques requires the estimation of quantities like the specific heat or susceptibilities in a wide range of temperaturesor the construction of the density of states in large intervals of energy. Although many of these techniques are well developed they may be very time consuming when the system size becomes large enough. It should be suitable to have a method that could surpass those difficulties. In this work we present an iterative method to study the critical behavior of a system based on the partial knowledge of the complex Fisher zeros set of the partition function. The method is general with advantages over most conventional techniques since it does not need to identify any order parameter a priori. The critical temperature and exponents can be obtained with great precision even in the most unamenable cases like the two dimensional XY model. To test the method and to show how it works we applied it to some selected models where the transitions are well known: The 2D Ising, Potts and XY models and to a homopolymer system. Our choices cover systems with first order, continuous and Berezinskii-Kosterlitz-Thouless transitions as well as the homopolymer that has two pseudo-transitions. The strategy can easily be adapted to any model, classical or quantum, once we are able to build the corresponding energy probability distribution.

  11. Cumulative luminosity distributions of Supergiant Fast X-ray Transients in hard X-rays

    CERN Document Server

    Paizis, A

    2014-01-01

    We have analyzed in a systematic way about nine years of INTEGRAL data (17-100 keV) focusing on Supergiant Fast X-ray Transients (SFXTs) and three classical High Mass X-ray Binaries (HMXBs). Our approach has been twofold: image based analysis, sampled over a ~ks time frame to investigate the long-term properties of the sources, and lightcurve based analysis, sampled over a 100s time frame to seize the fast variability of each source during its ~ks activity. We find that while the prototypical SFXTs (IGR J17544-2619, XTE J1739-302 and SAX J1818.6-1703) are among the sources with the lowest ~ks based duty cycle ($<$1% activity over nine years of data), when studied at the 100s level, they are the ones with the highest detection percentage, meaning that, when active, they tend to have many bright short-term flares with respect to the other SFXTs. To investigate in a coherent and self consistent way all the available results within a physical scenario, we have extracted cumulative luminosity distributions for ...

  12. Performance Probability Distributions for Sediment Control Best Management Practices

    Science.gov (United States)

    Ferrell, L.; Beighley, R.; Walsh, K.

    2007-12-01

    Controlling soil erosion and sediment transport can be a significant challenge during the construction process due to the extent and conditions of bare, disturbed soils. Best Management Practices (BMPs) are used as the framework for the design of sediment discharge prevention systems in stormwater pollution prevention plans which are typically required for construction sites. This research focuses on commonly-used BMP systems for perimeter control of sediment export: silt fences and fiber rolls. Although these systems are widely used, the physical and engineering parameters describing their performance are not well understood. Performance expectations are based on manufacturer results, but due to the dynamic conditions that exist on a construction site performance expectations are not always achievable in the field. Based on experimental results product performance is shown to be highly variable. Experiments using the same installation procedures show inconsistent sediment removal performances ranging from (>)85 percent to zero. The goal of this research is to improve the determination of off-site sediment yield based on probabilistic performance results of perimeter control BMPs. BMPs are evaluated in the Soil Erosion Research Laboratory (SERL) in the Civil and Environmental Engineering department at San Diego State University. SERL experiments are performed on a 3-m by 10-m tilting soil bed with a soil depth of 0.5 meters and a slope of 33 percent. The simulated storm event consists of 17 mm/hr for 20 minutes followed by 51 mm/hr for 30 minutes. The storm event is based on an ASTM design storm intended to simulate BMP failures. BMP performance is assessed based on experiments where BMPs are installed per manufacture specifications, less than optimal installations, and no treatment conditions. Preliminary results from 30 experiments are presented and used to develop probability distributions for BMP sediment removal efficiencies. The results are then combined with

  13. Size effect on strength and lifetime probability distributions of quasibrittle structures

    Indian Academy of Sciences (India)

    Zdeněk P Bažant; Jia-Liang Le

    2012-02-01

    Engineering structures such as aircraft, bridges, dams, nuclear containments and ships, as well as computer circuits, chips and MEMS, should be designed for failure probability < $10^{-6}-10^{-7}$ per lifetime. The safety factors required to ensure it are still determined empirically, even though they represent much larger and much more uncertain corrections to deterministic calculations than do the typical errors of modern computer analysis of structures. The empirical approach is sufficient for perfectly brittle and perfectly ductile structures since the cumulative distribution function (cdf) of random strength is known, making it possible to extrapolate to the tail from the mean and variance. However, the empirical approach does not apply to structures consisting of quasibrittle materials, which are brittle materials with inhomogeneities that are not negligible compared to structure size. This paper presents a refined theory on the strength distribution of quasibrittle structures, which is based on the fracture mechanics of nanocracks propagating by activation energy controlled small jumps through the atomic lattice and an analytical model for the multi-scale transition of strength statistics. Based on the power law for creep crack growth rate and the cdf of material strength, the lifetime distribution of quasibrittle structures under constant load is derived. Both the strength and lifetime cdfs are shown to be sizeand geometry-dependent. The theory predicts intricate size effects on both the mean structural strength and lifetime, the latter being much stronger. The theory is shown to match the experimentally observed systematic deviations of strength and lifetime histograms of industrial ceramics from the Weibull distribution.

  14. The Probability Distribution of Inter-car Spacings

    Science.gov (United States)

    Xian, Jin Guo; Han, Dong

    In this paper, the celluar automation model with Fukui-Ishibashi-type acceleration rule is used to study the inter-car spacing distribution for traffic flow. The method used in complex network analysis is applied to study the spacings distribution. By theoretical analysis, we obtain the result that the distribution of inter-car spacings follows power law when vehicle density is low and spacing is not large, while, when the vehicle density is high or the spacing is large, the distribution can be described by exponential distribution. Moreover, the numerical simulations support the theoretical result.

  15. The Impact of an Instructional Intervention Designed to Support Development of Stochastic Understanding of Probability Distribution

    Science.gov (United States)

    Conant, Darcy Lynn

    2013-01-01

    Stochastic understanding of probability distribution undergirds development of conceptual connections between probability and statistics and supports development of a principled understanding of statistical inference. This study investigated the impact of an instructional course intervention designed to support development of stochastic…

  16. The Impact of an Instructional Intervention Designed to Support Development of Stochastic Understanding of Probability Distribution

    Science.gov (United States)

    Conant, Darcy Lynn

    2013-01-01

    Stochastic understanding of probability distribution undergirds development of conceptual connections between probability and statistics and supports development of a principled understanding of statistical inference. This study investigated the impact of an instructional course intervention designed to support development of stochastic…

  17. Measuring Robustness of Timetables in Stations using a Probability Distribution

    DEFF Research Database (Denmark)

    Jensen, Lars Wittrup; Landex, Alex

    of a station based on the plan of operation and the minimum headway times However, none of the above methods take a given timetable into account when the complexity of the station is calculated. E.g. two timetable candidates are given following the same plan of operation in a station; one will be more...... vulnerable to delays (less robust) while the other will be less vulnerable (more robust), but this cannot be measured by the above methods. In the light of this, the article will describe a new method where the complexity of a given station with a given timetable can be calculated based on a probability...... delays caused by interdependencies, and result in a more robust operation. Currently three methods to calculate the complexity of station exists: 1. Complexity of a station based on the track layout 2. Complexity of a station based on the probability of a conflict using a plan of operation 3. Complexity...

  18. Improving environmental exposure analysis using cumulative distribution functions and individual geocoding

    Directory of Open Access Journals (Sweden)

    Chakraborty Jayajit

    2006-05-01

    Full Text Available Abstract Background Assessments of environmental exposure and health risks that utilize Geographic Information Systems (GIS often make simplifying assumptions when using: (a one or more discrete buffer distances to define the spatial extent of impacted regions, and (b aggregated demographic data at the level of census enumeration units to derive the characteristics of the potentially exposed population. A case-study of school children in Orange County, Florida, is used to demonstrate how these limitations can be overcome by the application of cumulative distribution functions (CDFs and individual geocoded locations. Exposure potential for 159,923 school children was determined at the childrens' home residences and at school locations by determining the distance to the nearest gasoline station, stationary air pollution source, and industrial facility listed in the Toxic Release Inventory (TRI. Errors and biases introduced by the use of discrete buffer distances and data aggregation were examined. Results The use of discrete buffers distances in proximity-based exposure analysis introduced substantial bias in terms of determining the potentially exposed population, and the results are strongly dependent on the choice of buffer distance(s. Comparisons of exposure potential between home and school locations indicated that different buffer distances yield different results and contradictory conclusions. The use of a CDF provided a much more meaningful representation and is not based on the a-priori assumption that any particular distance is more relevant than another. The use of individual geocoded locations also provided a more accurate characterization of the exposed population and allowed for more reliable comparisons among sub-groups. In the comparison of children's home residences and school locations, the use of data aggregated at the census block group and tract level introduced variability as well as bias, leading to incorrect conclusions as

  19. Interacting discrete Markov processes with power-law probability distributions

    Science.gov (United States)

    Ridley, Kevin D.; Jakeman, Eric

    2017-09-01

    During recent years there has been growing interest in the occurrence of long-tailed distributions, also known as heavy-tailed or fat-tailed distributions, which can exhibit power-law behaviour and often characterise physical systems that undergo very large fluctuations. In this paper we show that the interaction between two discrete Markov processes naturally generates a time-series characterised by such a distribution. This possibility is first demonstrated by numerical simulation and then confirmed by a mathematical analysis that enables the parameter range over which the power-law occurs to be quantified. The results are supported by comparison of numerical results with theoretical predictions and general conclusions are drawn regarding mechanisms that can cause this behaviour.

  20. Martingale Couplings and Bounds on the Tails of Probability Distributions

    CERN Document Server

    Luh, Kyle J

    2011-01-01

    Hoeffding has shown that tail bounds on the distribution for sampling from a finite population with replacement also apply to the corresponding cases of sampling without replacement. (A special case of this result is that binomial tail bounds apply to the corresponding hypergeometric tails.) We give a new proof of Hoeffding's result by constructing a martingale coupling between the sampling distributions. This construction is given by an explicit combinatorial procedure involving balls and urns. We then apply this construction to create martingale couplings between other pairs of sampling distributions, both without replacement and with "surreplacement" (that is, sampling in which not only is the sampled individual replaced, but some number of "copies" of that individual are added to the population).

  1. Subjective Probability Distribution Elicitation in Cost Risk Analysis: A Review

    Science.gov (United States)

    2007-01-01

    DeGroot , Morris H., Optimal Statistical Decisions, New York: McGraw-Hill, 1970. Dewar, James A., Assumption-Based Planning: A Tool for Reducing...formal decision-analysis point of view. See DeGroot (1970) for a clear exposition of utility in decision analysis. 2 For the triangle distribution, the

  2. Sampling Random Bioinformatics Puzzles using Adaptive Probability Distributions

    DEFF Research Database (Denmark)

    Have, Christian Theil; Appel, Emil Vincent; Bork-Jensen, Jette

    2016-01-01

    We present a probabilistic logic program to generate an educational puzzle that introduces the basic principles of next generation sequencing, gene finding and the translation of genes to proteins following the central dogma in biology. In the puzzle, a secret "protein word" must be found...... by assembling DNA from fragments (reads), locating a gene in this sequence and translating the gene to a protein. Sampling using this program generates random instance of the puzzle, but it is possible constrain the difficulty and to customize the secret protein word. Because of these constraints...... and the randomness of the generation process, sampling may fail to generate a satisfactory puzzle. To avoid failure we employ a strategy using adaptive probabilities which change in response to previous steps of generative process, thus minimizing the risk of failure....

  3. Extreme Points of the Convex Set of Joint Probability Distributions with Fixed Marginals

    Indian Academy of Sciences (India)

    K R Parthasarathy

    2007-11-01

    By using a quantum probabilistic approach we obtain a description of the extreme points of the convex set of all joint probability distributions on the product of two standard Borel spaces with fixed marginal distributions.

  4. Probability distribution analysis of observational extreme events and model evaluation

    Science.gov (United States)

    Yu, Q.; Lau, A. K. H.; Fung, J. C. H.; Tsang, K. T.

    2016-12-01

    Earth's surface temperatures were the warmest in 2015 since modern record-keeping began in 1880, according to the latest study. In contrast, a cold weather occurred in many regions of China in January 2016, and brought the first snowfall to Guangzhou, the capital city of Guangdong province in 67 years. To understand the changes of extreme weather events as well as project its future scenarios, this study use statistical models to analyze on multiple climate data. We first use Granger-causality test to identify the attribution of global mean temperature rise and extreme temperature events with CO2 concentration. The four statistical moments (mean, variance, skewness, kurtosis) of daily maximum temperature distribution is investigated on global climate observational, reanalysis (1961-2010) and model data (1961-2100). Furthermore, we introduce a new tail index based on the four moments, which is a more robust index to measure extreme temperatures. Our results show that the CO2 concentration can provide information to the time series of mean and extreme temperature, but not vice versa. Based on our new tail index, we find that other than mean and variance, skewness is an important indicator should be considered to estimate extreme temperature changes and model evaluation. Among the 12 climate model data we investigate, the fourth version of Community Climate System Model (CCSM4) from National Center for Atmospheric Research performs well on the new index we introduce, which indicate the model have a substantial capability to project the future changes of extreme temperature in the 21st century. The method also shows its ability to measure extreme precipitation/ drought events. In the future we will introduce a new diagram to systematically evaluate the performance of the four statistical moments in climate model output, moreover, the human and economic impacts of extreme weather events will also be conducted.

  5. Batch Mode Active Sampling based on Marginal Probability Distribution Matching.

    Science.gov (United States)

    Chattopadhyay, Rita; Wang, Zheng; Fan, Wei; Davidson, Ian; Panchanathan, Sethuraman; Ye, Jieping

    2012-01-01

    Active Learning is a machine learning and data mining technique that selects the most informative samples for labeling and uses them as training data; it is especially useful when there are large amount of unlabeled data and labeling them is expensive. Recently, batch-mode active learning, where a set of samples are selected concurrently for labeling, based on their collective merit, has attracted a lot of attention. The objective of batch-mode active learning is to select a set of informative samples so that a classifier learned on these samples has good generalization performance on the unlabeled data. Most of the existing batch-mode active learning methodologies try to achieve this by selecting samples based on varied criteria. In this paper we propose a novel criterion which achieves good generalization performance of a classifier by specifically selecting a set of query samples that minimizes the difference in distribution between the labeled and the unlabeled data, after annotation. We explicitly measure this difference based on all candidate subsets of the unlabeled data and select the best subset. The proposed objective is an NP-hard integer programming optimization problem. We provide two optimization techniques to solve this problem. In the first one, the problem is transformed into a convex quadratic programming problem and in the second method the problem is transformed into a linear programming problem. Our empirical studies using publicly available UCI datasets and a biomedical image dataset demonstrate the effectiveness of the proposed approach in comparison with the state-of-the-art batch-mode active learning methods. We also present two extensions of the proposed approach, which incorporate uncertainty of the predicted labels of the unlabeled data and transfer learning in the proposed formulation. Our empirical studies on UCI datasets show that incorporation of uncertainty information improves performance at later iterations while our studies on 20

  6. Exact probability distribution for the Bernoulli-Malthus-Verhulst model driven by a multiplicative colored noise

    Science.gov (United States)

    Calisto, H.; Bologna, M.

    2007-05-01

    We report an exact result for the calculation of the probability distribution of the Bernoulli-Malthus-Verhulst model driven by a multiplicative colored noise. We study the conditions under which the probability distribution of the Malthus-Verhulst model can exhibit a transition from a unimodal to a bimodal distribution depending on the value of a critical parameter. Also we show that the mean value of x(t) in the latter model always approaches asymptotically the value 1.

  7. The equilibrium probability distribution of a conductive sphere's floating charge in a collisionless, drifting Maxwellian plasma

    CERN Document Server

    Thomas, Drew M

    2013-01-01

    A dust grain in a plasma has a fluctuating electric charge, and past work concludes that spherical grains in a stationary, collisionless plasma have an essentially Gaussian charge probability distribution. This paper extends that work to flowing plasmas and arbitrarily large spheres, deriving analytic charge probability distributions up to normalizing constants. We find that these distributions also have good Gaussian approximations, with analytic expressions for their mean and variance.

  8. The probability distribution of the predicted CFM-induced ozone depletion. [Chlorofluoromethane

    Science.gov (United States)

    Ehhalt, D. H.; Chang, J. S.; Bulter, D. M.

    1979-01-01

    It is argued from the central limit theorem that the uncertainty in model predicted changes of the ozone column density is best represented by a normal probability density distribution. This conclusion is validated by comparison with a probability distribution generated by a Monte Carlo technique. In the case of the CFM-induced ozone depletion, and based on the estimated uncertainties in the reaction rate coefficients alone the relative mean standard deviation of this normal distribution is estimated to be 0.29.

  9. Constructing the probability distribution function for the total capacity of a power system

    Energy Technology Data Exchange (ETDEWEB)

    Vasin, V.P.; Prokhorenko, V.I.

    1980-01-01

    The difficulties involved in constructing the probability distribution function for the total capacity of a power system consisting of numerous power plants are discussed. A method is considered for the approximate determination of such a function by a Monte Carlo method and by exact calculation based on special recursion formulas on a particular grid of argument values. It is shown that there may be significant deviations between the true probability distribution and a normal distribution.

  10. A New Probability of Detection Model for Updating Crack Distribution of Offshore Structures

    Institute of Scientific and Technical Information of China (English)

    李典庆; 张圣坤; 唐文勇

    2003-01-01

    There exists model uncertainty of probability of detection for inspecting ship structures with nondestructive inspection techniques. Based on a comparison of several existing probability of detection (POD) models, a new probability of detection model is proposed for the updating of crack size distribution. Furthermore, the theoretical derivation shows that most existing probability of detection models are special cases of the new probability of detection model. The least square method is adopted for determining the values of parameters in the new POD model. This new model is also compared with other existing probability of detection models. The results indicate that the new probability of detection model can fit the inspection data better. This new probability of detection model is then applied to the analysis of the problem of crack size updating for offshore structures. The Bayesian updating method is used to analyze the effect of probability of detection models on the posterior distribution of a crack size. The results show that different probabilities of detection models generate different posterior distributions of a crack size for offshore structures.

  11. Some possible q-exponential type probability distribution in the non-extensive statistical physics

    Science.gov (United States)

    Chung, Won Sang

    2016-08-01

    In this paper, we present two exponential type probability distributions which are different from Tsallis’s case which we call Type I: one given by pi = 1 Zq[eq(Ei)]-β (Type IIA) and another given by pi = 1 Zq[eq(-β)]Ei (Type IIIA). Starting with the Boltzman-Gibbs entropy, we obtain the different probability distribution by using the Kolmogorov-Nagumo average for the microstate energies. We present the first-order differential equations related to Types I, II and III. For three types of probability distributions, we discuss the quantum harmonic oscillator, two-level problem and the spin-1 2 paramagnet.

  12. Probability distributions for directed polymers in random media with correlated noise

    Science.gov (United States)

    Chu, Sherry; Kardar, Mehran

    2016-07-01

    The probability distribution for the free energy of directed polymers in random media (DPRM) with uncorrelated noise in d =1 +1 dimensions satisfies the Tracy-Widom distribution. We inquire if and how this universal distribution is modified in the presence of spatially correlated noise. The width of the distribution scales as the DPRM length to an exponent β , in good (but not full) agreement with previous renormalization group and numerical results. The scaled probability is well described by the Tracy-Widom form for uncorrelated noise, but becomes symmetric with increasing correlation exponent. We thus find a class of distributions that continuously interpolates between Tracy-Widom and Gaussian forms.

  13. Collective motions of globally coupled oscillators and some probability distributions on circle

    Energy Technology Data Exchange (ETDEWEB)

    Jaćimović, Vladimir [Faculty of Natural Sciences and Mathematics, University of Montenegro, Cetinjski put, bb., 81000 Podgorica (Montenegro); Crnkić, Aladin, E-mail: aladin.crnkic@hotmail.com [Faculty of Technical Engineering, University of Bihać, Ljubijankićeva, bb., 77000 Bihać, Bosnia and Herzegovina (Bosnia and Herzegovina)

    2017-06-28

    In 2010 Kato and Jones described a new family of probability distributions on circle, obtained as Möbius transformation of von Mises distribution. We present the model demonstrating that these distributions appear naturally in study of populations of coupled oscillators. We use this opportunity to point out certain relations between Directional Statistics and collective motion of coupled oscillators. - Highlights: • We specify probability distributions on circle that arise in Kuramoto model. • We study how the mean-field coupling affects the shape of distribution of phases. • We discuss potential applications in some experiments on cell cycle. • We apply Directional Statistics to study collective dynamics of coupled oscillators.

  14. Effects of volume corrections and resonance decays on cumulants of net-charge distributions in a Monte Carlo hadron resonance gas model

    Science.gov (United States)

    Xu, Hao-jie

    2017-02-01

    The effects of volume corrections and resonance decays (the resulting correlations between positive charges and negative charges) on cumulants of net-proton distributions and net-charge distributions are investigated by using a Monte Carlo hadron resonance gas (MCHRG) model. The required volume distributions are generated by a Monte Carlo Glauber (MC-Glb) model. Except the variances of net-charge distributions, the MCHRG model with more realistic simulations of volume corrections, resonance decays and acceptance cuts can reasonably explain the data of cumulants of net-proton distributions and net-charge distributions reported by the STAR collaboration. The MCHRG calculations indicate that both the volume corrections and resonance decays make the cumulant products of net-charge distributions deviate from the Skellam expectations: the deviations of Sσ and κσ2 are dominated by the former effect while the deviations of ω are dominated by the latter one.

  15. A measure of mutual divergence among a number of probability distributions

    Directory of Open Access Journals (Sweden)

    J. N. Kapur

    1987-01-01

    major inequalities due to Shannon, Renyi and Holder. The inequalities are then used to obtain some useful results in information theory. In particular measures are obtained to measure the mutual divergence among two or more probability distributions.

  16. The probability distribution of fatigue damage and the statistical moment of fatigue life

    Institute of Scientific and Technical Information of China (English)

    熊峻江; 高镇同

    1997-01-01

    The randomization of deterministic fatigue damage equation leads to the stochastic differential equation and the Fokker-Planck equation affected by random fluctuation. By means of the solution of equation, the probability distribution of fatigue damage with the change of time is obtained. Then the statistical moment of fatigue life in consideration of the stationary random fluctuation is derived. Finally, the damage probability distributions during the fatigue crack initiation and fatigue crack growth are given

  17. Quantitative non-monotonic modeling of economic uncertainty by probability and possibility distributions

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans

    2012-01-01

    to the understanding of similarities and differences of the two approaches as well as practical applications. The probability approach offers a good framework for representation of randomness and variability. Once the probability distributions of uncertain parameters and their correlations are known the resulting...... uncertainty can be calculated. The possibility approach is particular well suited for representation of uncertainty of a non-statistical nature due to lack of knowledge and requires less information than the probability approach. Based on the kind of uncertainty and knowledge present, these aspects...... by probability distributions is readily done by means of Monte Carlo simulation. Calculation of non-monotonic functions of possibility distributions is done within the theoretical framework of fuzzy intervals, but straight forward application of fuzzy arithmetic in general results in overestimation of interval...

  18. Frequency, magnitude, and distribution of head impacts in Pop Warner football: the cumulative burden.

    Science.gov (United States)

    Wong, Ricky H; Wong, Andrew K; Bailes, Julian E

    2014-03-01

    A growing body of research suggests that subconcussive head impacts or repetitive mild Traumatic Brain Injury (mTBI) can have cumulative and deleterious effects. Several studies have investigated head impacts in football at the professional, collegiate, and high school levels, in an attempt to elucidate the biomechanics of head impacts among football players. Youth football players, generally from 7 to 14 years of age, constitute 70% of all football players, yet burden of, and susceptibility to, head injury in this population is not well known. A novel impact sensor utilizing binary force switches (Shockbox(®)) was used to follow an entire Pop Warner football team consisting of twenty-two players for six games and five practices. The impact sensor was designed to record impacts with linear accelerations over 30g. In addition, video recording of games and practices were used to further characterize the head impacts by type of position (skilled versus unskilled), field location of impact (open field versus line of scrimmage), type of hit (tackling, tackled, or hold/push), and whether the impact was a head-to-head impact or not. We recorded a total of 480 head impacts. An average of 21.8 head impacts occurred per practice, while 61.8 occurred per game. Players had an average of 3.7 head impacts per game and 1.5 impacts per practice (p80g) was 11. Two concussions were diagnosed over the course of the season. However, due to technical reasons the biomechanics of those hits resulting in concussions were not captured. Despite smaller players and slower play when compared to high school, collegiate or professional players, those involved in youth football sustain a moderate number of head impacts per season with several high magnitude impacts. Our results suggest that players involved in open-field, tackling plays that have head-to-head contact sustain impacts with the highest linear accelerations. Our data supports previously published data that suggests changes to the

  19. A simple derivation and classification of common probability distributions based on information symmetry and measurement scale

    CERN Document Server

    Frank, Steven A

    2010-01-01

    Commonly observed patterns typically follow a few distinct families of probability distributions. Over one hundred years ago, Karl Pearson provided a systematic derivation and classification of the common continuous distributions. His approach was phenomenological: a differential equation that generated common distributions without any underlying conceptual basis for why common distributions have particular forms and what explains the familial relations. Pearson's system and its descendants remain the most popular systematic classification of probability distributions. Here, we unify the disparate forms of common distributions into a single system based on two meaningful and justifiable propositions. First, distributions follow maximum entropy subject to constraints, where maximum entropy is equivalent to minimum information. Second, different problems associate magnitude to information in different ways, an association we describe in terms of the relation between information invariance and measurement scale....

  20. Predicting the probability of slip in gait: methodology and distribution study.

    Science.gov (United States)

    Gragg, Jared; Yang, James

    2016-01-01

    The likelihood of a slip is related to the available and required friction for a certain activity, here gait. Classical slip and fall analysis presumed that a walking surface was safe if the difference between the mean available and required friction coefficients exceeded a certain threshold. Previous research was dedicated to reformulating the classical slip and fall theory to include the stochastic variation of the available and required friction when predicting the probability of slip in gait. However, when predicting the probability of a slip, previous researchers have either ignored the variation in the required friction or assumed the available and required friction to be normally distributed. Also, there are no published results that actually give the probability of slip for various combinations of required and available frictions. This study proposes a modification to the equation for predicting the probability of slip, reducing the previous equation from a double-integral to a more convenient single-integral form. Also, a simple numerical integration technique is provided to predict the probability of slip in gait: the trapezoidal method. The effect of the random variable distributions on the probability of slip is also studied. It is shown that both the required and available friction distributions cannot automatically be assumed as being normally distributed. The proposed methods allow for any combination of distributions for the available and required friction, and numerical results are compared to analytical solutions for an error analysis. The trapezoidal method is shown to be highly accurate and efficient. The probability of slip is also shown to be sensitive to the input distributions of the required and available friction. Lastly, a critical value for the probability of slip is proposed based on the number of steps taken by an average person in a single day.

  1. WIENER-HOPF SOLVER WITH SMOOTH PROBABILITY DISTRIBUTIONS OF ITS COMPONENTS

    Directory of Open Access Journals (Sweden)

    Mr. Vladimir A. Smagin

    2016-12-01

    Full Text Available The Wiener – Hopf solver with smooth probability distributions of its component is presented. The method is based on hyper delta approximations of initial distributions. The use of Fourier series transformation and characteristic function allows working with the random variable method concentrated in transversal axis of absc.

  2. Parametrizing Impulsive X-ray Heating with a Cumulative Initial-Temperature Distribution

    CERN Document Server

    Gayley, K G

    2014-01-01

    In collisional ionization equilibrium (CIE), the X-ray spectrum from a plasma depends on the distribution of emission measure over temperature (DEM). Due to the well-known ill conditioning problem, no precisely resolved DEM can be inverted directly from the spectrum, so often only a gross parametrization of the DEM is used to approximate the data, in hopes that the parametrization can provide useful model-independent constraints on the heating process. However, ill conditioning also introduces ambiguity into the various different parametrizations that could approximate the data, which may spoil the perceived advantages of model independence. Thus, this paper instead suggests a single parametrization for both the heating mechanism and the X-ray sources, based on a model of impulsive heating followed by complete cooling. This approach is similar to a ``cooling flow'' approach, but allows injection at multiple initial temperatures, and applies even when the steady state is distribution of different shock strengt...

  3. Probability Models for the Distribution of Copepods in Different Coastal Ecosystems Along the Straits of Malacca

    Science.gov (United States)

    Matias-Peralta, Hazel Monica; Ghodsi, Alireza; Shitan, Mahendran; Yusoff, Fatimah Md.

    Copepods are the most abundant microcrustaceans in the marine waters and are the major food resource for many commercial fish species. In addition, changes in the distribution and population composition of copepods may also serve as an indicator of global climate changes. Therefore, it is important to model the copepod distribution in different ecosystems. Copepod samples were collected from three different ecosystems (seagrass area, cage aquaculture area and coastal waters off shrimp aquaculture farm) along the coastal waters of the Malacca Straits over a one year period. In this study the major statistical analysis consisted of fitting different probability models. This paper highlights the fitting of probability distributions and discusses the adequateness of the fitted models. The usefulness of these fitted models would enable one to make probability statements about the distribution of copepods in three different ecosystems.

  4. Adaptive sampling based on the cumulative distribution function of order statistics to delineate heavy-metal contaminated soils using kriging.

    Science.gov (United States)

    Juang, Kai-Wei; Lee, Dar-Yuan; Teng, Yun-Lung

    2005-11-01

    Correctly classifying "contaminated" areas in soils, based on the threshold for a contaminated site, is important for determining effective clean-up actions. Pollutant mapping by means of kriging is increasingly being used for the delineation of contaminated soils. However, those areas where the kriged pollutant concentrations are close to the threshold have a high possibility for being misclassified. In order to reduce the misclassification due to the over- or under-estimation from kriging, an adaptive sampling using the cumulative distribution function of order statistics (CDFOS) was developed to draw additional samples for delineating contaminated soils, while kriging. A heavy-metal contaminated site in Hsinchu, Taiwan was used to illustrate this approach. The results showed that compared with random sampling, adaptive sampling using CDFOS reduced the kriging estimation errors and misclassification rates, and thus would appear to be a better choice than random sampling, as additional sampling is required for delineating the "contaminated" areas.

  5. A Class of Chaotic Sequences with Gauss Probability Distribution for Radar Mask Jamming

    Institute of Scientific and Technical Information of China (English)

    Ni-Ni Rao; Yu-Chuan Huang; Bin Liu

    2007-01-01

    A simple generation approach for chaotic sequences with Gauss probability distribution is proposed. Theoretical analysis and simulation based on Logistic chaotic model show that the approach is feasible and effective. The distribution characteristics of the novel chaotic sequence are comparable to that of the standard normal distribution. Its mean and variance can be changed to the desired values. The novel sequences have also good randomness. The applications for radar mask jamming are analyzed.

  6. Investigating and improving student understanding of the probability distributions for measuring physical observables in quantum mechanics

    Science.gov (United States)

    Marshman, Emily; Singh, Chandralekha

    2017-03-01

    A solid grasp of the probability distributions for measuring physical observables is central to connecting the quantum formalism to measurements. However, students often struggle with the probability distributions of measurement outcomes for an observable and have difficulty expressing this concept in different representations. Here we first describe the difficulties that upper-level undergraduate and PhD students have with the probability distributions for measuring physical observables in quantum mechanics. We then discuss how student difficulties found in written surveys and individual interviews were used as a guide in the development of a quantum interactive learning tutorial (QuILT) to help students develop a good grasp of the probability distributions of measurement outcomes for physical observables. The QuILT strives to help students become proficient in expressing the probability distributions for the measurement of physical observables in Dirac notation and in the position representation and be able to convert from Dirac notation to position representation and vice versa. We describe the development and evaluation of the QuILT and findings about the effectiveness of the QuILT from in-class evaluations.

  7. Score distributions of gapped multiple sequence alignments down to the low-probability tail

    Science.gov (United States)

    Fieth, Pascal; Hartmann, Alexander K.

    2016-08-01

    Assessing the significance of alignment scores of optimally aligned DNA or amino acid sequences can be achieved via the knowledge of the score distribution of random sequences. But this requires obtaining the distribution in the biologically relevant high-scoring region, where the probabilities are exponentially small. For gapless local alignments of infinitely long sequences this distribution is known analytically to follow a Gumbel distribution. Distributions for gapped local alignments and global alignments of finite lengths can only be obtained numerically. To obtain result for the small-probability region, specific statistical mechanics-based rare-event algorithms can be applied. In previous studies, this was achieved for pairwise alignments. They showed that, contrary to results from previous simple sampling studies, strong deviations from the Gumbel distribution occur in case of finite sequence lengths. Here we extend the studies to multiple sequence alignments with gaps, which are much more relevant for practical applications in molecular biology. We study the distributions of scores over a large range of the support, reaching probabilities as small as 10-160, for global and local (sum-of-pair scores) multiple alignments. We find that even after suitable rescaling, eliminating the sequence-length dependence, the distributions for multiple alignment differ from the pairwise alignment case. Furthermore, we also show that the previously discussed Gaussian correction to the Gumbel distribution needs to be refined, also for the case of pairwise alignments.

  8. A Cumulative Damage Reliability Model on the Basis of Contact Fatigue of the Rolling Bearing

    Institute of Scientific and Technical Information of China (English)

    HUANG Li

    2006-01-01

    A cumulative damage reliability model of contact fatigue of the rolling bearing is more identical with the actual conditions. It is put forward on the basis of contact fatigue life probability distribution of the rolling bearing that obey Weibull distribution and rest on the Miner cumulative damage theory. Finally a case is given to predict the reliability of bearing roller by using these models.

  9. Probability collectives a distributed multi-agent system approach for optimization

    CERN Document Server

    Kulkarni, Anand Jayant; Abraham, Ajith

    2015-01-01

    This book provides an emerging computational intelligence tool in the framework of collective intelligence for modeling and controlling distributed multi-agent systems referred to as Probability Collectives. In the modified Probability Collectives methodology a number of constraint handling techniques are incorporated, which also reduces the computational complexity and improved the convergence and efficiency. Numerous examples and real world problems are used for illustration, which may also allow the reader to gain further insight into the associated concepts.

  10. The Exit Distribution for Smart Kinetic Walk with Symmetric and Asymmetric Transition Probability

    Science.gov (United States)

    Dai, Yan

    2017-03-01

    It has been proved that the distribution of the point where the smart kinetic walk (SKW) exits a domain converges in distribution to harmonic measure on the hexagonal lattice. For other lattices, it is believed that this result still holds, and there is good numerical evidence to support this conjecture. Here we examine the effect of the symmetry and asymmetry of the transition probability on each step of the SKW on the square lattice and test if the exit distribution converges in distribution to harmonic measure as well. From our simulations, the limiting exit distribution of the SKW with a non-uniform but symmetric transition probability as the lattice spacing goes to zero is the harmonic measure. This result does not hold for asymmetric transition probability. We are also interested in the difference between the SKW with symmetric transition probability exit distribution and harmonic measure. Our simulations provide strong support for a explicit conjecture about this first order difference. The explicit formula for the conjecture will be given below.

  11. Ruin Probability and Joint Distributions of Some Actuarial Random Vectors in the Compound Pascal Model

    Institute of Scientific and Technical Information of China (English)

    Xian-min Geng; Shu-chen Wan

    2011-01-01

    The compound negative binomial model, introduced in this paper, is a discrete time version. We discuss the Markov properties of the surplus process, and study the ruin probability and the joint distributions of actuarial random vectors in this model. By the strong Markov property and the mass function of a defective renewal sequence, we obtain the explicit expressions of the ruin probability, the finite-horizon ruin probability,the joint distributions of T, U(T - 1), |U(T)| and inf 0≤n<T1 U(n) (i.e., the time of ruin, the surplus immediately before ruin, the deficit at ruin and maximal deficit from ruin to recovery) and the distributions of some actuariai random vectors.

  12. Net-charge probability distributions in heavy ion collisions at chemical freeze-out

    CERN Document Server

    Braun-Munzinger, P; Karsch, F; Redlich, K; Skokov, V

    2011-01-01

    We explore net charge probability distributions in heavy ion collisions within the hadron resonance gas model. The distributions for strangeness, electric charge and baryon number are derived. We show that, within this model, net charge probability distributions and the resulting fluctuations can be computed directly from the measured yields of charged and multi-charged hadrons. The influence of multi-charged particles and quantum statistics on the shape of the distribution is examined. We discuss the properties of the net proton distribution along the chemical freeze-out line. The model results presented here can be compared with data at RHIC energies and at the LHC to possibly search for the relation between chemical freeze-out and QCD cross-over lines in heavy ion collisions.

  13. LAGRANGE MULTIPLIERS IN THE PROBABILITY DISTRIBUTIONS ELICITATION PROBLEM: AN APPLICATION TO THE 2013 FIFA CONFEDERATIONS CUP

    Directory of Open Access Journals (Sweden)

    Diogo de Carvalho Bezerra

    2015-12-01

    Full Text Available ABSTRACT Contributions from the sensitivity analysis of the parameters of the linear programming model for the elicitation of experts' beliefs are presented. The process allows for the calibration of the family of probability distributions obtained in the elicitation process. An experiment to obtain the probability distribution of a future event (Brazil vs. Spain soccer game in the 2013 FIFA Confederations Cup final game was conducted. The proposed sensitivity analysis step may help to reduce the vagueness of the information given by the expert.

  14. THE LEBESGUE-STIELJES INTEGRAL AS APPLIED IN PROBABILITY DISTRIBUTION THEORY

    Science.gov (United States)

    bounded variation and Borel measureable functions are set forth in the introduction. Chapter 2 is concerned with establishing a one to one correspondence between LebesgueStieljes measures and certain equivalence classes of functions which are monotone non decreasing and continuous on the right. In Chapter 3 the Lebesgue-Stieljes Integral is defined and some of its properties are demonstrated. In Chapter 4 probability distribution function is defined and the notions in Chapters 2 and 3 are used to show that the Lebesgue-Stieljes integral of any probability distribution

  15. Estimating the Upper Limit of Lifetime Probability Distribution, Based on Data of Japanese Centenarians.

    Science.gov (United States)

    Hanayama, Nobutane; Sibuya, Masaaki

    2016-08-01

    In modern biology, theories of aging fall mainly into two groups: damage theories and programed theories. If programed theories are true, the probability that human beings live beyond a specific age will be zero. In contrast, if damage theories are true, such an age does not exist, and a longevity record will be eventually destroyed. In this article, for examining real state, a special type of binomial model based on the generalized Pareto distribution has been applied to data of Japanese centenarians. From the results, it is concluded that the upper limit of lifetime probability distribution in the Japanese population has been estimated 123 years.

  16. Comparative assessment of surface fluxes from different sources using probability density distributions

    Science.gov (United States)

    Gulev, Sergey; Tilinina, Natalia; Belyaev, Konstantin

    2015-04-01

    Surface turbulent heat fluxes from modern era and first generation reanalyses (NCEP-DOE, ERA-Interim, MERRA NCEP-CFSR, JRA) as well as from satellite products (SEAFLUX, IFREMER, HOAPS) were intercompared using framework of probability distributions for sensible and latent heat fluxes. For approximation of probability distributions and estimation of extreme flux values Modified Fisher-Tippett (MFT) distribution has been used. Besides mean flux values, consideration is given to the comparative analysis of (i) parameters of the MFT probability density functions (scale and location), (ii) extreme flux values corresponding high order percentiles of fluxes (e.g. 99th and higher) and (iii) fractional contribution of extreme surface flux events in the total surface turbulent fluxes integrated over months and seasons. The latter was estimated using both fractional distribution derived from MFT and empirical estimates based upon occurrence histograms. The strongest differences in the parameters of probability distributions of surface fluxes and extreme surface flux values between different reanalyses are found in the western boundary current extension regions and high latitudes, while the highest differences in the fractional contributions of surface fluxes may occur in mid ocean regions being closely associated with atmospheric synoptic dynamics. Generally, satellite surface flux products demonstrate relatively stronger extreme fluxes compared to reanalyses, even in the Northern Hemisphere midlatitudes where data assimilation input in reanalyses is quite dense compared to the Southern Ocean regions.

  17. Ventilation/perfusion lung scan probability category distributions in university and community hospitals.

    Science.gov (United States)

    Lowe, V J; Bullard, A G; Coleman, R E

    1995-12-01

    The criteria used in the Prospective Investigation of Pulmonary Embolism Diagnosis (PIOPED) study for the interpretation of ventilation/perfusion scans are widely used and the probability of pulmonary embolism is determined from these criteria. The prevalence of pulmonary embolism in the PIOPED study was 33%. To investigate the similarity of patient populations who have ventilation/perfusion scans at one of the medical centers that participated in the PIOPED study and a small community hospital, the authors evaluated the probability category distributions of lung scans at the two institutions. They retrospectively interpreted 54 and 49 ventilation/perfusion lung scans selected from January, 1991, to June, 1992, at Duke University Medical Center and at Central Carolina Hospital, respectively. Studies were interpreted according to the PIOPED criteria. The percentage of studies assigned to each category at Duke University Medical Center and Central Carolina Hospital were 17% and 27% normal or very low probability, 31% and 59% low probability, 39% and 10% intermediate probability, and 13% and 4% high probability, respectively. The different distribution of probability categories between university and community hospitals suggests that the prevalence of disease may also be different. The post-test probability of pulmonary embolism is related to the prevalence of disease and the sensitivity and specificity of the ventilation/perfusion scan. Because these variables may differ in community hospital settings, the post-test probability of pulmonary embolism as determined by data from the PIOPED study should only be used in institutions with similar populations. Clinical management based upon the results of the PIOPED study may not be applicable to patients who have ventilation/perfusion scans performed in a community hospital.

  18. Radionuclide and colloid transport in the Culebra Dolomite and associated complementary cumulative distribution functions in the 1996 performance assessment for the Waste Isolation Pilot Plant

    Energy Technology Data Exchange (ETDEWEB)

    RAMSEY, JAMES L.; BLAINE,R.; GARNER,J.W.; HELTON,JON CRAIG; JOHNSON,J.D.; SMITH,L.N.; WALLACE,M.

    2000-05-22

    The following topics related to radionuclide and colloid transport in the Culebra Dolomite in the 1996 performance assessment for the Waste Isolation Pilot Plant (WIPP) are presented: (1) mathematical description of models, (2) uncertainty and sensitivity analysis results arising from subjective (i.e., epistemic) uncertainty for individual releases, and (3) construction of complementary cumulative distribution functions (CCDFs) arising from stochastic (i.e., aleatory) uncertainty. The presented results indicate that radionuclide and colloid transport in the Culebra Dolomite does not constitute a serious threat to the effectiveness of the WIPP as a disposal facility for transuranic waste. Even when the effects of uncertain analysis inputs are taken into account, no radionuclide transport to the boundary with the accessible environment was observed; thus the associated CCDFs for comparison with the boundary line specified in the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, 40 CFR 194) are degenerate in the sense of having a probability of zero of exceeding a release of zero.

  19. Importance measures for imprecise probability distributions and their sparse grid solutions

    Institute of Scientific and Technical Information of China (English)

    WANG; Pan; LU; ZhenZhou; CHENG; Lei

    2013-01-01

    For the imprecise probability distribution of structural system, the variance based importance measures (IMs) of the inputs are investigated, and three IMs are defined on the conditions of random distribution parameters, interval distribution parameters and the mixture of those two types of distribution parameters. The defined IMs can reflect the influence of the inputs on the output of the structural system with imprecise distribution parameters, respectively. Due to the large computational cost of the variance based IMs, sparse grid method is employed in this work to compute the variance based IMs at each reference point of distribution parameters. For the three imprecise distribution parameter cases, the sparse grid method and the combination of sparse grid method with genetic algorithm are used to compute the defined IMs. Numerical and engineering examples are em-ployed to demonstrate the rationality of the defined IMs and the efficiency of the applied methods.

  20. The probability distribution model of air pollution index and its dominants in Kuala Lumpur

    Science.gov (United States)

    AL-Dhurafi, Nasr Ahmed; Razali, Ahmad Mahir; Masseran, Nurulkamal; Zamzuri, Zamira Hasanah

    2016-11-01

    This paper focuses on the statistical modeling for the distributions of air pollution index (API) and its sub-indexes data observed at Kuala Lumpur in Malaysia. Five pollutants or sub-indexes are measured including, carbon monoxide (CO); sulphur dioxide (SO2); nitrogen dioxide (NO2), and; particulate matter (PM10). Four probability distributions are considered, namely log-normal, exponential, Gamma and Weibull in search for the best fit distribution to the Malaysian air pollutants data. In order to determine the best distribution for describing the air pollutants data, five goodness-of-fit criteria's are applied. This will help in minimizing the uncertainty in pollution resource estimates and improving the assessment phase of planning. The conflict in criterion results for selecting the best distribution was overcome by using the weight of ranks method. We found that the Gamma distribution is the best distribution for the majority of air pollutants data in Kuala Lumpur.

  1. Statistical analysis of the Lognormal-Pareto distribution using Probability Weighted Moments and Maximum Likelihood

    OpenAIRE

    Marco Bee

    2012-01-01

    This paper deals with the estimation of the lognormal-Pareto and the lognormal-Generalized Pareto mixture distributions. The log-likelihood function is discontinuous, so that Maximum Likelihood Estimation is not asymptotically optimal. For this reason, we develop an alternative method based on Probability Weighted Moments. We show that the standard version of the method can be applied to the first distribution, but not to the latter. Thus, in the lognormal- Generalized Pareto case, we work ou...

  2. Calculation of Radar Probability of Detection in K-Distributed Sea Clutter and Noise

    Science.gov (United States)

    2011-04-01

    Expanded Swerling Target Models, IEEE Trans. AES 39 (2003) 1059-1069. 18. G. Arfken , Mathematical Methods for Physicists, Second Edition, Academic...form solution for the probability of detection in K-distributed clutter, so numerical methods are required. The K distribution is a compound model...the integration, with the nodes and weights calculated using matrix methods , so that a general purpose numerical integration routine is not required

  3. Analysis of low probability of intercept (LPI) radar signals using the Wigner Distribution

    OpenAIRE

    Gau, Jen-Yu

    2002-01-01

    Approved for public release, distribution is unlimited The parameters of Low Probability of Intercept (LPI) radar signals are hard to identify by using traditional periodogram signal processing techniques. Using the Wigner Distribution (WD), this thesis examines eight types of LPI radar signals. Signal to noise ratios of 0 dB and -6dB are also investigated. The eight types LPI radar signals examined include Frequency Modulation Continuous Wave (FMCW), Frank code, P1 code, P2 code, P3 code,...

  4. Neighbor-dependent Ramachandran probability distributions of amino acids developed from a hierarchical Dirichlet process model.

    Directory of Open Access Journals (Sweden)

    Daniel Ting

    2010-04-01

    Full Text Available Distributions of the backbone dihedral angles of proteins have been studied for over 40 years. While many statistical analyses have been presented, only a handful of probability densities are publicly available for use in structure validation and structure prediction methods. The available distributions differ in a number of important ways, which determine their usefulness for various purposes. These include: 1 input data size and criteria for structure inclusion (resolution, R-factor, etc.; 2 filtering of suspect conformations and outliers using B-factors or other features; 3 secondary structure of input data (e.g., whether helix and sheet are included; whether beta turns are included; 4 the method used for determining probability densities ranging from simple histograms to modern nonparametric density estimation; and 5 whether they include nearest neighbor effects on the distribution of conformations in different regions of the Ramachandran map. In this work, Ramachandran probability distributions are presented for residues in protein loops from a high-resolution data set with filtering based on calculated electron densities. Distributions for all 20 amino acids (with cis and trans proline treated separately have been determined, as well as 420 left-neighbor and 420 right-neighbor dependent distributions. The neighbor-independent and neighbor-dependent probability densities have been accurately estimated using Bayesian nonparametric statistical analysis based on the Dirichlet process. In particular, we used hierarchical Dirichlet process priors, which allow sharing of information between densities for a particular residue type and different neighbor residue types. The resulting distributions are tested in a loop modeling benchmark with the program Rosetta, and are shown to improve protein loop conformation prediction significantly. The distributions are available at http://dunbrack.fccc.edu/hdp.

  5. Providing probability distributions for the causal pathogen of clinical mastitis using naive Bayesian networks

    NARCIS (Netherlands)

    Steeneveld, W.; Gaag, van der L.C.; Barkema, H.W.; Hogeveen, H.

    2009-01-01

    Clinical mastitis (CM) can be caused by a wide variety of pathogens and farmers must start treatment before the actual causal pathogen is known. By providing a probability distribution for the causal pathogen, naive Bayesian networks (NBN) can serve as a management tool for farmers to decide which t

  6. Using Preferred Outcome Distributions to estimate Value and Probability Weighting Functions in Decisions under Risk

    NARCIS (Netherlands)

    A.C.D. Donkers (Bas); T. Lourenco (Tania); B.G.C. Dellaert (Benedict); D.G. Goldstein (Daniel G.)

    2013-01-01

    textabstract In this paper we propose the use of preferred outcome distributions as a new method to elicit individuals' value and probability weighting functions in decisions under risk. Extant approaches for the elicitation of these two key ingredients of individuals' risk attitude typically rely

  7. Establishment and optimization of project investment risk income models on the basis of probability χ distribution

    Institute of Scientific and Technical Information of China (English)

    吕渭济; 崔巍

    2001-01-01

    In this paper, two kinds of models are presented and optimized for project investment risk income on the basis of probability X distribution. One kind of model being proved has only a maximal value and another kind being proved has no extreme values.

  8. Establishment and optimization of project investment risk income models on the basis of probability χ distribution

    Institute of Scientific and Technical Information of China (English)

    LU Wei-ji; CUI Wei

    2001-01-01

    In this paper, two kinds of models are presented and optimized for pro ject investment risk income on the basis of probability χ distribution. One kin d of model being proved has only a maximal value and another kind being proved h as no extreme values.

  9. INVESTIGATION OF INFLUENCE OF ENCODING FUNCTION COMPLEXITY ON DISTRIBUTION OF ERROR MASKING PROBABILITY

    Directory of Open Access Journals (Sweden)

    A. B. Levina

    2016-03-01

    Full Text Available Error detection codes are mechanisms that enable robust delivery of data in unreliable communication channels and devices. Unreliable channels and devices are error-prone objects. Respectively, error detection codes allow detecting such errors. There are two classes of error detecting codes - classical codes and security-oriented codes. The classical codes have high percentage of detected errors; however, they have a high probability to miss an error in algebraic manipulation. In order, security-oriented codes are codes with a small Hamming distance and high protection to algebraic manipulation. The probability of error masking is a fundamental parameter of security-oriented codes. A detailed study of this parameter allows analyzing the behavior of the error-correcting code in the case of error injection in the encoding device. In order, the complexity of the encoding function plays an important role in the security-oriented codes. Encoding functions with less computational complexity and a low probability of masking are the best protection of encoding device against malicious acts. This paper investigates the influence of encoding function complexity on the error masking probability distribution. It will be shownthat the more complex encoding function reduces the maximum of error masking probability. It is also shown in the paper that increasing of the function complexity changes the error masking probability distribution. In particular, increasing of computational complexity decreases the difference between the maximum and average value of the error masking probability. Our resultshave shown that functions with greater complexity have smoothed maximums of error masking probability, which significantly complicates the analysis of error-correcting code by attacker. As a result, in case of complex encoding function the probability of the algebraic manipulation is reduced. The paper discusses an approach how to measure the error masking

  10. The Probability Density Functions to Diameter Distributions for Scots Pine Oriental Beech and Mixed Stands

    Directory of Open Access Journals (Sweden)

    Aydın Kahriman

    2011-11-01

    Full Text Available Determine the diameter distribution of a stand and its relations with stand ages, site index, density and mixture percentage is very important both biologically and economically. The Weibull with two parameters, Weibull with three parameters, Gamma with two parameters, Gamma with three parameters, Beta, Lognormal with two parameters, Lognormal with three parameters, Normal, Johnson SB probability density functions were used to determination of diameter distributions. This study aimed to compared based on performance of describing different diameter distribution and to describe the best successful function of diameter distributions. The data were obtaited from 162 temporary sample plots measured Scots pine and Oriental beech mixed stands in Black Sea Region. The results show that four parameter Johnson SB function for both scots pine and oriental beech is the best successful function to describe diameter distributions based on error index values calculated by difference between observed and predicted diameter distributions.

  11. Generalized quantum Fokker-Planck, diffusion and Smoluchowski equations with true probability distribution functions

    CERN Document Server

    Banik, S K; Ray, D S; Banik, Suman Kumar; Bag, Bidhan Chandra; Ray, Deb Shankar

    2002-01-01

    Traditionally, the quantum Brownian motion is described by Fokker-Planck or diffusion equations in terms of quasi-probability distribution functions, e.g., Wigner functions. These often become singular or negative in the full quantum regime. In this paper a simple approach to non-Markovian theory of quantum Brownian motion using {\\it true probability distribution functions} is presented. Based on an initial coherent state representation of the bath oscillators and an equilibrium canonical distribution of the quantum mechanical mean values of their co-ordinates and momenta we derive a generalized quantum Langevin equation in $c$-numbers and show that the latter is amenable to a theoretical analysis in terms of the classical theory of non-Markovian dynamics. The corresponding Fokker-Planck, diffusion and the Smoluchowski equations are the {\\it exact} quantum analogues of their classical counterparts. The present work is {\\it independent} of path integral techniques. The theory as developed here is a natural ext...

  12. Adaptive sampling based on the cumulative distribution function of order statistics to delineate heavy-metal contaminated soils using kriging

    Energy Technology Data Exchange (ETDEWEB)

    Juang, K.-W. [Department of Post-Modern Agriculture, MingDao University, Pitou, Changhua, Taiwan (China); Lee, D.-Y. [Graduate Institute of Agricultural Chemistry, National Taiwan University, Taipei, Taiwan (China)]. E-mail: dylee@ccms.ntu.edu.tw; Teng, Y.-L. [Graduate Institute of Agricultural Chemistry, National Taiwan University, Taipei, Taiwan (China)

    2005-11-15

    Correctly classifying 'contaminated' areas in soils, based on the threshold for a contaminated site, is important for determining effective clean-up actions. Pollutant mapping by means of kriging is increasingly being used for the delineation of contaminated soils. However, those areas where the kriged pollutant concentrations are close to the threshold have a high possibility for being misclassified. In order to reduce the misclassification due to the over- or under-estimation from kriging, an adaptive sampling using the cumulative distribution function of order statistics (CDFOS) was developed to draw additional samples for delineating contaminated soils, while kriging. A heavy-metal contaminated site in Hsinchu, Taiwan was used to illustrate this approach. The results showed that compared with random sampling, adaptive sampling using CDFOS reduced the kriging estimation errors and misclassification rates, and thus would appear to be a better choice than random sampling, as additional sampling is required for delineating the 'contaminated' areas. - A sampling approach was derived for drawing additional samples while kriging.

  13. Small-Scale Spatio-Temporal Distribution of Bactrocera minax (Enderlein) (Diptera: Tephritidae) Using Probability Kriging.

    Science.gov (United States)

    Wang, S Q; Zhang, H Y; Li, Z L

    2016-10-01

    Understanding spatio-temporal distribution of pest in orchards can provide important information that could be used to design monitoring schemes and establish better means for pest control. In this study, the spatial and temporal distribution of Bactrocera minax (Enderlein) (Diptera: Tephritidae) was assessed, and activity trends were evaluated by using probability kriging. Adults of B. minax were captured in two successive occurrences in a small-scale citrus orchard by using food bait traps, which were placed both inside and outside the orchard. The weekly spatial distribution of B. minax within the orchard and adjacent woods was examined using semivariogram parameters. The edge concentration was discovered during the most weeks in adult occurrence, and the population of the adults aggregated with high probability within a less-than-100-m-wide band on both of the sides of the orchard and the woods. The sequential probability kriged maps showed that the adults were estimated in the marginal zone with higher probability, especially in the early and peak stages. The feeding, ovipositing, and mating behaviors of B. minax are possible explanations for these spatio-temporal patterns. Therefore, spatial arrangement and distance to the forest edge of traps or spraying spot should be considered to enhance pest control on B. minax in small-scale orchards.

  14. About the probability distribution of a quantity with given mean and variance

    CERN Document Server

    Olivares, Stefano

    2012-01-01

    Supplement 1 to GUM (GUM-S1) recommends the use of maximum entropy principle (MaxEnt) in determining the probability distribution of a quantity having specified properties, e.g., specified central moments. When we only know the mean value and the variance of a variable, GUM-S1 prescribes a Gaussian probability distribution for that variable. When further information is available, in the form of a finite interval in which the variable is known to lie, we indicate how the distribution for the variable in this case can be obtained. A Gaussian distribution should only be used in this case when the standard deviation is small compared to the range of variation (the length of the interval). In general, when the interval is finite, the parameters of the distribution should be evaluated numerically, as suggested by I. Lira, Metrologia, 46 L27 (2009). Here we note that the knowledge of the range of variation is equivalent to a bias of the distribution toward a flat distribution in that range, and the principle of mini...

  15. Earthquake probabilities and magnitude distribution (M≥6.7) along the Haiyuan fault, northwestern China

    Institute of Scientific and Technical Information of China (English)

    冉洪流

    2004-01-01

    In recent years, some researchers have studied the paleoearthquake along the Haiyuan fault and revealed a lot of paleoearthquake events. All available information allows more reliable analysis of earthquake recurrence interval and earthquake rupture patterns along the Haiyuan fault. Based on this paleoseismological information, the recurrence probability and magnitude distribution for M≥6.7 earthquakes in future 100 years along the Haiyuan fault can be obtained through weighted computation by using Poisson and Brownian passage time models and considering different rupture patterns. The result shows that the recurrence probability of MS≥6.7 earthquakes is about 0.035 in future 100 years along the Haiyuan fault.

  16. Some Characterization Results on Dynamic Cumulative Residual Tsallis Entropy

    Directory of Open Access Journals (Sweden)

    Madan Mohan Sati

    2015-01-01

    Full Text Available We propose a generalized cumulative residual information measure based on Tsallis entropy and its dynamic version. We study the characterizations of the proposed information measure and define new classes of life distributions based on this measure. Some applications are provided in relation to weighted and equilibrium probability models. Finally the empirical cumulative Tsallis entropy is proposed to estimate the new information measure.

  17. Probability distribution of the free energy of a directed polymer in a random medium

    Science.gov (United States)

    Brunet, Éric; Derrida, Bernard

    2000-06-01

    We calculate exactly the first cumulants of the free energy of a directed polymer in a random medium for the geometry of a cylinder. By using the fact that the nth moment of the partition function is given by the ground-state energy of a quantum problem of n interacting particles on a ring of length L, we write an integral equation allowing to expand these moments in powers of the strength of the disorder γ or in powers of n. For n small and n~(Lγ)-1/2, the moments take a scaling form which allows us to describe all the fluctuations of order 1/L of the free energy per unit length of the directed polymer. The distribution of these fluctuations is the same as the one found recently in the asymmetric exclusion process, indicating that it is characteristic of all the systems described by the Kardar-Parisi-Zhang equation in 1+1 dimensions.

  18. Comparative assessment of surface fluxes from different sources: a framework based on probability distributions

    Science.gov (United States)

    Gulev, S.

    2015-12-01

    Surface turbulent heat fluxes from modern era and first generation reanalyses (NCEP-DOE, ERA-Interim, MERRA NCEP-CFSR, JRA) as well as from satellite products (SEAFLUX, IFREMER, HOAPS) were intercompared using framework of probability distributions for sensible and latent heat fluxes. For approximation of probability distributions and estimation of extreme flux values Modified Fisher-Tippett (MFT) distribution has been used. Besides mean flux values, consideration is given to the comparative analysis of (i) parameters of the MFT probability density functions (scale and location), (ii) extreme flux values corresponding high order percentiles of fluxes (e.g. 99th and higher) and (iii) fractional contribution of extreme surface flux events in the total surface turbulent fluxes integrated over months and seasons. The latter was estimated using both fractional distribution derived from MFT and empirical estimates based upon occurrence histograms. The strongest differences in the parameters of probability distributions of surface fluxes and extreme surface flux values between different reanalyses are found in the western boundary current extension regions and high latitudes, while the highest differences in the fractional contributions of surface fluxes may occur in mid ocean regions being closely associated with atmospheric synoptic dynamics. Generally, satellite surface flux products demonstrate relatively stronger extreme fluxes compared to reanalyses, even in the Northern Hemisphere midlatitudes where data assimilation input in reanalyses is quite dense compared to the Southern Ocean regions. Our assessment also discriminated different reanalyses and satellite products with respect to their ability to quantify the role of extreme surface turbulent fluxes in forming ocean heat release in different regions.

  19. Probability distribution of surface wind speed induced by convective adjustment on Venus

    Science.gov (United States)

    Yamamoto, Masaru

    2017-03-01

    The influence of convective adjustment on the spatial structure of Venusian surface wind and probability distribution of its wind speed is investigated using an idealized weather research and forecasting model. When the initially uniform wind is much weaker than the convective wind, patches of both prograde and retrograde winds with scales of a few kilometers are formed during active convective adjustment. After the active convective adjustment, because the small-scale convective cells and their related vertical momentum fluxes dissipate quickly, the large-scale (>4 km) prograde and retrograde wind patches remain on the surface and in the longitude-height cross-section. This suggests the coexistence of local prograde and retrograde flows, which may correspond to those observed by Pioneer Venus below 10 km altitude. The probability distributions of surface wind speed V during the convective adjustment have a similar form in different simulations, with a sharp peak around ∼0.1 m s-1 and a bulge developing on the flank of the probability distribution. This flank bulge is associated with the most active convection, which has a probability distribution with a peak at the wind speed 1.5-times greater than the Weibull fitting parameter c during the convective adjustment. The Weibull distribution P(> V) (= exp[-(V/c)k]) with best-estimate coefficients of Lorenz (2016) is reproduced during convective adjustments induced by a potential energy of ∼7 × 107 J m-2, which is calculated from the difference in total potential energy between initially unstable and neutral states. The maximum vertical convective heat flux magnitude is proportional to the potential energy of the convective adjustment in the experiments with the initial unstable-layer thickness altered. The present work suggests that convective adjustment is a promising process for producing the wind structure with occasionally generating surface winds of ∼1 m s-1 and retrograde wind patches.

  20. The force distribution probability function for simple fluids by density functional theory.

    Science.gov (United States)

    Rickayzen, G; Heyes, D M

    2013-02-28

    Classical density functional theory (DFT) is used to derive a formula for the probability density distribution function, P(F), and probability distribution function, W(F), for simple fluids, where F is the net force on a particle. The final formula for P(F) ∝ exp(-AF(2)), where A depends on the fluid density, the temperature, and the Fourier transform of the pair potential. The form of the DFT theory used is only applicable to bounded potential fluids. When combined with the hypernetted chain closure of the Ornstein-Zernike equation, the DFT theory for W(F) agrees with molecular dynamics computer simulations for the Gaussian and bounded soft sphere at high density. The Gaussian form for P(F) is still accurate at lower densities (but not too low density) for the two potentials, but with a smaller value for the constant, A, than that predicted by the DFT theory.

  1. Explicit Expressions for the Ruin Probabilities of Erlang Risk Processes with Pareto Individual Claim Distributions

    Institute of Scientific and Technical Information of China (English)

    Li Wei; Hai-liang Yang

    2004-01-01

    In this paper we first consider a risk process in which claim inter-arrival times and the time until the first claim have an Erlang (2)distribution.An explicit solution is derived for the probability of ultimate ruin,given an initial reserve of u when the claim size follows a Pareto distribution.Follow Ramsay [8] ,Laplace transforms and exponential integrals are used to derive the solution,which involves a single integral of real valued functions along the positive real line,and the integrand is not of an oscillating kind.Then we show that the ultimate ruin probability can be expressed as the sum of expected values of functions of two different Gamma random variables.Finally,the results are extended to the Erlang(n)case.Numerical examples are given to illustrate the main results.

  2. Estimating the probability distribution of von Mises stress for structures undergoing random excitation. Part 1: Derivation

    Energy Technology Data Exchange (ETDEWEB)

    Segalman, D.; Reese, G.

    1998-09-01

    The von Mises stress is often used as the metric for evaluating design margins, particularly for structures made of ductile materials. For deterministic loads, both static and dynamic, the calculation of von Mises stress is straightforward, as is the resulting calculation of reliability. For loads modeled as random processes, the task is different; the response to such loads is itself a random process and its properties must be determined in terms of those of both the loads and the system. This has been done in the past by Monte Carlo sampling of numerical realizations that reproduce the second order statistics of the problem. Here, the authors present a method that provides analytic expressions for the probability distributions of von Mises stress which can be evaluated efficiently and with good precision numerically. Further, this new approach has the important advantage of providing the asymptotic properties of the probability distribution.

  3. Conditional probability distribution (CPD) method in temperature based death time estimation: Error propagation analysis.

    Science.gov (United States)

    Hubig, Michael; Muggenthaler, Holger; Mall, Gita

    2014-05-01

    Bayesian estimation applied to temperature based death time estimation was recently introduced as conditional probability distribution or CPD-method by Biermann and Potente. The CPD-method is useful, if there is external information that sets the boundaries of the true death time interval (victim last seen alive and found dead). CPD allows computation of probabilities for small time intervals of interest (e.g. no-alibi intervals of suspects) within the large true death time interval. In the light of the importance of the CPD for conviction or acquittal of suspects the present study identifies a potential error source. Deviations in death time estimates will cause errors in the CPD-computed probabilities. We derive formulae to quantify the CPD error as a function of input error. Moreover we observed the paradox, that in cases, in which the small no-alibi time interval is located at the boundary of the true death time interval, adjacent to the erroneous death time estimate, CPD-computed probabilities for that small no-alibi interval will increase with increasing input deviation, else the CPD-computed probabilities will decrease. We therefore advise not to use CPD if there is an indication of an error or a contra-empirical deviation in the death time estimates, that is especially, if the death time estimates fall out of the true death time interval, even if the 95%-confidence intervals of the estimate still overlap the true death time interval.

  4. Cumulants of Net-Proton, Net-Kaon and Net-Charge Multiplicity Distributions in Au+Au Collisions at RHIC BES Energies from UrQMD Model

    CERN Document Server

    Xu, Ji; Liu, Feng; Luo, Xiaofeng

    2016-01-01

    Fluctuations of conserved quantities are sensitive observables to probe the signature of QCD phase transition and critical point in heavy-ion collisions. With the UrQMD model, we have studied the centrality and energy dependence of various order cumulants and cumulant ratios (up to fourth order) of net-proton,net-charge and net-kaon multiplicity distributions in Au+Au collisions at $\\sqrt{s_{NN}}$= 7.7, 11.5, 19.6, 27, 39, 62.4, 200 GeV. The model results show that the production mechanism of the particles and anti-particles have significant impacts on the cumulants of net-particles multiplicity distributions and show strong energy dependence. We also made comparisons between model calculations and experimental data measured in the first phase of the beam energy scan (BES) program by the STAR experiment at RHIC. The comparisons indicate that the baryon conservation effect strongly suppress the cumulants of net-proton distributions at low energies and the non-monotonic energy dependence for the net-proton {\\KV...

  5. An Undersea Mining Microseism Source Location Algorithm Considering Wave Velocity Probability Distribution

    OpenAIRE

    2014-01-01

    The traditional mine microseism locating methods are mainly based on the assumption that the wave velocity is uniform through the space, which leads to some errors for the assumption goes against the laws of nature. In this paper, the wave velocity is regarded as a random variable, and the probability distribution information of the wave velocity is fused into the traditional locating method. This paper puts forwards the microseism source location method for the undersea mining on condition o...

  6. Utilizing Probability Distribution Functions and Ensembles to Forecast lonospheric and Thermosphere Space Weather

    Science.gov (United States)

    2016-04-26

    created using probability distribution functions. This new model performs as well or better than other modern models of the solar wind velocity. In... Physics , 120: 7987-8001, doi: 10.1002/2014JA020962. Abstract: The temporal and spatial variations of the thermospheric mass density during a series of...2015), Theoretical study of zonal differences of electron density at midlatitudes with GITM simulation, J. Geophys. Res. Space Physics , 120, 2951

  7. Pauling resonant structures in real space through electron number probability distributions.

    Science.gov (United States)

    Pendas, A Martín; Francisco, E; Blanco, M A

    2007-02-15

    A general hierarchy of the coarsed-grained electron probability distributions induced by exhaustive partitions of the physical space is presented. It is argued that when the space is partitioned into atomic regions the consideration of these distributions may provide a first step toward an orbital invariant treatment of resonant structures. We also show that, in this case, the total molecular energy and its components may be partitioned into structure contributions, providing a fruitful extension of the recently developed interacting quantum atoms approach (J. Chem. Theory Comput. 2005, 1, 1096). The above ideas are explored in the hydrogen molecule, where a complete statistical and energetic decomposition into covalent and ionic terms is presented.

  8. Evolving Molecular Cloud Structure and the Column Density Probability Distribution Function

    CERN Document Server

    Ward, Rachel L; Sills, Alison

    2014-01-01

    The structure of molecular clouds can be characterized with the probability distribution function (PDF) of the mass surface density. In particular, the properties of the distribution can reveal the nature of the turbulence and star formation present inside the molecular cloud. In this paper, we explore how these structural characteristics evolve with time and also how they relate to various cloud properties as measured from a sample of synthetic column density maps of molecular clouds. We find that, as a cloud evolves, the peak of its column density PDF will shift to surface densities below the observational threshold for detection, resulting in an underlying lognormal distribution which has been effectively lost at late times. Our results explain why certain observations of actively star-forming, dynamically older clouds, such as the Orion molecular cloud, do not appear to have any evidence of a lognormal distribution in their column density PDFs. We also study the evolution of the slope and deviation point ...

  9. Probability distribution of the entanglement across a cut at an infinite-randomness fixed point

    Science.gov (United States)

    Devakul, Trithep; Majumdar, Satya N.; Huse, David A.

    2017-03-01

    We calculate the probability distribution of entanglement entropy S across a cut of a finite one-dimensional spin chain of length L at an infinite-randomness fixed point using Fisher's strong randomness renormalization group (RG). Using the random transverse-field Ising model as an example, the distribution is shown to take the form p (S |L ) ˜L-ψ (k ) , where k ≡S /ln[L /L0] , the large deviation function ψ (k ) is found explicitly, and L0 is a nonuniversal microscopic length. We discuss the implications of such a distribution on numerical techniques that rely on entanglement, such as matrix-product-state-based techniques. Our results are verified with numerical RG simulations, as well as the actual entanglement entropy distribution for the random transverse-field Ising model which we calculate for large L via a mapping to Majorana fermions.

  10. Pore size distribution, survival probability, and relaxation time in random and ordered arrays of fibers

    Science.gov (United States)

    Tomadakis, Manolis M.; Robertson, Teri J.

    2003-07-01

    We present a random walk based investigation of the pore size probability distribution and its moments, the survival probability and mean survival time, and the principal relaxation time, for random and ordered arrays of cylindrical fibers of various orientation distributions. The dimensionless mean survival time, principal relaxation time, mean pore size, and mean square pore size are found to increase with porosity, remain practically independent of the directionality of random fiber beds, and attain lower values for ordered arrays. Wide pore size distributions are obtained for random fiber structures and relatively narrow for ordered square arrays, all in very good agreement with theoretically predicted limiting values. Analytical results derived for the pore size probability and its lower moments for square arrays of fibers practically coincide with the corresponding simulation results. Earlier variational bounds on the mean survival time and principal relaxation time are obeyed by our numerical results in all cases, and are found to be quite sharp up to very high porosities. Dimensionless groups representing the deviation of such bounds from our simulation results vary in practically the same range as the corresponding values reported earlier for beds of spherical particles. A universal scaling expression of the literature relating the mean survival time to the mean pore size [S. Torquato and C. L. Y. Yeong, J. Chem. Phys. 106, 8814 (1997)] agrees very well with our results for all types of fiber structures, thus validated for the first time for anisotropic porous media.

  11. Optimal design of unit hydrographs using probability distribution and genetic algorithms

    Indian Academy of Sciences (India)

    Rajib Kumar Bhattacharjya

    2004-10-01

    A nonlinear optimization model is developed to transmute a unit hydrograph into a probability distribution function (PDF). The objective function is to minimize the sum of the square of the deviation between predicted and actual direct runoff hydrograph of a watershed. The predicted runoff hydrograph is estimated by using a PDF. In a unit hydrograph, the depth of rainfall excess must be unity and the ordinates must be positive. Incorporation of a PDF ensures that the depth of rainfall excess for the unit hydrograph is unity, and the ordinates are also positive. Unit hydrograph ordinates are in terms of intensity of rainfall excess on a discharge per unit catchment area basis, the unit area thus representing the unit rainfall excess. The proposed method does not have any constraint. The nonlinear optimization formulation is solved using binary-coded genetic algorithms. The number of variables to be estimated by optimization is the same as the number of probability distribution parameters; gamma and log-normal probability distributions are used. The existing nonlinear programming model for obtaining optimal unit hydrograph has also been solved using genetic algorithms, where the constrained nonlinear optimization problem is converted to an unconstrained problem using penalty parameter approach. The results obtained are compared with those obtained by the earlier LP model and are fairly similar.

  12. Probability distribution of financial returns in a model of multiplicative Brownian motion with stochastic diffusion coefficient

    Science.gov (United States)

    Silva, Antonio

    2005-03-01

    It is well-known that the mathematical theory of Brownian motion was first developed in the Ph. D. thesis of Louis Bachelier for the French stock market before Einstein [1]. In Ref. [2] we studied the so-called Heston model, where the stock-price dynamics is governed by multiplicative Brownian motion with stochastic diffusion coefficient. We solved the corresponding Fokker-Planck equation exactly and found an analytic formula for the time-dependent probability distribution of stock price changes (returns). The formula interpolates between the exponential (tent-shaped) distribution for short time lags and the Gaussian (parabolic) distribution for long time lags. The theoretical formula agrees very well with the actual stock-market data ranging from the Dow-Jones index [2] to individual companies [3], such as Microsoft, Intel, etc. [] [1] Louis Bachelier, ``Th'eorie de la sp'eculation,'' Annales Scientifiques de l''Ecole Normale Sup'erieure, III-17:21-86 (1900).[] [2] A. A. Dragulescu and V. M. Yakovenko, ``Probability distribution of returns in the Heston model with stochastic volatility,'' Quantitative Finance 2, 443--453 (2002); Erratum 3, C15 (2003). [cond-mat/0203046] [] [3] A. C. Silva, R. E. Prange, and V. M. Yakovenko, ``Exponential distribution of financial returns at mesoscopic time lags: a new stylized fact,'' Physica A 344, 227--235 (2004). [cond-mat/0401225

  13. Effect of Rain on Probability Distributions Fitted to Vehicle Time Headways

    Directory of Open Access Journals (Sweden)

    Hashim Mohammed Alhassan

    2012-01-01

    Full Text Available Time headway data generated from different rain conditions were fitted to probability distributions to see which ones best described the trends in headway behaviour in wet weather.  Data was generated from the J5, a principal road in Johor Bahru for two months and the headways in no-rain condition were analysed and compared to the rain generated headway data. The results showed a decrease in headways between no-rain and the rain conditions. Further decreases were observed with increase in rainfall intensity. Thus between no-rain to light rain condition there was 15.66% reduction in the mean headways. Also the mean headway reduction between no-rain and medium rain condition is 19.97% while the reduction between no-rain and heavy rain condition is 25.65%. This trend is already acknowledged in the literature. The Burr probability distribution ranked first amongst five others in describing the trends in headway behaviour during rainfall. It passed the goodness of fit tests for the K-S, A2 and C-S at 95% and 99 % respectively. The scale parameter of the Burr model and the P-value increased as the rain intensity increased. This suggests more vehicular cluster during rainfall with the probability of this occurring increasing with more rain intensity. The coefficient of variation and Skewness also pointed towards increase in vehicle cluster. The Burr Probability Distribution therefore can be applied to model headways in rain and no-rain weather conditions among others.

  14. The Lognormal Probability Distribution Function of the Perseus Molecular Cloud: A Comparison of HI and Dust

    CERN Document Server

    Burkhart, Blakesley; Murray, Claire; Stanimirovic, Snezana

    2015-01-01

    The shape of the probability distribution function (PDF) of molecular clouds is an important ingredient for modern theories of star formation and turbulence. Recently, several studies have pointed out observational difficulties with constraining the low column density (i.e. Av <1) PDF using dust tracers. In order to constrain the shape and properties of the low column density probability distribution function, we investigate the PDF of multiphase atomic gas in the Perseus molecular cloud using opacity-corrected GALFA-HI data and compare the PDF shape and properties to the total gas PDF and the N(H2) PDF. We find that the shape of the PDF in the atomic medium of Perseus is well described by a lognormal distribution, and not by a power-law or bimodal distribution. The peak of the atomic gas PDF in and around Perseus lies at the HI-H2 transition column density for this cloud, past which the N(H2) PDF takes on a powerlaw form. We find that the PDF of the atomic gas is narrow and at column densities larger than...

  15. Criticality of the net-baryon number probability distribution at finite density

    Directory of Open Access Journals (Sweden)

    Kenji Morita

    2015-02-01

    Full Text Available We compute the probability distribution P(N of the net-baryon number at finite temperature and quark-chemical potential, μ, at a physical value of the pion mass in the quark-meson model within the functional renormalization group scheme. For μ/T<1, the model exhibits the chiral crossover transition which belongs to the universality class of the O(4 spin system in three dimensions. We explore the influence of the chiral crossover transition on the properties of the net baryon number probability distribution, P(N. By considering ratios of P(N to the Skellam function, with the same mean and variance, we unravel the characteristic features of the distribution that are related to O(4 criticality at the chiral crossover transition. We explore the corresponding ratios for data obtained at RHIC by the STAR Collaboration and discuss their implications. We also examine O(4 criticality in the context of binomial and negative-binomial distributions for the net proton number.

  16. Probability distribution functions of turbulence in seepage-affected alluvial channel

    Science.gov (United States)

    Sharma, Anurag; Kumar, Bimlesh

    2017-02-01

    The present experimental study is carried out on the probability distribution functions (PDFs) of turbulent flow characteristics within near-bed-surface and away-from-bed surfaces for both no seepage and seepage flow. Laboratory experiments were conducted in the plane sand bed for no seepage (NS), 10% seepage (10%S) and 15% seepage (15%) cases. The experimental calculation of the PDFs of turbulent parameters such as Reynolds shear stress, velocity fluctuations, and bursting events is compared with theoretical expression obtained by Gram-Charlier (GC)-based exponential distribution. Experimental observations follow the computed PDF distributions for both no seepage and seepage cases. Jensen-Shannon divergence (JSD) method is used to measure the similarity between theoretical and experimental PDFs. The value of JSD for PDFs of velocity fluctuation lies between 0.0005 to 0.003 while the JSD value for PDFs of Reynolds shear stress varies between 0.001 to 0.006. Even with the application of seepage, the PDF distribution of bursting events, sweeps and ejections are well characterized by the exponential distribution of the GC series, except that a slight deflection of inward and outward interactions is observed which may be due to weaker events. The value of JSD for outward and inward interactions ranges from 0.0013 to 0.032, while the JSD value for sweep and ejection events varies between 0.0001 to 0.0025. The theoretical expression for the PDF of turbulent intensity is developed in the present study, which agrees well with the experimental observations and JSD lies between 0.007 and 0.015. The work presented is potentially applicable to the probability distribution of mobile-bed sediments in seepage-affected alluvial channels typically characterized by the various turbulent parameters. The purpose of PDF estimation from experimental data is that it provides a complete numerical description in the areas of turbulent flow either at a single or finite number of points.

  17. Predicting dihedral angle probability distributions for protein coil residues from primary sequence using neural networks

    Directory of Open Access Journals (Sweden)

    Fonseca Rasmus

    2009-10-01

    Full Text Available Abstract Background Predicting the three-dimensional structure of a protein from its amino acid sequence is currently one of the most challenging problems in bioinformatics. The internal structure of helices and sheets is highly recurrent and help reduce the search space significantly. However, random coil segments make up nearly 40% of proteins and they do not have any apparent recurrent patterns, which complicates overall prediction accuracy of protein structure prediction methods. Luckily, previous work has indicated that coil segments are in fact not completely random in structure and flanking residues do seem to have a significant influence on the dihedral angles adopted by the individual amino acids in coil segments. In this work we attempt to predict a probability distribution of these dihedral angles based on the flanking residues. While attempts to predict dihedral angles of coil segments have been done previously, none have, to our knowledge, presented comparable results for the probability distribution of dihedral angles. Results In this paper we develop an artificial neural network that uses an input-window of amino acids to predict a dihedral angle probability distribution for the middle residue in the input-window. The trained neural network shows a significant improvement (4-68% in predicting the most probable bin (covering a 30° × 30° area of the dihedral angle space for all amino acids in the data set compared to baseline statistics. An accuracy comparable to that of secondary structure prediction (≈ 80% is achieved by observing the 20 bins with highest output values. Conclusion Many different protein structure prediction methods exist and each uses different tools and auxiliary predictions to help determine the native structure. In this work the sequence is used to predict local context dependent dihedral angle propensities in coil-regions. This predicted distribution can potentially improve tertiary structure prediction

  18. Evolution Equation for a Joint Tomographic Probability Distribution of Spin-1 Particles

    Science.gov (United States)

    Korennoy, Ya. A.; Man'ko, V. I.

    2016-11-01

    The nine-component positive vector optical tomographic probability portrait of quantum state of spin-1 particles containing full spatial and spin information about the state without redundancy is constructed. Also the suggested approach is expanded to symplectic tomography representation and to representations with quasidistributions like Wigner function, Husimi Q-function, and Glauber-Sudarshan P-function. The evolution equations for constructed vector optical and symplectic tomograms and vector quasidistributions for arbitrary Hamiltonian are found. The evolution equations are also obtained in special case of the quantum system of charged spin-1 particle in arbitrary electro-magnetic field, which are analogs of non-relativistic Proca equation in appropriate representations. The generalization of proposed approach to the cases of arbitrary spin is discussed. The possibility of formulation of quantum mechanics of the systems with spins in terms of joint probability distributions without the use of wave functions or density matrices is explicitly demonstrated.

  19. Label Ranking with Abstention: Predicting Partial Orders by Thresholding Probability Distributions (Extended Abstract)

    CERN Document Server

    Cheng, Weiwei

    2011-01-01

    We consider an extension of the setting of label ranking, in which the learner is allowed to make predictions in the form of partial instead of total orders. Predictions of that kind are interpreted as a partial abstention: If the learner is not sufficiently certain regarding the relative order of two alternatives, it may abstain from this decision and instead declare these alternatives as being incomparable. We propose a new method for learning to predict partial orders that improves on an existing approach, both theoretically and empirically. Our method is based on the idea of thresholding the probabilities of pairwise preferences between labels as induced by a predicted (parameterized) probability distribution on the set of all rankings.

  20. Probability distributions of whisker-surface contact: quantifying elements of the rat vibrissotactile natural scene.

    Science.gov (United States)

    Hobbs, Jennifer A; Towal, R Blythe; Hartmann, Mitra J Z

    2015-08-01

    Analysis of natural scene statistics has been a powerful approach for understanding neural coding in the auditory and visual systems. In the field of somatosensation, it has been more challenging to quantify the natural tactile scene, in part because somatosensory signals are so tightly linked to the animal's movements. The present work takes a step towards quantifying the natural tactile scene for the rat vibrissal system by simulating rat whisking motions to systematically investigate the probabilities of whisker-object contact in naturalistic environments. The simulations permit an exhaustive search through the complete space of possible contact patterns, thereby allowing for the characterization of the patterns that would most likely occur during long sequences of natural exploratory behavior. We specifically quantified the probabilities of 'concomitant contact', that is, given that a particular whisker makes contact with a surface during a whisk, what is the probability that each of the other whiskers will also make contact with the surface during that whisk? Probabilities of concomitant contact were quantified in simulations that assumed increasingly naturalistic conditions: first, the space of all possible head poses; second, the space of behaviorally preferred head poses as measured experimentally; and third, common head poses in environments such as cages and burrows. As environments became more naturalistic, the probability distributions shifted from exhibiting a 'row-wise' structure to a more diagonal structure. Results also reveal that the rat appears to use motor strategies (e.g. head pitches) that generate contact patterns that are particularly well suited to extract information in the presence of uncertainty. © 2015. Published by The Company of Biologists Ltd.

  1. The probability distribution functions of emission line flux measurements and their ratios

    CERN Document Server

    Wesson, R; Scicluna, P

    2016-01-01

    Many physical parameters in astrophysics are derived using the ratios of two observed quantities. If the relative uncertainties on measurements are small enough, uncertainties can be propagated analytically using simplifying assumptions, but for large normally distributed uncertainties, the probability distribution of the ratio become skewed, with a modal value offset from that expected in Gaussian uncertainty propagation. Furthermore, the most likely value of a ratio A/B is not equal to the reciprocal of the most likely value of B/A. The effect is most pronounced when the uncertainty on the denominator is larger than that on the numerator. We show that this effect is seen in an analysis of 12,126 spectra from the Sloan Digital Sky Survey. The intrinsically fixed ratio of the [O III] lines at 4959 and 5007 ${\\AA}$ is conventionally expressed as the ratio of the stronger line to the weaker line. Thus, the uncertainty on the denominator is larger, and non-Gaussian probability distributions result. By taking thi...

  2. Comparison of Lauritzen-Spiegelhalter and successive restrictions algorithms for computing probability distributions in Bayesian networks

    Science.gov (United States)

    Smail, Linda

    2016-06-01

    The basic task of any probabilistic inference system in Bayesian networks is computing the posterior probability distribution for a subset or subsets of random variables, given values or evidence for some other variables from the same Bayesian network. Many methods and algorithms have been developed to exact and approximate inference in Bayesian networks. This work compares two exact inference methods in Bayesian networks-Lauritzen-Spiegelhalter and the successive restrictions algorithm-from the perspective of computational efficiency. The two methods were applied for comparison to a Chest Clinic Bayesian Network. Results indicate that the successive restrictions algorithm shows more computational efficiency than the Lauritzen-Spiegelhalter algorithm.

  3. Finite de Finetti theorem for conditional probability distributions describing physical theories

    Science.gov (United States)

    Christandl, Matthias; Toner, Ben

    2009-04-01

    We work in a general framework where the state of a physical system is defined by its behavior under measurement and the global state is constrained by no-signaling conditions. We show that the marginals of symmetric states in such theories can be approximated by convex combinations of independent and identical conditional probability distributions, generalizing the classical finite de Finetti theorem of Diaconis and Freedman. Our results apply to correlations obtained from quantum states even when there is no bound on the local dimension, so that known quantum de Finetti theorems cannot be used.

  4. Discrete coherent states and probability distributions in finite-dimensional spaces

    Energy Technology Data Exchange (ETDEWEB)

    Galetti, D.; Marchiolli, M.A.

    1995-06-01

    Operator bases are discussed in connection with the construction of phase space representatives of operators in finite-dimensional spaces and their properties are presented. It is also shown how these operator bases allow for the construction of a finite harmonic oscillator-like coherent state. Creation and annihilation operators for the Fock finite-dimensional space are discussed and their expressions in terms of the operator bases are explicitly written. The relevant finite-dimensional probability distributions are obtained and their limiting behavior for an infinite-dimensional space are calculated which agree with the well know results. (author). 20 refs, 2 figs.

  5. On the Meta Distribution of Coverage Probability in Uplink Cellular Networks

    KAUST Repository

    Elsawy, Hesham

    2017-04-07

    This letter studies the meta distribution of coverage probability (CP), within a stochastic geometry framework, for cellular uplink transmission with fractional path-loss inversion power control. Using the widely accepted Poisson point process (PPP) for modeling the spatial locations of base stations (BSs), we obtain the percentiles of users that achieve a target uplink CP over an arbitrary, but fixed, realization of the PPP. To this end, the effect of the users activity factor (p) and the path-loss compensation factor () on the uplink performance are analyzed. The results show that decreasing p and/or increasing reduce the CP variation around the spatially averaged value.

  6. Spectra and probability distributions of thermal flux in turbulent Rayleigh-B\\'{e}nard convection

    CERN Document Server

    Pharasi, Hirdesh K; Kumar, Krishna; Bhattacharjee, Jayanta K

    2016-01-01

    The spectra of turbulent heat flux $\\mathrm{H}(k)$ in Rayleigh-B\\'{e}nard convection with and without uniform rotation are presented. The spectrum $\\mathrm{H}(k)$ scales with wave number $k$ as $\\sim k^{-2}$. The scaling exponent is almost independent of the Taylor number $\\mathrm{Ta}$ and Prandtl number $\\mathrm{Pr}$ for higher values of the reduced Rayleigh number $r$ ($ > 10^3$). The exponent, however, depends on $\\mathrm{Ta}$ and $\\mathrm{Pr}$ for smaller values of $r$ ($<10^3$). The probability distribution functions of the local heat fluxes are non-Gaussian and have exponential tails.

  7. Wind speed analysis in La Vainest, Mexico: a bimodal probability distribution case

    Energy Technology Data Exchange (ETDEWEB)

    Jaramillo, O.A.; Borja, M.A. [Energias No Convencionales, Morelos (Mexico). Instituto de Investigaciones Electricas

    2004-08-01

    The statistical characteristics of the wind speed in La Vainest, Oxoic, Mexico, have been analyzed by using wind speed data recorded by Instituto de Investigaciones Electricas (IIE). By grouping the observations by annual, seasonal and wind direction, we show that the wind speed distribution, with calms included, is not represented by the typical two-parameter Weibull function. A mathematical formulation by using a bimodal Weibull and Weibull probability distribution function (PDF) has been developed to analyse the wind speed frequency distribution in that region. The model developed here can be applied for similar regions where the wind speed distribution presents a bimodal PDF. The two-parameter Weibull wind speed distribution must not be generalised, since it is not accurate to represent some wind regimes as the case of La Ventosa, Mexico. The analysis of wind data shows that computing the capacity factor for wind power plants to be installed in La Ventosa must be carded out by means of a bimodal PDF instead of the typical Weibull PDF. Otherwise, the capacity factor will be underestimated. (author)

  8. Long-Term Probability Distribution of Wind Turbine Planetary Bearing Loads (Poster)

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Z.; Xing, Y.; Guo, Y.; Dong, W.; Moan, T.; Gao, Z.

    2013-04-01

    Among the various causes of bearing damage and failure, metal fatigue of the rolling contact surface is the dominant failure mechanism. The fatigue life is associated with the load conditions under which wind turbines operate in the field. Therefore, it is important to understand the long-term distribution of the bearing loads under various environmental conditions. The National Renewable Energy Laboratory's 750-kW Gearbox Reliability Collaborative wind turbine is studied in this work. A decoupled analysis using several computer codes is carried out. The global aero-elastic simulations are performed using HAWC2. The time series of the drivetrain loads and motions from the global dynamic analysis are fed to a drivetrain model in SIMPACK. The time-varying internal pressure distribution along the raceway is obtained analytically. A series of probability distribution functions are then used to fit the long-term statistical distribution at different locations along raceways. The long-term distribution of the bearing raceway loads are estimated under different environmental conditions. Finally, the bearing fatigue lives are calculated.

  9. Study of the SEMG probability distribution of the paretic tibialis anterior muscle

    Science.gov (United States)

    Cherniz, Analía S.; Bonell, Claudia E.; Tabernig, Carolina B.

    2007-11-01

    The surface electromyographic signal is a stochastic signal that has been modeled as a Gaussian process, with a zero mean. It has been experimentally proved that this probability distribution can be adjusted with less error to a Laplacian type distribution. The selection of estimators for the detection of changes in the amplitude of the muscular signal depends, among other things, on the type of distribution. In the case of subjects with lesions to the superior motor neuron, the lack of central control affects the muscular tone, the force and the patterns of muscular movement involved in activities such as the gait cycle. In this work, the distribution types of the SEMG signal amplitudes of the tibialis anterior muscle are evaluated during gait, both in two healthy subjects and in two hemiparetic ones in order to select the estimators that best characterize them. It was observed that the Laplacian distribution function would be the one that best adjusts to the experimental data in the studied subjects, although this largely depends on the subject and on the data segment analyzed.

  10. Study of the SEMG probability distribution of the paretic tibialis anterior muscle

    Energy Technology Data Exchange (ETDEWEB)

    Cherniz, AnalIa S; Bonell, Claudia E; Tabernig, Carolina B [Laboratorio de Ingenieria de Rehabilitacion e Investigaciones Neuromusculares y Sensoriales, Facultad de Ingenieria, UNER, Oro Verde (Argentina)

    2007-11-15

    The surface electromyographic signal is a stochastic signal that has been modeled as a Gaussian process, with a zero mean. It has been experimentally proved that this probability distribution can be adjusted with less error to a Laplacian type distribution. The selection of estimators for the detection of changes in the amplitude of the muscular signal depends, among other things, on the type of distribution. In the case of subjects with lesions to the superior motor neuron, the lack of central control affects the muscular tone, the force and the patterns of muscular movement involved in activities such as the gait cycle. In this work, the distribution types of the SEMG signal amplitudes of the tibialis anterior muscle are evaluated during gait, both in two healthy subjects and in two hemiparetic ones in order to select the estimators that best characterize them. It was observed that the Laplacian distribution function would be the one that best adjusts to the experimental data in the studied subjects, although this largely depends on the subject and on the data segment analyzed.

  11. Measurement of higher cumulants of net-charge multiplicity distributions in Au +Au collisions at √{sN N}=7.7 -200 GeV

    Science.gov (United States)

    Adare, A.; Afanasiev, S.; Aidala, C.; Ajitanand, N. N.; Akiba, Y.; Akimoto, R.; Al-Bataineh, H.; Alexander, J.; Al-Ta'Ani, H.; Angerami, A.; Aoki, K.; Apadula, N.; Aramaki, Y.; Asano, H.; Aschenauer, E. C.; Atomssa, E. T.; Averbeck, R.; Awes, T. C.; Azmoun, B.; Babintsev, V.; Bai, M.; Baksay, G.; Baksay, L.; Bannier, B.; Barish, K. N.; Bassalleck, B.; Basye, A. T.; Bathe, S.; Baublis, V.; Baumann, C.; Baumgart, S.; Bazilevsky, A.; Belikov, S.; Belmont, R.; Bennett, R.; Berdnikov, A.; Berdnikov, Y.; Bickley, A. A.; Black, D.; Blau, D. S.; Bok, J. S.; Boyle, K.; Brooks, M. L.; Bryslawskyj, J.; Buesching, H.; Bumazhnov, V.; Bunce, G.; Butsyk, S.; Camacho, C. M.; Campbell, S.; Castera, P.; Chen, C.-H.; Chi, C. Y.; Chiu, M.; Choi, I. J.; Choi, J. B.; Choi, S.; Choudhury, R. K.; Christiansen, P.; Chujo, T.; Chung, P.; Chvala, O.; Cianciolo, V.; Citron, Z.; Cole, B. A.; Connors, M.; Constantin, P.; Cronin, N.; Crossette, N.; Csanád, M.; Csörgő, T.; Dahms, T.; Dairaku, S.; Danchev, I.; Das, K.; Datta, A.; Daugherity, M. S.; David, G.; Dehmelt, K.; Denisov, A.; Deshpande, A.; Desmond, E. J.; Dharmawardane, K. V.; Dietzsch, O.; Ding, L.; Dion, A.; Do, J. H.; Donadelli, M.; D'Orazio, L.; Drapier, O.; Drees, A.; Drees, K. A.; Durham, J. M.; Durum, A.; Dutta, D.; Edwards, S.; Efremenko, Y. V.; Ellinghaus, F.; Engelmore, T.; Enokizono, A.; En'yo, H.; Esumi, S.; Eyser, K. O.; Fadem, B.; Fields, D. E.; Finger, M.; Finger, M.; Fleuret, F.; Fokin, S. L.; Fraenkel, Z.; Frantz, J. E.; Franz, A.; Frawley, A. D.; Fujiwara, K.; Fukao, Y.; Fusayasu, T.; Gainey, K.; Gal, C.; Garg, P.; Garishvili, A.; Garishvili, I.; Giordano, F.; Glenn, A.; Gong, H.; Gong, X.; Gonin, M.; Goto, Y.; Granier de Cassagnac, R.; Grau, N.; Greene, S. V.; Grosse Perdekamp, M.; Gu, Y.; Gunji, T.; Guo, L.; Gustafsson, H.-Å.; Hachiya, T.; Haggerty, J. S.; Hahn, K. I.; Hamagaki, H.; Hamblen, J.; Han, R.; Hanks, J.; Hartouni, E. P.; Hashimoto, K.; Haslum, E.; Hayano, R.; Hayashi, S.; He, X.; Heffner, M.; Hemmick, T. K.; Hester, T.; Hill, J. C.; Hohlmann, M.; Hollis, R. S.; Holzmann, W.; Homma, K.; Hong, B.; Horaguchi, T.; Hori, Y.; Hornback, D.; Huang, S.; Ichihara, T.; Ichimiya, R.; Ide, J.; Iinuma, H.; Ikeda, Y.; Imai, K.; Imazu, Y.; Imrek, J.; Inaba, M.; Iordanova, A.; Isenhower, D.; Ishihara, M.; Isinhue, A.; Isobe, T.; Issah, M.; Isupov, A.; Ivanishchev, D.; Jacak, B. V.; Javani, M.; Jia, J.; Jiang, X.; Jin, J.; Johnson, B. M.; Joo, K. S.; Jouan, D.; Jumper, D. S.; Kajihara, F.; Kametani, S.; Kamihara, N.; Kamin, J.; Kaneti, S.; Kang, B. H.; Kang, J. H.; Kang, J. S.; Kapustinsky, J.; Karatsu, K.; Kasai, M.; Kawall, D.; Kawashima, M.; Kazantsev, A. V.; Kempel, T.; Key, J. A.; Khandai, P. K.; Khanzadeev, A.; Kijima, K. M.; Kim, B. I.; Kim, C.; Kim, D. H.; Kim, D. J.; Kim, E.; Kim, E.-J.; Kim, H. J.; Kim, K.-B.; Kim, S. H.; Kim, Y.-J.; Kim, Y. K.; Kinney, E.; Kiriluk, K.; Kiss, Á.; Kistenev, E.; Klatsky, J.; Kleinjan, D.; Kline, P.; Kochenda, L.; Komatsu, Y.; Komkov, B.; Konno, M.; Koster, J.; Kotchetkov, D.; Kotov, D.; Kozlov, A.; Král, A.; Kravitz, A.; Krizek, F.; Kunde, G. J.; Kurita, K.; Kurosawa, M.; Kwon, Y.; Kyle, G. S.; Lacey, R.; Lai, Y. S.; Lajoie, J. G.; Lebedev, A.; Lee, B.; Lee, D. M.; Lee, J.; Lee, K.; Lee, K. B.; Lee, K. S.; Lee, S. H.; Lee, S. R.; Leitch, M. J.; Leite, M. A. L.; Leitgab, M.; Leitner, E.; Lenzi, B.; Lewis, B.; Li, X.; Liebing, P.; Lim, S. H.; Linden Levy, L. A.; Liška, T.; Litvinenko, A.; Liu, H.; Liu, M. X.; Love, B.; Luechtenborg, R.; Lynch, D.; Maguire, C. F.; Makdisi, Y. I.; Makek, M.; Malakhov, A.; Malik, M. D.; Manion, A.; Manko, V. I.; Mannel, E.; Mao, Y.; Maruyama, T.; Masui, H.; Masumoto, S.; Matathias, F.; McCumber, M.; McGaughey, P. L.; McGlinchey, D.; McKinney, C.; Means, N.; Meles, A.; Mendoza, M.; Meredith, B.; Miake, Y.; Mibe, T.; Midori, J.; Mignerey, A. C.; Mikeš, P.; Miki, K.; Milov, A.; Mishra, D. K.; Mishra, M.; Mitchell, J. T.; Miyachi, Y.; Miyasaka, S.; Mohanty, A. K.; Mohapatra, S.; Moon, H. J.; Morino, Y.; Morreale, A.; Morrison, D. P.; Moskowitz, M.; Motschwiller, S.; Moukhanova, T. V.; Murakami, T.; Murata, J.; Mwai, A.; Nagae, T.; Nagamiya, S.; Nagle, J. L.; Naglis, M.; Nagy, M. I.; Nakagawa, I.; Nakamiya, Y.; Nakamura, K. R.; Nakamura, T.; Nakano, K.; Nattrass, C.; Nederlof, A.; Netrakanti, P. K.; Newby, J.; Nguyen, M.; Nihashi, M.; Niida, T.; Nouicer, R.; Novitzky, N.; Nukariya, A.; Nyanin, A. S.; Obayashi, H.; O'Brien, E.; Oda, S. X.; Ogilvie, C. A.; Oka, M.; Okada, K.; Onuki, Y.; Oskarsson, A.; Ouchida, M.; Ozawa, K.; Pak, R.; Pantuev, V.; Papavassiliou, V.; Park, B. H.; Park, I. H.; Park, J.; Park, S.; Park, S. K.; Park, W. J.; Pate, S. F.; Patel, L.; Pei, H.; Peng, J.-C.; Pereira, H.; Perepelitsa, D. V.; Peresedov, V.; Peressounko, D. Yu.; Petti, R.; Pinkenburg, C.; Pisani, R. P.; Proissl, M.; Purschke, M. L.; Purwar, A. K.; Qu, H.; Rak, J.; Rakotozafindrabe, A.; Ravinovich, I.; Read, K. F.; Reygers, K.; Reynolds, D.; Riabov, V.; Riabov, Y.; Richardson, E.; Riveli, N.; Roach, D.; Roche, G.; Rolnick, S. D.; Rosati, M.; Rosen, C. A.; Rosendahl, S. S. E.; Rosnet, P.; Rukoyatkin, P.; Ružička, P.; Ryu, M. S.; Sahlmueller, B.; Saito, N.; Sakaguchi, T.; Sakashita, K.; Sako, H.; Samsonov, V.; Sano, M.; Sano, S.; Sarsour, M.; Sato, S.; Sato, T.; Sawada, S.; Sedgwick, K.; Seele, J.; Seidl, R.; Semenov, A. Yu.; Sen, A.; Seto, R.; Sett, P.; Sharma, D.; Shein, I.; Shibata, T.-A.; Shigaki, K.; Shimomura, M.; Shoji, K.; Shukla, P.; Sickles, A.; Silva, C. L.; Silvermyr, D.; Silvestre, C.; Sim, K. S.; Singh, B. K.; Singh, C. P.; Singh, V.; Skolnik, M.; Slunečka, M.; Solano, S.; Soltz, R. A.; Sondheim, W. E.; Sorensen, S. P.; Sourikova, I. V.; Sparks, N. A.; Stankus, P. W.; Steinberg, P.; Stenlund, E.; Stepanov, M.; Ster, A.; Stoll, S. P.; Sugitate, T.; Sukhanov, A.; Sun, J.; Sziklai, J.; Takagui, E. M.; Takahara, A.; Taketani, A.; Tanabe, R.; Tanaka, Y.; Taneja, S.; Tanida, K.; Tannenbaum, M. J.; Tarafdar, S.; Taranenko, A.; Tarján, P.; Tennant, E.; Themann, H.; Thomas, T. L.; Todoroki, T.; Togawa, M.; Toia, A.; Tomášek, L.; Tomášek, M.; Torii, H.; Towell, R. S.; Tserruya, I.; Tsuchimoto, Y.; Tsuji, T.; Vale, C.; Valle, H.; van Hecke, H. W.; Vargyas, M.; Vazquez-Zambrano, E.; Veicht, A.; Velkovska, J.; Vértesi, R.; Vinogradov, A. A.; Virius, M.; Voas, B.; Vossen, A.; Vrba, V.; Vznuzdaev, E.; Wang, X. R.; Watanabe, D.; Watanabe, K.; Watanabe, Y.; Watanabe, Y. S.; Wei, F.; Wei, R.; Wessels, J.; Whitaker, S.; White, S. N.; Winter, D.; Wolin, S.; Wood, J. P.; Woody, C. L.; Wright, R. M.; Wysocki, M.; Xia, B.; Xie, W.; Yamaguchi, Y. L.; Yamaura, K.; Yang, R.; Yanovich, A.; Ying, J.; Yokkaichi, S.; You, Z.; Young, G. R.; Younus, I.; Yushmanov, I. E.; Zajc, W. A.; Zelenski, A.; Zhang, C.; Zhou, S.; Zolin, L.; Phenix Collaboration

    2016-01-01

    We report the measurement of cumulants (Cn,n =1 ,...,4 ) of the net-charge distributions measured within pseudorapidity (|η |<0.35 ) in Au +Au collisions at √{sNN}=7.7 -200 GeV with the PHENIX experiment at the Relativistic Heavy Ion Collider. The ratios of cumulants (e.g., C1/C2 , C3/C1 ) of the net-charge distributions, which can be related to volume independent susceptibility ratios, are studied as a function of centrality and energy. These quantities are important to understand the quantum-chromodynamics phase diagram and possible existence of a critical end point. The measured values are very well described by expectation from negative binomial distributions. We do not observe any nonmonotonic behavior in the ratios of the cumulants as a function of collision energy. The measured values of C1/C2 and C3/C1 can be directly compared to lattice quantum-chromodynamics calculations and thus allow extraction of both the chemical freeze-out temperature and the baryon chemical potential at each center-of-mass energy. The extracted baryon chemical potentials are in excellent agreement with a thermal-statistical analysis model.

  12. Measurement of higher cumulants of net-charge multiplicity distributions in Au$+$Au collisions at $\\sqrt{s_{_{NN}}}=7.7-200$ GeV

    CERN Document Server

    Adare, A; Aidala, C; Ajitanand, N N; Akiba, Y; Akimoto, R; Al-Bataineh, H; Alexander, J; Al-Ta'ani, H; Angerami, A; Aoki, K; Apadula, N; Aramaki, Y; Asano, H; Aschenauer, E C; Atomssa, E T; Averbeck, R; Awes, T C; Azmoun, B; Babintsev, V; Bai, M; Baksay, G; Baksay, L; Bannier, B; Barish, K N; Bassalleck, B; Basye, A T; Bathe, S; Baublis, V; Baumann, C; Baumgart, S; Bazilevsky, A; Belikov, S; Belmont, R; Bennett, R; Berdnikov, A; Berdnikov, Y; Bickley, A A; Black, D; Blau, D S; Bok, J S; Boyle, K; Brooks, M L; Bryslawskyj, J; Buesching, H; Bumazhnov, V; Bunce, G; Butsyk, S; Camacho, C M; Campbell, S; Castera, P; Chen, C -H; Chi, C Y; Chiu, M; Choi, I J; Choi, J B; Choi, S; Choudhury, R K; Christiansen, P; Chujo, T; Chung, P; Chvala, O; Cianciolo, V; Citron, Z; Cole, B A; Connors, M; Constantin, P; Cronin, N; Crossette, N; Csanád, M; Csörgő, T; Dahms, T; Dairaku, S; Danchev, I; Das, K; Datta, A; Daugherity, M S; David, G; Dehmelt, K; Denisov, A; Deshpande, A; Desmond, E J; Dharmawardane, K V; Dietzsch, O; Ding, L; Dion, A; Do, J H; Donadelli, M; D'Orazio, L; Drapier, O; Drees, A; Drees, K A; Durham, J M; Durum, A; Dutta, D; Edwards, S; Efremenko, Y V; Ellinghaus, F; Engelmore, T; Enokizono, A; En'yo, H; Esumi, S; Eyser, K O; Fadem, B; Fields, D E; Finger, M; Jr., \\,; Fleuret, F; Fokin, S L; Fraenkel, Z; Frantz, J E; Franz, A; Frawley, A D; Fujiwara, K; Fukao, Y; Fusayasu, T; Gainey, K; Gal, C; Garg, P; Garishvili, A; Garishvili, I; Giordano, F; Glenn, A; Gong, H; Gong, X; Gonin, M; Goto, Y; de Cassagnac, R Granier; Grau, N; Greene, S V; Perdekamp, M Grosse; Gu, Y; Gunji, T; Guo, L; Gustafsson, H -Å; Hachiya, T; Haggerty, J S; Hahn, K I; Hamagaki, H; Hamblen, J; Han, R; Hanks, J; Hartouni, E P; Hashimoto, K; Haslum, E; Hayano, R; Hayashi, S; He, X; Heffner, M; Hemmick, T K; Hester, T; Hill, J C; Hohlmann, M; Hollis, R S; Holzmann, W; Homma, K; Hong, B; Horaguchi, T; Hori, Y; Hornback, D; Huang, S; Ichihara, T; Ichimiya, R; Ide, J; Iinuma, H; Ikeda, Y; Imai, K; Imazu, Y; Imrek, J; Inaba, M; Iordanova, A; Isenhower, D; Ishihara, M; Isinhue, A; Isobe, T; Issah, M; Isupov, A; Ivanishchev, D; Jacak, B V; Javani, M; Jia, J; Jiang, X; Jin, J; Johnson, B M; Joo, K S; Jouan, D; Jumper, D S; Kajihara, F; Kametani, S; Kamihara, N; Kamin, J; Kaneti, S; Kang, B H; Kang, J H; Kang, J S; Kapustinsky, J; Karatsu, K; Kasai, M; Kawall, D; Kawashima, M; Kazantsev, A V; Kempel, T; Key, J A; Khandai, P K; Khanzadeev, A; Kijima, K M; Kim, B I; Kim, C; Kim, D H; Kim, D J; Kim, E; Kim, E -J; Kim, H J; Kim, K -B; Kim, S H; Kim, Y -J; Kim, Y K; Kinney, E; Kiriluk, K; Kiss, Á; Kistenev, E; Klatsky, J; Kleinjan, D; Kline, P; Kochenda, L; Komatsu, Y; Komkov, B; Konno, M; Koster, J; Kotchetkov, D; Kotov, D; Kozlov, A; Král, A; Kravitz, A; Krizek, F; Kunde, G J; Kurita, K; Kurosawa, M; Kwon, Y; Kyle, G S; Lacey, R; Lai, Y S; Lajoie, J G; Lebedev, A; Lee, B; Lee, D M; Lee, J; Lee, K; Lee, K B; Lee, K S; Lee, S H; Lee, S R; Leitch, M J; Leite, M A L; Leitgab, M; Leitner, E; Lenzi, B; Lewis, B; Li, X; Liebing, P; Lim, S H; Levy, L A Linden; Liška, T; Litvinenko, A; Liu, H; Liu, M X; Love, B; Luechtenborg, R; Lynch, D; Maguire, C F; Makdisi, Y I; Makek, M; Malakhov, A; Malik, M D; Manion, A; Manko, V I; Mannel, E; Mao, Y; Maruyama, T; Masui, H; Masumoto, S; Matathias, F; McCumber, M; McGaughey, P L; McGlinchey, D; McKinney, C; Means, N; Meles, A; Mendoza, M; Meredith, B; Miake, Y; Mibe, T; Midori, J; Mignerey, A C; Mikeš, P; Miki, K; Milov, A; Mishra, D K; Mishra, M; Mitchell, J T; Miyachi, Y; Miyasaka, S; Mohanty, A K; Mohapatra, S; Moon, H J; Morino, Y; Morreale, A; Morrison, D P; Moskowitz, M; Motschwiller, S; Moukhanova, T V; Murakami, T; Murata, J; Mwai, A; Nagae, T; Nagamiya, S; Nagle, J L; Naglis, M; Nagy, M I; Nakagawa, I; Nakamiya, Y; Nakamura, K R; Nakamura, T; Nakano, K; Nattrass, C; Nederlof, A; Netrakanti, P K; Newby, J; Nguyen, M; Nihashi, M; Niida, T; Nouicer, R; Novitzky, N; Nukariya, A; Nyanin, A S; Obayashi, H; O'Brien, E; Oda, S X; Ogilvie, C A; Oka, M; Okada, K; Onuki, Y; Oskarsson, A; Ouchida, M; Ozawa, K; Pak, R; Pantuev, V; Papavassiliou, V; Park, B H; Park, I H; Park, J; Park, S; Park, S K; Park, W J; Pate, S F; Patel, L; Pei, H; Peng, J -C; Pereira, H; Perepelitsa, D V; Peresedov, V; Peressounko, D Yu; Petti, R; Pinkenburg, C; Pisani, R P; Proissl, M; Purschke, M L; Purwar, A K; Qu, H; Rak, J; Rakotozafindrabe, A; Ravinovich, I; Read, K F; Reygers, K; Reynolds, D; Riabov, V; Riabov, Y; Richardson, E; Riveli, N; Roach, D; Roche, G; Rolnick, S D; Rosati, M; Rosen, C A; Rosendahl, S S E; Rosnet, P; Rukoyatkin, P; Ružička, P; Ryu, M S; Sahlmueller, B; Saito, N; Sakaguchi, T; Sakashita, K; Sako, H; Samsonov, V; Sano, M; Sano, S; Sarsour, M; Sato, S; Sato, T; Sawada, S; Sedgwick, K; Seele, J; Seidl, R; Semenov, A Yu; Sen, A; Seto, R; Sett, P; Sharma, D; Shein, I; Shibata, T -A; Shigaki, K; Shimomura, M; Shoji, K; Shukla, P; Sickles, A

    2015-01-01

    We report the measurement of cumulants ($C_n, n=1\\ldots4$) of the net-charge distributions measured within pseudorapidity ($|\\eta|<0.35$) in Au$+$Au collisions at $\\sqrt{s_{_{NN}}}=7.7-200$ GeV with the PHENIX experiment at the Relativistic Heavy Ion Collider. The ratios of cumulants (e.g. $C_1/C_2$, $C_3/C_1$) of the net-charge distributions, which can be related to volume independent susceptibility ratios, are studied as a function of centrality and energy. These quantities are important to understand the quantum-chromodynamics phase diagram and possible existence of a critical end point. The measured values are very well described by expectation from negative binomial distributions. We do not observe any nonmonotonic behavior in the ratios of the cumulants as a function of collision energy. The measured values of $C_1/C_2 = \\mu/\\sigma^2$ and $C_3/C_1 = S\\sigma^3/\\mu$ can be directly compared to lattice quantum-chromodynamics calculations and thus allow extraction of both the chemical freeze-out temperat...

  13. Wave Packet Dynamics in the Infinite Square Well with the Wigner Quasi-probability Distribution

    Science.gov (United States)

    Belloni, Mario; Doncheski, Michael; Robinett, Richard

    2004-05-01

    Over the past few years a number of authors have been interested in the time evolution and revivals of Gaussian wave packets in one-dimensional infinite wells and in two-dimensional infinite wells of various geometries. In all of these circumstances, the wave function is guaranteed to revive at a time related to the inverse of the system's ground state energy, if not sooner. To better visualize these revivals we have calculated the time-dependent Wigner quasi-probability distribution for position and momentum, P_W(x; p), for Gaussian wave packet solutions of this system. The Wigner quasi-probability distribution clearly demonstrates the short-term semi-classical time dependence, as well as longer-term revival behavior and the structure during the collapsed state. This tool also provides an excellent way of demonstrating the patterns of highly-correlated Schrödinger-cat-like `mini-packets' which appear at fractional multiples of the exact revival time. This research is supported in part by a Research Corporation Cottrell College Science Award (CC5470) and the National Science Foundation under contracts DUE-0126439 and DUE-9950702.

  14. A Voting Based Approach to Detect Recursive Order Number of Photocopy Documents Using Probability Distributions

    Directory of Open Access Journals (Sweden)

    Rani K

    2014-08-01

    Full Text Available Photocopy documents are very common in our normal life. People are permitted to carry and present photocopied documents to avoid damages to the original documents. But this provision is misused for temporary benefits by fabricating fake photocopied documents. Fabrication of fake photocopied document is possible only in 2nd and higher order recursive order of photocopies. Whenever a photocopied document is submitted, it may be required to check its originality. When the document is 1st order photocopy, chances of fabrication may be ignored. On the other hand when the photocopy order is 2nd or above, probability of fabrication may be suspected. Hence when a photocopy document is presented, the recursive order number of photocopy is to be estimated to ascertain the originality. This requirement demands to investigate methods to estimate order number of photocopy. In this work, a voting based approach is used to detect the recursive order number of the photocopy document using probability distributions exponential, extreme values and lognormal distributions is proposed. A detailed experimentation is performed on a generated data set and the method exhibits efficiency close to 89%.

  15. Maximum Entropy Estimation of Probability Distribution of Variables in Higher Dimensions from Lower Dimensional Data

    Directory of Open Access Journals (Sweden)

    Jayajit Das '

    2015-07-01

    Full Text Available A common statistical situation concerns inferring an unknown distribution Q(x from a known distribution P(y, where X (dimension n, and Y (dimension m have a known functional relationship. Most commonly, n ≤ m, and the task is relatively straightforward for well-defined functional relationships. For example, if Y1 and Y2 are independent random variables, each uniform on [0, 1], one can determine the distribution of X = Y1 + Y2; here m = 2 and n = 1. However, biological and physical situations can arise where n > m and the functional relation Y→X is non-unique. In general, in the absence of additional information, there is no unique solution to Q in those cases. Nevertheless, one may still want to draw some inferences about Q. To this end, we propose a novel maximum entropy (MaxEnt approach that estimates Q(x based only on the available data, namely, P(y. The method has the additional advantage that one does not need to explicitly calculate the Lagrange multipliers. In this paper we develop the approach, for both discrete and continuous probability distributions, and demonstrate its validity. We give an intuitive justification as well, and we illustrate with examples.

  16. Serial Spike Time Correlations Affect Probability Distribution of Joint Spike Events.

    Science.gov (United States)

    Shahi, Mina; van Vreeswijk, Carl; Pipa, Gordon

    2016-01-01

    Detecting the existence of temporally coordinated spiking activity, and its role in information processing in the cortex, has remained a major challenge for neuroscience research. Different methods and approaches have been suggested to test whether the observed synchronized events are significantly different from those expected by chance. To analyze the simultaneous spike trains for precise spike correlation, these methods typically model the spike trains as a Poisson process implying that the generation of each spike is independent of all the other spikes. However, studies have shown that neural spike trains exhibit dependence among spike sequences, such as the absolute and relative refractory periods which govern the spike probability of the oncoming action potential based on the time of the last spike, or the bursting behavior, which is characterized by short epochs of rapid action potentials, followed by longer episodes of silence. Here we investigate non-renewal processes with the inter-spike interval distribution model that incorporates spike-history dependence of individual neurons. For that, we use the Monte Carlo method to estimate the full shape of the coincidence count distribution and to generate false positives for coincidence detection. The results show that compared to the distributions based on homogeneous Poisson processes, and also non-Poisson processes, the width of the distribution of joint spike events changes. Non-renewal processes can lead to both heavy tailed or narrow coincidence distribution. We conclude that small differences in the exact autostructure of the point process can cause large differences in the width of a coincidence distribution. Therefore, manipulations of the autostructure for the estimation of significance of joint spike events seem to be inadequate.

  17. An informative prior probability distribution of the gompertz parameters for bayesian approaches in paleodemography.

    Science.gov (United States)

    Sasaki, Tomohiko; Kondo, Osamu

    2016-03-01

    In paleodemography, the Bayesian approach has been suggested to provide an effective means by which mortality profiles of past populations can be adequately estimated, and thus avoid problems of "age-mimicry" inherent in conventional approaches. In this study, we propose an application of the Gompertz model using an "informative" prior probability distribution by revising a recent example of the Bayesian approach based on an "uninformative" distribution. Life-table data of 134 human populations including those of contemporary hunter-gatherers were used to determine the Gompertz parameters of each population. In each population, we used both raw life-table data and the Gompertz parameters to calculate some demographic values such as the mean life-span, to confirm representativeness of the model. Then, the correlation between the two Gompertz parameters (the Strehler-Mildvan correlation) was re-established. We incorporated the correlation into the Bayesian approach as an "informative" prior probability distribution, and tested its effectiveness using simulated data. Our analyses showed that the mean life-span (≥ age 15) and the proportion of living persons aging over 45 were well-reproduced by the Gompertz model. The simulation showed that using the correlation as an informative prior provides a narrower estimation range in the Bayesian approach than does the uninformative prior. The Gompertz model can be assumed to accurately estimate the mean life-span and/or the proportion of old people in a population. We suggest that the Strehler-Mildvan correlation can be used as a useful constraint in demographic reconstructions of past human populations. © 2015 Wiley Periodicals, Inc.

  18. Digital simulation of two-dimensional random fields with arbitrary power spectra and non-Gaussian probability distribution functions

    DEFF Research Database (Denmark)

    Yura, Harold; Hanson, Steen Grüner

    2012-01-01

    Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the......Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set...... with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative...

  19. Measurement of the ratio of the sixth order to the second order cumulant of net-proton multiplicity distributions in relativistic heavy-ion collisions

    CERN Document Server

    Chen, Lizhu; Cui, Fenping; Wu, Yuanfang

    2016-01-01

    We investigate the measurement of the sixth order cumulant and its ratio to the second order cumulant ($C_6/C_2$) in relativistic heavy-ion collisions. The influence of statistics and different methods of centrality bin width correction on $C_6/C_2$ of net-proton multiplicity distributions is demonstrated. There is no satisfactory method to extract $C_6/C_2$ with the current statistics recorded at lower energies by STAR at RHIC. With statistics comparable to the expected statistics at the planned future RHIC Beam Energy Scan II (BES II), no energy dependence of $C_6/C_2$ is observed in central collisions using the UrQMD model. We find if the transition signal is as strong as predicted by the PQM model, then it is hopefully observed at the upcoming RHIC BES II.

  20. Measurement of the ratio of the sixth order to the second order cumulant of net-proton multiplicity distributions in relativistic heavy-ion collisions

    Science.gov (United States)

    Chen, Lizhu; Li, Zhiming; Cui, Fenping; Wu, Yuanfang

    2017-01-01

    We investigate the measurement of the sixth order cumulant and its ratio to the second order cumulant (C6 /C2) in relativistic heavy-ion collisions. The influence of statistics and different methods of centrality bin width correction on C6 /C2 of net-proton multiplicity distributions is demonstrated. There is no satisfactory method to extract C6 /C2 with the current statistics recorded at lower energies by STAR at RHIC. With statistics comparable to the expected statistics at the planned future RHIC Beam Energy Scan II (BES II), no energy dependence of C6 /C2 is observed in central collisions using the UrQMD model. We find that if the transition signal is as strong as predicted by the PQM model, then it is hopefully observed at the upcoming RHIC BES II.

  1. An improved multilevel Monte Carlo method for estimating probability distribution functions in stochastic oil reservoir simulations

    Science.gov (United States)

    Lu, Dan; Zhang, Guannan; Webster, Clayton; Barbier, Charlotte

    2016-12-01

    In this work, we develop an improved multilevel Monte Carlo (MLMC) method for estimating cumulative distribution functions (CDFs) of a quantity of interest, coming from numerical approximation of large-scale stochastic subsurface simulations. Compared with Monte Carlo (MC) methods, that require a significantly large number of high-fidelity model executions to achieve a prescribed accuracy when computing statistical expectations, MLMC methods were originally proposed to significantly reduce the computational cost with the use of multifidelity approximations. The improved performance of the MLMC methods depends strongly on the decay of the variance of the integrand as the level increases. However, the main challenge in estimating CDFs is that the integrand is a discontinuous indicator function whose variance decays slowly. To address this difficult task, we approximate the integrand using a smoothing function that accelerates the decay of the variance. In addition, we design a novel a posteriori optimization strategy to calibrate the smoothing function, so as to balance the computational gain and the approximation error. The combined proposed techniques are integrated into a very general and practical algorithm that can be applied to a wide range of subsurface problems for high-dimensional uncertainty quantification, such as a fine-grid oil reservoir model considered in this effort. The numerical results reveal that with the use of the calibrated smoothing function, the improved MLMC technique significantly reduces the computational complexity compared to the standard MC approach. Finally, we discuss several factors that affect the performance of the MLMC method and provide guidance for effective and efficient usage in practice.

  2. Modeling the probability distribution of positional errors incurred by residential address geocoding

    Directory of Open Access Journals (Sweden)

    Mazumdar Soumya

    2007-01-01

    Full Text Available Abstract Background The assignment of a point-level geocode to subjects' residences is an important data assimilation component of many geographic public health studies. Often, these assignments are made by a method known as automated geocoding, which attempts to match each subject's address to an address-ranged street segment georeferenced within a streetline database and then interpolate the position of the address along that segment. Unfortunately, this process results in positional errors. Our study sought to model the probability distribution of positional errors associated with automated geocoding and E911 geocoding. Results Positional errors were determined for 1423 rural addresses in Carroll County, Iowa as the vector difference between each 100%-matched automated geocode and its true location as determined by orthophoto and parcel information. Errors were also determined for 1449 60%-matched geocodes and 2354 E911 geocodes. Huge (> 15 km outliers occurred among the 60%-matched geocoding errors; outliers occurred for the other two types of geocoding errors also but were much smaller. E911 geocoding was more accurate (median error length = 44 m than 100%-matched automated geocoding (median error length = 168 m. The empirical distributions of positional errors associated with 100%-matched automated geocoding and E911 geocoding exhibited a distinctive Greek-cross shape and had many other interesting features that were not capable of being fitted adequately by a single bivariate normal or t distribution. However, mixtures of t distributions with two or three components fit the errors very well. Conclusion Mixtures of bivariate t distributions with few components appear to be flexible enough to fit many positional error datasets associated with geocoding, yet parsimonious enough to be feasible for nascent applications of measurement-error methodology to spatial epidemiology.

  3. On the probability distribution of daily streamflow in the United States

    Science.gov (United States)

    Blum, Annalise G.; Archfield, Stacey A.; Vogel, Richard M.

    2017-01-01

    Daily streamflows are often represented by flow duration curves (FDCs), which illustrate the frequency with which flows are equaled or exceeded. FDCs have had broad applications across both operational and research hydrology for decades; however, modeling FDCs has proven elusive. Daily streamflow is a complex time series with flow values ranging over many orders of magnitude. The identification of a probability distribution that can approximate daily streamflow would improve understanding of the behavior of daily flows and the ability to estimate FDCs at ungaged river locations. Comparisons of modeled and empirical FDCs at nearly 400 unregulated, perennial streams illustrate that the four-parameter kappa distribution provides a very good representation of daily streamflow across the majority of physiographic regions in the conterminous United States (US). Further, for some regions of the US, the three-parameter generalized Pareto and lognormal distributions also provide a good approximation to FDCs. Similar results are found for the period of record FDCs, representing the long-term hydrologic regime at a site, and median annual FDCs, representing the behavior of flows in a typical year.

  4. Detection of two power-law tails in the probability distribution functions of massive GMCs

    CERN Document Server

    Schneider, N; Girichidis, P; Rayner, T; Motte, F; Andre, P; Russeil, D; Abergel, A; Anderson, L; Arzoumanian, D; Benedettini, M; Csengeri, T; Didelon, P; Francesco, J D; Griffin, M; Hill, T; Klessen, R S; Ossenkopf, V; Pezzuto, S; Rivera-Ingraham, A; Spinoglio, L; Tremblin, P; Zavagno, A

    2015-01-01

    We report the novel detection of complex high-column density tails in the probability distribution functions (PDFs) for three high-mass star-forming regions (CepOB3, MonR2, NGC6334), obtained from dust emission observed with Herschel. The low column density range can be fit with a lognormal distribution. A first power-law tail starts above an extinction (Av) of ~6-14. It has a slope of alpha=1.3-2 for the rho~r^-alpha profile for an equivalent density distribution (spherical or cylindrical geometry), and is thus consistent with free-fall gravitational collapse. Above Av~40, 60, and 140, we detect an excess that can be fitted by a flatter power law tail with alpha>2. It correlates with the central regions of the cloud (ridges/hubs) of size ~1 pc and densities above 10^4 cm^-3. This excess may be caused by physical processes that slow down collapse and reduce the flow of mass towards higher densities. Possible are: 1. rotation, which introduces an angular momentum barrier, 2. increasing optical depth and weaker...

  5. Probability distribution of turbulence in curvilinear cross section mobile bed channel.

    Science.gov (United States)

    Sharma, Anurag; Kumar, Bimlesh

    2016-01-01

    The present study investigates the probability density functions (PDFs) of two-dimensional turbulent velocity fluctuations, Reynolds shear stress (RSS) and conditional RSSs in threshold channel obtained by using Gram-Charlier (GC) series. The GC series expansion has been used up to the moments of order four to include the skewness and kurtosis. Experiments were carried out in the curvilinear cross section sand bed channel at threshold condition with uniform sand size of d50 = 0.418 mm. The result concludes that the PDF distributions of turbulent velocity fluctuations and RSS calculated theoretically based on GC series expansion satisfied the PDFs obtained from the experimental data. The PDF distribution of conditional RSSs related to the ejections and sweeps are well represented by the GC series exponential distribution, except that a slight departure of inward and outward interactions is observed, which may be due to weaker events. This paper offers some new insights into the probabilistic mechanism of sediment transport, which can be helpful in sediment management and design of curvilinear cross section mobile bed channel.

  6. The return period analysis of natural disasters with statistical modeling of bivariate joint probability distribution.

    Science.gov (United States)

    Li, Ning; Liu, Xueqin; Xie, Wei; Wu, Jidong; Zhang, Peng

    2013-01-01

    New features of natural disasters have been observed over the last several years. The factors that influence the disasters' formation mechanisms, regularity of occurrence and main characteristics have been revealed to be more complicated and diverse in nature than previously thought. As the uncertainty involved increases, the variables need to be examined further. This article discusses the importance and the shortage of multivariate analysis of natural disasters and presents a method to estimate the joint probability of the return periods and perform a risk analysis. Severe dust storms from 1990 to 2008 in Inner Mongolia were used as a case study to test this new methodology, as they are normal and recurring climatic phenomena on Earth. Based on the 79 investigated events and according to the dust storm definition with bivariate, the joint probability distribution of severe dust storms was established using the observed data of maximum wind speed and duration. The joint return periods of severe dust storms were calculated, and the relevant risk was analyzed according to the joint probability. The copula function is able to simulate severe dust storm disasters accurately. The joint return periods generated are closer to those observed in reality than the univariate return periods and thus have more value in severe dust storm disaster mitigation, strategy making, program design, and improvement of risk management. This research may prove useful in risk-based decision making. The exploration of multivariate analysis methods can also lay the foundation for further applications in natural disaster risk analysis. © 2012 Society for Risk Analysis.

  7. Analysis of Observation Data of Earth-Rockfill Dam Based on Cloud Probability Distribution Density Algorithm

    Directory of Open Access Journals (Sweden)

    Han Liwei

    2014-07-01

    Full Text Available Monitoring data on an earth-rockfill dam constitutes a form of spatial data. Such data include much uncertainty owing to the limitation of measurement information, material parameters, load, geometry size, initial conditions, boundary conditions and the calculation model. So the cloud probability density of the monitoring data must be addressed. In this paper, the cloud theory model was used to address the uncertainty transition between the qualitative concept and the quantitative description. Then an improved algorithm of cloud probability distribution density based on a backward cloud generator was proposed. This was used to effectively convert certain parcels of accurate data into concepts which can be described by proper qualitative linguistic values. Such qualitative description was addressed as cloud numerical characteristics-- {Ex, En, He}, which could represent the characteristics of all cloud drops. The algorithm was then applied to analyze the observation data of a piezometric tube in an earth-rockfill dam. And experiment results proved that the proposed algorithm was feasible, through which, we could reveal the changing regularity of piezometric tube’s water level. And the damage of the seepage in the body was able to be found out.

  8. Spatial probability distribution of future volcanic eruptions at El Hierro Island (Canary Islands, Spain)

    Science.gov (United States)

    Becerril, Laura; Cappello, Annalisa; Galindo, Inés; Neri, Marco; Del Negro, Ciro

    2013-05-01

    The 2011 submarine eruption that took place in the proximity of El Hierro Island (Canary Islands, Spain) has raised the need to identify the most likely future emission zones even on volcanoes characterized by low frequency activity. Here, we propose a probabilistic method to build the susceptibility map of El Hierro, i.e. the spatial distribution of vent opening for future eruptions, based on the probabilistic analysis of volcano-structural data of the Island collected through new fieldwork measurements, bathymetric information, as well as analysis of geological maps, orthophotos and aerial photographs. These data have been divided into different datasets and converted into separate and weighted probability density functions, which were included in a non-homogeneous Poisson process to produce the volcanic susceptibility map. The most likely area to host new eruptions in El Hierro is in the south-western part of the West rift. High probability locations are also found in the Northeast and South rifts, and along the submarine parts of the rifts. This map represents the first effort to deal with the volcanic hazard at El Hierro and can be a support tool for decision makers in land planning, emergency measures and civil defense actions.

  9. Predicting dihedral angle probability distributions for protein coil residues from primary sequence using neural networks

    DEFF Research Database (Denmark)

    Helles, Glennie; Fonseca, Rasmus

    2009-01-01

    Predicting the three-dimensional structure of a protein from its amino acid sequence is currently one of the most challenging problems in bioinformatics. The internal structure of helices and sheets is highly recurrent and help reduce the search space significantly. However, random coil segments...... make up nearly 40\\% of proteins, and they do not have any apparent recurrent patterns which complicates overall prediction accuracy of protein structure prediction methods. Luckily, previous work has indicated that coil segments are in fact not completely random in structure and flanking residues do...... seem to have a significant influence on the dihedral angles adopted by the individual amino acids in coil segments. In this work we attempt to predict a probability distribution of these dihedral angles based on the flanking residues. While attempts to predict dihedral angles of coil segments have been...

  10. Light Scattering of Rough Orthogonal Anisotropic Surfaces with Secondary Most Probable Slope Distributions

    Institute of Scientific and Technical Information of China (English)

    LI Hai-Xia; CHENG Chuan-Fu

    2011-01-01

    @@ We study the light scattering of an orthogonal anisotropic rough surface with secondary most probable slope distribution It is found that the scattered intensity profiles have obvious secondary maxima, and in the direction perpendicular to the plane of incidence, the secondary maxima are oriented in a curve on the observation plane,which is called the orientation curve.By numerical calculation of the scattering wave fields with the height data of the sample, it is validated that the secondary maxima are induced by the side face element, which constitutes the prismoid structure of the anisotropic surface.We derive the equation of the quadratic orientation curve.Experimentally, we construct the system for light scattering measurement using a CCD.The scattered intensity profiles are extracted from the images at different angles of incidence along the orientation curves.The experimental results conform to the theory.

  11. Lower Bound Bayesian Networks - An Efficient Inference of Lower Bounds on Probability Distributions in Bayesian Networks

    CERN Document Server

    Andrade, Daniel

    2012-01-01

    We present a new method to propagate lower bounds on conditional probability distributions in conventional Bayesian networks. Our method guarantees to provide outer approximations of the exact lower bounds. A key advantage is that we can use any available algorithms and tools for Bayesian networks in order to represent and infer lower bounds. This new method yields results that are provable exact for trees with binary variables, and results which are competitive to existing approximations in credal networks for all other network structures. Our method is not limited to a specific kind of network structure. Basically, it is also not restricted to a specific kind of inference, but we restrict our analysis to prognostic inference in this article. The computational complexity is superior to that of other existing approaches.

  12. Sampling the probability distribution of Type Ia Supernova lightcurve parameters in cosmological analysis

    Science.gov (United States)

    Dai, Mi; Wang, Yun

    2016-06-01

    In order to obtain robust cosmological constraints from Type Ia supernova (SN Ia) data, we have applied Markov Chain Monte Carlo (MCMC) to SN Ia lightcurve fitting. We develop a method for sampling the resultant probability density distributions (pdf) of the SN Ia lightcuve parameters in the MCMC likelihood analysis to constrain cosmological parameters, and validate it using simulated data sets. Applying this method to the `joint lightcurve analysis (JLA)' data set of SNe Ia, we find that sampling the SN Ia lightcurve parameter pdf's leads to cosmological parameters closer to that of a flat Universe with a cosmological constant, compared to the usual practice of using only the best-fitting values of the SN Ia lightcurve parameters. Our method will be useful in the use of SN Ia data for precision cosmology.

  13. Probability Distributions of Random Electromagnetic Fields in the Presence of a Semi-Infinite Isotropic Medium

    CERN Document Server

    Arnaut, L R

    2006-01-01

    Using a TE/TM decomposition for an angular plane-wave spectrum of free random electromagnetic waves and matched boundary conditions, we derive the probability density function for the energy density of the vector electric field in the presence of a semi-infinite isotropic medium. The theoretical analysis is illustrated with calculations and results for good electric conductors and for a lossless dielectric half-space. The influence of the permittivity and conductivity on the intensity, random polarization, statistical distribution and standard deviation of the field is investigated, both for incident plus reflected fields and for refracted fields. External refraction is found to result in compression of the fluctuations of the random field.

  14. Probability distribution function for inclinations of merging compact binaries detected by gravitational wave interferometers

    CERN Document Server

    Seto, Naoki

    2014-01-01

    We analytically discuss probability distribution function (PDF) for inclinations of merging compact binaries whose gravitational waves are coherently detected by a network of ground based interferometers. The PDF would be useful for studying prospects of (1) simultaneously detecting electromagnetic signals (such as gamma-ray-bursts) associated with binary mergers and (2) statistically constraining the related theoretical models from the actual observational data of multi-messenger astronomy. Our approach is similar to Schutz (2011), but we explicitly include the dependence of the polarization angles of the binaries, based on the concise formulation given in Cutler and Flanagan (1994). We find that the overall profiles of the PDFs are similar for any networks composed by the second generation detectors (Advanced-LIGO, Advanced-Virgo, KAGRA, LIGO-India). For example, 5.1% of detected binaries would have inclination angle less than 10 degree with at most 0.1% differences between the potential networks. A perturb...

  15. On the reliability of observational measurements of column density probability distribution functions

    CERN Document Server

    Ossenkopf, Volker; Schneider, Nicola; Federrath, Christoph; Klessen, Ralf S

    2016-01-01

    Probability distribution functions (PDFs) of column densities are an established tool to characterize the evolutionary state of interstellar clouds. Using simulations, we show to what degree their determination is affected by noise, line-of-sight contamination, field selection, and the incomplete sampling in interferometric measurements. We solve the integrals that describe the convolution of a cloud PDF with contaminating sources and study the impact of missing information on the measured column density PDF. The effect of observational noise can be easily estimated and corrected for if the root mean square (rms) of the noise is known. For $\\sigma_{noise}$ values below 40\\,\\% of the typical cloud column density, $N_{peak}$, this involves almost no degradation of the accuracy of the PDF parameters. For higher noise levels and narrow cloud PDFs the width of the PDF becomes increasingly uncertain. A contamination by turbulent foreground or background clouds can be removed as a constant shield if the PDF of the c...

  16. GENERALIZED FATIGUE CONSTANT LIFE CURVE AND TWO-DIMENSIONAL PROBABILITY DISTRIBUTION OF FATIGUE LIMIT

    Institute of Scientific and Technical Information of China (English)

    熊峻江; 武哲; 高镇同

    2002-01-01

    According to the traditional fatigue constant life curve, the concept and the universal expression of the generalized fatigue constant life curve were proposed.Then, on the basis of the optimization method of the correlation coefficient, the parameter estimation formulas were induced and the generalized fatigue constant life curve with the reliability level p was given.From P-Sa-Sm curve, the two-dimensional probability distribution of the fatigue limit was derived.After then, three set of tests of LY11 CZ corresponding to the different average stress were carried out in terms of the two-dimensional up-down method.Finally, the methods are used to analyze the test results, and it is found that the analyzedresults with the high precision may be obtained.

  17. The HI Probability Distribution Function and the Atomic-to-Molecular Transition in Molecular Clouds

    CERN Document Server

    Imara, Nia

    2016-01-01

    We characterize the column density probability distributions functions (PDFs) of the atomic hydrogen gas, HI, associated with seven Galactic molecular clouds (MCs). We use 21 cm observations from the Leiden/Argentine/Bonn Galactic HI Survey to derive column density maps and PDFs. We find that the peaks of the HI PDFs occur at column densities ranging from ~1-2$\\times 10^{21}$ cm$^2$ (equivalently, ~0.5-1 mag). The PDFs are uniformly narrow, with a mean dispersion of $\\sigma_{HI}\\approx 10^{20}$ cm$^2$ (~0.1 mag). We also investigate the HI-to-H$_2$ transition towards the cloud complexes and estimate HI surface densities ranging from 7-16 $M_\\odot$ pc$^{-2}$ at the transition. We propose that the HI PDF is a fitting tool for identifying the HI-to-H$_2$ transition column in Galactic MCs.

  18. Probability distribution function and multiscaling properties in the Korean stock market

    Science.gov (United States)

    Lee, Kyoung Eun; Lee, Jae Woo

    2007-09-01

    We consider the probability distribution function (pdf) and the multiscaling properties of the index and the traded volume in the Korean stock market. We observed the power law of the pdf at the fat tail region for the return, volatility, the traded volume, and changes of the traded volume. We also investigate the multifractality in the Korean stock market. We consider the multifractality by the detrended fluctuation analysis (MFDFA). We observed the multiscaling behaviors for index, return, traded volume, and the changes of the traded volume. We apply MFDFA method for the randomly shuffled time series to observe the effects of the autocorrelations. The multifractality is strongly originated from the long time correlations of the time series.

  19. Analysis of Low Probability of Intercept (LPI) Radar Signals Using the Wigner Distribution

    Science.gov (United States)

    Gau, Jen-Yu

    2002-09-01

    The parameters of Low Probability of Intercept (LPI) radar signals are hard to identity by using traditional periodogram signal processing techniques. Using the Wigner Distribution (WD), this thesis examines eight types of LPI radar signals. Signal to noise ratios of 0 dB and -6 dB are also investigated. The eight types LPI radar signals examined include Frequency Modulation Continuous Wave (FMCW), Frank code, Pt code, P2 code, P3 code, P4 code, COSTAS frequency hopping and Phase Shift Keying/Frequency Shift Keying (PSK/FSK) signals. Binary Phase Shift Keying (BPSK) signals although not used in modern LPI radars are also examined to further illustrate the principal characteristics of the WD.

  20. Binomial moments of the distance distribution and the probability of undetected error

    Energy Technology Data Exchange (ETDEWEB)

    Barg, A. [Lucent Technologies, Murray Hill, NJ (United States). Bell Labs.; Ashikhmin, A. [Los Alamos National Lab., NM (United States)

    1998-09-01

    In [1] K.A.S. Abdel-Ghaffar derives a lower bound on the probability of undetected error for unrestricted codes. The proof relies implicitly on the binomial moments of the distance distribution of the code. The authors use the fact that these moments count the size of subcodes of the code to give a very simple proof of the bound in [1] by showing that it is essentially equivalent to the Singleton bound. They discuss some combinatorial connections revealed by this proof. They also discuss some improvements of this bound. Finally, they analyze asymptotics. They show that an upper bound on the undetected error exponent that corresponds to the bound of [1] improves known bounds on this function.

  1. Mean square delay dependent-probability-distribution stability analysis of neutral type stochastic neural networks.

    Science.gov (United States)

    Muralisankar, S; Manivannan, A; Balasubramaniam, P

    2015-09-01

    The aim of this manuscript is to investigate the mean square delay dependent-probability-distribution stability analysis of neutral type stochastic neural networks with time-delays. The time-delays are assumed to be interval time-varying and randomly occurring. Based on the new Lyapunov-Krasovskii functional and stochastic analysis approach, a novel sufficient condition is obtained in the form of linear matrix inequality such that the delayed stochastic neural networks are globally robustly asymptotically stable in the mean-square sense for all admissible uncertainties. Finally, the derived theoretical results are validated through numerical examples in which maximum allowable upper bounds are calculated for different lower bounds of time-delay.

  2. The H I Probability Distribution Function and the Atomic-to-molecular Transition in Molecular Clouds

    Science.gov (United States)

    Imara, Nia; Burkhart, Blakesley

    2016-10-01

    We characterize the column-density probability distribution functions (PDFs) of the atomic hydrogen gas, H i, associated with seven Galactic molecular clouds (MCs). We use 21 cm observations from the Leiden/Argentine/Bonn Galactic H i Survey to derive column-density maps and PDFs. We find that the peaks of the H i PDFs occur at column densities in the range ˜1-2 × 1021 {{cm}}-2 (equivalently, ˜0.5-1 mag). The PDFs are uniformly narrow, with a mean dispersion of {σ }{{H}{{I}}}≈ {10}20 {{cm}}-2 (˜0.1 mag). We also investigate the H i-to-H2 transition toward the cloud complexes and estimate H i surface densities ranging from 7 to 16 {M}⊙ {{pc}}-2 at the transition. We propose that the H i PDF is a fitting tool for identifying the H i-to-H2 transition column in Galactic MCs.

  3. Random numbers from the tails of probability distributions using the transformation method

    CERN Document Server

    Fulger, Daniel; Germano, Guido

    2009-01-01

    The speed of many one-line transformation methods for the production of, for example, Levy alpha-stable random numbers, which generalize Gaussian ones, and Mittag-Leffler random numbers, which generalize exponential ones, is very high and satisfactory for most purposes. However, for the class of decreasing probability densities fast rejection implementations like the Ziggurat by Marsaglia and Tsang promise a significant speed-up if it is possible to complement them with a method that samples the tails of the infinite support. This requires the fast generation of random numbers greater or smaller than a certain value. We present a method to achieve this, and also to generate random numbers within any arbitrary interval. We demonstrate the method showing the properties of the transform maps of the above mentioned distributions as examples of stable and geometric stable random numbers used for the stochastic solution of the space-time fractional diffusion equation.

  4. Communication in a Poisson Field of Interferers -- Part I: Interference Distribution and Error Probability

    CERN Document Server

    Pinto, Pedro C

    2010-01-01

    We present a mathematical model for communication subject to both network interference and noise. We introduce a framework where the interferers are scattered according to a spatial Poisson process, and are operating asynchronously in a wireless environment subject to path loss, shadowing, and multipath fading. We consider both cases of slow and fast-varying interferer positions. The paper is comprised of two separate parts. In Part I, we determine the distribution of the aggregate network interference at the output of a linear receiver. We characterize the error performance of the link, in terms of average and outage probabilities. The proposed model is valid for any linear modulation scheme (e.g., M-ary phase shift keying or M-ary quadrature amplitude modulation), and captures all the essential physical parameters that affect network interference. Our work generalizes the conventional analysis of communication in the presence of additive white Gaussian noise and fast fading, allowing the traditional results...

  5. Likelihood analysis of species occurrence probability from presence-only data for modelling species distributions

    Science.gov (United States)

    Royle, J. Andrew; Chandler, Richard B.; Yackulic, Charles; Nichols, James D.

    2012-01-01

    1. Understanding the factors affecting species occurrence is a pre-eminent focus of applied ecological research. However, direct information about species occurrence is lacking for many species. Instead, researchers sometimes have to rely on so-called presence-only data (i.e. when no direct information about absences is available), which often results from opportunistic, unstructured sampling. MAXENT is a widely used software program designed to model and map species distribution using presence-only data. 2. We provide a critical review of MAXENT as applied to species distribution modelling and discuss how it can lead to inferential errors. A chief concern is that MAXENT produces a number of poorly defined indices that are not directly related to the actual parameter of interest – the probability of occurrence (ψ). This focus on an index was motivated by the belief that it is not possible to estimate ψ from presence-only data; however, we demonstrate that ψ is identifiable using conventional likelihood methods under the assumptions of random sampling and constant probability of species detection. 3. The model is implemented in a convenient r package which we use to apply the model to simulated data and data from the North American Breeding Bird Survey. We demonstrate that MAXENT produces extreme under-predictions when compared to estimates produced by logistic regression which uses the full (presence/absence) data set. We note that MAXENT predictions are extremely sensitive to specification of the background prevalence, which is not objectively estimated using the MAXENT method. 4. As with MAXENT, formal model-based inference requires a random sample of presence locations. Many presence-only data sets, such as those based on museum records and herbarium collections, may not satisfy this assumption. However, when sampling is random, we believe that inference should be based on formal methods that facilitate inference about interpretable ecological quantities

  6. Obtaining magnitude-cumulative frequency curves from rockfall scar size distribution using cosmogenic chlorine-36 in the Montsec area (Eastern Pyrenees, Spain)

    Science.gov (United States)

    Domènech, Guillem; Mavrouli, Olga; Corominas, Jordi; Abellán, Antonio; Merchel, Silke; Pavetich, Stefan; Rugel, Georg

    2015-04-01

    Magnitude-cumulative frequency (MCF) relations are commonly used components for assessing the rockfall hazard using databases of recorded events. However, in some cases, data are lacking or incomplete. To overcome this restriction, the volume distribution of the rockfall scars has been used instead. The latter may yield the temporal probability of occurrence if the time span required to generate the scars is known. The Montsec range, located in the Eastern Pyrenees, Spain, was chosen as a pilot study area for investigating MCF distributions. This cliff, which is composed of limestones from Upper Cretaceous age, shows distinct evidences of rockfall activity, including large recent rockfall scars. These areas are identifiable by their orange colour, which contrasts in front of the greyish old stable (reference) surface of the cliff face. We present a procedure to obtain the MCF of the rockfall scars by dating an old reference cliff surface and measuring the total volume released since then. The reference cliff surface was dated using the terrestrial cosmogenic nuclide (TCN) chlorine-36 (Merchel et al., 2013). We used the Rockfall Scar Size Distribution (RSSD) obtained in Domènech et al. (2014) that considers several rockfall pattern scenarios. Scenario 1 allows for, mostly, large rockfall scar volumes, scenario 2 considers smaller occurrences and scenario 3 suggests that rockfall scars can be the result of one or several rockfall events, and thus contemplating a wider range of scar volumes. The main steps of the methodology are: a) Obtaining the RSSD, b) Volume calculation of material lost, c) Calculation of time (T0) elapsed for the cliff to retreat (age of the old reference surface), and d) generation of the MCF curve from the RSSD. A total volume of material lost of 78900 m3 was obtained as well as an elapsed period of time of 15350 years. The MCF curves for different rockfall scenarios are found to be well fitted by a power law with exponents -1.7, -1.1 and -1

  7. Development of a Medical-text Parsing Algorithm Based on Character Adjacent Probability Distribution for Japanese Radiology Reports

    National Research Council Canada - National Science Library

    N. Nishimoto; S. Terae; M. Uesugi; K. Ogasawara; T. Sakurai

    2008-01-01

    Objectives: The objectives of this study were to investigate the transitional probability distribution of medical term boundaries between characters and to develop a parsing algorithm specifically for medical texts. Methods...

  8. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  9. Probability distribution of biofilm thickness and effect of biofilm on the permeability of porous media

    Science.gov (United States)

    Ye, S.; Sleep, B. E.; Chien, C.

    2010-12-01

    Probability distribution of biofilm thickness and effect of biofilm on permeability of saturated porous media were investigated in a two-dimensional sand-filled cell (55 cm wide x 45 cm high x 1.28 cm thick) under condition of rich nutrition. Inoculation of the lower portion of the cell with a methanogenic culture and addition of methanol to the bottom of the cell led to biomass growth. Biomass distributions in the water and on the sand in the cell were measured by protein analysis. The biofilm distribution on the sand was observed by confocal laser scanning microscopy (CLSM). Permeability was measured by laboratory hydraulic tests. The biomass levels measured in water and on the sand increased with time, and were highest at the bottom of the cell. The biofilm on the sand at the bottom of the cell was thicker. Biomass distribution on the grain of sand was not uniform. Biofilm thickness was a random variable with a normal distribution by statistical analysis of CLSM images. The results of the hydraulic tests demonstrated that the permeability due to biofilm growth was estimated to be average 12% of the initial value. To investigate the spatial distribution of permeability in the two dimensional cell, three models (Taylor, Seki, and Clement) were used to calculate permeability of porous media with biofilm growth. The results of Taylor's model (Taylor et al., 1990) showed reduction in permeability of 2-5 orders magnitude. The Clement's model (Clement et al., 1996) predicted 3%-98% of the initial value. Seki's model (Seki and Miyazaki, 2001) could not be applied in this study. Conclusively, biofilm growth could obviously decrease the permeability of two dimensional saturated porous media, however, the reduction was much less than that estimated in one dimensional condition. Additionally, under condition of two dimensional saturated porous media with rich nutrition, Seki's model could not be applied, Taylor’s model predicted bigger reductions, and the results of

  10. The probability distribution for non-Gaussianity estimators constructed from the CMB trispectrum

    CERN Document Server

    Smith, Tristan L

    2012-01-01

    Considerable recent attention has focussed on the prospects to use the cosmic microwave background (CMB) trispectrum to probe the physics of the early universe. Here we evaluate the probability distribution function (PDF) for the standard estimator tau_nle for the amplitude tau_nl of the CMB trispectrum both for the null-hypothesis (i.e., for Gaussian maps with tau_nl = 0) and for maps with a non-vanishing trispectrum (|tau_nl|>0). We find these PDFs to be highly non-Gaussian in both cases. We also evaluate the variance with which the trispectrum amplitude can be measured, , as a function of its underlying value, tau_nl. We find a strong dependence of this variance on tau_nl. We also find that the variance does not, given the highly non-Gaussian nature of the PDF, effectively characterize the distribution. Detailed knowledge of these PDFs will therefore be imperative in order to properly interpret the implications of any given trispectrum measurement. For example, if a CMB experiment with a maximum multipole ...

  11. A new probability distribution model of turbulent irradiance based on Born perturbation theory

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The subject of the PDF (Probability Density Function) of the irradiance fluctuations in a turbulent atmosphere is still unsettled.Theory reliably describes the behavior in the weak turbulence regime,but theoretical description in the strong and whole turbulence regimes are still controversial.Based on Born perturbation theory,the physical manifestations and correlations of three typical PDF models (Rice-Nakagami,exponential-Bessel and negative-exponential distribution) were theoretically analyzed.It is shown that these models can be derived by separately making circular-Gaussian,strong-turbulence and strong-turbulence-circular-Gaussian approximations in Born perturbation theory,which denies the viewpoint that the Rice-Nakagami model is only applicable in the extremely weak turbulence regime and provides theoretical arguments for choosing rational models in practical applications.In addition,a common shortcoming of the three models is that they are all approximations.A new model,called the Maclaurin-spread distribution,is proposed without any approximation except for assuming the correlation coefficient to be zero.So,it is considered that the new model can exactly reflect the Born perturbation theory.Simulated results prove the accuracy of this new model.

  12. Probability distribution functions for ELM bursts in a series of JET tokamak discharges

    Energy Technology Data Exchange (ETDEWEB)

    Greenhough, J [Space and Astrophysics Group, Department of Physics, Warwick University, Coventry CV4 7AL (United Kingdom); Chapman, S C [Space and Astrophysics Group, Department of Physics, Warwick University, Coventry CV4 7AL (United Kingdom); Dendy, R O [Space and Astrophysics Group, Department of Physics, Warwick University, Coventry CV4 7AL (United Kingdom); Ward, D J [EURATOM/UKAEA Fusion Association, Culham Science Centre, Abingdon, Oxfordshire OX14 3DB (United Kingdom)

    2003-05-01

    A novel statistical treatment of the full raw edge localized mode (ELM) signal from a series of previously studied JET plasmas is tested. The approach involves constructing probability distribution functions (PDFs) for ELM amplitudes and time separations, and quantifying the fit between the measured PDFs and model distributions (Gaussian, inverse exponential) and Poisson processes. Uncertainties inherent in the discreteness of the raw signal require the application of statistically rigorous techniques to distinguish ELM data points from background, and to extrapolate peak amplitudes. The accuracy of PDF construction is further constrained by the relatively small number of ELM bursts (several hundred) in each sample. In consequence the statistical technique is found to be difficult to apply to low frequency (typically Type I) ELMs, so the focus is narrowed to four JET plasmas with high frequency (typically Type III) ELMs. The results suggest that there may be several fundamentally different kinds of Type III ELMing process at work. It is concluded that this novel statistical treatment can be made to work, may have wider applications to ELM data, and has immediate practical value as an additional quantitative discriminant between classes of ELMing behaviour.

  13. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution.

    Science.gov (United States)

    Dinov, Ivo D; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2013-01-01

    Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students' understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference.

  14. The Lognormal Probability Distribution Function of the Perseus Molecular Cloud: A Comparison of HI and Dust

    Science.gov (United States)

    Burkhart, Blakesley; Lee, Min-Young; Murray, Claire E.; Stanimirović, Snezana

    2015-10-01

    The shape of the probability distribution function (PDF) of molecular clouds is an important ingredient for modern theories of star formation and turbulence. Recently, several studies have pointed out observational difficulties with constraining the low column density (i.e., {A}V\\lt 1) PDF using dust tracers. In order to constrain the shape and properties of the low column density PDF, we investigate the PDF of multiphase atomic gas in the Perseus molecular cloud using opacity-corrected GALFA-HI data and compare the PDF shape and properties to the total gas PDF and the N(H2) PDF. We find that the shape of the PDF in the atomic medium of Perseus is well described by a lognormal distribution and not by a power-law or bimodal distribution. The peak of the atomic gas PDF in and around Perseus lies at the HI-H2 transition column density for this cloud, past which the N(H2) PDF takes on a power-law form. We find that the PDF of the atomic gas is narrow, and at column densities larger than the HI-H2 transition, the HI rapidly depletes, suggesting that the HI PDF may be used to find the HI-H2 transition column density. We also calculate the sonic Mach number of the atomic gas by using HI absorption line data, which yield a median value of Ms = 4.0 for the CNM, while the HI emission PDF, which traces both the WNM and CNM, has a width more consistent with transonic turbulence.

  15. Assessing cumulative impacts of forest development on the distribution of furbearers using expert-based habitat modeling.

    Science.gov (United States)

    Bridger, M C; Johnson, C J; Gillingham, M P

    2016-03-01

    Cumulative impacts of anthropogenic landscape change must be considered when managing and conserving wildlife habitat. Across the central-interior of British Columbia, Canada, industrial activities are altering the habitat of furbearer species. This region has witnessed unprecedented levels of anthropogenic landscape change following rapid development in a number of resource sectors, particularly forestry. Our objective was to create expert-based habitat models for three furbearer species: fisher (Pekania pennanti), Canada lynx (Lynx canadensis), and American marten (Martes americana) and quantify habitat change for those species. We recruited 10 biologist and 10 trapper experts and then used the analytical hierarchy process to elicit expert knowledge of habitat variables important to each species. We applied the models to reference landscapes (i.e., registered traplines) in two distinct study areas and then quantified the change in habitat availability from 1990 to 2013. There was strong agreement between expert groups in the choice of habitat variables and associated scores. Where anthropogenic impacts had increased considerably over the study period, the habitat models showed substantial declines in habitat availability for each focal species (78% decline in optimal fisher habitat, 83% decline in optimal lynx habitat, and 79% decline in optimal marten habitat). For those traplines with relatively little forest harvesting, the habitat models showed no substantial change in the availability of habitat over time. The results suggest that habitat for these three furbearer species declined significantly as a result of the cumulative impacts of forest harvesting. Results of this study illustrate the utility of expert knowledge for understanding large-scale patterns of habitat change over long time periods.

  16. Probability and radical behaviorism

    Science.gov (United States)

    Espinosa, James M.

    1992-01-01

    The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforcement and extinction, respectively. PMID:22478114

  17. Probability and radical behaviorism

    OpenAIRE

    Espinosa, James M.

    1992-01-01

    The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforc...

  18. Universal Probability Distribution for the Wave Function of a Quantum System Entangled with its Environment

    Science.gov (United States)

    Goldstein, Sheldon; Lebowitz, Joel L.; Mastrodonato, Christian; Tumulka, Roderich; Zanghì, Nino

    2016-03-01

    A quantum system (with Hilbert space {H}1) entangled with its environment (with Hilbert space {H}2) is usually not attributed to a wave function but only to a reduced density matrix {ρ1}. Nevertheless, there is a precise way of attributing to it a random wave function {ψ1}, called its conditional wave function, whose probability distribution {μ1} depends on the entangled wave function {ψ in H1 ⊗ H2} in the Hilbert space of system and environment together. It also depends on a choice of orthonormal basis of H2 but in relevant cases, as we show, not very much. We prove several universality (or typicality) results about {μ1}, e.g., that if the environment is sufficiently large then for every orthonormal basis of H2, most entangled states {ψ} with given reduced density matrix {ρ1} are such that {μ1} is close to one of the so-called GAP (Gaussian adjusted projected) measures, {GAP(ρ1)}. We also show that, for most entangled states {ψ} from a microcanonical subspace (spanned by the eigenvectors of the Hamiltonian with energies in a narrow interval {[E, E+ δ E]}) and most orthonormal bases of H2, {μ1} is close to {GAP({tr}2 ρ_{mc})} with {ρ_{mc}} the normalized projection to the microcanonical subspace. In particular, if the coupling between the system and the environment is weak, then {μ1} is close to {GAP(ρ_β)} with {ρ_β} the canonical density matrix on H1 at inverse temperature {β=β(E)}. This provides the mathematical justification of our claim in Goldstein et al. (J Stat Phys 125: 1193-1221, 2006) that GAP measures describe the thermal equilibrium distribution of the wave function.

  19. Probability Distribution Function of a Forced Passive Tracer in the Lower Stratosphere

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The probability distribution function (PDF) of a passive tracer, forced by a "mean gradient", is studied. First, we take two theoretical approaches, the Lagrangian and the conditional closure formalisms, to study the PDFs of such an externally forced passive tracer. Then, we carry out numerical simulations for an idealized random flow on a sphere and for European Center for Medium-Range Weather Forecasts (ECMWF) stratospheric winds to test whether the mean-gradient model can be applied to studying stratospheric tracer mixing in midlatitude surf zones, in which a weak and poleward zonal-mean gradient is maintained by tracer leakage through polar and tropical mixing barriers, and whether the PDFs of tracer fluctuations in midlatitudes are consistent with the theoretical predictions. The numerical simulations show that when diffusive dissipation is balanced by the mean-gradient forcing, the PDF in the random flow and the Southern-Hemisphere PDFs in ECMWF winds show time-invariant exponential tails, consistent with theoretical predictions. In the Northern Hemisphere, the PDFs exhibit non-Gaussian tails. However, the PDF tails are not consistent with theoretical expectations. The long-term behavior of the PDF tails of the forced tracer is compared to that of a decaying tracer. It is found that the PDF tails of the decaying tracer are time-dependent, and evolve toward flatter than exponential.

  20. Understanding star formation in molecular clouds I. A universal probability distribution of column densities ?

    CERN Document Server

    Schneider, N; Csengeri, T; Klessen, R; Federrath, C; Tremblin, P; Girichidis, P; Bontemps, S; Andre, Ph

    2014-01-01

    Column density maps of molecular clouds are one of the most important observables in the context of molecular cloud- and star-formation (SF) studies. With Herschel it is now possible to reveal rather precisely the column density of dust, which is the best tracer of the bulk of material in molecular clouds. However, line-of-sight (LOS) contamination from fore- or background clouds can lead to an overestimation of the dust emission of molecular clouds, in particular for distant clouds. This implies too high values for column density and mass, and a misleading interpretation of probability distribution functions (PDFs) of the column density. In this paper, we demonstrate by using observations and simulations how LOS contamination affects the PDF. We apply a first-order approximation (removing a constant level) to the molecular clouds of Auriga and Maddalena (low-mass star-forming), and Carina and NGC3603(both high-mass SF regions). In perfect agreement with the simulations, we find that the PDFs become broader, ...

  1. Turbulence-Induced Relative Velocity of Dust Particles III: The Probability Distribution

    CERN Document Server

    Pan, Liubin; Scalo, John

    2014-01-01

    Motivated by its important role in the collisional growth of dust particles in protoplanetary disks, we investigate the probability distribution function (PDF) of the relative velocity of inertial particles suspended in turbulent flows. Using the simulation from our previous work, we compute the relative velocity PDF as a function of the friction timescales, tau_p1 and tau_p2, of two particles of arbitrary sizes. The friction time of particles included in the simulation ranges from 0.1 tau_eta to 54T_L, with tau_eta and T_L the Kolmogorov time and the Lagrangian correlation time of the flow, respectively. The relative velocity PDF is generically non-Gaussian, exhibiting fat tails. For a fixed value of tau_p1, the PDF is the fattest for equal-size particles (tau_p2~tau_p1), and becomes thinner at both tau_p2tau_p1. Defining f as the friction time ratio of the smaller particle to the larger one, we find that, at a given f in 1/2>T_L). These features are successfully explained by the Pan & Padoan model. Usin...

  2. Exact probability distributions of selected species in stochastic chemical reaction networks.

    Science.gov (United States)

    López-Caamal, Fernando; Marquez-Lago, Tatiana T

    2014-09-01

    Chemical reactions are discrete, stochastic events. As such, the species' molecular numbers can be described by an associated master equation. However, handling such an equation may become difficult due to the large size of reaction networks. A commonly used approach to forecast the behaviour of reaction networks is to perform computational simulations of such systems and analyse their outcome statistically. This approach, however, might require high computational costs to provide accurate results. In this paper we opt for an analytical approach to obtain the time-dependent solution of the Chemical Master Equation for selected species in a general reaction network. When the reaction networks are composed exclusively of zeroth and first-order reactions, this analytical approach significantly alleviates the computational burden required by simulation-based methods. By building upon these analytical solutions, we analyse a general monomolecular reaction network with an arbitrary number of species to obtain the exact marginal probability distribution for selected species. Additionally, we study two particular topologies of monomolecular reaction networks, namely (i) an unbranched chain of monomolecular reactions with and without synthesis and degradation reactions and (ii) a circular chain of monomolecular reactions. We illustrate our methodology and alternative ways to use it for non-linear systems by analysing a protein autoactivation mechanism. Later, we compare the computational load required for the implementation of our results and a pure computational approach to analyse an unbranched chain of monomolecular reactions. Finally, we study calcium ions gates in the sarco/endoplasmic reticulum mediated by ryanodine receptors.

  3. ANNz2: Photometric Redshift and Probability Distribution Function Estimation using Machine Learning

    Science.gov (United States)

    Sadeh, I.; Abdalla, F. B.; Lahav, O.

    2016-10-01

    We present ANNz2, a new implementation of the public software for photometric redshift (photo-z) estimation of Collister & Lahav, which now includes generation of full probability distribution functions (PDFs). ANNz2 utilizes multiple machine learning methods, such as artificial neural networks and boosted decision/regression trees. The objective of the algorithm is to optimize the performance of the photo-z estimation, to properly derive the associated uncertainties, and to produce both single-value solutions and PDFs. In addition, estimators are made available, which mitigate possible problems of non-representative or incomplete spectroscopic training samples. ANNz2 has already been used as part of the first weak lensing analysis of the Dark Energy Survey, and is included in the experiment's first public data release. Here we illustrate the functionality of the code using data from the tenth data release of the Sloan Digital Sky Survey and the Baryon Oscillation Spectroscopic Survey. The code is available for download at http://github.com/IftachSadeh/ANNZ.

  4. Vertical changes in the probability distribution of downward irradiance within the near-surface ocean under sunny conditions

    Science.gov (United States)

    Gernez, Pierre; Stramski, Dariusz; Darecki, Miroslaw

    2011-07-01

    Time series measurements of fluctuations in underwater downward irradiance, Ed, within the green spectral band (532 nm) show that the probability distribution of instantaneous irradiance varies greatly as a function of depth within the near-surface ocean under sunny conditions. Because of intense light flashes caused by surface wave focusing, the near-surface probability distributions are highly skewed to the right and are heavy tailed. The coefficients of skewness and excess kurtosis at depths smaller than 1 m can exceed 3 and 20, respectively. We tested several probability models, such as lognormal, Gumbel, Fréchet, log-logistic, and Pareto, which are potentially suited to describe the highly skewed heavy-tailed distributions. We found that the models cannot approximate with consistently good accuracy the high irradiance values within the right tail of the experimental distribution where the probability of these values is less than 10%. This portion of the distribution corresponds approximately to light flashes with Ed > 1.5?, where ? is the time-averaged downward irradiance. However, the remaining part of the probability distribution covering all irradiance values smaller than the 90th percentile can be described with a reasonable accuracy (i.e., within 20%) with a lognormal model for all 86 measurements from the top 10 m of the ocean included in this analysis. As the intensity of irradiance fluctuations decreases with depth, the probability distribution tends toward a function symmetrical around the mean like the normal distribution. For the examined data set, the skewness and excess kurtosis assumed values very close to zero at a depth of about 10 m.

  5. A Robust Function to Return the Cumulative Density of Non-Central F Distributions in Microsoft Office Excel

    Science.gov (United States)

    Nelson, James Byron

    2016-01-01

    The manuscript presents a Visual Basic[superscript R] for Applications function that operates within Microsoft Office Excel[superscript R] to return the area below the curve for a given F within a specified non-central F distribution. The function will be of use to Excel users without programming experience wherever a non-central F distribution is…

  6. Differential Evolution with Adaptive Mutation and Parameter Control Using Lévy Probability Distribution

    Institute of Scientific and Technical Information of China (English)

    Ren-Jie He; Zhen-Yu Yang

    2012-01-01

    Differential evolution (DE) has become a very popular and effective global optimization algorithm in the area of evolutionary computation.In spite of many advantages such as conceptual simplicity,high efficiency and ease of use,DE has two main components,i.e.,mutation scheme and parameter control,which significantly influence its performance.In this paper we intend to improve the performance of DE by using carefully considered strategies for both of the two components.We first design an adaptive mutation scheme,which adaptively makes use of the bias of superior individuals when generating new solutions.Although introducing such a bias is not a new idea,existing methods often use heuristic rules to control the bias.They can hardly maintain the appropriate balance between exploration and exploitation during the search process,because the preferred bias is often problem and evolution-stage dependent.Instead of using any fixed rule,a novel strategy is adopted in the new adaptive mutation scheme to adjust the bias dynamically based on the identified local fitness landscape captured by the current population.As for the other component,i.e.,parameter control,we propose a mechanism by using the Lévy probability distribution to adaptively control the scale factor F of DE.For every mutation in each generation,an Fi is produced from one of four different Lévy distributions according to their historical performance.With the adaptive mutation scheme and parameter control using Lévy distribution as the main components,we present a new DE variant called Lévy DE (LDE).Experimental studies were carried out on a broad range of benchmark functions in global numerical optimization.The results show that LDE is very competitive,and both of the two main components have contributed to its overall performance.The scalability of LDE is also discussed by conducting experiments on some selected benchmark functions with dimensions from 30 to 200.

  7. 多元Beta分布特性分析%Analysis on Multi-dimensional Beta Probability Distribution Function

    Institute of Scientific and Technical Information of China (English)

    潘高田; 梁帆; 郭齐胜; 黄一斌

    2011-01-01

    Based on the quantitative truncated sequential test theory, multi-dimensional Beta probability distribution functions are come across in the problem of weapons system against aerial target hit accuracy tests. This paper analyses multi-dimensional Beta probability distribution function's properties and figures out" part of two-dimensional Beta probability distribution function values. This research plays an important role in the field of weapon system hit accuracy tests.%利用小样本截尾序贯检验理论,在武器系统对空中目标的命中精度检验问题中,遇到了一类多元Beta概率分布函数,讨论分析了多维Beta概率分布函数的特性并给出了概率计算表.结果对武器精度检验具有重要意义和实用价值.

  8. Emergence of visual saliency from natural scenes via context-mediated probability distributions coding.

    Directory of Open Access Journals (Sweden)

    Jinhua Xu

    Full Text Available Visual saliency is the perceptual quality that makes some items in visual scenes stand out from their immediate contexts. Visual saliency plays important roles in natural vision in that saliency can direct eye movements, deploy attention, and facilitate tasks like object detection and scene understanding. A central unsolved issue is: What features should be encoded in the early visual cortex for detecting salient features in natural scenes? To explore this important issue, we propose a hypothesis that visual saliency is based on efficient encoding of the probability distributions (PDs of visual variables in specific contexts in natural scenes, referred to as context-mediated PDs in natural scenes. In this concept, computational units in the model of the early visual system do not act as feature detectors but rather as estimators of the context-mediated PDs of a full range of visual variables in natural scenes, which directly give rise to a measure of visual saliency of any input stimulus. To test this hypothesis, we developed a model of the context-mediated PDs in natural scenes using a modified algorithm for independent component analysis (ICA and derived a measure of visual saliency based on these PDs estimated from a set of natural scenes. We demonstrated that visual saliency based on the context-mediated PDs in natural scenes effectively predicts human gaze in free-viewing of both static and dynamic natural scenes. This study suggests that the computation based on the context-mediated PDs of visual variables in natural scenes may underlie the neural mechanism in the early visual cortex for detecting salient features in natural scenes.

  9. Characterisation of seasonal flood types according to timescales in mixed probability distributions

    Science.gov (United States)

    Fischer, Svenja; Schumann, Andreas; Schulte, Markus

    2016-08-01

    When flood statistics are based on annual maximum series (AMS), the sample often contains flood peaks, which differ in their genesis. If the ratios among event types change over the range of observations, the extrapolation of a probability distribution function (pdf) can be dominated by a majority of events that belong to a certain flood type. If this type is not typical for extraordinarily large extremes, such an extrapolation of the pdf is misleading. To avoid this breach of the assumption of homogeneity, seasonal models were developed that differ between winter and summer floods. We show that a distinction between summer and winter floods is not always sufficient if seasonal series include events with different geneses. Here, we differentiate floods by their timescales into groups of long and short events. A statistical method for such a distinction of events is presented. To demonstrate their applicability, timescales for winter and summer floods in a German river basin were estimated. It is shown that summer floods can be separated into two main groups, but in our study region, the sample of winter floods consists of at least three different flood types. The pdfs of the two groups of summer floods are combined via a new mixing model. This model considers that information about parallel events that uses their maximum values only is incomplete because some of the realisations are overlaid. A statistical method resulting in an amendment of statistical parameters is proposed. The application in a German case study demonstrates the advantages of the new model, with specific emphasis on flood types.

  10. Understanding star formation in molecular clouds. III. Probability distribution functions of molecular lines in Cygnus X

    Science.gov (United States)

    Schneider, N.; Bontemps, S.; Motte, F.; Ossenkopf, V.; Klessen, R. S.; Simon, R.; Fechtenbaum, S.; Herpin, F.; Tremblin, P.; Csengeri, T.; Myers, P. C.; Hill, T.; Cunningham, M.; Federrath, C.

    2016-03-01

    The probability distribution function of column density (N-PDF) serves as a powerful tool to characterise the various physical processes that influence the structure of molecular clouds. Studies that use extinction maps or H2 column-density maps (N) that are derived from dust show that star-forming clouds can best be characterised by lognormal PDFs for the lower N range and a power-law tail for higher N, which is commonly attributed to turbulence and self-gravity and/or pressure, respectively. While PDFs from dust cover a large dynamic range (typically N ~ 1020-24 cm-2 or Av~ 0.1-1000), PDFs obtained from molecular lines - converted into H2 column density - potentially trace more selectively different regimes of (column) densities and temperatures. They also enable us to distinguish different clouds along the line of sight through using the velocity information. We report here on PDFs that were obtained from observations of 12CO, 13CO, C18O, CS, and N2H+ in the Cygnus X North region, and make a comparison to a PDF that was derived from dust observations with the Herschel satellite. The PDF of 12CO is lognormal for Av ~ 1-30, but is cut for higher Av because of optical depth effects. The PDFs of C18O and 13CO are mostly lognormal up to Av ~ 1-15, followed by excess up to Av ~ 40. Above that value, all CO PDFs drop, which is most likely due to depletion. The high density tracers CS and N2H+ exhibit only a power law distribution between Av ~ 15 and 400, respectively. The PDF from dust is lognormal for Av ~ 3-15 and has a power-law tail up to Av ~ 500. Absolute values for the molecular line column densities are, however, rather uncertain because of abundance and excitation temperature variations. If we take the dust PDF at face value, we "calibrate" the molecular line PDF of CS to that of the dust and determine an abundance [CS]/[H2] of 10-9. The slopes of the power-law tails of the CS, N2H+, and dust PDFs are -1.6, -1.4, and -2.3, respectively, and are thus consistent

  11. Simulating Tail Probabilities in GI/GI.1 Queues and Insurance Risk Processes with Subexponentail Distributions

    NARCIS (Netherlands)

    Boots, Nam Kyoo; Shahabuddin, Perwez

    2001-01-01

    This paper deals with estimating small tail probabilities of thesteady-state waiting time in a GI/GI/1 queue with heavy-tailed (subexponential) service times. The problem of estimating infinite horizon ruin probabilities in insurance risk processes with heavy-tailed claims can be transformed into th

  12. Simulating Tail Probabilities in GI/GI.1 Queues and Insurance Risk Processes with Subexponentail Distributions

    NARCIS (Netherlands)

    Boots, Nam Kyoo; Shahabuddin, Perwez

    2001-01-01

    This paper deals with estimating small tail probabilities of thesteady-state waiting time in a GI/GI/1 queue with heavy-tailed (subexponential) service times. The problem of estimating infinite horizon ruin probabilities in insurance risk processes with heavy-tailed claims can be transformed into th

  13. The VIMOS Public Extragalactic Redshift Survey (VIPERS). On the recovery of the count-in-cell probability distribution function

    Science.gov (United States)

    Bel, J.; Branchini, E.; Di Porto, C.; Cucciati, O.; Granett, B. R.; Iovino, A.; de la Torre, S.; Marinoni, C.; Guzzo, L.; Moscardini, L.; Cappi, A.; Abbas, U.; Adami, C.; Arnouts, S.; Bolzonella, M.; Bottini, D.; Coupon, J.; Davidzon, I.; De Lucia, G.; Fritz, A.; Franzetti, P.; Fumana, M.; Garilli, B.; Ilbert, O.; Krywult, J.; Le Brun, V.; Le Fèvre, O.; Maccagni, D.; Małek, K.; Marulli, F.; McCracken, H. J.; Paioro, L.; Polletta, M.; Pollo, A.; Schlagenhaufer, H.; Scodeggio, M.; Tasca, L. A. M.; Tojeiro, R.; Vergani, D.; Zanichelli, A.; Burden, A.; Marchetti, A.; Mellier, Y.; Nichol, R. C.; Peacock, J. A.; Percival, W. J.; Phleps, S.; Wolk, M.

    2016-04-01

    We compare three methods to measure the count-in-cell probability density function of galaxies in a spectroscopic redshift survey. From this comparison we found that, when the sampling is low (the average number of object per cell is around unity), it is necessary to use a parametric method to model the galaxy distribution. We used a set of mock catalogues of VIPERS to verify if we were able to reconstruct the cell-count probability distribution once the observational strategy is applied. We find that, in the simulated catalogues, the probability distribution of galaxies is better represented by a Gamma expansion than a skewed log-normal distribution. Finally, we correct the cell-count probability distribution function from the angular selection effect of the VIMOS instrument and study the redshift and absolute magnitude dependency of the underlying galaxy density function in VIPERS from redshift 0.5 to 1.1. We found a very weak evolution of the probability density distribution function and that it is well approximated by a Gamma distribution, independently of the chosen tracers. Based on observations collected at the European Southern Observatory, Cerro Paranal, Chile, using the Very Large Telescope under programmes 182.A-0886 and partly 070.A-9007. Also based on observations obtained with MegaPrime/MegaCam, a joint project of CFHT and CEA/DAPNIA, at the Canada-France-Hawaii Telescope (CFHT), which is operated by the National Research Council (NRC) of Canada, the Institut National des Sciences de l'Univers of the Centre National de la Recherche Scientifique (CNRS) of France, and the University of Hawaii. This work is based in part on data products produced at TERAPIX and the Canadian Astronomy Data Centre as part of the Canada-France-Hawaii Telescope Legacy Survey, a collaborative project of NRC and CNRS. The VIPERS web site is http://www.vipers.inaf.it/

  14. Region-based approximation of probability distributions (for visibility between imprecise points among obstacles)

    OpenAIRE

    Buchin, K Kevin; Kostitsyna, I Irina; Löffler, M; Silveira, RI

    2014-01-01

    Let $p$ and $q$ be two imprecise points, given as probability density functions on $\\mathbb R^2$, and let $\\cal R$ be a set of $n$ line segments (obstacles) in $\\mathbb R^2$. We study the problem of approximating the probability that $p$ and $q$ can see each other; that is, that the segment connecting $p$ and $q$ does not cross any segment of $\\cal R$. To solve this problem, we approximate each density function by a weighted set of polygons; a novel approach to dealing with probability densit...

  15. Projectile Two-dimensional Coordinate Measurement Method Based on Optical Fiber Coding Fire and its Coordinate Distribution Probability

    Science.gov (United States)

    Li, Hanshan; Lei, Zhiyong

    2013-01-01

    To improve projectile coordinate measurement precision in fire measurement system, this paper introduces the optical fiber coding fire measurement method and principle, sets up their measurement model, and analyzes coordinate errors by using the differential method. To study the projectile coordinate position distribution, using the mathematical statistics hypothesis method to analyze their distributing law, firing dispersion and probability of projectile shooting the object center were put under study. The results show that exponential distribution testing is relatively reasonable to ensure projectile position distribution on the given significance level. Through experimentation and calculation, the optical fiber coding fire measurement method is scientific and feasible, which can gain accurate projectile coordinate position.

  16. Calculation of ruin probabilities for a dense class of heavy tailed distributions

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis; Samorodnitsky, Gennady

    2015-01-01

    In this paper, we propose a class of infinite-dimensional phase-type distributions with finitely many parameters as models for heavy tailed distributions. The class of finite-dimensional phase-type distributions is dense in the class of distributions on the positive reals and may hence approximat...

  17. Effects of Turbulent Aberrations on Probability Distribution of Orbital Angular Momentum for Optical Communication

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yi-Xin; CANG Ji

    2009-01-01

    Effects of atmospheric turbulence tilt, defocus, astigmatism and coma aberrations on the orbital angular mo-mentum measurement probability of photons propagating in weak turbulent regime are modeled with Rytov approximation. By considering the resulting wave as a superposition of angular momentum eigenstates, the or-bital angular momentum measurement probabilities of the transmitted digit axe presented. Our results show that the effect of turbulent tilt aberration on the orbital angular momentum measurement probabilities of photons is the maximum among these four kinds of aberrations. As the aberration order increases, the effects of turbulence aberrations on the measurement probabilities of orbital angular momentum generally decrease, whereas the effect of turbulence defoens can be ignored. For tilt aberration, as the difference between the measured orbital angular momentum and the original orbital angular momentum increases, the orbital angular momentum measurement probabifity decreases.

  18. Direct releases to the surface and associated complementary cumulative distribution functions in the 1996 performance assessments for the Waste Isolation Pilot Plant: Direct brine release

    Energy Technology Data Exchange (ETDEWEB)

    STOELZEL,D.M.; O' BRIEN,D.G.; GARNER,J.W.; HELTON,JON CRAIG; JOHNSON,J.D.; SCOTT,L.N.

    2000-05-19

    The following topics related to the treatment of direct brine releases to the surface environment in the 1996 performance assessment for the Waste Isolation Pilot Plant (WIPP) are presented (1) mathematical description of models, (2) uncertainty and sensitivity analysis results arising from subjective (i.e., epistemic) uncertainty for individual releases, (3) construction of complementary cumulative distribution functions (CCDFs) arising from stochastic (i.e., aleatory) uncertainty, and (4) uncertainty and sensitivity analysis results for CCDFs. The presented analyses indicate that direct brine releases do not constitute a serious threat to the effectiveness of the WIPP as a disposal facility for transuranic waste. Even when the effects of uncertain analysis inputs are taken into account, the CCDFs for direct brine releases fall substantially to the left of the boundary line specified in the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (4O CFR 191.40 CFR 194).

  19. Examining the Tails of Probability Distributions Created Using Uncertainty Methods: A Case Study

    Science.gov (United States)

    Kang, M.; Thomson, N. R.; Sykes, J. F.

    2006-12-01

    Environmental management decisions require an understanding of all possible outcomes especially those with a low likelihood of occurrence; however, despite this need emphasis has been placed on the mean rather than extreme outcomes. Typically in groundwater contaminant transport problems, parameter estimates are obtained using automated parameter estimation packages (e.g., PEST) for a given conceptual model. The resulting parameter estimates and covariance information are used to generate Monte Carlo or Latin Hypercube realizations. Our observations indicate that the capacity of the simulations using parameters from the tails of the corresponding probability distributions often fail to sufficiently replicate field based observations. This stems from the fact that the input parameters governing Monte Carlo type uncertainty analysis method are based on the mean. In order to improve the quality of the realizations at the tails, the Dynamically- Dimensioned Search-Uncertainty Analysis (DDS-UA) method is adopted. This approach uses the Dynamically-Dimensioned Search (DDS) algorithm, which is designed to find multiple local minimums, and a pseudo-likelihood function. To test the robustness of this methodology, we applied it to a contaminant transport problem which involved TCE contamination due to releases from the Lockformer Company Facility in Lisle, Illinois. Contamination has been observed in the Silurian dolomite aquifer underlying the facility, which served as a supply of drinking water. Dissolved TCE is assumed to migrate in a predominantly vertically downward direction through the overburden that underlies the Lockformer site and then migrate horizontally in the underlying aquifer. The model is solved using a semi-analytical solution of the mass conservation equation. The parameter estimation process is complicated by the fact that a concentration level equal or greater than the maximum contaminant level must be observed at specified locations. Penalty functions

  20. Fitting a distribution to censored contamination data using Markov Chain Monte Carlo methods and samples selected with unequal probabilities.

    Science.gov (United States)

    Williams, Michael S; Ebel, Eric D

    2014-11-18

    The fitting of statistical distributions to chemical and microbial contamination data is a common application in risk assessment. These distributions are used to make inferences regarding even the most pedestrian of statistics, such as the population mean. The reason for the heavy reliance on a fitted distribution is the presence of left-, right-, and interval-censored observations in the data sets, with censored observations being the result of nondetects in an assay, the use of screening tests, and other practical limitations. Considerable effort has been expended to develop statistical distributions and fitting techniques for a wide variety of applications. Of the various fitting methods, Markov Chain Monte Carlo methods are common. An underlying assumption for many of the proposed Markov Chain Monte Carlo methods is that the data represent independent and identically distributed (iid) observations from an assumed distribution. This condition is satisfied when samples are collected using a simple random sampling design. Unfortunately, samples of food commodities are generally not collected in accordance with a strict probability design. Nevertheless, pseudosystematic sampling efforts (e.g., collection of a sample hourly or weekly) from a single location in the farm-to-table continuum are reasonable approximations of a simple random sample. The assumption that the data represent an iid sample from a single distribution is more difficult to defend if samples are collected at multiple locations in the farm-to-table continuum or risk-based sampling methods are employed to preferentially select samples that are more likely to be contaminated. This paper develops a weighted bootstrap estimation framework that is appropriate for fitting a distribution to microbiological samples that are collected with unequal probabilities of selection. An example based on microbial data, derived by the Most Probable Number technique, demonstrates the method and highlights the

  1. A Hot Spots Ignition Probability Model for Low-Velocity Impacted Explosive Particles Based on the Particle Size and Distribution

    Directory of Open Access Journals (Sweden)

    Hong-fu Guo

    2017-01-01

    Full Text Available Particle size and distribution play an important role in ignition. The size and distribution of the cyclotetramethylene tetranitramine (HMX particles were investigated by Laser Particle Size Analyzer Malvern MS2000 before experiment and calculation. The mean size of particles is 161 μm. Minimum and maximum sizes are 80 μm and 263 μm, respectively. The distribution function is like a quadratic function. Based on the distribution of micron scale explosive particles, a microscopic model is established to describe the process of ignition of HMX particles under drop weight. Both temperature of contact zones and ignition probability of powder explosive can be predicted. The calculated results show that the temperature of the contact zones between the particles and the drop weight surface increases faster and higher than that of the contact zones between two neighboring particles. For HMX particles, with all other conditions being kept constant, if the drop height is less than 0.1 m, ignition probability will be close to 0. When the drop heights are 0.2 m and 0.3 m, the ignition probability is 0.27 and 0.64, respectively, whereas when the drop height is more than 0.4 m, ignition probability will be close to 0.82. In comparison with experimental results, the two curves are reasonably close to each other, which indicates our model has a certain degree of rationality.

  2. Remark about Transition Probabilities Calculation for Single Server Queues with Lognormal Inter-Arrival or Service Time Distributions

    Science.gov (United States)

    Lee, Moon Ho; Dudin, Alexander; Shaban, Alexy; Pokhrel, Subash Shree; Ma, Wen Ping

    Formulae required for accurate approximate calculation of transition probabilities of embedded Markov chain for single-server queues of the GI/M/1, GI/M/1/K, M/G/1, M/G/1/K type with heavy-tail lognormal distribution of inter-arrival or service time are given.

  3. RUNS TEST FOR A CIRCULAR DISTRIBUTION AND A TABLE OF PROBABILITIES

    Science.gov (United States)

    of the well-known Wald - Wolfowitz runs test for a distribution on a straight line. The primary advantage of the proposed test is that it minimizes the number of assumptions on the theoretical distribution.

  4. Deduction of compound nucleus formation probability from the fragment angular distributions in heavy-ion reactions

    Science.gov (United States)

    Yadav, C.; Thomas, R. G.; Mohanty, A. K.; Kapoor, S. S.

    2015-07-01

    The presence of various fissionlike reactions in heavy-ion induced reactions is a major hurdle in the path to laboratory synthesis of heavy and super-heavy nuclei. It is known that the cross section of forming a heavy evaporation residue in fusion reactions depends on the three factors—the capture cross section, probability of compound nucleus formation PCN, and the survival probability of the compound nucleus against fission. As the probability of compound nucleus formation, PCN is difficult to theoretically estimate because of its complex dependence on several parameters; attempts have been made in the past to deduce it from the fission fragment anisotropy data. In the present work, the fragment anisotropy data for a number of heavy-ion reactions are analyzed and it is found that deduction of PCN from the anisotropy data also requires the knowledge of the ratio of relaxation time of the K degree of freedom to pre-equilibrium fission time.

  5. Multi-scale Characterization and Modeling of Surface Slope Probability Distribution for ~20-km Diameter Lunar Craters

    Science.gov (United States)

    Mahanti, P.; Robinson, M. S.; Boyd, A. K.

    2013-12-01

    Craters ~20-km diameter and above significantly shaped the lunar landscape. The statistical nature of the slope distribution on their walls and floors dominate the overall slope distribution statistics for the lunar surface. Slope statistics are inherently useful for characterizing the current topography of the surface, determining accurate photometric and surface scattering properties, and in defining lunar surface trafficability [1-4]. Earlier experimental studies on the statistical nature of lunar surface slopes were restricted either by resolution limits (Apollo era photogrammetric studies) or by model error considerations (photoclinometric and radar scattering studies) where the true nature of slope probability distribution was not discernible at baselines smaller than a kilometer[2,3,5]. Accordingly, historical modeling of lunar surface slopes probability distributions for applications such as in scattering theory development or rover traversability assessment is more general in nature (use of simple statistical models such as the Gaussian distribution[1,2,5,6]). With the advent of high resolution, high precision topographic models of the Moon[7,8], slopes in lunar craters can now be obtained at baselines as low as 6-meters allowing unprecedented multi-scale (multiple baselines) modeling possibilities for slope probability distributions. Topographic analysis (Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC) 2-m digital elevation models (DEM)) of ~20-km diameter Copernican lunar craters revealed generally steep slopes on interior walls (30° to 36°, locally exceeding 40°) over 15-meter baselines[9]. In this work, we extend the analysis from a probability distribution modeling point-of-view with NAC DEMs to characterize the slope statistics for the floors and walls for the same ~20-km Copernican lunar craters. The difference in slope standard deviations between the Gaussian approximation and the actual distribution (2-meter sampling) was

  6. Chaos optimization algorithms based on chaotic maps with different probability distribution and search speed for global optimization

    Science.gov (United States)

    Yang, Dixiong; Liu, Zhenjun; Zhou, Jilei

    2014-04-01

    Chaos optimization algorithms (COAs) usually utilize the chaotic map like Logistic map to generate the pseudo-random numbers mapped as the design variables for global optimization. Many existing researches indicated that COA can more easily escape from the local minima than classical stochastic optimization algorithms. This paper reveals the inherent mechanism of high efficiency and superior performance of COA, from a new perspective of both the probability distribution property and search speed of chaotic sequences generated by different chaotic maps. The statistical property and search speed of chaotic sequences are represented by the probability density function (PDF) and the Lyapunov exponent, respectively. Meanwhile, the computational performances of hybrid chaos-BFGS algorithms based on eight one-dimensional chaotic maps with different PDF and Lyapunov exponents are compared, in which BFGS is a quasi-Newton method for local optimization. Moreover, several multimodal benchmark examples illustrate that, the probability distribution property and search speed of chaotic sequences from different chaotic maps significantly affect the global searching capability and optimization efficiency of COA. To achieve the high efficiency of COA, it is recommended to adopt the appropriate chaotic map generating the desired chaotic sequences with uniform or nearly uniform probability distribution and large Lyapunov exponent.

  7. Classification of Knee Joint Vibration Signals Using Bivariate Feature Distribution Estimation and Maximal Posterior Probability Decision Criterion

    Directory of Open Access Journals (Sweden)

    Fang Zheng

    2013-04-01

    Full Text Available Analysis of knee joint vibration or vibroarthrographic (VAG signals using signal processing and machine learning algorithms possesses high potential for the noninvasive detection of articular cartilage degeneration, which may reduce unnecessary exploratory surgery. Feature representation of knee joint VAG signals helps characterize the pathological condition of degenerative articular cartilages in the knee. This paper used the kernel-based probability density estimation method to model the distributions of the VAG signals recorded from healthy subjects and patients with knee joint disorders. The estimated densities of the VAG signals showed explicit distributions of the normal and abnormal signal groups, along with the corresponding contours in the bivariate feature space. The signal classifications were performed by using the Fisher’s linear discriminant analysis, support vector machine with polynomial kernels, and the maximal posterior probability decision criterion. The maximal posterior probability decision criterion was able to provide the total classification accuracy of 86.67% and the area (Az of 0.9096 under the receiver operating characteristics curve, which were superior to the results obtained by either the Fisher’s linear discriminant analysis (accuracy: 81.33%, Az: 0.8564 or the support vector machine with polynomial kernels (accuracy: 81.33%, Az: 0.8533. Such results demonstrated the merits of the bivariate feature distribution estimation and the superiority of the maximal posterior probability decision criterion for analysis of knee joint VAG signals.

  8. Probability Distributions of Cost and Sequential Bidding Procedures for Defense Procurement Contracts

    Science.gov (United States)

    1998-02-01

    html [March 17, 1998]. DeGroot , Morris H., Optimal Statistical Decisions, New York: McGraw-Hill, 1970. Fudenburg, Drew, and Jean Tirole, Game Theory...probabilities of occurrence. The expected utility approach was originally developed by Von Neumann and Morgenstern, and is described, for example, in DeGroot

  9. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...

  10. In favor of general probability distributions: lateral prefrontal and insular cortices respond to stimulus inherent, but irrelevant differences.

    Science.gov (United States)

    Mestres-Missé, Anna; Trampel, Robert; Turner, Robert; Kotz, Sonja A

    2016-04-01

    A key aspect of optimal behavior is the ability to predict what will come next. To achieve this, we must have a fairly good idea of the probability of occurrence of possible outcomes. This is based both on prior knowledge about a particular or similar situation and on immediately relevant new information. One question that arises is: when considering converging prior probability and external evidence, is the most probable outcome selected or does the brain represent degrees of uncertainty, even highly improbable ones? Using functional magnetic resonance imaging, the current study explored these possibilities by contrasting words that differ in their probability of occurrence, namely, unbalanced ambiguous words and unambiguous words. Unbalanced ambiguous words have a strong frequency-based bias towards one meaning, while unambiguous words have only one meaning. The current results reveal larger activation in lateral prefrontal and insular cortices in response to dominant ambiguous compared to unambiguous words even when prior and contextual information biases one interpretation only. These results suggest a probability distribution, whereby all outcomes and their associated probabilities of occurrence--even if very low--are represented and maintained.

  11. Uncertainty of Hydrological Drought Characteristics with Copula Functions and Probability Distributions: A Case Study of Weihe River, China

    Directory of Open Access Journals (Sweden)

    Panpan Zhao

    2017-05-01

    Full Text Available This study investigates the sensitivity and uncertainty of hydrological droughts frequencies and severity in the Weihe Basin, China during 1960–2012, by using six commonly used univariate probability distributions and three Archimedean copulas to fit the marginal and joint distributions of drought characteristics. The Anderson-Darling method is used for testing the goodness-of-fit of the univariate model, and the Akaike information criterion (AIC is applied to select the best distribution and copula functions. The results demonstrate that there is a very strong correlation between drought duration and drought severity in three stations. The drought return period varies depending on the selected marginal distributions and copula functions and, with an increase of the return period, the differences become larger. In addition, the estimated return periods (both co-occurrence and joint from the best-fitted copulas are the closet to those from empirical distribution. Therefore, it is critical to select the appropriate marginal distribution and copula function to model the hydrological drought frequency and severity. The results of this study can not only help drought investigation to select a suitable probability distribution and copulas function, but are also useful for regional water resource management. However, a few limitations remain in this study, such as the assumption of stationary of runoff series.

  12. A general formula for computing maximum proportion correct scores in various psychophysical paradigms with arbitrary probability distributions of stimulus observations.

    Science.gov (United States)

    Dai, Huanping; Micheyl, Christophe

    2015-05-01

    Proportion correct (Pc) is a fundamental measure of task performance in psychophysics. The maximum Pc score that can be achieved by an optimal (maximum-likelihood) observer in a given task is of both theoretical and practical importance, because it sets an upper limit on human performance. Within the framework of signal detection theory, analytical solutions for computing the maximum Pc score have been established for several common experimental paradigms under the assumption of Gaussian additive internal noise. However, as the scope of applications of psychophysical signal detection theory expands, the need is growing for psychophysicists to compute maximum Pc scores for situations involving non-Gaussian (internal or stimulus-induced) noise. In this article, we provide a general formula for computing the maximum Pc in various psychophysical experimental paradigms for arbitrary probability distributions of sensory activity. Moreover, easy-to-use MATLAB code implementing the formula is provided. Practical applications of the formula are illustrated, and its accuracy is evaluated, for two paradigms and two types of probability distributions (uniform and Gaussian). The results demonstrate that Pc scores computed using the formula remain accurate even for continuous probability distributions, as long as the conversion from continuous probability density functions to discrete probability mass functions is supported by a sufficiently high sampling resolution. We hope that the exposition in this article, and the freely available MATLAB code, facilitates calculations of maximum performance for a wider range of experimental situations, as well as explorations of the impact of different assumptions concerning internal-noise distributions on maximum performance in psychophysical experiments.

  13. ASSESSMENT OF ACCURACY OF PRECIPITATION INDEX (SPI DETERMI-NED BY DIFFERENT PROBABILITY DISTRIBUTIONS

    Directory of Open Access Journals (Sweden)

    Edward Gąsiorek

    2014-11-01

    Full Text Available The use of different calculating methods to compute the standardized precipitation index (SPI results in various approximations. Methods based on normal distribution and its transformations, as well as on gamma distribution, give similar results and may be used equally, whereas the lognormal distribution fitting method is significantly discrepant, especially for extreme values of SPI. Therefore, it is problematic which method gives the distribution optimally fitted to empirical data. The aim of this study is to categorize the above mentioned methods according to the degree of approximation to empirical data from the Observatory of Agro- and Hydrometeorology in Wrocław-Swojec from 1964–2009 years.

  14. Understanding statistical power using noncentral probability distributions: Chi-squared, G-squared, and ANOVA

    Directory of Open Access Journals (Sweden)

    Sébastien Hélie

    2007-09-01

    Full Text Available This paper presents a graphical way of interpreting effect sizes when more than two groups are involved in a statistical analysis. This method uses noncentral distributions to specify the alternative hypothesis, and the statistical power can thus be directly computed. This principle is illustrated using the chi-squared distribution and the F distribution. Examples of chi-squared and ANOVA statistical tests are provided to further illustrate the point. It is concluded that power analyses are an essential part of statistical analysis, and that using noncentral distributions provides an argument in favour of using a factorial ANOVA over multiple t tests.

  15. Sampling informative/complex a priori probability distributions using Gibbs sampling assisted by sequential simulation

    DEFF Research Database (Denmark)

    Hansen, Thomas Mejer; Mosegaard, Klaus; Cordua, Knud Skou

    2010-01-01

    Markov chain Monte Carlo methods such as the Gibbs sampler and the Metropolis algorithm can be used to sample the solutions to non-linear inverse problems. In principle these methods allow incorporation of arbitrarily complex a priori information, but current methods allow only relatively simple...... this algorithm with the Metropolis algorithm to obtain an efficient method for sampling posterior probability densities for nonlinear inverse problems....

  16. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  17. Probability distributions of bed load particle velocities, accelerations, hop distances, and travel times informed by Jaynes's principle of maximum entropy

    Science.gov (United States)

    Furbish, David J.; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan L.

    2016-01-01

    We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.

  18. Twenty-four hour predictions of the solar wind speed peaks by the probability distribution function model

    Science.gov (United States)

    Bussy-Virat, C. D.; Ridley, A. J.

    2016-10-01

    Abrupt transitions from slow to fast solar wind represent a concern for the space weather forecasting community. They may cause geomagnetic storms that can eventually affect systems in orbit and on the ground. Therefore, the probability distribution function (PDF) model was improved to predict enhancements in the solar wind speed. New probability distribution functions allow for the prediction of the peak amplitude and the time to the peak while providing an interval of uncertainty on the prediction. It was found that 60% of the positive predictions were correct, while 91% of the negative predictions were correct, and 20% to 33% of the peaks in the speed were found by the model. This represents a considerable improvement upon the first version of the PDF model. A direct comparison with the Wang-Sheeley-Arge model shows that the PDF model is quite similar, except that it leads to fewer false positive predictions and misses fewer events, especially when the peak reaches very high speeds.

  19. Calculation of Probability Maps Directly from Ordinary Kriging Weights

    Directory of Open Access Journals (Sweden)

    Jorge Kazuo Yamamoto

    2010-03-01

    Full Text Available Probability maps are useful to analyze ores or contaminants in soils and they are helpful to make a decision duringexploration work. These probability maps are usually derived from the indicator kriging approach. Ordinary krigingweights can be used to derive probability maps as well. For testing these two approaches a sample data base was randomlydrawn from an exhaustive data set. From the exhaustive data set actual cumulative distribution functions were determined.Thus, estimated and actual conditional cumulative distribution functions were compared. The vast majority of correlationcoeffi cients between estimated and actual probability maps is greater than 0.75. Not only does the ordinary kriging approachwork, but it also gives slightly better results than median indicator kriging. Moreover, probability maps from ordinary krigingweights are much easier than the traditional approach based on either indicator kriging or median indicator kriging.

  20. Solitary waves for the nonlinear Schrödinger problem with the probability distribution function in the stochastic input case

    Science.gov (United States)

    Abdelrahman, Mahmoud A. E.; Sohaly, M. A.

    2017-08-01

    This work deals with the construction of the exact traveling wave solutions for the nonlinear Schrödinger equation by the new Riccati-Bernoulli Sub-ODE method. Additionally, we apply this method in order to study the random solutions by finding the probability distribution function when the coefficient in our problem is a random variable. The travelling wave solutions of many equations physically or mathematically are expressed by hyperbolic functions, trigonometric functions and rational functions. We discuss our method in the deterministic case and also in a random case, by studying the beta distribution for the random input.

  1. Void probability as a function of the void's shape and scale-invariant models. [in studies of spacial galactic distribution

    Science.gov (United States)

    Elizalde, E.; Gaztanaga, E.

    1992-01-01

    The dependence of counts in cells on the shape of the cell for the large scale galaxy distribution is studied. A very concrete prediction can be done concerning the void distribution for scale invariant models. The prediction is tested on a sample of the CfA catalog, and good agreement is found. It is observed that the probability of a cell to be occupied is bigger for some elongated cells. A phenomenological scale invariant model for the observed distribution of the counts in cells, an extension of the negative binomial distribution, is presented in order to illustrate how this dependence can be quantitatively determined. An original, intuitive derivation of this model is presented.

  2. Classical probability density distributions with uncertainty relations for ground states of simple non-relativistic quantum-mechanical systems

    Science.gov (United States)

    Radożycki, Tomasz

    2016-11-01

    The probability density distributions for the ground states of certain model systems in quantum mechanics and for their classical counterparts are considered. It is shown, that classical distributions are remarkably improved by incorporating into them the Heisenberg uncertainty relation between position and momentum. Even the crude form of this incorporation makes the agreement between classical and quantum distributions unexpectedly good, except for the small area, where classical momenta are large. It is demonstrated that the slight improvement of this form, makes the classical distribution very similar to the quantum one in the whole space. The obtained results are much better than those from the WKB method. The paper is devoted to ground states, but the method applies to excited states too.

  3. A Probability Distribution of Surface Elevation for Wind Waves in Terms of the Gram-Charlier Series

    Institute of Scientific and Technical Information of China (English)

    黄传江; 戴德君; 王伟; 钱成春

    2003-01-01

    Laboratory experiments are conducted to study the probability distribution of surface elevation for wind waves and the convergence is discussed of the Gram-Charlier series in describing the surface elevation distribution. Results show that the agreement between the Gram-Charlier series and the observed distribution becomes better and better as the truncated order of the series increases in a certain range, which is contrary to the phenomenon observed by Huang and Long (1980). It is also shown that the Gram-Charlier series is sensitive to the anomalies in the data set which will make the agreement worse if they are not preprocessed appropriately. Negative values of the probability distribution expressed by the Gram-Charlier series in some ranges of surface elevations are discussed, but the absolute values of the negative values as well as the ranges of their occurrence become smaller gradually as more and more terms are included. Therefore the negative values will have no evident effect on the form of the whole surface elevation distribution when the series is truncated at higher orders. Furthermore, a simple recurrence formula is obtained to calculate the coefficients of the Gram-Charlier series in order to extend the Gram-Charlier series to high orders conveniently.

  4. Probability Distributions for Cyclone Key Parameters and Cyclonic Wind Speed for the East Coast of Indian Region

    Directory of Open Access Journals (Sweden)

    Pradeep K. Goyal

    2011-09-01

    Full Text Available This paper presents a study conducted on the probabilistic distribution of key cyclone parameters and the cyclonic wind speed by analyzing the cyclone track records obtained from India meteorological department for east coast region of India. The dataset of historical landfalling storm tracks in India from 1975–2007 with latitude /longitude and landfall locations are used to map the cyclone tracks in a region of study. The statistical tests were performed to find a best fit distribution to the track data for each cyclone parameter. These parameters include central pressure difference, the radius of maximum wind speed, the translation velocity, track angle with site and are used to generate digital simulated cyclones using wind field simulation techniques. For this, different sets of values for all the cyclone key parameters are generated randomly from their probability distributions. Using these simulated values of the cyclone key parameters, the distribution of wind velocity at a particular site is obtained. The same distribution of wind velocity at the site is also obtained from actual track records and using the distributions of the cyclone key parameters as published in the literature. The simulated distribution is compared with the wind speed distributions obtained from actual track records. The findings are useful in cyclone disaster mitigation.

  5. Research on the behavior of fiber orientation probability distribution function in the planar flows

    Institute of Scientific and Technical Information of China (English)

    ZHOU Kun; LIN Jian-zhong

    2005-01-01

    The equation of two-dimensional fiber direction vector was solved theoretically to give the fiber orientation distribution in simple shear flow, flow with two direction shears, extensional flow and arbitrary planar incompressible flow. The Fokker-Planck equation was solved numerically to validify the theoretical solutions. The stable orientation and orientation period of fiber were obtained. The results showed that the fiber orientation distribution is dependent on the relative not absolute magnitude of the matrix rate-of-strain of flow. The effect of fiber aspect ratio on the orientation distribution of fiber is insignificant in most conditions except the simple shear case. It was proved that the results for a planar flow could be generalized to the case of 3-D fiber direction vector.

  6. Multiparameter probability distributions for heavy rainfall modeling in extreme southern Brazil

    Directory of Open Access Journals (Sweden)

    Samuel Beskow

    2015-09-01

    New hydrological insights for the region: The Anderson–Darling and Filliben tests were the most restrictive in this study. Based on the Anderson–Darling test, it was found that the Kappa distribution presented the best performance, followed by the GEV. This finding provides evidence that these multiparameter distributions result, for the region of study, in greater accuracy for the generation of intensity–duration–frequency curves and the prediction of peak streamflows and design hydrographs. As a result, this finding can support the design of hydraulic structures and flood management in river basins.

  7. The Soft Cumulative Constraint

    CERN Document Server

    Petit, Thierry

    2009-01-01

    This research report presents an extension of Cumulative of Choco constraint solver, which is useful to encode over-constrained cumulative problems. This new global constraint uses sweep and task interval violation-based algorithms.

  8. Is extrapair mating random? On the probability distribution of extrapair young in avian broods

    NARCIS (Netherlands)

    Brommer, Jon E.; Korsten, Peter; Bouwman, Karen A.; Berg, Mathew L.; Komdeur, Jan

    2007-01-01

    A dichotomy in female extrapair copulation (EPC) behavior, with some females seeking EPC and others not, is inferred if the observed distribution of extrapair young (EPY) over broods differs from a random process on the level of individual offspring (binomial, hypergeometrical, or Poisson). A review

  9. Ruin Probabilities and Aggregrate Claims Distributions for Shot Noise Cox Processes

    DEFF Research Database (Denmark)

    Albrecher, H.; Asmussen, Søren

    We consider a risk process Rt where the claim arrival process is a superposition of a homogeneous Poisson process and a Cox process with a Poisson shot noise intensity process, capturing the effect of sudden increases of the claim intensity due to external events. The distribution of the aggregate...

  10. Ground impact probability distribution for small unmanned aircraft in ballistic descent

    DEFF Research Database (Denmark)

    La Cour-Harbo, Anders

    2017-01-01

    Safety is a key factor in all aviation, and while years of development has made manned aviation relatively safe, the same has yet to happen for unmanned aircraft. However, the rapid development of unmanned aircraft technology means that the range of commercial and scientific applications is growing...... equally rapid. At the same time the trend in national and international regulations for unmanned aircraft is to take a risk-based approach, effectively requiring risk assessment for every flight operation. This work addresses the growing need for methods for quantitatively evaluating individual flights...... by modelling the consequences of a ballistic descent of an unmanned aircraft as a result of a major inflight incident. The presented model is a probability density function for the ground impact area based on a second order drag model with probabilistic assumptions on the least well-known parameters...

  11. Luminosity distance in Swiss cheese cosmology with randomized voids. II. Magnification probability distributions

    CERN Document Server

    Flanagan, Éanna É; Wasserman, Ira; Vanderveld, R Ali

    2011-01-01

    We study the fluctuations in luminosity distances due to gravitational lensing by large scale (> 35 Mpc) structures, specifically voids and sheets. We use a simplified "Swiss cheese" model consisting of a \\Lambda -CDM Friedman-Robertson-Walker background in which a number of randomly distributed non-overlapping spherical regions are replaced by mass compensating comoving voids, each with a uniform density interior and a thin shell of matter on the surface. We compute the distribution of magnitude shifts using a variant of the method of Holz & Wald (1998), which includes the effect of lensing shear. The standard deviation of this distribution is ~ 0.027 magnitudes and the mean is ~ 0.003 magnitudes for voids of radius 35 Mpc, sources at redshift z_s=1.0, with the voids chosen so that 90% of the mass is on the shell today. The standard deviation varies from 0.005 to 0.06 magnitudes as we vary the void size, source redshift, and fraction of mass on the shells today. If the shell walls are given a finite thic...

  12. Unit-Sphere Anisotropic Multiaxial Stochastic-Strength Model Probability Density Distribution for the Orientation of Critical Flaws

    Science.gov (United States)

    Nemeth, Noel

    2013-01-01

    Models that predict the failure probability of monolithic glass and ceramic components under multiaxial loading have been developed by authors such as Batdorf, Evans, and Matsuo. These "unit-sphere" failure models assume that the strength-controlling flaws are randomly oriented, noninteracting planar microcracks of specified geometry but of variable size. This report develops a formulation to describe the probability density distribution of the orientation of critical strength-controlling flaws that results from an applied load. This distribution is a function of the multiaxial stress state, the shear sensitivity of the flaws, the Weibull modulus, and the strength anisotropy. Examples are provided showing the predicted response on the unit sphere for various stress states for isotropic and transversely isotropic (anisotropic) materials--including the most probable orientation of critical flaws for offset uniaxial loads with strength anisotropy. The author anticipates that this information could be used to determine anisotropic stiffness degradation or anisotropic damage evolution for individual brittle (or quasi-brittle) composite material constituents within finite element or micromechanics-based software

  13. A near-infrared SETI experiment: probability distribution of false coincidences

    Science.gov (United States)

    Maire, Jérôme; Wright, Shelley A.; Werthimer, Dan; Treffers, Richard R.; Marcy, Geoffrey W.; Stone, Remington P. S.; Drake, Frank; Siemion, Andrew

    2014-07-01

    A Search for Extraterrestrial Life (SETI), based on the possibility of interstellar communication via laser signals, is being designed to extend the search into the near-infrared spectral region (Wright et al, this conference). The dedicated near-infrared (900 to 1700 nm) instrument takes advantage of a new generation of avalanche photodiodes (APD), based on internal discrete amplification. These discrete APD (DAPD) detectors have a high speed response (laser light pulse detection in our experiment. These criteria are defined to optimize the trade between high detection efficiency and low false positive coincident signals, which can be produced by detector dark noise, background light, cosmic rays, and astronomical sources. We investigate experimentally how false coincidence rates depend on the number of detectors in parallel, and on the signal pulse height and width. We also look into the corresponding threshold to each of the signals to optimize the sensitivity while also reducing the false coincidence rates. Lastly, we discuss the analytical solution used to predict the probability of laser pulse detection with multiple detectors.

  14. Image denoising in bidimensional empirical mode decomposition domain: the role of Student's probability distribution function.

    Science.gov (United States)

    Lahmiri, Salim

    2016-03-01

    Hybridisation of the bi-dimensional empirical mode decomposition (BEMD) with denoising techniques has been proposed in the literature as an effective approach for image denoising. In this Letter, the Student's probability density function is introduced in the computation of the mean envelope of the data during the BEMD sifting process to make it robust to values that are far from the mean. The resulting BEMD is denoted tBEMD. In order to show the effectiveness of the tBEMD, several image denoising techniques in tBEMD domain are employed; namely, fourth order partial differential equation (PDE), linear complex diffusion process (LCDP), non-linear complex diffusion process (NLCDP), and the discrete wavelet transform (DWT). Two biomedical images and a standard digital image were considered for experiments. The original images were corrupted with additive Gaussian noise with three different levels. Based on peak-signal-to-noise ratio, the experimental results show that PDE, LCDP, NLCDP, and DWT all perform better in the tBEMD than in the classical BEMD domain. It is also found that tBEMD is faster than classical BEMD when the noise level is low. When it is high, the computational cost in terms of processing time is similar. The effectiveness of the presented approach makes it promising for clinical applications.

  15. Predicting Ligand Binding Sites on Protein Surfaces by 3-Dimensional Probability Density Distributions of Interacting Atoms

    Science.gov (United States)

    Jian, Jhih-Wei; Elumalai, Pavadai; Pitti, Thejkiran; Wu, Chih Yuan; Tsai, Keng-Chang; Chang, Jeng-Yih; Peng, Hung-Pin; Yang, An-Suei

    2016-01-01

    Predicting ligand binding sites (LBSs) on protein structures, which are obtained either from experimental or computational methods, is a useful first step in functional annotation or structure-based drug design for the protein structures. In this work, the structure-based machine learning algorithm ISMBLab-LIG was developed to predict LBSs on protein surfaces with input attributes derived from the three-dimensional probability density maps of interacting atoms, which were reconstructed on the query protein surfaces and were relatively insensitive to local conformational variations of the tentative ligand binding sites. The prediction accuracy of the ISMBLab-LIG predictors is comparable to that of the best LBS predictors benchmarked on several well-established testing datasets. More importantly, the ISMBLab-LIG algorithm has substantial tolerance to the prediction uncertainties of computationally derived protein structure models. As such, the method is particularly useful for predicting LBSs not only on experimental protein structures without known LBS templates in the database but also on computationally predicted model protein structures with structural uncertainties in the tentative ligand binding sites. PMID:27513851

  16. Cumulants and waitingtime distribution of the photon emission from driven BaF molecule%外场作用下BaF分子发射光子累积量及等待时间分布的研究∗

    Institute of Scientific and Technical Information of China (English)

    古丽姗; 彭勇刚

    2016-01-01

    In this paper, we consider a single BaF molecule driven by an external field. When the symmetry is broken, the states of the BaF molecule demonstrate the permanent dipole moments. An external laser field to excite BaF molecule transition from its ground state to its excited state, and a radio frequency field couple with the permanent dipole moment of the BaF. The first order and second order cumulants of the emission photons and the waiting time distribution are studied via the recently developed generating function approach, which is very convenient to study the counting statistics and the corresponding probability distributions. The results demonstrate that the radio frequency field could help the BaF molecule to absorb photons from the driving field. The second and third order waiting time distributions oscillate with the evolution time, which reflects the states oscillating with the external radio frequency field.

  17. Confidence limits with multiple channels and arbitrary probability distributions for sensitivity and expected background

    CERN Document Server

    Perrotta, A

    2002-01-01

    A MC method is proposed to compute upper limits, in a pure Bayesian approach, when the errors associated with the experimental sensitivity and expected background content are not Gaussian distributed or not small enough to apply usual approximations. It is relatively easy to extend the procedure to the multichannel case (for instance when different decay branching, luminosities or experiments have to be combined). Some of the searches for supersymmetric particles performed in the DELPHI experiment at the LEP electron- positron collider use such a procedure to propagate systematics into the calculation of cross-section upper limits. One of these searches is described as an example. (6 refs).

  18. Development of probability distributions for regional climate change from uncertain global mean warming and an uncertain scaling relationship

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available To produce probability distributions for regional climate change in surface temperature and precipitation, a probability distribution for global mean temperature increase has been combined with the probability distributions for the appropriate scaling variables, i.e. the changes in regional temperature/precipitation per degree global mean warming. Each scaling variable is assumed to be normally distributed. The uncertainty of the scaling relationship arises from systematic differences between the regional changes from global and regional climate model simulations and from natural variability. The contributions of these sources of uncertainty to the total variance of the scaling variable are estimated from simulated temperature and precipitation data in a suite of regional climate model experiments conducted within the framework of the EU-funded project PRUDENCE, using an Analysis Of Variance (ANOVA. For the area covered in the 2001–2004 EU-funded project SWURVE, five case study regions (CSRs are considered: NW England, the Rhine basin, Iberia, Jura lakes (Switzerland and Mauvoisin dam (Switzerland. The resulting regional climate changes for 2070–2099 vary quite significantly between CSRs, between seasons and between meteorological variables. For all CSRs, the expected warming in summer is higher than that expected for the other seasons. This summer warming is accompanied by a large decrease in precipitation. The uncertainty of the scaling ratios for temperature and precipitation is relatively large in summer because of the differences between regional climate models. Differences between the spatial climate-change patterns of global climate model simulations make significant contributions to the uncertainty of the scaling ratio for temperature. However, no meaningful contribution could be found for the scaling ratio for precipitation due to the small number of global climate models in the PRUDENCE project and natural variability, which is

  19. How to use MATLAB to fit the ex-Gaussian and other probability functions to a distribution of response times

    Directory of Open Access Journals (Sweden)

    Denis Cousineau

    2008-03-01

    Full Text Available This article discusses how to characterize response time (RT frequency distributions in terms of probability functions and how to implement the necessary analysis tools using MATLAB. The first part of the paper discusses the general principles of maximum likelihood estimation. A detailed implementation that allows fitting the popular ex-Gaussian function is then presented followed by the results of a Monte Carlo study that shows the validity of the proposed approach. Although the main focus is the ex-Gaussian function, the general procedure described here can be used to estimate best fitting parameters of various probability functions. The proposed computational tools, written in MATLAB source code, are available through the Internet.

  20. A MULTIVARIATE WEIBULL DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    Cheng Lee

    2010-07-01

    Full Text Available A multivariate survival function of Weibull Distribution is developed by expanding the theorem by Lu and Bhattacharyya. From the survival function, the probability density function, the cumulative probability function, the determinant of the Jacobian Matrix, and the general moment are derived.

  1. Codon information value and codon transition-probability distributions in short-term evolution

    Science.gov (United States)

    Jiménez-Montaño, M. A.; Coronel-Brizio, H. F.; Hernández-Montoya, A. R.; Ramos-Fernández, A.

    2016-07-01

    To understand the way the Genetic Code and the physical-chemical properties of coded amino acids affect accepted amino acid substitutions in short-term protein evolution, taking into account only overall amino acid conservation, we consider an underlying codon-level model. This model employs codon pair-substitution frequencies from an empirical matrix in the literature, modified for single-base mutations only. Ordering the degenerated codons according to their codon information value (Volkenstein, 1979), we found that three-fold and most of four-fold degenerated codons, which have low codon values, were best fitted to rank-frequency distributions with constant failure rate (exponentials). In contrast, almost all two-fold degenerated codons, which have high codon values, were best fitted to rank-frequency distributions with variable failure rate (inverse power-laws). Six-fold degenerated codons are considered to be doubly assigned. The exceptional behavior of some codons, including non-degenerate codons, is discussed.

  2. Probability density functions for description of diameter distribution in thinned stands of Tectona grandis

    Directory of Open Access Journals (Sweden)

    Julianne de Castro Oliveira

    2012-06-01

    Full Text Available The objective of this study was to evaluate the effectiveness of fatigue life, Frechet, Gamma, Generalized Gamma, Generalized Logistic, Log-logistic, Nakagami, Beta, Burr, Dagum, Weibull and Hyperbolic distributions in describing diameter distribution in teak stands subjected to thinning at different ages. Data used in this study originated from 238 rectangular permanent plots 490 m2 in size, installed in stands of Tectona grandis L. f. in Mato Grosso state, Brazil. The plots were measured at ages 34, 43, 55, 68, 81, 82, 92, 104, 105, 120, 134 and 145 months on average. Thinning was done in two occasions: the first was systematic at age 81months, with a basal area intensity of 36%, while the second was selective at age 104 months on average and removed poorer trees, reducing basal area by 30%. Fittings were assessed by the Kolmogorov-Smirnov goodness-of-fit test. The Log-logistic (3P, Burr (3P, Hyperbolic (3P, Burr (4P, Weibull (3P, Hyperbolic (2P, Fatigue Life (3P and Nakagami functions provided more satisfactory values for the k-s test than the more commonly used Weibull function.

  3. Confidence Limits with Multiple Channels and Arbitrary Probability Distributions for Sensitivity and Expected Background

    Science.gov (United States)

    Perrotta, Andrea

    A MC method is proposed to compute upper limits, in a pure Bayesian approach, when the errors associated to the experimental sensitivity and to the expected background content are not Gaussian distributed or not small enough to apply the usual approximations. It is relatively easy to extend the procedure to the multichannel case (for instance when different decay branchings, or luminosities or experiments have to be combined). Some of the searches for supersymmetric particles performed in the DELPHI experiment at the LEP electron-positron collider use such a procedure to propagate the systematics into the calculation of the cross-section upper limits. One of these searches will be described as an example.

  4. Maxwell and the normal distribution: A colored story of probability, independence, and tendency toward equilibrium

    Science.gov (United States)

    Gyenis, Balázs

    2017-02-01

    We investigate Maxwell's attempt to justify the mathematical assumptions behind his 1860 Proposition IV according to which the velocity components of colliding particles follow the normal distribution. Contrary to the commonly held view we find that his molecular collision model plays a crucial role in reaching this conclusion, and that his model assumptions also permit inference to equalization of mean kinetic energies (temperatures), which is what he intended to prove in his discredited and widely ignored Proposition VI. If we take a charitable reading of his own proof of Proposition VI then it was Maxwell, and not Boltzmann, who gave the first proof of a tendency towards equilibrium, a sort of H-theorem. We also call attention to a potential conflation of notions of probabilistic and value independence in relevant prior works of his contemporaries and of his own, and argue that this conflation might have impacted his adoption of the suspect independence assumption of Proposition IV.

  5. The Homotopic Probability Distribution and the Partition Function for the Entangled System Around a Ribbon Segment Chain

    Institute of Scientific and Technical Information of China (English)

    QIAN Shang-Wu; GU Zhi-Yu

    2001-01-01

    Using the Feynman's path integral with topological constraints arising from the presence of one singular line, we find the homotopic probability distribution PnL for the winding number n and the partition function PL of the entangled system around a ribbon segment chain. We find that when the width of the ribbon segment chain 2a increases,the partition function exponentially decreases, whereas the free energy increases an amount, which is proportional to the square of the width. When the width tends to zero we obtain the same results as those of a single chain with one singular point.

  6. Exact valence bond entanglement entropy and probability distribution in the XXX spin chain and the potts model.

    Science.gov (United States)

    Jacobsen, J L; Saleur, H

    2008-02-29

    We determine exactly the probability distribution of the number N_(c) of valence bonds connecting a subsystem of length L>1 to the rest of the system in the ground state of the XXX antiferromagnetic spin chain. This provides, in particular, the asymptotic behavior of the valence-bond entanglement entropy S_(VB)=N_(c)ln2=4ln2/pi(2)lnL disproving a recent conjecture that this should be related with the von Neumann entropy, and thus equal to 1/3lnL. Our results generalize to the Q-state Potts model.

  7. A Method for Justification of the View of Observables in Quantum Mechanics and Probability Distributions in Phase Space

    CERN Document Server

    Beniaminov, E M

    2001-01-01

    There are considered some corollaries of certain hypotheses on the observation process of microphenomena. We show that an enlargement of the phase space and of its motion group and an account for the diffusion motions of microsystems in the enlarged space, the motions which act by small random translations along the enlarged group, lead to observable quantum effects. This approach enables one to recover probability distributions in the phase space for wave functions. The parameters of the model considered here are estimated on the base of Lamb's shift in the spectrum of the hydrogen's atom.

  8. Nonuniversal power law scaling in the probability distribution of scientific citations

    CERN Document Server

    Peterson, G J; Dill, K A; 10.1073/pnas.1010757107

    2010-01-01

    We develop a model for the distribution of scientific citations. The model involves a dual mechanism: in the direct mechanism, the author of a new paper finds an old paper A and cites it. In the indirect mechanism, the author of a new paper finds an old paper A only via the reference list of a newer intermediary paper B, which has previously cited A. By comparison to citation databases, we find that papers having few citations are cited mainly by the direct mechanism. Papers already having many citations (`classics') are cited mainly by the indirect mechanism. The indirect mechanism gives a power-law tail. The `tipping point' at which a paper becomes a classic is about 25 citations for papers published in the Institute for Scientific Information (ISI) Web of Science database in 1981, 31 for Physical Review D papers published from 1975-1994, and 37 for all publications from a list of high h-index chemists assembled in 2007. The power-law exponent is not universal. Individuals who are highly cited have a system...

  9. Structured Coupling of Probability Loss Distributions: Assessing Joint Flood Risk in Multiple River Basins.

    Science.gov (United States)

    Timonina, Anna; Hochrainer-Stigler, Stefan; Pflug, Georg; Jongman, Brenden; Rojas, Rodrigo

    2015-11-01

    Losses due to natural hazard events can be extraordinarily high and difficult to cope with. Therefore, there is considerable interest to estimate the potential impact of current and future extreme events at all scales in as much detail as possible. As hazards typically spread over wider areas, risk assessment must take into account interrelations between regions. Neglecting such interdependencies can lead to a severe underestimation of potential losses, especially for extreme events. This underestimation of extreme risk can lead to the failure of riskmanagement strategies when they are most needed, namely, in times of unprecedented events. In this article, we suggest a methodology to incorporate such interdependencies in risk via the use of copulas. We demonstrate that by coupling losses, dependencies can be incorporated in risk analysis, avoiding the underestimation of risk. Based on maximum discharge data of river basins and stream networks, we present and discuss different ways to couple loss distributions of basins while explicitly incorporating tail dependencies. We distinguish between coupling methods that require river structure data for the analysis and those that do not. For the later approach we propose a minimax algorithm to choose coupled basin pairs so that the underestimation of risk is avoided and the use of river structure data is not needed. The proposed methodology is especially useful for large-scale analysis and we motivate and apply our method using the case of Romania. The approach can be easily extended to other countries and natural hazards.

  10. Understanding star formation in molecular clouds III. Probability distribution functions of molecular lines in Cygnus X

    CERN Document Server

    Schneider, N; Motte, F; Ossenkopf, V; Klessen, R S; Simon, R; Fechtenbaum, S; Herpin, F; Tremblin, P; Csengeri, T; Myers, P C; Hill, T; Cunningham, M; Federrath, C

    2015-01-01

    Column density (N) PDFs serve as a powerful tool to characterize the physical processes that influence the structure of molecular clouds. Star-forming clouds can best be characterized by lognormal PDFs for the lower N range and a power-law tail for higher N, commonly attributed to turbulence and self-gravity and/or pressure, respectively. We report here on PDFs obtained from observations of 12CO, 13CO, C18O, CS, and N2H+ in the Cygnus X North region and compare to a PDF derived from dust observations with the Herschel satellite. The PDF of 12CO is lognormal for Av~1-30, but is cut for higher Av due to optical depth effects. The PDFs of C18O and 13CO are mostly lognormal up for Av~1-15, followed by excess up to Av~40. Above that value, all CO PDFs drop, most likely due to depletion. The high density tracers CS and N2H+ exhibit only a power law distribution between Av~15 and 400, respectively. The PDF from dust is lognormal for Av~2-15 and has a power-law tail up to Av~500. Absolute values for the molecular lin...

  11. Nonuniversal power law scaling in the probability distribution of scientific citations.

    Science.gov (United States)

    Peterson, George J; Pressé, Steve; Dill, Ken A

    2010-09-14

    We develop a model for the distribution of scientific citations. The model involves a dual mechanism: in the direct mechanism, the author of a new paper finds an old paper A and cites it. In the indirect mechanism, the author of a new paper finds an old paper A only via the reference list of a newer intermediary paper B, which has previously cited A. By comparison to citation databases, we find that papers having few citations are cited mainly by the direct mechanism. Papers already having many citations ("classics") are cited mainly by the indirect mechanism. The indirect mechanism gives a power-law tail. The "tipping point" at which a paper becomes a classic is about 25 citations for papers published in the Institute for Scientific Information (ISI) Web of Science database in 1981, 31 for Physical Review D papers published from 1975-1994, and 37 for all publications from a list of high h-index chemists assembled in 2007. The power-law exponent is not universal. Individuals who are highly cited have a systematically smaller exponent than individuals who are less cited.

  12. Computation of steady-state probability distributions in stochastic models of cellular networks.

    Directory of Open Access Journals (Sweden)

    Mark Hallen

    2011-10-01

    Full Text Available Cellular processes are "noisy". In each cell, concentrations of molecules are subject to random fluctuations due to the small numbers of these molecules and to environmental perturbations. While noise varies with time, it is often measured at steady state, for example by flow cytometry. When interrogating aspects of a cellular network by such steady-state measurements of network components, a key need is to develop efficient methods to simulate and compute these distributions. We describe innovations in stochastic modeling coupled with approaches to this computational challenge: first, an approach to modeling intrinsic noise via solution of the chemical master equation, and second, a convolution technique to account for contributions of extrinsic noise. We show how these techniques can be combined in a streamlined procedure for evaluation of different sources of variability in a biochemical network. Evaluation and illustrations are given in analysis of two well-characterized synthetic gene circuits, as well as a signaling network underlying the mammalian cell cycle entry.

  13. Families of Fokker-Planck equations and the associated entropic form for a distinct steady-state probability distribution with a known external force field.

    Science.gov (United States)

    Asgarani, Somayeh

    2015-02-01

    A method of finding entropic form for a given stationary probability distribution and specified potential field is discussed, using the steady-state Fokker-Planck equation. As examples, starting with the Boltzmann and Tsallis distribution and knowing the force field, we obtain the Boltzmann-Gibbs and Tsallis entropies. Also, the associated entropy for the gamma probability distribution is found, which seems to be in the form of the gamma function. Moreover, the related Fokker-Planck equations are given for the Boltzmann, Tsallis, and gamma probability distributions.

  14. Systematic Study of Rogue Wave Probability Distributions in a Fourth-Order Nonlinear Schr\\"odinger Equation

    CERN Document Server

    Ying, L H

    2012-01-01

    Nonlinear instability and refraction by ocean currents are both important mechanisms that go beyond the Rayleigh approximation and may be responsible for the formation of freak waves. In this paper, we quantitatively study nonlinear effects on the evolution of surface gravity waves on the ocean, to explore systematically the effects of various input parameters on the probability of freak wave formation. The fourth-order current-modified nonlinear Schr\\"odinger equation (CNLS4) is employed to describe the wave evolution. By solving CNLS4 numerically, we are able to obtain quantitative predictions for the wave height distribution as a function of key environmental conditions such as average steepness, angular spread, and frequency spread of the local sea state. Additionally, we explore the spatial dependence of the wave height distribution, associated with the buildup of nonlinear development.

  15. Extinction probabilities and stationary distributions of mobile genetic elements in prokaryotes: The birth-death-diversification model.

    Science.gov (United States)

    Drakos, Nicole E; Wahl, Lindi M

    2015-12-01

    Theoretical approaches are essential to our understanding of the complex dynamics of mobile genetic elements (MGEs) within genomes. Recently, the birth-death-diversification model was developed to describe the dynamics of mobile promoters (MPs), a particular class of MGEs in prokaryotes. A unique feature of this model is that genetic diversification of elements was included. To explore the implications of diversification on the longterm fate of MGE lineages, in this contribution we analyze the extinction probabilities, extinction times and equilibrium solutions of the birth-death-diversification model. We find that diversification increases both the survival and growth rate of MGE families, but the strength of this effect depends on the rate of horizontal gene transfer (HGT). We also find that the distribution of MGE families per genome is not necessarily monotonically decreasing, as observed for MPs, but may have a peak in the distribution that is related to the HGT rate. For MPs specifically, we find that new families have a high extinction probability, and predict that the number of MPs is increasing, albeit at a very slow rate. Additionally, we develop an extension of the birth-death-diversification model which allows MGEs in different regions of the genome, for example coding and non-coding, to be described by different rates. This extension may offer a potential explanation as to why the majority of MPs are located in non-promoter regions of the genome.

  16. On a six-parameter generalized Burr XII distribution

    OpenAIRE

    A. K. Olapade

    2008-01-01

    In this paper, we derive a probability density function that generalizes the Burr XII distribution. The cumulative distribution function and the $n^{th}$ moment of the generalized distribution are obtained while the distribution of some order statistics of the distribution are established. A theorem that relate the new distribution to another statistical distribution is established.

  17. Probability theory

    CERN Document Server

    S Varadhan, S R

    2001-01-01

    This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando

  18. Fast Hadamard transforms for compressive sensing of joint systems: measurement of a 3.2 million-dimensional bi-photon probability distribution.

    Science.gov (United States)

    Lum, Daniel J; Knarr, Samuel H; Howell, John C

    2015-10-19

    We demonstrate how to efficiently implement extremely high-dimensional compressive imaging of a bi-photon probability distribution. Our method uses fast-Hadamard-transform Kronecker-based compressive sensing to acquire the joint space distribution. We list, in detail, the operations necessary to enable fast-transform-based matrix-vector operations in the joint space to reconstruct a 16.8 million-dimensional image in less than 10 minutes. Within a subspace of that image exists a 3.2 million-dimensional bi-photon probability distribution. In addition, we demonstrate how the marginal distributions can aid in the accuracy of joint space distribution reconstructions.

  19. Magnetization curves and probability angular distribution of the magnetization vector in Er2Fe14Si3

    Science.gov (United States)

    Sobh, Hala A.; Aly, Samy H.; Shabara, Reham M.; Yehia, Sherif

    2016-01-01

    Specific magnetic and magneto-thermal properties of Er2Fe14Si3, in the temperature range of 80-300 K, have been investigated using basic laws of classical statistical mechanics in a simple model. In this model, the constructed partition function was used to derive, and therefore calculate the temperature and/or field dependence of a host of physical properties. Examples of these properties are: the magnetization, magnetic heat capacity, magnetic susceptibility, probability angular distribution of the magnetization vector, and the associated angular dependence of energy. We highlight a correlation between the energy of the system, its magnetization behavior and the angular location of the magnetization vector. Our results show that Er2Fe14Si3 is an easy-axis system in the temperature range 80-114 K, but switches to an easy-plane system at T≥114 K. This transition is also supported by both of the temperature dependence of the magnetic heat capacity, which develops a peak at a temperature ~114 K, and the probability landscape which shows, in zero magnetic field, a prominent peak in the basal plane at T=113.5 K.

  20. Binomial probability distribution model-based protein identification algorithm for tandem mass spectrometry utilizing peak intensity information.

    Science.gov (United States)

    Xiao, Chuan-Le; Chen, Xiao-Zhou; Du, Yang-Li; Sun, Xuesong; Zhang, Gong; He, Qing-Yu

    2013-01-04

    Mass spectrometry has become one of the most important technologies in proteomic analysis. Tandem mass spectrometry (LC-MS/MS) is a major tool for the analysis of peptide mixtures from protein samples. The key step of MS data processing is the identification of peptides from experimental spectra by searching public sequence databases. Although a number of algorithms to identify peptides from MS/MS data have been already proposed, e.g. Sequest, OMSSA, X!Tandem, Mascot, etc., they are mainly based on statistical models considering only peak-matches between experimental and theoretical spectra, but not peak intensity information. Moreover, different algorithms gave different results from the same MS data, implying their probable incompleteness and questionable reproducibility. We developed a novel peptide identification algorithm, ProVerB, based on a binomial probability distribution model of protein tandem mass spectrometry combined with a new scoring function, making full use of peak intensity information and, thus, enhancing the ability of identification. Compared with Mascot, Sequest, and SQID, ProVerB identified significantly more peptides from LC-MS/MS data sets than the current algorithms at 1% False Discovery Rate (FDR) and provided more confident peptide identifications. ProVerB is also compatible with various platforms and experimental data sets, showing its robustness and versatility. The open-source program ProVerB is available at http://bioinformatics.jnu.edu.cn/software/proverb/ .

  1. Autoregressive processes with exponentially decaying probability distribution functions: applications to daily variations of a stock market index.

    Science.gov (United States)

    Porto, Markus; Roman, H Eduardo

    2002-04-01

    We consider autoregressive conditional heteroskedasticity (ARCH) processes in which the variance sigma(2)(y) depends linearly on the absolute value of the random variable y as sigma(2)(y) = a+b absolute value of y. While for the standard model, where sigma(2)(y) = a + b y(2), the corresponding probability distribution function (PDF) P(y) decays as a power law for absolute value of y-->infinity, in the linear case it decays exponentially as P(y) approximately exp(-alpha absolute value of y), with alpha = 2/b. We extend these results to the more general case sigma(2)(y) = a+b absolute value of y(q), with 0 history of the ARCH process is taken into account, the resulting PDF becomes a stretched exponential even for q = 1, with a stretched exponent beta = 2/3, in a much better agreement with the empirical data.

  2. Influence of Coloured Correlated Noises on Probability Distribution and Mean of Tumour Cell Number in the Logistic Growth Model

    Institute of Scientific and Technical Information of China (English)

    HAN Li-Bo; GONG Xiao-Long; CAO Li; WU Da-Jin

    2007-01-01

    An approximate Fokker-P1anck equation for the logistic growth model which is driven by coloured correlated noises is derived by applying the Novikov theorem and the Fox approximation. The steady-state probability distribution (SPD) and the mean of the tumour cell number are analysed. It is found that the SPD is the single extremum configuration when the degree of correlation between the multiplicative and additive noises, λ, is in -1<λ ≤ 0 and can be the double extrema in 0<λ<1. A configuration transition occurs because of the variation of noise parameters. A minimum appears in the curve of the mean of the steady-state tumour cell number, 〈x〉, versus λ. The position and the value of the minimum are controlled by the noise-correlated times.

  3. The Effect of Probability Distributions in a Failure Criterion for the Reliability of a Passive Safety System

    Energy Technology Data Exchange (ETDEWEB)

    Han, Seok-Jung; Yang, Joon-Eon; Lee, Won-Jea [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2008-05-15

    A safety issue of a Very High Temperature Reactor (VHTR) is to estimate the Reliability of a Passive safety System (RoPS). The Stress-Strength Interference (SSI) approach is widely adopted to estimate the RoPS. Major efforts for the RoPS addressed a quantification of the operational uncertainty of a passive safety system given a postulated accident scenario. However, another important problem is to determine the failure criteria of a passive safety system, because there is an ambiguity in the failure criteria for a VHTR due to the inherent safety characteristics. This paper focuses on an investigation of the reliability characteristics due to a change of the probability distribution in a failure criterion for the quantification of the RoPS.

  4. Comparison between the probability distribution of returns in the Heston model and empirical data for stock indexes

    Science.gov (United States)

    Silva, A. Christian; Yakovenko, Victor M.

    2003-06-01

    We compare the probability distribution of returns for the three major stock-market indexes (Nasdaq, S&P500, and Dow-Jones) with an analytical formula recently derived by Drăgulescu and Yakovenko for the Heston model with stochastic variance. For the period of 1982-1999, we find a very good agreement between the theory and the data for a wide range of time lags from 1 to 250 days. On the other hand, deviations start to appear when the data for 2000-2002 are included. We interpret this as a statistical evidence of the major change in the market from a positive growth rate in 1980s and 1990s to a negative rate in 2000s.

  5. Updated greenhouse gas and criteria air pollutant emission factors and their probability distribution functions for electricity generating units

    Energy Technology Data Exchange (ETDEWEB)

    Cai, H.; Wang, M.; Elgowainy, A.; Han, J. (Energy Systems)

    2012-07-06

    Greenhouse gas (CO{sub 2}, CH{sub 4} and N{sub 2}O, hereinafter GHG) and criteria air pollutant (CO, NO{sub x}, VOC, PM{sub 10}, PM{sub 2.5} and SO{sub x}, hereinafter CAP) emission factors for various types of power plants burning various fuels with different technologies are important upstream parameters for estimating life-cycle emissions associated with alternative vehicle/fuel systems in the transportation sector, especially electric vehicles. The emission factors are typically expressed in grams of GHG or CAP per kWh of electricity generated by a specific power generation technology. This document describes our approach for updating and expanding GHG and CAP emission factors in the GREET (Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation) model developed at Argonne National Laboratory (see Wang 1999 and the GREET website at http://greet.es.anl.gov/main) for various power generation technologies. These GHG and CAP emissions are used to estimate the impact of electricity use by stationary and transportation applications on their fuel-cycle emissions. The electricity generation mixes and the fuel shares attributable to various combustion technologies at the national, regional and state levels are also updated in this document. The energy conversion efficiencies of electric generating units (EGUs) by fuel type and combustion technology are calculated on the basis of the lower heating values of each fuel, to be consistent with the basis used in GREET for transportation fuels. On the basis of the updated GHG and CAP emission factors and energy efficiencies of EGUs, the probability distribution functions (PDFs), which are functions that describe the relative likelihood for the emission factors and energy efficiencies as random variables to take on a given value by the integral of their own probability distributions, are updated using best-fit statistical curves to characterize the uncertainties associated with GHG and CAP emissions in life

  6. Updated greenhouse gas and criteria air pollutant emission factors and their probability distribution functions for electricity generating units

    Energy Technology Data Exchange (ETDEWEB)

    Cai, H.; Wang, M.; Elgowainy, A.; Han, J. (Energy Systems)

    2012-07-06

    Greenhouse gas (CO{sub 2}, CH{sub 4} and N{sub 2}O, hereinafter GHG) and criteria air pollutant (CO, NO{sub x}, VOC, PM{sub 10}, PM{sub 2.5} and SO{sub x}, hereinafter CAP) emission factors for various types of power plants burning various fuels with different technologies are important upstream parameters for estimating life-cycle emissions associated with alternative vehicle/fuel systems in the transportation sector, especially electric vehicles. The emission factors are typically expressed in grams of GHG or CAP per kWh of electricity generated by a specific power generation technology. This document describes our approach for updating and expanding GHG and CAP emission factors in the GREET (Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation) model developed at Argonne National Laboratory (see Wang 1999 and the GREET website at http://greet.es.anl.gov/main) for various power generation technologies. These GHG and CAP emissions are used to estimate the impact of electricity use by stationary and transportation applications on their fuel-cycle emissions. The electricity generation mixes and the fuel shares attributable to various combustion technologies at the national, regional and state levels are also updated in this document. The energy conversion efficiencies of electric generating units (EGUs) by fuel type and combustion technology are calculated on the basis of the lower heating values of each fuel, to be consistent with the basis used in GREET for transportation fuels. On the basis of the updated GHG and CAP emission factors and energy efficiencies of EGUs, the probability distribution functions (PDFs), which are functions that describe the relative likelihood for the emission factors and energy efficiencies as random variables to take on a given value by the integral of their own probability distributions, are updated using best-fit statistical curves to characterize the uncertainties associated with GHG and CAP emissions in life

  7. Critical Finite Size Scaling Relation of the Order-Parameter Probability Distribution for the Three-Dimensional Ising Model on the Creutz Cellular Automaton

    Institute of Scientific and Technical Information of China (English)

    B. Kutlu; M. Civi

    2006-01-01

    @@ We study the order parameter probability distribution at the critical point for the three-dimensional spin-1/2 and spin-1 Ising models on the simple cubic lattice under periodic boundary conditions.

  8. IGM Constraints from the SDSS-III/BOSS DR9 Ly-alpha Forest Flux Probability Distribution Function

    CERN Document Server

    Lee, Khee-Gan; Spergel, David N; Weinberg, David H; Hogg, David W; Viel, Matteo; Bolton, James S; Bailey, Stephen; Pieri, Matthew M; Carithers, William; Schlegel, David J; Lundgren, Britt; Palanque-Delabrouille, Nathalie; Suzuki, Nao; Schneider, Donald P; Yeche, Christophe

    2014-01-01

    The Ly$\\alpha$ forest flux probability distribution function (PDF) is an established probe of the intergalactic medium (IGM) astrophysics, especially the temperature-density relationship of the IGM. We measure the flux PDF from 3393 Baryon Oscillations Spectroscopic Survey (BOSS) quasars from SDSS Data Release 9, and compare with mock spectra that include careful modeling of the noise, continuum, and astrophysical uncertainties. The BOSS flux PDFs, measured at $\\langle z \\rangle = [2.3,2.6,3.0]$, are compared with PDFs created from mock spectra drawn from a suite of hydrodynamical simulations that sample the IGM temperature-density relationship, $\\gamma$, and temperature at mean-density, $T_0$, where $T(\\Delta) = T_0 \\Delta^{\\gamma-1}$. We find that a significant population of partial Lyman-limit systems with a column-density distribution slope of $\\beta_\\mathrm{pLLS} \\sim -2$ are required to explain the data at the low-flux end of flux PDF, while uncertainties in the mean \\lya\\ forest transmission affect the...

  9. Integrated statistical modelling of spatial landslide probability

    Science.gov (United States)

    Mergili, M.; Chu, H.-J.

    2015-09-01

    Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.

  10. IGM CONSTRAINTS FROM THE SDSS-III/BOSS DR9 Lyα FOREST TRANSMISSION PROBABILITY DISTRIBUTION FUNCTION

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Khee-Gan; Hennawi, Joseph F. [Max Planck Institute for Astronomy, Königstuhl 17, D-69117 Heidelberg (Germany); Spergel, David N. [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States); Weinberg, David H. [Department of Astronomy and Center for Cosmology and Astro-Particle Physics, Ohio State University, Columbus, OH 43210 (United States); Hogg, David W. [Center for Cosmology and Particle Physics, New York University, 4 Washington Place, Meyer Hall of Physics, New York, NY 10003 (United States); Viel, Matteo [INAF, Osservatorio Astronomico di Trieste, Via G. B. Tiepolo 11, I-34131 Trieste (Italy); Bolton, James S. [School of Physics and Astronomy, University of Nottingham, University Park, Nottingham NG7 2RD (United Kingdom); Bailey, Stephen; Carithers, William; Schlegel, David J. [E.O. Lawrence Berkeley National Lab, 1 Cyclotron Road, Berkeley, CA 94720 (United States); Pieri, Matthew M. [Institute of Cosmology and Gravitation, University of Portsmouth, Dennis Sciama Building, Portsmouth PO1 3FX (United Kingdom); Lundgren, Britt [Department of Astronomy, University of Wisconsin, Madison, WI 53706 (United States); Palanque-Delabrouille, Nathalie; Yèche, Christophe [CEA, Centre de Saclay, Irfu/SPP, F-91191 Gif-sur-Yvette (France); Suzuki, Nao [Kavli Institute for the Physics and Mathematics of the Universe (IPMU), The University of Tokyo, Kashiwano-ha 5-1-5, Kashiwa-shi, Chiba (Japan); Schneider, Donald P., E-mail: lee@mpia.de [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States)

    2015-02-01

    The Lyα forest transmission probability distribution function (PDF) is an established probe of the intergalactic medium (IGM) astrophysics, especially the temperature-density relationship of the IGM. We measure the transmission PDF from 3393 Baryon Oscillations Spectroscopic Survey (BOSS) quasars from Sloan Digital Sky Survey Data Release 9, and compare with mock spectra that include careful modeling of the noise, continuum, and astrophysical uncertainties. The BOSS transmission PDFs, measured at (z) = [2.3, 2.6, 3.0], are compared with PDFs created from mock spectra drawn from a suite of hydrodynamical simulations that sample the IGM temperature-density relationship, γ, and temperature at mean density, T {sub 0}, where T(Δ) = T {sub 0}Δ{sup γ} {sup –} {sup 1}. We find that a significant population of partial Lyman-limit systems (LLSs) with a column-density distribution slope of β{sub pLLS} ∼ – 2 are required to explain the data at the low-transmission end of transmission PDF, while uncertainties in the mean Lyα forest transmission affect the high-transmission end. After modeling the LLSs and marginalizing over mean transmission uncertainties, we find that γ = 1.6 best describes the data over our entire redshift range, although constraints on T {sub 0} are affected by systematic uncertainties. Within our model framework, isothermal or inverted temperature-density relationships (γ ≤ 1) are disfavored at a significance of over 4σ, although this could be somewhat weakened by cosmological and astrophysical uncertainties that we did not model.

  11. CMB lensing beyond the power spectrum: Cosmological constraints from the one-point probability distribution function and peak counts

    Science.gov (United States)

    Liu, Jia; Hill, J. Colin; Sherwin, Blake D.; Petri, Andrea; Böhm, Vanessa; Haiman, Zoltán

    2016-11-01

    Unprecedentedly precise cosmic microwave background (CMB) data are expected from ongoing and near-future CMB stage III and IV surveys, which will yield reconstructed CMB lensing maps with effective resolution approaching several arcminutes. The small-scale CMB lensing fluctuations receive non-negligible contributions from nonlinear structure in the late-time density field. These fluctuations are not fully characterized by traditional two-point statistics, such as the power spectrum. Here, we use N -body ray-tracing simulations of CMB lensing maps to examine two higher-order statistics: the lensing convergence one-point probability distribution function (PDF) and peak counts. We show that these statistics contain significant information not captured by the two-point function and provide specific forecasts for the ongoing stage III Advanced Atacama Cosmology Telescope (AdvACT) experiment. Considering only the temperature-based reconstruction estimator, we forecast 9 σ (PDF) and 6 σ (peaks) detections of these statistics with AdvACT. Our simulation pipeline fully accounts for the non-Gaussianity of the lensing reconstruction noise, which is significant and cannot be neglected. Combining the power spectrum, PDF, and peak counts for AdvACT will tighten cosmological constraints in the Ωm-σ8 plane by ≈30 %, compared to using the power spectrum alone.

  12. On the multiplicity distribution in statistical model: (I) negative binomial distribution

    CERN Document Server

    Xu, Hao-jie

    2016-01-01

    With the distribution of principal thermodynamic variables (e.g.,volume) and the probability condition from reference multiplicity, we develop an improved baseline measure for multiplicity distribution in statistical model to replace the traditional Poisson expectations. We demonstrate the mismatches between experimental measurements and previous theoretical calculations on multiplicity distributions. We derive a general expression for multiplicity distribution, i.e. a conditional probability distribution, in statistical model and calculate its cumulants under Poisson approximation in connection with recent data for multiplicity fluctuations. We find that probability condition from reference multiplicity are crucial to explain the centrality resolution effect in experiment. With the improved baseline measure for multiplicity distribution, we can quantitatively reproduce the cumulants (cumulant products) for multiplicity distribution of total (net) charges measured in experiments.

  13. Comparison of disjunctive kriging to generalized probability kriging in application to the estimation of simulated and real data

    Energy Technology Data Exchange (ETDEWEB)

    Carr, J.R. (Nevada Univ., Reno, NV (United States). Dept. of Geological Sciences); Mao, Nai-hsien (Lawrence Livermore National Lab., CA (United States))

    1992-01-01

    Disjunctive kriging has been compared previously to multigaussian kriging and indicator cokriging for estimation of cumulative distribution functions; it has yet to be compared extensively to probability kriging. Herein, disjunctive kriging and generalized probability kriging are applied to one real and one simulated data set and compared for estimation of the cumulative distribution functions. Generalized probability kriging is an extension, based on generalized cokriging theory, of simple probability kriging for the estimation of the indicator and uniform transforms at each cutoff, Z{sub k}. The disjunctive kriging and the generalized probability kriging give similar results for simulated data of normal distribution, but differ considerably for real data set with non-normal distribution.

  14. Cumulative Paired φ-Entropy

    Directory of Open Access Journals (Sweden)

    Ingo Klein

    2016-07-01

    Full Text Available A new kind of entropy will be introduced which generalizes both the differential entropy and the cumulative (residual entropy. The generalization is twofold. First, we simultaneously define the entropy for cumulative distribution functions (cdfs and survivor functions (sfs, instead of defining it separately for densities, cdfs, or sfs. Secondly, we consider a general “entropy generating function” φ, the same way Burbea et al. (IEEE Trans. Inf. Theory 1982, 28, 489–495 and Liese et al. (Convex Statistical Distances; Teubner-Verlag, 1987 did in the context of φ-divergences. Combining the ideas of φ-entropy and cumulative entropy leads to the new “cumulative paired φ-entropy” ( C P E φ . This new entropy has already been discussed in at least four scientific disciplines, be it with certain modifications or simplifications. In the fuzzy set theory, for example, cumulative paired φ-entropies were defined for membership functions, whereas in uncertainty and reliability theories some variations of C P E φ were recently considered as measures of information. With a single exception, the discussions in the scientific disciplines appear to be held independently of each other. We consider C P E φ for continuous cdfs and show that C P E φ is rather a measure of dispersion than a measure of information. In the first place, this will be demonstrated by deriving an upper bound which is determined by the standard deviation and by solving the maximum entropy problem under the restriction of a fixed variance. Next, this paper specifically shows that C P E φ satisfies the axioms of a dispersion measure. The corresponding dispersion functional can easily be estimated by an L-estimator, containing all its known asymptotic properties. C P E φ is the basis for several related concepts like mutual φ-information, φ-correlation, and φ-regression, which generalize Gini correlation and Gini regression. In addition, linear rank tests for scale that

  15. Effects of Nuclear Potential on the Cumulants of Net-Proton and Net-Baryon Multiplicity Distributions in Au+Au Collisions at $\\sqrt{s_{\\text{NN}}} = 5\\,\\text{GeV}$

    CERN Document Server

    He, Shu; Nara, Yasushi; Esumi, ShinIchi; Xu, Nu

    2016-01-01

    We analyzed the rapidity and transverse momentum dependence for the cumulants of the net-proton and net-baryon distributions in Au+Au collisions at $\\sqrt{s_{\\text{NN}}} = 5\\,\\text{GeV}$ with a microscopic hadronic transport (JAM) model. To study the effects of mean field potential and softening of equation of state (EoS) on the fluctuations of net-proton (baryon) in heavy-ion collisions, the calculations are performed with two different modes. The softening of EoS is realized in the model by implementing the attractive orbit in the two-body scattering to introduce a reduction pressure of the system. By comparing the results from the two modes with the results from default cascade, we find the mean field potential and softening of EoS have strong impacts on the rapidity distributions ($\\text{d}N/\\text{d}y$) and the shape of the net-proton (baryon) multiplicity distributions. The net-proton (baryon) cumulants and their ratios calculated from all of the three modes are with similar trends and show significant s...

  16. Effects of nuclear potential on the cumulants of net-proton and net-baryon multiplicity distributions in Au+Au collisions at √{sNN} = 5GeV

    Science.gov (United States)

    He, Shu; Luo, Xiaofeng; Nara, Yasushi; Esumi, ShinIchi; Xu, Nu

    2016-11-01

    We analyze the rapidity and transverse momentum dependence for the cumulants of the net-proton and net-baryon distributions in Au+Au collisions at √{sNN} = 5GeV with a microscopic hadronic transport (JAM) model. To study the effects of mean field potential and softening of equation of state (EoS) on the fluctuations of net-proton (baryon) in heavy-ion collisions, the calculations are performed with two different modes. The softening of EoS is realized in the model by implementing the attractive orbit in the two-body scattering to introduce a reduction pressure of the system. By comparing the results from the two modes with the results from default cascade, we find the mean field potential and softening of EoS have strong impacts on the rapidity distributions (d N /d y) and the shape of the net-proton (baryon) multiplicity distributions. The net-proton (baryon) cumulants and their ratios calculated from all of the three modes are with similar trends and show significant suppression with respect to unity, which can be explained by the presence of baryon number conservations. It indicates that the effects of mean field potential and softening of EoS might be not the ingredients that are responsible to the observed strong enhancement in the most central Au+Au collisions at 7.7 GeV measured by the STAR experiment at RHIC.

  17. Elements of probability theory

    CERN Document Server

    Rumshiskii, L Z

    1965-01-01

    Elements of Probability Theory presents the methods of the theory of probability. This book is divided into seven chapters that discuss the general rule for the multiplication of probabilities, the fundamental properties of the subject matter, and the classical definition of probability. The introductory chapters deal with the functions of random variables; continuous random variables; numerical characteristics of probability distributions; center of the probability distribution of a random variable; definition of the law of large numbers; stability of the sample mean and the method of moments

  18. Cumulative fatigue damage models

    Science.gov (United States)

    Mcgaw, Michael A.

    1988-01-01

    The problem of calculating expected component life under fatigue loading conditions is complicated by the fact that component loading histories contain, in many cases, cyclic loads of widely varying amplitudes. In such a case a cumulative damage model is required, in addition to a fatigue damage criterion, or life relationship, in order to compute the expected fatigue life. The traditional cumulative damage model used in design is the linear damage rule. This model, while being simple to use, can yield grossly unconservative results under certain loading conditions. Research at the NASA Lewis Research Center has led to the development of a nonlinear cumulative damage model, named the double damage curve approach (DDCA), that has greatly improved predictive capability. This model, which considers the life (or loading) level dependence of damage evolution, was applied successfully to two polycrystalline materials, 316 stainless steel and Haynes 188. The cumulative fatigue behavior of the PWA 1480 single-crystal material is currently being measured to determine the applicability of the DDCA for this material.

  19. PROBABILITY AND STATISTICS.

    Science.gov (United States)

    STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION

  20. Determining the Probability Distribution of Hillslope Peak Discharge Using an Analytical Solution of Kinematic Wave Time of Concentration

    Science.gov (United States)

    Baiamonte, Giorgio; Singh, Vijay P.

    2016-04-01

    extended to the case of pervious hillslopes, accounting for infiltration. In particular, an analytical solution for the time of concentration for overland flow on a rectangular plane surface was derived using the kinematic wave equation under the Green-Ampt infiltration (Baiamonte and Singh, 2015). The objective of this work is to apply the latter solution to determine the probability distribution of hillslope peak discharge by combining it with the familiar rainfall duration-intensity-frequency approach. References Agnese, C., Baiamonte, G., and Corrao, C. (2001). "A simple model of hillslope response for overland flow generation". Hydrol. Process., 15, 3225-3238, ISSN: 0885-6087, doi: 10.1002/hyp.182. Baiamonte, G., and Agnese, C. (2010). "An analytical solution of kinematic wave equations for overland flow under Green-Ampt infiltration". J. Agr. Eng., vol. 1, p. 41-49, ISSN: 1974-7071. Baiamonte, G., and Singh, V.P., (2015). "Analytical solution of kinematic wave time of concentration for overland flow under Green-Ampt Infiltration." J Hydrol E - ASCE, DOI: 10.1061/(ASCE)HE.1943-5584.0001266. Robinson, J.S., and Sivapalan, M. (1996). "Instantaneous response functions of overland flow and subsurface stormflow for catchment models". Hydrol. Process., 10, 845-862. Singh, V.P. (1976). "Derivation of time of concentration". J. of Hydrol., 30, 147-165. Singh, V.P., (1996). Kinematic-Wave Modeling in Water Resources: Surface-Water Hydrology. John Wiley & Sons, Inc., New York, 1399 pp.

  1. Numerical renormalization group study of probability distributions for local fluctuations in the Anderson-Holstein and Holstein-Hubbard models.

    Science.gov (United States)

    Hewson, Alex C; Bauer, Johannes

    2010-03-24

    We show that information on the probability density of local fluctuations can be obtained from a numerical renormalization group calculation of a reduced density matrix. We apply this approach to the Anderson-Holstein impurity model to calculate the ground state probability density ρ(x) for the displacement x of the local oscillator. From this density we can deduce an effective local potential for the oscillator and compare its form with that obtained from a semiclassical approximation as a function of the coupling strength. The method is extended to the infinite dimensional Holstein-Hubbard model using dynamical mean field theory. We use this approach to compare the probability densities for the displacement of the local oscillator in the normal, antiferromagnetic and charge ordered phases.

  2. Cumulative Effects of Human Activities on Marine Mammal Populations

    Science.gov (United States)

    2015-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Cumulative Effects of Human Activities on Marine Mammal ...marine mammals . OBJECTIVES The National Academies of Sciences, Engineering, and Medicine has convened a volunteer committee that will...Review the present scientific understanding of cumulative effects of anthropogenic stressors on marine mammals with a focus on anthropogenic sound

  3. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  4. Arc Tan- Exponential Type Distribution Induced By Stereographic Projection / Bilinear Transformation On Modified Wrapped Exponential Distribution

    Directory of Open Access Journals (Sweden)

    Phani Y.

    2013-05-01

    Full Text Available In this paper we make an attempt to construct a new three parameter linear model, we call this new model as Arc Tan-Exponential Type distribution, by applying Stereographic Projection or equivalently Bilinear transformation on Wrapped Exponential distribution, Probability density and cumulative distribution functions of this new model are presented and their graphs are plotted for various values of parameters.

  5. The probability distribution of side-chain conformations in [Leu] and [Met]enkephalin determines the potency and selectivity to mu and delta opiate receptors

    DEFF Research Database (Denmark)

    Nielsen, Bjørn Gilbert; Jensen, Morten Østergaard; Bohr, Henrik

    2003-01-01

    The structure of enkephalin, a small neuropeptide with five amino acids, has been simulated on computers using molecular dynamics. Such simulations exhibit a few stable conformations, which also have been identified experimentally. The simulations provide the possibility to perform cluster analysis...... in the space defined by potentially pharmacophoric measures such as dihedral angles, side-chain orientation, etc. By analyzing the statistics of the resulting clusters, the probability distribution of the side-chain conformations may be determined. These probabilities allow us to predict the selectivity...

  6. Estimation of most probable power distribution in BWRs by least squares method using in-core measurements

    Energy Technology Data Exchange (ETDEWEB)

    Ezure, Hideo

    1988-09-01

    Effective combination of measured data with theoretical analysis has permitted deriving a mehtod for more accurately estimating the power distribution in BWRs. Use is made of least squares method for the combination between relationship of the power distribution with measured values and the model used in FLARE or in the three-dimensional two-group diffusion code. Trial application of the new method to estimating the power distribution in JPDR-1 has proved the method to provide reliable results.

  7. Cumulative Timers for Microprocessors

    Science.gov (United States)

    Battle, John O.

    2007-01-01

    It has been proposed to equip future microprocessors with electronic cumulative timers, for essentially the same reasons for which land vehicles are equipped with odometers (total-distance-traveled meters) and aircraft are equipped with Hobbs meters (total-engine-operating time meters). Heretofore, there has been no way to determine the amount of use to which a microprocessor (or a product containing a microprocessor) has been subjected. The proposed timers would count all microprocessor clock cycles and could only be read by means of microprocessor instructions but, like odometers and Hobbs meters, could never be reset to zero without physically damaging the chip.

  8. Cumulative Vehicle Routing Problems

    OpenAIRE

    Kara, &#;mdat; Kara, Bahar Yeti&#;; Yeti&#;, M. Kadri

    2008-01-01

    This paper proposes a new objective function and corresponding formulations for the vehicle routing problem. The new cost function defined as the product of the distance of the arc and the flow on that arc. We call a vehicle routing problem with this new objective function as the Cumulative Vehicle Routing Problem (CumVRP). Integer programming formulations with O(n2) binary variables and O(n2) constraints are developed for both collection and delivery cases. We show that the CumVRP is a gener...

  9. Computation of Probabilities in Causal Models of History of Science

    Directory of Open Access Journals (Sweden)

    Osvaldo Pessoa Jr.

    2006-12-01

    Full Text Available : The aim of this paper is to investigate the ascription of probabilities in a causal model of an episode in the history of science. The aim of such a quantitative approach is to allow the implementation of the causal model in a computer, to run simulations. As an example, we look at the beginning of the science of magnetism, “explaining” — in a probabilistic way, in terms of a single causal model — why the field advanced in China but not in Europe (the difference is due to different prior probabilities of certain cultural manifestations. Given the number of years between the occurrences of two causally connected advances X and Y, one proposes a criterion for stipulating the value pY=X of the conditional probability of an advance Y occurring, given X. Next, one must assume a specific form for the cumulative probability function pY=X(t, which we take to be the time integral of an exponential distribution function, as is done in physics of radioactive decay. Rules for calculating the cumulative functions for more than two events are mentioned, involving composition, disjunction and conjunction of causes. We also consider the problems involved in supposing that the appearance of events in time follows an exponential distribution, which are a consequence of the fact that a composition of causes does not follow an exponential distribution, but a “hypoexponential” one. We suggest that a gamma distribution function might more adequately represent the appearance of advances.

  10. Probability distribution function values in mobile phones;Valores de funciones de distribución de probabilidad en el teléfono móvil

    Directory of Open Access Journals (Sweden)

    Luis Vicente Chamorro Marcillllo

    2013-06-01

    Full Text Available Engineering, within its academic and application forms, as well as any formal research work requires the use of statistics and every inferential statistical analysis requires the use of values of probability distribution functions that are generally available in tables. Generally, management of those tables have physical problems (wasteful transport and wasteful consultation and operational (incomplete lists and limited accuracy. The study, “Probability distribution function values in mobile phones”, permitted determining – through a needs survey applied to students involved in statistics studies at Universidad de Nariño – that the best known and most used values correspond to Chi-Square, Binomial, Student’s t, and Standard Normal distributions. Similarly, it showed user’s interest in having the values in question within an alternative means to correct, at least in part, the problems presented by “the famous tables”. To try to contribute to the solution, we built software that allows immediately and dynamically obtaining the values of the probability distribution functions most commonly used by mobile phones.

  11. A Finding Method of Business Risk Factors Using Characteristics of Probability Distributions of Effect Ratios on Qualitative and Quantitative Hybrid Simulation

    Science.gov (United States)

    Samejima, Masaki; Negoro, Keisuke; Mitsukuni, Koshichiro; Akiyoshi, Masanori

    We propose a finding method of business risk factors on qualitative and quantitative hybrid simulation in time series. Effect ratios of qualitative arcs in the hybrid simulation vary output values of the simulation, so we define effect ratios causing risk as business risk factors. Finding business risk factors in entire ranges of effect ratios is time-consuming. It is considered that probability distributions of effect ratios in present time step and ones in previous time step are similar, the probability distributions in present time step can be estimated. Our method finds business risk factors in only estimated ranges effectively. Experimental results show that a precision rate and a recall rate are 86%, and search time is decreased 20% at least.

  12. The difference between the joint probability distributions of apparent wave heights and periods and individual wave heights and periods

    Institute of Scientific and Technical Information of China (English)

    ZHENGGuizhen; JIANGXiulan; HANShuzong

    2004-01-01

    The joint distribution of wave heights and periods of individual waves is usually approximated by the joint distribution of apparent wave heights and periods. However there is difference between them. This difference is addressed and the theoretical joint distributions of apparent wave heights and periods due to Longuet-Higgins and Sun are modified to give more reasonable representations of the joint distribution of wave heights and periods of individual waves. The modification has overcome an inherent drawback of these joint PDFs that the mean wave period is infinite. A comparison is made between the modified formulae and the field data of Goda, which shows that the new formulae consist with the measurement better than their original counterparts.

  13. Protein-Protein Interaction Site Predictions with Three-Dimensional Probability Distributions of Interacting Atoms on Protein Surfaces

    Science.gov (United States)

    Chen, Ching-Tai; Peng, Hung-Pin; Jian, Jhih-Wei; Tsai, Keng-Chang; Chang, Jeng-Yih; Yang, Ei-Wen; Chen, Jun-Bo; Ho, Shinn-Ying; Hsu, Wen-Lian; Yang, An-Suei

    2012-01-01

    Protein-protein interactions are key to many biological processes. Computational methodologies devised to predict protein-protein interaction (PPI) sites on protein surfaces are important tools in providing insights into the biological functions of proteins and in developing therapeutics targeting the protein-protein interaction sites. One of the general features of PPI sites is that the core regions from the two interacting protein surfaces are complementary to each other, similar to the interior of proteins in packing density and in the physicochemical nature of the amino acid composition. In this work, we simulated the physicochemical complementarities by constructing three-dimensional probability density maps of non-covalent interacting atoms on the protein surfaces. The interacting probabilities were derived from the interior of known structures. Machine learning algorithms were applied to learn the characteristic patterns of the probability density maps specific to the PPI sites. The trained predictors for PPI sites were cross-validated with the training cases (consisting of 432 proteins) and were tested on an independent dataset (consisting of 142 proteins). The residue-based Matthews correlation coefficient for the independent test set was 0.423; the accuracy, precision, sensitivity, specificity were 0.753, 0.519, 0.677, and 0.779 respectively. The benchmark results indicate that the optimized machine learning models are among the best predictors in identifying PPI sites on protein surfaces. In particular, the PPI site prediction accuracy increases with increasing size of the PPI site and with increasing hydrophobicity in amino acid composition of the PPI interface; the core interface regions are more likely to be recognized with high prediction confidence. The results indicate that the physicochemical complementarity patterns on protein surfaces are important determinants in PPIs, and a substantial portion of the PPI sites can be predicted correctly with

  14. Analysis of experimental data on correlations between cumulative particles

    Energy Technology Data Exchange (ETDEWEB)

    Vlasov, A.V.; Doroshkevich, E.A.; Leksin, G.A. [Institute of Theoretical and Experimental Physics, Moscow (Russian Federation)] [and others

    1995-04-01

    Experimental data on correlations between cumulative particles are analyzed. A space-time and energy-transfer pattern of hadron-nucleus interaction based on both correlation data and data on the inclusive spectra of cumulative particles is considered. A new variable that is convenient for describing the production of cumulative particles is proposed using the concept of symmetry between the one-particle and multiparticle distributions. 32 refs., 9 figs., 1 tab.

  15. Existence, uniqueness and regularity of a time-periodic probability density distribution arising in a sedimentation-diffusion problem

    Science.gov (United States)

    Nitsche, Ludwig C.; Nitsche, Johannes M.; Brenner, Howard

    1988-01-01

    The sedimentation and diffusion of a nonneutrally buoyant Brownian particle in vertical fluid-filled cylinder of finite length which is instantaneously inverted at regular intervals are investigated analytically. A one-dimensional convective-diffusive equation is derived to describe the temporal and spatial evolution of the probability density; a periodicity condition is formulated; the applicability of Fredholm theory is established; and the parameter-space regions are determined within which the existence and uniqueness of solutions are guaranteed. Numerical results for sample problems are presented graphically and briefly characterized.

  16. Diagnostics of Rovibrational Distribution of H2 in Low Temperature Plasmas by Fulcher-α band Spectroscopy - on the Reaction Rates and Transition Probabilities

    Institute of Scientific and Technical Information of China (English)

    Xiao Bingjia; Shinichiro Kado; Shin Kajita; Daisuge Yamasaki; Satoru Tanaka

    2005-01-01

    A novel fitting procedure is proposed for a better determination of H2 rovibrational distribution from the Fulcher-a band spectroscopy. We have recalculated the transition probabilities and the results show that they deviate from Franck-Condon approximation especially for the non-diagonal transitions. We also calculated the complete sets of vibrationally resolved crosssections for electron impact d3∏u- X3∑g transition based on the semi-classical Gryzinski theory.An example of experimental study confirms that current approach provides a tool for a better diagnostics of H2 rovibrational distribution in electronic ground state.

  17. Cumulative environmental effects. Summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    This report presents a compilation of knowledge about the state of the environment and human activity in the Norwegian part of the North Sea and Skagerrak. The report gives an overview of pressures and impacts on the environment from normal activity and in the event of accidents. This is used to assess the cumulative environmental effects, which factors have most impact and where the impacts are greatest, and to indicate which problems are expected to be most serious in the future. The report is intended to provide relevant information that can be used in the management of the marine area in the future. It also provides input for the identification of environmental targets and management measures for the North Sea and Skagerrak.(Author)

  18. The Mean Distance to the nth Neighbour in a Uniform Distribution of Random Points: An Application of Probability Theory

    Science.gov (United States)

    Bhattacharyya, Pratip; Chakrabarti, Bikas K.

    2008-01-01

    We study different ways of determining the mean distance (r[subscript n]) between a reference point and its nth neighbour among random points distributed with uniform density in a D-dimensional Euclidean space. First, we present a heuristic method; though this method provides only a crude mathematical result, it shows a simple way of estimating…

  19. Investigating anthelmintic efficacy against gastrointestinal nematodes in cattle by considering appropriate probability distributions for faecal egg count data

    Directory of Open Access Journals (Sweden)

    J.W. Love

    2017-04-01

    Where FEC data were obtained with less sensitive counting techniques (i.e. McMaster 30 or 15 epg, zero-inflated distributions and their associated central tendency were the most appropriate and would be recommended to use, i.e. the arithmetic group mean divided by the proportion of non-zero counts present; otherwise apparent anthelmintic efficacy could be misrepresented.

  20. Predicting probability of occurrence and factors affecting distribution and abundance of three Ozark endemic crayfish species at multiple spatial scales

    Science.gov (United States)

    Nolen, Matthew S.; Magoulick, Daniel D.; DiStefano, Robert J.; Imhoff, Emily M.; Wagner, Brian K.

    2014-01-01

    Crayfishes and other freshwater aquatic fauna are particularly at risk globally due to anthropogenic demand, manipulation and exploitation of freshwater resources and yet are often understudied. The Ozark faunal region of Missouri and Arkansas harbours a high level of aquatic biological diversity, especially in regard to endemic crayfishes. Three such endemics, Orconectes eupunctus,Orconectes marchandi and Cambarus hubbsi, are threatened by limited natural distribution and the invasions of Orconectes neglectus.

  1. On the Efficient Simulation of the Distribution of the Sum of Gamma-Gamma Variates with Application to the Outage Probability Evaluation Over Fading Channels

    KAUST Repository

    Ben Issaid, Chaouki

    2016-06-01

    The Gamma-Gamma distribution has recently emerged in a number of applications ranging from modeling scattering and reverbation in sonar and radar systems to modeling atmospheric turbulence in wireless optical channels. In this respect, assessing the outage probability achieved by some diversity techniques over this kind of channels is of major practical importance. In many circumstances, this is intimately related to the difficult question of analyzing the statistics of a sum of Gamma-Gamma random variables. Answering this question is not a simple matter. This is essentially because outage probabilities encountered in practice are often very small, and hence the use of classical Monte Carlo methods is not a reasonable choice. This lies behind the main motivation of the present work. In particular, this paper proposes a new approach to estimate the left tail of the sum of Gamma-Gamma variates. More specifically, we propose a mean-shift importance sampling scheme that efficiently evaluates the outage probability of L-branch maximum ratio combining diversity receivers over Gamma-Gamma fading channels. The proposed estimator satisfies the well-known bounded relative error criterion, a well-desired property characterizing the robustness of importance sampling schemes, for both identically and non-identically independent distributed cases. We show the accuracy and the efficiency of our approach compared to naive Monte Carlo via some selected numerical simulations.

  2. Probability distribution of the number of distinct sites visited by a random walk on the finite-size fully-connected lattice

    CERN Document Server

    Turban, L

    2016-01-01

    The probability distribution of the number $s$ of distinct sites visited up to time $t$ by a random walk on the fully-connected lattice with $N$ sites is first obtained by solving the eigenvalue problem associated with the discrete master equation. Then, using generating function techniques, we compute the joint probability distribution of $s$ and $r$, where $r$ is the number of sites visited only once up to time $t$. Mean values, variances and covariance are deduced from the generating functions and their finite-size-scaling behaviour is studied. Introducing properly centered and scaled variables $u$ and $v$ for $r$ and $s$ and working in the scaling limit ($t\\to\\infty$, $N\\to\\infty$ with $w=t/N$ fixed) the joint probability density of $u$ and $v$ is shown to be a bivariate Gaussian density. It follows that the fluctuations of $r$ and $s$ around their mean values in a finite-size system are Gaussian in the scaling limit. The same type of finite-size scaling is expected to hold on periodic lattices above the ...

  3. 负二项分布概率最大值的性质%The Characters of the Probability Maximum Value for Negative Binomial Distribution

    Institute of Scientific and Technical Information of China (English)

    丁勇

    2016-01-01

    The character of probability maximum value for negative binomial distribution was explored. The probability maximum value for negative binomial distribution was a function of p and r, where p was the probability of success for each test, and r was the number of the first successful test. It was a mono-tonically increasing continuous function of p when r was given,only (r-1)/p was a integer, its derivative did not exist, and a monotone decreasing function of r when p was given.%负二项分布概率的最大值是每次试验成功的概率p和首次试验成功次数r的函数。对确定的r,该函数是p的单调上升的连续函数,仅当(r-1)/p是整数时不可导;对确定的p,该函数是r的单调下降函数。

  4. UBIQUITOUS POLLUTANTS FROM CUMULATIVE ...

    Science.gov (United States)

    The occurrence of pharmaceuticals and personal care products (PPCPS) as environmental pollutants is a multifaceted issue whose scope continues to become better delineated since the escalation of concerted attention beginning in the 1980s. PPCPs typically occur as trace environmental pollutants (primarily in surface but also in ground waters) as a result of their widespread, continuous, combined usage in a broad range of human and veterinary therapeutic activities and practices. With respect to the risk-assessment paradigm, the growing body of published work has focused primarily on the origin and occurrence of these substances. Comparatively less is known about human and ecological exposure, and even less about the known or even potential hazards associated with exposure to these anthropogenic substances, many of which are highly bioactive. The continually growing, worldwide importance of freshwater resources underscores the need for ensuring that any aggregate or cumulative impacts on water supplies and resultant potential for human or ecological exposure be minimized. This has prompted the more recent investigations on waste treatment processes for one of the major sources of environmental disposition, namely sewage. Despite the paucity of health effects data for long-term, simultaneous exposure to multiple xenobiotics (particularly PPCPS) at low doses (a major toxicological issue that can be described by the

  5. Comparative analysis of methods for modelling the short-term probability distribution of extreme wind turbine loads

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimirov

    2016-01-01

    We have tested the performance of statistical extrapolation methods in predicting the extreme response of a multi-megawatt wind turbine generator. We have applied the peaks-over-threshold, block maxima and average conditional exceedance rates (ACER) methods for peaks extraction, combined with four...... levels, based on the assumption that the response tail is asymptotically Gumbel distributed. Example analyses were carried out, aimed at comparing the different methods, analysing the statistical uncertainties and identifying the factors, which are critical to the accuracy and reliability...

  6. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  7. 风速概率分布参数估计的低阶概率权重矩法%Low-order Probability-weighted Moments Method for Wind Speed Probability Distribution Parameter Estimation

    Institute of Scientific and Technical Information of China (English)

    潘晓春

    2012-01-01

    It is necessary to describe the statistical properties of wind speed using three-parameter Weibull distribution for offshore wind energy resource assessment and utilization.According to the functional relation between parameters and probability-weighted moments(PWM),the functions were fitted with the shape parameter and PWM using logistic curve.Two formulae of parameter estimation were studied out based on low-order insufficient and exceeding PWM.Accuracy test results show that these formulae had higher precision in large-scale range.Through comparative analysis with high-order PWM method for example,the author believes the low-order PWM methods in this paper are worth popularizing.%为便于进行海上风能资源评估与利用,采用三参数Weibull分布来描述风的统计特性是必要的。根据Weibull分布的三参数与概率权重矩(probability-weighted moment,PWM)的关系,应用罗吉斯蒂曲线拟合形状参数与PWM的函数关系,提出低阶不及PWM和超过PWM 2种参数估计方法。精度检验显示,文中方法在较大范围内均具有较高的精度。通过算例分析比较,认为提出的低阶PWM法值得推广使用。

  8. Quantum probability

    CERN Document Server

    Gudder, Stanley P

    2014-01-01

    Quantum probability is a subtle blend of quantum mechanics and classical probability theory. Its important ideas can be traced to the pioneering work of Richard Feynman in his path integral formalism.Only recently have the concept and ideas of quantum probability been presented in a rigorous axiomatic framework, and this book provides a coherent and comprehensive exposition of this approach. It gives a unified treatment of operational statistics, generalized measure theory and the path integral formalism that can only be found in scattered research articles.The first two chapters survey the ne

  9. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  10. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  11. 最概然分布理论的探究和剖析%A Summary of Background Knowledge of the Most Probable Distribution Theory

    Institute of Scientific and Technical Information of China (English)

    周昱; 魏蔚; 张艳燕; 马晓栋

    2011-01-01

    文章总结了最概然分布理论推导所需的一些基本概念和基本结论,在基本概念的表达、基本结论的理解和所有相关知识点的关联上提出一些体会与心得。%By summarizing basic concepts and conclusions in deriving the most probable distribution theory, this paper propose some comments and experiences of the expression of basic concepts, comprehend of basic conclusions, and relations of those pertinent knowle

  12. Modeling particle size distributions by the Weibull distribution function

    Energy Technology Data Exchange (ETDEWEB)

    Fang, Zhigang (Rogers Tool Works, Rogers, AR (United States)); Patterson, B.R.; Turner, M.E. Jr (Univ. of Alabama, Birmingham, AL (United States))

    1993-10-01

    A method is proposed for modeling two- and three-dimensional particle size distributions using the Weibull distribution function. Experimental results show that, for tungsten particles in liquid phase sintered W-14Ni-6Fe, the experimental cumulative section size distributions were well fit by the Weibull probability function, which can also be used to compute the corresponding relative frequency distributions. Modeling the two-dimensional section size distributions facilitates the use of the Saltykov or other methods for unfolding three-dimensional (3-D) size distributions with minimal irregularities. Fitting the unfolded cumulative 3-D particle size distribution with the Weibull function enables computation of the statistical distribution parameters from the parameters of the fit Weibull function.

  13. Lexicographic Probability, Conditional Probability, and Nonstandard Probability

    Science.gov (United States)

    2009-11-11

    the following conditions: CP1. µ(U |U) = 1 if U ∈ F ′. CP2 . µ(V1 ∪ V2 |U) = µ(V1 |U) + µ(V2 |U) if V1 ∩ V2 = ∅, U ∈ F ′, and V1, V2 ∈ F . CP3. µ(V |U...µ(V |X)× µ(X |U) if V ⊆ X ⊆ U , U,X ∈ F ′, V ∈ F . Note that it follows from CP1 and CP2 that µ(· |U) is a probability measure on (W,F) (and, in... CP2 hold. This is easily seen to determine µ. Moreover, µ vaciously satisfies CP3, since there do not exist distinct sets U and X in F ′ such that U

  14. Origins and implications of the relationship between warming and cumulative carbon emissions

    Science.gov (United States)

    Raupach, M. R.; Davis, S. J.; Peters, G. P.; Andrew, R. M.; Canadell, J.; Le Quere, C.

    2014-12-01

    A near-linear relationship between warming (T) and cumulative carbon emissions (Q) is a robust finding from numerous studies. This finding opens biophysical questions concerning (1) its theoretical basis, (2) the treatment of non-CO2 forcings, and (3) uncertainty specifications. Beyond these biophysical issues, a profound global policy question is raised: (4) how can a quota on cumulative emissions be shared? Here, an integrated survey of all four issues is attempted. (1) Proportionality between T and Q is an emergent property of a linear carbon-climate system forced by exponentially increasing CO2 emissions. This idealisation broadly explains past but not future near-proportionality between T and Q: in future, the roles of non-CO2 forcings and carbon-climate nonlinearities become important, and trajectory dependence becomes stronger. (2) The warming effects of short-lived non-CO2 forcers depend on instantaneous rather than cumulative fluxes. However, inertia in emissions trajectories reinstates some of the benefits of a cumulative emissions approach, with residual trajectory dependence comparable to that for CO2. (3) Uncertainties arise from several sources: climate projections, carbon-climate feedbacks, and residual trajectory dependencies in CO2 and other emissions. All of these can in principle be combined into a probability distribution P(T|Q) for the warming T from given cumulative CO2 emissions Q. Present knowledge of P(T|Q) allows quantification of the tradeoff between mitigation ambition and climate risk. (4) Cumulative emissions consistent with a given warming target and climate risk are a finite common resource that will inevitably be shared, creating a tragedy-of-the-commons dilemma. Sharing options range from "inertia" (present distribution of emissions is maintained) to "equity" (cumulative emissions are distributed equally per-capita). Both extreme options lead to emissions distributions that are unrealisable in practice, but a blend of the two

  15. Towards a Global Water Scarcity Risk Assessment Framework: Incorporation of Probability Distributions and Hydro-Climatic Variability

    Science.gov (United States)

    Veldkamp, T. I. E.; Wada, Y.; Aerts, J. C. J. H.; Ward, P. J.

    2016-01-01

    Changing hydro-climatic and socioeconomic conditions increasingly put pressure on fresh water resources and are expected to aggravate water scarcity conditions towards the future. Despite numerous calls for risk-based water scarcity assessments, a global-scale framework that includes UNISDR's definition of risk does not yet exist. This study provides a first step towards such a risk based assessment, applying a Gamma distribution to estimate water scarcity conditions at the global scale under historic and future conditions, using multiple climate change and population growth scenarios. Our study highlights that water scarcity risk, expressed in terms of expected annual exposed population, increases given all future scenarios, up to greater than 56.2% of the global population in 2080. Looking at the drivers of risk, we find that population growth outweigh the impacts of climate change at global and regional scales. Using a risk-based method to assess water scarcity, we show the results to be less sensitive than traditional water scarcity assessments to the use of fixed threshold to represent different levels of water scarcity. This becomes especially important when moving from global to local scales, whereby deviations increase up to 50% of estimated risk levels.

  16. Towards a Global Water Scarcity Risk Assessment Framework: Incorporation of Probability Distributions and Hydro-Climatic Variability

    Science.gov (United States)

    Veldkamp, T. I. E.; Wada, Y.; Aerts, J. C. J. H.; Ward, P. J.

    2016-01-01

    Changing hydro-climatic and socioeconomic conditions increasingly put pressure on fresh water resources and are expected to aggravate water scarcity conditions towards the future. Despite numerous calls for risk-based water scarcity assessments, a global-scale framework that includes UNISDR's definition of risk does not yet exist. This study provides a first step towards such a risk based assessment, applying a Gamma distribution to estimate water scarcity conditions at the global scale under historic and future conditions, using multiple climate change and population growth scenarios. Our study highlights that water scarcity risk, expressed in terms of expected annual exposed population, increases given all future scenarios, up to greater than 56.2% of the global population in 2080. Looking at the drivers of risk, we find that population growth outweigh the impacts of climate change at global and regional scales. Using a risk-based method to assess water scarcity, we show the results to be less sensitive than traditional water scarcity assessments to the use of fixed threshold to represent different levels of water scarcity. This becomes especially important when moving from global to local scales, whereby deviations increase up to 50% of estimated risk levels.

  17. Sharing a quota on cumulative carbon emissions

    Science.gov (United States)

    Raupach, Michael R.; Davis, Steven J.; Peters, Glen P.; Andrew, Robbie M.; Canadell, Josep G.; Ciais, Philippe; Friedlingstein, Pierre; Jotzo, Frank; van Vuuren, Detlef P.; Le Quéré, Corinne

    2014-10-01

    Any limit on future global warming is associated with a quota on cumulative global CO2 emissions. We translate this global carbon quota to regional and national scales, on a spectrum of sharing principles that extends from continuation of the present distribution of emissions to an equal per-capita distribution of cumulative emissions. A blend of these endpoints emerges as the most viable option. For a carbon quota consistent with a 2 °C warming limit (relative to pre-industrial levels), the necessary long-term mitigation rates are very challenging (typically over 5% per year), both because of strong limits on future emissions from the global carbon quota and also the likely short-term persistence in emissions growth in many regions.

  18. Modeling Non-Equilibrium Dynamics of a Discrete Probability Distribution: General Rate Equation for Maximal Entropy Generation in a Maximum-Entropy Landscape with Time-Dependent Constraints

    Directory of Open Access Journals (Sweden)

    Gian Paolo Beretta

    2008-08-01

    Full Text Available A rate equation for a discrete probability distribution is discussed as a route to describe smooth relaxation towards the maximum entropy distribution compatible at all times with one or more linear constraints. The resulting dynamics follows the path of steepest entropy ascent compatible with the constraints. The rate equation is consistent with the Onsager theorem of reciprocity and the fluctuation-dissipation theorem. The mathematical formalism was originally developed to obtain a quantum theoretical unification of mechanics and thermodinamics. It is presented here in a general, non-quantal formulation as a part of an effort to develop tools for the phenomenological treatment of non-equilibrium problems with applications in engineering, biology, sociology, and economics. The rate equation is also extended to include the case of assigned time-dependences of the constraints and the entropy, such as for modeling non-equilibrium energy and entropy exchanges.

  19. Modeling Non-Equilibrium Dynamics of a Discrete Probability Distribution: General Rate Equation for Maximal Entropy Generation in a Maximum-Entropy Landscape with Time-Dependent Constraints

    Science.gov (United States)

    Beretta, Gian P.

    2008-09-01

    A rate equation for a discrete probability distribution is discussed as a route to describe smooth relaxation towards the maximum entropy distribution compatible at all times with one or more linear constraints. The resulting dynamics follows the path of steepest entropy ascent compatible with the constraints. The rate equation is consistent with the Onsager theorem of reciprocity and the fluctuation-dissipation theorem. The mathematical formalism was originally developed to obtain a quantum theoretical unification of mechanics and thermodinamics. It is presented here in a general, non-quantal formulation as a part of an effort to develop tools for the phenomenological treatment of non-equilibrium problems with applications in engineering, biology, sociology, and economics. The rate equation is also extended to include the case of assigned time-dependences of the constraints and the entropy, such as for modeling non-equilibrium energy and entropy exchanges.

  20. Simulation studies of multi-line line-of-sight tunable-diode-laser absorption spectroscopy performance in measuring temperature probability distribution function

    Science.gov (United States)

    Zhang, Guang-Le; Liu, Jian-Guo; Kan, Rui-Feng; Xu, Zhen-Yu

    2014-12-01

    Line-of-sight tunable-diode-laser absorption spectroscopy (LOS-TDLAS) with multiple absorption lines is introduced for non-uniform temperature measurement. Temperature binning method combined with Gauss—Seidel iteration method is used to measure temperature probability distribution function (PDF) along the line-of-sight (LOS). Through 100 simulated measurements, the variation of measurement accuracy is investigated with the number of absorption lines, the number of temperature bins and the magnitude of temperature non-uniformity. A field model with 2-T temperature distribution and 15 well-selected absorption lines are used for the simulation study. The Gauss—Seidel iteration method is discussed for its reliability. The investigation result about the variation of measurement accuracy with the number of temperature bins is different from the previous research results.

  1. On the Efficient Simulation of the Distribution of the Sum of Gamma-Gamma Variates with Application to the Outage Probability Evaluation Over Fading Channels

    KAUST Repository

    Issaid, Chaouki ben

    2017-01-26

    The Gamma-Gamma distribution has recently emerged in a number of applications ranging from modeling scattering and reverberation in sonar and radar systems to modeling atmospheric turbulence in wireless optical channels. In this respect, assessing the outage probability achieved by some diversity techniques over this kind of channels is of major practical importance. In many circumstances, this is related to the difficult question of analyzing the statistics of a sum of Gamma- Gamma random variables. Answering this question is not a simple matter. This is essentially because outage probabilities encountered in practice are often very small, and hence the use of classical Monte Carlo methods is not a reasonable choice. This lies behind the main motivation of the present work. In particular, this paper proposes a new approach to estimate the left tail of the sum of Gamma-Gamma variates. More specifically, we propose robust importance sampling schemes that efficiently evaluates the outage probability of diversity receivers over Gamma-Gamma fading channels. The proposed estimators satisfy the well-known bounded relative error criterion for both maximum ratio combining and equal gain combining cases. We show the accuracy and the efficiency of our approach compared to naive Monte Carlo via some selected numerical simulations.

  2. Transient Cumulative Processes,

    Science.gov (United States)

    1981-11-01

    34 + Ct’SN(t)’ N~t)+l)’ SN~t)<t’SSN(t) I" We require that W(t) be of bounded variation in every finite t interval with probability one and that the random...This function is of bounded variation in every finite interval and thus its Laplace-Stieltjes Transform exists. Lemma 5.6 L{Ee t1L= (s;- Ca-s) [1-F...0, we may write F = X(t) (5.2.18) m(l+t) where X(t) is of bounded variation , tends to :ero as t approaches infinity, and if m a 1, F(t_ belongs

  3. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... of insurance companies facing losses due to natural disasters, banks seeking protection against huge losses, failures in expensive and sophisticated systems or loss of valuable information in electronic systems. The main difficulty when dealing with this kind of problems is the unavailability of a closed...

  4. Assessment of boundary uncertainty in a coal deposit by means of probability kriging

    Energy Technology Data Exchange (ETDEWEB)

    Tercan, A.E. [Hacettepe University, Ankara (Turkey). Dept. of Mining Engineering

    1998-01-01

    Uncertainty over the boundary of a coal deposit must be quantified to allow evaluation of the risk involved in mine-planning decisions. Quantification of uncertainty calls for modelling of the conditional cumulative distribution function (ccdf) about an unknown value. Probability kriging is here used for approximating the ccdf. Thickness is introduced as a covariable in assessing the boundary uncertainty of the Kalburcayiri coal deposit in Kangal, Turkey at regular intervals over the deposit. Comparison of the probability map provided by probability kriging with that of indicator kriging showed there to be no difference between them, may be because of the undersampled covariable. 10 refs., 5 figs.

  5. Estimation method for random sonic fatigue life of thin-walled structure of a combustor liner based on stress probability distribution%Estimation method for random sonic fatigue life of thin-walled structure of a combustor liner based on stress probability distribution

    Institute of Scientific and Technical Information of China (English)

    SHA Yun-dong; GUO Xiao-peng; LIAO Lian-fang; XIE Li-juan

    2011-01-01

    As to the sonic fatigue problem of an aero-engine combustor liner structure under the random acoustic loadings, an effective method for predicting the fatigue life of a structure under random loadings was studied. Firstly, the probability distribution of Von Mises stress of thin-walled structure under random loadings was studied, analysis suggested that probability density function of Von Mises stress process accord approximately with two-parameter Weibull distribution. The formula for calculating Weibull parameters were given. Based on the Miner linear theory, the method for predicting the random sonic fatigue life based on the stress probability density was developed, and the model for fatigue life prediction was constructed. As an example, an aero-engine combustor liner structure was considered. The power spectrum density (PSD) of the vibrational stress response was calculated by using the coupled FEM/BEM (finite element method/boundary element method) model, the fatigue life was estimated by using the constructed model. And considering the influence of the wide frequency band, the calculated results were modified. Comparetive analysis shows that the estimated results of sonic fatigue of the combustor liner structure by using Weibull distribution of Von Mises stress are more conservative than using Dirlik distribution to some extend. The results show that the methods presented in this paper are practical for the random fatigue life analysis of the aeronautical thin-walled structures.

  6. Probability Distribution of Airport Capacity Affected by Weather%天气影响的机场容量概率分布

    Institute of Scientific and Technical Information of China (English)

    张静; 徐肖豪; 王飞

    2011-01-01

    Weather is a major factor causing airport capacity reduction. To reflect the impact of the weather on capacity, the n-phase arrival capacity distribution model is established. The historical weather data are translated into the capacity probability distribution for each weather type through weather type decision tree. According to the capacity probability distribution of each weather type, the probabilistic weather forecasts are translated into probabilistic capacity forecasts by using total probability formula. Weather forecasts of a day are simulated according to the 5-year airport hourly data, and a set of the n-phase arrival capacity distribution based on the weather forecasts is obtained. Simulation results indicate that inclement weather forecasts at different time can be translated into a set of stochastic capacity forecasts, which thus meeting the needs of the real time traffic flow management.%天气是影响机场容量下降的主要因素,为了反映预测天气对容量的影响,建立了n-阶段到达容量分布模型.通过天气类型决策树将历史天气数据转换为每种天气类型的到达容量概率分布.根据天气类型的容量概率分布,用全概公式将概率天气预测转换为概率容量预测.基于5年的机场小时天气数据,对某一日的预测天气进行算例仿真,得到了一组基于预测天气的n-阶段容量概率分布.结果表明,n-阶段容量分布模型能够将机场不同时段的预测恶劣天气转换为预测随机容量,从而满足实时流量管理的需要.

  7. Generation, combination and extension of random set approximations to coherent lower and upper probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Jim W.; Lawry, Jonathan

    2004-09-01

    Random set theory provides a convenient mechanism for representing uncertain knowledge including probabilistic and set-based information, and extending it through a function. This paper focuses upon the situation when the available information is in terms of coherent lower and upper probabilities, which are encountered, for example, when a probability distribution is specified by interval parameters. We propose an Iterative Rescaling Method (IRM) for constructing a random set with corresponding belief and plausibility measures that are a close outer approximation to the lower and upper probabilities. The approach is compared with the discrete approximation method of Williamson and Downs (sometimes referred to as the p-box), which generates a closer approximation to lower and upper cumulative probability distributions but in most cases a less accurate approximation to the lower and upper probabilities on the remainder of the power set. Four combination methods are compared by application to example random sets generated using the IRM.

  8. Effect of correlation on cumulants in heavy-ion collisions

    CERN Document Server

    Mishra, D K; Netrakanti, P K

    2015-01-01

    We study the effects of correlation on cumulants and their ratios of net-proton multiplicity distribution which have been measured for central (0-5\\%) Au+Au collisions at Relativistic Heavy Ion Collider (RHIC). This effect has been studied assuming individual proton and anti-proton distributions as Poisson or Negative Binomial Distribution (NBD). In-spite of significantly correlated production due to baryon number, electric charge conservation and kinematical correlations of protons and anti-protons, the measured cumulants of net-proton distribution follow the independent production model. In the present work we demonstrate how the introduction of correlations will affect the cumulants and their ratios for the difference distributions. We have also demonstrated this study using the proton and anti-proton distributions obtained from HIJING event generator.

  9. Probability Theory without Bayes' Rule

    OpenAIRE

    Rodriques, Samuel G.

    2014-01-01

    Within the Kolmogorov theory of probability, Bayes' rule allows one to perform statistical inference by relating conditional probabilities to unconditional probabilities. As we show here, however, there is a continuous set of alternative inference rules that yield the same results, and that may have computational or practical advantages for certain problems. We formulate generalized axioms for probability theory, according to which the reverse conditional probability distribution P(B|A) is no...

  10. Lattice QCD results on cumulant ratios at freeze-out

    CERN Document Server

    Karsch, Frithjof

    2016-01-01

    Ratios of cumulants of net proton-number fluctuations measured by the STAR Collaboration show strong deviations from a skellam distribution, which should describe thermal properties of cumulant ratios, if proton-number fluctuations are generated in equilibrium and a hadron resonance gas (HRG) model would provide a suitable description of thermodynamics at the freeze-out temperature. We present some results on sixth order cumulants entering the calculation of the QCD equation of state at non-zero values of the baryon chemical potential (mu_B) and discuss limitations on the applicability of HRG thermodynamics deduced from a comparison between QCD and HRG model calculations of cumulants of conserved charge fluctuations. We show that basic features of the $\\mu_B$-dependence of skewness and kurtosis ratios of net proton-number fluctuations measured by the STAR Collaboration resemble those expected from a O(mu_B^2) QCD calculation of the corresponding net baryon-number cumulant ratios.

  11. Probability Distribution Functions OF 12CO(J = 1-0) Brightness and Integrated Intensity in M51: The PAWS View

    CERN Document Server

    Hughes, Annie; Schinnerer, Eva; Colombo, Dario; Pety, Jerome; Leroy, Adam K; Dobbs, Clare L; Garcia-Burillo, Santiago; Thompson, Todd A; Dumas, Gaelle; Schuster, Karl F; Kramer, Carsten

    2013-01-01

    We analyse the distribution of CO brightness temperature and integrated intensity in M51 at ~40 pc resolution using new CO data from the Plateau de Bure Arcsecond Whirlpool Survey (PAWS). We present probability distribution functions (PDFs) of the CO emission within the PAWS field, which covers the inner 11 x 7 kpc of M51. We find variations in the shape of CO PDFs within different M51 environments, and between M51 and M33 and the Large Magellanic Cloud (LMC). Globally, the PDFs for the inner disk of M51 can be represented by narrow lognormal functions that cover 1 to 2 orders of magnitude in CO brightness and integrated intensity. The PDFs for M33 and the LMC are narrower and peak at lower CO intensities. However, the CO PDFs for different dynamical environments within the PAWS field depart from the shape of the global distribution. The PDFs for the interarm region are approximately lognormal, but in the spiral arms and central region of M51, they exhibit diverse shapes with a significant excess of bright CO...

  12. Probability Distributions over Cryptographic Protocols

    Science.gov (United States)

    2009-06-01

    exception. Cryptyc integrates use of pattern- matching in the spi calculus framework , which in turn allows the specification of nested cryptographic...programs too: the metaheuristic search for security protocols,” Information and Software Technology, vol. 43, pp. 891– 904, December 2001. 131 [9] X

  13. A shock process with a non-cumulative damage

    Energy Technology Data Exchange (ETDEWEB)

    Finkelstein, M.S.; Zarudnij, V.I

    2001-01-01

    Two types of non-cumulative damage shock models are considered. Based on the distribution of damage, caused by a shock effecting a system, the intervals with small, intermediate and large damage are introduced. The initial homogeneous Poisson shock process is split into three homogeneous Poisson processes and studied independently. Several criteria of failure are considered, based on the assumption that shocks with a small level of damage are harmless for a system, shocks with a large level of damage results in the system's failure and shocks with an intermediate level of damage can result in the system's failure only with some probability. The second model is based on an assumption that shocks with a small level of damage are harmless to a system, if they are not too close to each other. The probability of the system's failure-free performance in [0,t) is derived explicitly. Simple asymptotic exponential approximations are obtained The accuracy of this method is analyzed. Possible generalizations are discussed.

  14. Simulation modeling of the probability of magmatic disruption of the potential Yucca Mountain Site

    Energy Technology Data Exchange (ETDEWEB)

    Crowe, B.M.; Perry, F.V.; Valentine, G.A. [Los Alamos National Lab., NM (United States); Wallmann, P.C.; Kossik, R. [Golder Associates, Inc., Redmond, WA (United States)

    1993-11-01

    The first phase of risk simulation modeling was completed for the probability of magmatic disruption of a potential repository at Yucca Mountain. E1, the recurrence rate of volcanic events, is modeled using bounds from active basaltic volcanic fields and midpoint estimates of E1. The cumulative probability curves for El are generated by simulation modeling using a form of a triangular distribution. The 50% estimates are about 5 to 8 {times} 10{sup 8} events yr{sup {minus}1}. The simulation modeling shows that the cumulative probability distribution for E1 is more sensitive to the probability bounds then the midpoint estimates. The E2 (disruption probability) is modeled through risk simulation using a normal distribution and midpoint estimates from multiple alternative stochastic and structural models. The 50% estimate of E2 is 4.3 {times} 10{sup {minus}3} The probability of magmatic disruption of the potential Yucca Mountain site is 2.5 {times} 10{sup {minus}8} yr{sup {minus}1}. This median estimate decreases to 9.6 {times} 10{sup {minus}9} yr{sup {minus}1} if E1 is modified for the structural models used to define E2. The Repository Integration Program was tested to compare releases of a simulated repository (without volcanic events) to releases from time histories which may include volcanic disruptive events. Results show that the performance modeling can be used for sensitivity studies of volcanic effects.

  15. Analysis of meteorological droughts and dry spells in semiarid regions: a comparative analysis of probability distribution functions in the Segura Basin (SE Spain)

    Science.gov (United States)

    Pérez-Sánchez, Julio; Senent-Aparicio, Javier

    2017-08-01

    Dry spells are an essential concept of drought climatology that clearly defines the semiarid Mediterranean environment and whose consequences are a defining feature for an ecosystem, so vulnerable with regard to water. The present study was conducted to characterize rainfall drought in the Segura River basin located in eastern Spain, marked by the self seasonal nature of these latitudes. A daily precipitation set has been utilized for 29 weather stations during a period of 20 years (1993-2013). Furthermore, four sets of dry spell length (complete series, monthly maximum, seasonal maximum, and annual maximum) are used and simulated for all the weather stations with the following probability distribution functions: Burr, Dagum, error, generalized extreme value, generalized logistic, generalized Pareto, Gumbel Max, inverse Gaussian, Johnson SB, Log-Logistic, Log-Pearson 3, Triangular, Weibull, and Wakeby. Only the series of annual maximum spell offer a good adjustment for all the weather stations, thereby gaining the role of Wakeby as the best result, with a p value means of 0.9424 for the Kolmogorov-Smirnov test (0.2 significance level). Probability of dry spell duration for return periods of 2, 5, 10, and 25 years maps reveal the northeast-southeast gradient, increasing periods with annual rainfall of less than 0.1 mm in the eastern third of the basin, in the proximity of the Mediterranean slope.

  16. Impact of distributed generation in the probability density of voltage sags; Impacto da geracao distribuida na densidade de probabilidade de afundamentos de tensao

    Energy Technology Data Exchange (ETDEWEB)

    Ramos, Alessandro Candido Lopes [CELG - Companhia Energetica de Goias, Goiania, GO (Brazil). Generation and Transmission. System' s Operation Center], E-mail: alessandro.clr@celg.com.br; Batista, Adalberto Jose [Universidade Federal de Goias (UFG), Goiania, GO (Brazil)], E-mail: batista@eee.ufg.br; Leborgne, Roberto Chouhy [Universidade Federal do Rio Grande do Sul (UFRS), Porto Alegre, RS (Brazil)], E-mail: rcl@ece.ufrgs.br; Emiliano, Pedro Henrique Mota, E-mail: ph@phph.com.br

    2009-07-01

    This article presents the impact of distributed generation in studies of voltage sags caused by faults in the electrical system. We simulated short-circuit-to-ground in 62 lines of 230, 138, 69 and 13.8 kV that are part of the electrical system of the city of Goiania, of Goias state . For each fault position was monitored the bus voltage of 380 V in an industrial consumer sensitive to such sags. Were inserted different levels of GD near the consumer. The simulations of a short circuit, with the monitoring bar 380 V, were performed again. A study using stochastic simulation Monte Carlo (SMC) was performed to obtain, at each level of GD, the probability curves and sags of the probability density and its voltage class. With these curves were obtained the average number of sags according to each class, that the consumer bar may be submitted annually. The simulations were performed using the Program Analysis of Simultaneous Faults - ANAFAS. In order to overcome the intrinsic limitations of the methods of simulation of this program and allow data entry via windows, a computational tool was developed in Java language. Data processing was done using the MATLAB software.

  17. RANDOM VARIABLE WITH FUZZY PROBABILITY

    Institute of Scientific and Technical Information of China (English)

    吕恩琳; 钟佑明

    2003-01-01

    Mathematic description about the second kind fuzzy random variable namely the random variable with crisp event-fuzzy probability was studied. Based on the interval probability and using the fuzzy resolution theorem, the feasible condition about a probability fuzzy number set was given, go a step further the definition arid characters of random variable with fuzzy probability ( RVFP ) and the fuzzy distribution function and fuzzy probability distribution sequence of the RVFP were put forward. The fuzzy probability resolution theorem with the closing operation of fuzzy probability was given and proved. The definition and characters of mathematical expectation and variance of the RVFP were studied also. All mathematic description about the RVFP has the closing operation for fuzzy probability, as a result, the foundation of perfecting fuzzy probability operation method is laid.

  18. Translation-Invariant Representation for Cumulative Foot Pressure Images

    CERN Document Server

    Zheng, Shuai; Tan, Tieniu

    2010-01-01

    Human can be distinguished by different limb movements and unique ground reaction force. Cumulative foot pressure image is a 2-D cumulative ground reaction force during one gait cycle. Although it contains pressure spatial distribution information and pressure temporal distribution information, it suffers from several problems including different shoes and noise, when putting it into practice as a new biometric for pedestrian identification. In this paper, we propose a hierarchical translation-invariant representation for cumulative foot pressure images, inspired by the success of Convolutional deep belief network for digital classification. Key contribution in our approach is discriminative hierarchical sparse coding scheme which helps to learn useful discriminative high-level visual features. Based on the feature representation of cumulative foot pressure images, we develop a pedestrian recognition system which is invariant to three different shoes and slight local shape change. Experiments are conducted on...

  19. Improved curvelet thresholding denoising method by the Chi-Squared cumulation distribution function and PDE%融合卡方累积分布函数和PDE的曲波阈值去噪法

    Institute of Scientific and Technical Information of China (English)

    崔华; 严玍伻

    2014-01-01

    针对硬阈值函数不连续导致视觉失真和软阈值函数存在恒定重构偏差的问题,依据噪声的曲波系数分布特性以及理想阈值函数特性,提出了基于卡方累积分布函数的新阈值函数。为了克服阈值去噪法固有的环绕效应和难以兼顾细节保持与去噪效果的性能缺陷,将偏微分方程去噪图像中包含的有益信息融合进新阈值函数去噪图像中,提出了新去噪方法。理论分析和仿真结果一致表明,较软、硬阈值去噪法,文中采用卡方累积分布函数和偏微分方程改进的曲波阈值去噪方法,可以有效地改善去噪效果和视觉质量。%To circumvent the visual distortion due to the discontinuity of the hard threshold function and the constant reconstruction deviation caused by the soft thresholding function,this paper presents a novel thresholding function based on the Chi-Square cumulative distribution function according to the distribution characteristics of curvelet coefficients of the noise and the ideal properties thehigh effective curvelet threshold functions should have.Further,in order to eliminate the surrounding effect inherent in curvelet threshold denoising methods and achieve a better balance between detail conservation and noise reduction, useful information involved in a denoised image produced by the partial differential equation denoising method is fused with that by the novel curvelet threshold function into the proposed denoising method. Theoretical analysis and simulation results show that the proposed denoiding method outper forms the soft and hard threshold denoising methods in terms of the denoising effect and visual quality.

  20. Characteristics of the spatiotemporal distribution of daily extreme temperature events in China: Minimum temperature records in different climate states against the background of the most probable temperature

    Institute of Scientific and Technical Information of China (English)

    Qian Zhong-Hua; Hu Jing-Guo; Feng Guo-Lin; Cao Yong-Zhong

    2012-01-01

    Based on the skewed function,the most probable temperature is defined and the spatiotemporal distributions of the frequencies and strengths of extreme temperature events in different climate states over China are investigated,where the climate states are referred to as State Ⅰ,State Ⅱ and State Ⅲ,i.e.,the daily minimum temperature records of 1961-1990,1971-2000,and 1981-2009.The results show that in space the frequency of high temperature events in summer decreases clearly in the lower and middle reaches of the Yellow River in State Ⅰ and that low temperature events decrease in northern China in State Ⅱ.In the present state,the frequency of high temperature events increases significantly in most areas over China except the north east,while the frequency of low temperature events decreases mainly in north China and the regions between the Yangtze River and the Yellow River.The distributions of frequencies and strengths of extreme temperature events are consistent in space.The analysis of time evolution of extreme events shows that the occurrence of high temperature events become higher with the change in state,while that of low temperature events decreases.High temperature events are becoming stronger as well and deserve to be paid special attention.

  1. Monte Carlo dose calculations and radiobiological modelling: analysis of the effect of the statistical noise of the dose distribution on the probability of tumour control.

    Science.gov (United States)

    Buffa, F M; Nahum, A E

    2000-10-01

    The aim of this work is to investigate the influence of the statistical fluctuations of Monte Carlo (MC) dose distributions on the dose volume histograms (DVHs) and radiobiological models, in particular the Poisson model for tumour control probability (tcp). The MC matrix is characterized by a mean dose in each scoring voxel, d, and a statistical error on the mean dose, sigma(d); whilst the quantities d and sigma(d) depend on many statistical and physical parameters, here we consider only their dependence on the phantom voxel size and the number of histories from the radiation source. Dose distributions from high-energy photon beams have been analysed. It has been found that the DVH broadens when increasing the statistical noise of the dose distribution, and the tcp calculation systematically underestimates the real tumour control value, defined here as the value of tumour control when the statistical error of the dose distribution tends to zero. When increasing the number of energy deposition events, either by increasing the voxel dimensions or increasing the number of histories from the source, the DVH broadening decreases and tcp converges to the 'correct' value. It is shown that the underestimation of the tcp due to the noise in the dose distribution depends on the degree of heterogeneity of the radiobiological parameters over the population; in particular this error decreases with increasing the biological heterogeneity, whereas it becomes significant in the hypothesis of a radiosensitivity assay for single patients, or for subgroups of patients. It has been found, for example, that when the voxel dimension is changed from a cube with sides of 0.5 cm to a cube with sides of 0.25 cm (with a fixed number of histories of 10(8) from the source), the systematic error in the tcp calculation is about 75% in the homogeneous hypothesis, and it decreases to a minimum value of about 15% in a case of high radiobiological heterogeneity. The possibility of using the error

  2. Searching with Probabilities

    Science.gov (United States)

    1983-07-26

    DeGroot , Morris H. Probability and Statistic. Addison-Wesley Publishing Company, Reading, Massachusetts, 1975. [Gillogly 78] Gillogly, J.J. Performance...distribution [ DeGroot 751 has just begun. The beta distribution has several features that might make it a more reasonable choice. As with the normal-based...1982. [Cooley 65] Cooley, J.M. and Tukey, J.W. An algorithm for the machine calculation of complex Fourier series. Math. Comp. 19, 1965. [ DeGroot 75

  3. Water quality analysis in rivers with non-parametric probability distributions and fuzzy inference systems: application to the Cauca River, Colombia.

    Science.gov (United States)

    Ocampo-Duque, William; Osorio, Carolina; Piamba, Christian; Schuhmacher, Marta; Domingo, José L

    2013-02-01

    The integration of water quality monitoring variables is essential in environmental decision making. Nowadays, advanced techniques to manage subjectivity, imprecision, uncertainty, vagueness, and variability are required in such complex evaluation process. We here propose a probabilistic fuzzy hybrid model to assess river water quality. Fuzzy logic reasoning has been used to compute a water quality integrative index. By applying a Monte Carlo technique, based on non-parametric probability distributions, the randomness of model inputs was estimated. Annual histograms of nine water quality variables were built with monitoring data systematically collected in the Colombian Cauca River, and probability density estimations using the kernel smoothing method were applied to fit data. Several years were assessed, and river sectors upstream and downstream the city of Santiago de Cali, a big city with basic wastewater treatment and high industrial activity, were analyzed. The probabilistic fuzzy water quality index was able to explain the reduction in water quality, as the river receives a larger number of agriculture, domestic, and industrial effluents. The results of the hybrid model were compared to traditional water quality indexes. The main advantage of the proposed method is that it considers flexible boundaries between the linguistic qualifiers used to define the water status, being the belongingness of water quality to the diverse output fuzzy sets or classes provided with percentiles and histograms, which allows classify better the real water condition. The results of this study show that fuzzy inference systems integrated to stochastic non-parametric techniques may be used as complementary tools in water quality indexing methodologies.

  4. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  5. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  6. Cumulant matching for independent source extraction.

    Science.gov (United States)

    Phlypo, Ronald; Zarzoso, Vicente; Comon, Pierre; Lemahieu, Ignace

    2008-01-01

    In this work we show how one can make use of priors on signal statistics under the form of cumulant guesses to extract an independent source from an observed mixture. The advantage of using statistical priors on the signal lies in the fact that no specific knowledge is needed about its temporal behavior, neither about its spatial distribution. We show that these statistics can be obtained either by reasoning on the theoretical values of a supposed waveform, either by using a subset of the observations from which we know that their statistics are merely hindered by interferences. Results on an electro-cardiographic recording confirm the above assumptions.

  7. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  8. Modeling light-tailed and right-skewed data with a new asymmetric distribution

    OpenAIRE

    Cadena, Meitner

    2016-01-01

    A new three-parameter cumulative distribution function defined on (α,∞), for some α ≥ 0, with asymmetric probability density function and showing exponential decays at its both tails, is introduced. The new distribution is near to familiar distributions like the gamma and log-normal distributions, but this new one shows propre elements and does not generalize neither of these distributions. Hence, the new distribution constitutes a new alternative to fit lighted-tail behaviors of high extreme...

  9. Innovativeness, population size and cumulative cultural evolution.

    Science.gov (United States)

    Kobayashi, Yutaka; Aoki, Kenichi

    2012-08-01

    Henrich [Henrich, J., 2004. Demography and cultural evolution: how adaptive cultural processes can produce maladaptive losses-the Tasmanian case. Am. Antiquity 69, 197-214] proposed a model designed to show that larger population size facilitates cumulative cultural evolution toward higher skill levels. In this model, each newborn attempts to imitate the most highly skilled individual of the parental generation by directly-biased social learning, but the skill level he/she acquires deviates probabilistically from that of the exemplar (cultural parent). The probability that the skill level of the imitator exceeds that of the exemplar can be regarded as the innovation rate. After reformulating Henrich's model rigorously, we introduce an overlapping-generations analog based on the Moran model and derive an approximate formula for the expected change per generation of the highest skill level in the population. For large population size, our overlapping-generations model predicts a much larger effect of population size than Henrich's discrete-generations model. We then investigate by way of Monte Carlo simulations the case where each newborn chooses as his/her exemplar the most highly skilled individual from among a limited number of acquaintances. When the number of acquaintances is small relative to the population size, we find that a change in the innovation rate contributes more than a proportional change in population size to the cumulative cultural evolution of skill level.

  10. Detecting spatial patterns with the cumulant function – Part 1: The theory

    Directory of Open Access Journals (Sweden)

    P. Naveau

    2008-02-01

    Full Text Available In climate studies, detecting spatial patterns that largely deviate from the sample mean still remains a statistical challenge. Although a Principal Component Analysis (PCA, or equivalently a Empirical Orthogonal Functions (EOF decomposition, is often applied for this purpose, it provides meaningful results only if the underlying multivariate distribution is Gaussian. Indeed, PCA is based on optimizing second order moments, and the covariance matrix captures the full dependence structure of multivariate Gaussian vectors. Whenever the application at hand can not satisfy this normality hypothesis (e.g. precipitation data, alternatives and/or improvements to PCA have to be developed and studied. To go beyond this second order statistics constraint, that limits the applicability of the PCA, we take advantage of the cumulant function that can produce higher order moments information. The cumulant function, well-known in the statistical literature, allows us to propose a new, simple and fast procedure to identify spatial patterns for non-Gaussian data. Our algorithm consists in maximizing the cumulant function. Three families of multivariate random vectors, for which explicit computations are obtained, are implemented to illustrate our approach. In addition, we show that our algorithm corresponds to selecting the directions along which projected data display the largest spread over the marginal probability density tails.

  11. Conditional Probability Statistical Distributions in Variant Measurement Simulations%在变值测量模拟中的条件概率统计分布

    Institute of Scientific and Technical Information of China (English)

    郑智捷

    2011-01-01

    . Under the conditional probability model, intrinsic wave-like statistical distributions are observed on both normal conditions and interferenced conditions in their spatial statistical distributions respectively.

  12. Summed Probability Distribution of 14C Dates Suggests Regional Divergences in the Population Dynamics of the Jomon Period in Eastern Japan.

    Directory of Open Access Journals (Sweden)

    Enrico R Crema

    Full Text Available Recent advances in the use of summed probability distribution (SPD of calibrated 14C dates have opened new possibilities for studying prehistoric demography. The degree of correlation between climate change and population dynamics can now be accurately quantified, and divergences in the demographic history of distinct geographic areas can be statistically assessed. Here we contribute to this research agenda by reconstructing the prehistoric population change of Jomon hunter-gatherers between 7,000 and 3,000 cal BP. We collected 1,433 14C dates from three different regions in Eastern Japan (Kanto, Aomori and Hokkaido and established that the observed fluctuations in the SPDs were statistically significant. We also introduced a new non-parametric permutation test for comparing multiple sets of SPDs that highlights point of divergences in the population history of different geographic regions. Our analyses indicate a general rise-and-fall pattern shared by the three regions but also some key regional differences during the 6th millennium cal BP. The results confirm some of the patterns suggested by previous archaeological studies based on house and site counts but offer statistical significance and an absolute chronological framework that will enable future studies aiming to establish potential correlation with climatic changes.

  13. The neolithic demographic transition in Europe: correlation with juvenility index supports interpretation of the summed calibrated radiocarbon date probability distribution (SCDPD as a valid demographic proxy.

    Directory of Open Access Journals (Sweden)

    Sean S Downey

    Full Text Available Analysis of the proportion of immature skeletons recovered from European prehistoric cemeteries has shown that the transition to agriculture after 9000 BP triggered a long-term increase in human fertility. Here we compare the largest analysis of European cemeteries to date with an independent line of evidence, the summed calibrated date probability distribution of radiocarbon dates (SCDPD from archaeological sites. Our cemetery reanalysis confirms increased growth rates after the introduction of agriculture; the radiocarbon analysis also shows this pattern, and a significant correlation between both lines of evidence confirms the demographic validity of SCDPDs. We analyze the areal extent of Neolithic enclosures and demographic data from ethnographically known farming and foraging societies and we estimate differences in population levels at individual sites. We find little effect on the overall shape and precision of the SCDPD and we observe a small increase in the correlation with the cemetery trends. The SCDPD analysis supports the hypothesis that the transition to agriculture dramatically increased demographic growth, but it was followed within centuries by a general pattern of collapse even after accounting for higher settlement densities during the Neolithic. The study supports the unique contribution of SCDPDs as a valid demographic proxy for the demographic patterns associated with early agriculture.

  14. Evaluation of the mercury contamination in mushrooms of genus Leccinum from two different regions of the world: Accumulation, distribution and probable dietary intake.

    Science.gov (United States)

    Falandysz, Jerzy; Zhang, Ji; Wang, Yuanzhong; Krasińska, Grażyna; Kojta, Anna; Saba, Martyna; Shen, Tao; Li, Tao; Liu, Honggao

    2015-12-15

    This study focused on investigation of the accumulation and distribution of mercury (Hg) in mushrooms of the genus Leccinum that emerged on soils of totally different geochemical bedrock composition. Hg in 6 species from geographically diverse regions of the mercuriferous belt areas in Yunnan of SW China, and 8 species from the non-mercuriferous regions of Poland in Europe was measured. Also assessed was the probable dietary intake of Hg from consumption of Leccinum spp., which are traditional organic food items in SW China and Poland. The results showed that L. chromapes, L. extremiorientale, L. griseum and L. rugosicepes are good accumulators of Hg and the sequestered Hg in caps were up to 4.8, 3.5, 3.6 and 4.7 mg Hg kg(-1) dry matter respectively. Leccinum mushrooms from Poland also efficiently accumulated Hg with their average Hg content being an order of magnitude lower due to low concentrations of Hg in forest topsoil of Poland compared to the elevated contents in Yunnan. Consumption of Leccinum mushrooms with elevated Hg contents in Yunnan at rates of up to 300 g fresh product per week during the foraging season would not result in Hg intake that exceeds the provisional weekly tolerance limit of 0.004 mg kg(-1) body mass, assuming no Hg ingestion from other foods.

  15. Exploiting an ensemble of regional climate models to provide robust estimates of projected changes in monthly temperature and precipitation probability distribution functions

    Energy Technology Data Exchange (ETDEWEB)

    Tapiador, Francisco J.; Sanchez, Enrique; Romera, Raquel (Inst. of Environmental Sciences, Univ. of Castilla-La Mancha (UCLM), 45071 Toledo (Spain)). e-mail: francisco.tapiador@uclm.es

    2009-07-01

    Regional climate models (RCMs) are dynamical downscaling tools aimed to improve the modelling of local physical processes. Ensembles of RCMs are widely used to improve the coarse-grain estimates of global climate models (GCMs) since the use of several RCMs helps to palliate uncertainties arising from different dynamical cores and numerical schemes methods. In this paper, we analyse the differences and similarities in the climate change response for an ensemble of heterogeneous RCMs forced by one GCM (HadAM3H), and one emissions scenario (IPCC's SRES-A2 scenario). As a difference with previous approaches using PRUDENCE database, the statistical description of climate characteristics is made through the spatial and temporal aggregation of the RCMs outputs into probability distribution functions (PDF) of monthly values. This procedure is a complementary approach to conventional seasonal analyses. Our results provide new, stronger evidence on expected marked regional differences in Europe in the A2 scenario in terms of precipitation and temperature changes. While we found an overall increase in the mean temperature and extreme values, we also found mixed regional differences for precipitation

  16. The Chandra COSMOS Legacy Survey: Clustering of X-ray selected AGN at 2.9Probability Distribution Functions

    CERN Document Server

    Allevato, V; Finoguenov, A; Marchesi, S; Zamorani, G; Hasinger, G; Salvato, M; Miyaji, T; Gilli, R; Cappelluti, N; Brusa, M; Suh, H; Lanzuisi, G; Trakhtenbrot, B; Griffiths, R; Vignali, C; Schawinski, K; Karim, A

    2016-01-01

    We present the measurement of the projected and redshift space 2-point correlation function (2pcf) of the new catalog of Chandra COSMOS-Legacy AGN at 2.9$\\leq$z$\\leq$5.5 ($\\langle L_{bol} \\rangle \\sim$10$^{46}$ erg/s) using the generalized clustering estimator based on phot-z probability distribution functions (Pdfs) in addition to any available spec-z. We model the projected 2pcf estimated using $\\pi_{max}$ = 200 h$^{-1}$ Mpc with the 2-halo term and we derive a bias at z$\\sim$3.4 equal to b = 6.6$^{+0.60}_{-0.55}$, which corresponds to a typical mass of the hosting halos of log M$_h$ = 12.83$^{+0.12}_{-0.11}$ h$^{-1}$ M$_{\\odot}$. A similar bias is derived using the redshift-space 2pcf, modelled including the typical phot-z error $\\sigma_z$ = 0.052 of our sample at z$\\geq$2.9. Once we integrate the projected 2pcf up to $\\pi_{max}$ = 200 h$^{-1}$ Mpc, the bias of XMM and \\textit{Chandra} COSMOS at z=2.8 used in Allevato et al. (2014) is consistent with our results at higher redshift. The results suggest only...

  17. Using RFID and accelerometer-embedded tracers to measure probabilities of bed load transport, step lengths, and rest times in a mountain stream

    Science.gov (United States)

    Olinde, Lindsay; Johnson, Joel P. L.

    2015-09-01

    We present new measurements of bed load tracer transport in a mountain stream over several snowmelt seasons. Cumulative displacements were measured using passive tracers, which consisted of gravel and cobbles embedded with radio frequency identification tags. The timing of bed load motion during 11 transporting events was quantified with active tracers, i.e., accelerometer-embedded cobbles. Probabilities of cobble transport increased with discharge above a threshold, and exhibited slight to moderate hysteresis during snowmelt hydrographs. Dividing cumulative displacements by the number of movements recorded by each active tracer constrained average step lengths. Average step lengths increased with discharge, and distributions of average step lengths and cumulative displacements were thin tailed. Distributions of rest times followed heavy-tailed power law scaling. Rest time scaling varied somewhat with discharge and with the degree to which tracers were incorporated into the streambed. The combination of thin-tailed displacement distributions and heavy-tailed rest time distributions predict superdiffusive dispersion.

  18. Cumulative Risks of Foster Care Placement for Danish Children

    DEFF Research Database (Denmark)

    Fallesen, Peter; Emanuel, Natalia; Wildeman, Christopher

    2014-01-01

    Although recent research suggests that the cumulative risk of foster care placement is far higher for American children than originally suspected, little is known about the cumulative risk of foster care placement in other countries, which makes it difficult to gauge the degree to which factor...... is for Danish children. Results suggest that at the beginning of the study period (in 1998) the cumulative risk of foster care placement for Danish children was roughly in line with the risk for American children. Yet, by the end of the study period (2010), the risk had declined to half the risk for American...... foster care placement is salient in other contexts. In this article, we provide companion estimates to those provided in recent work on the US by using Danish registry data and synthetic cohort life tables to show how high and unequally distributed the cumulative risk of foster care placement...

  19. Mapping cumulative human impacts in the eastern North Sea

    DEFF Research Database (Denmark)

    Stock, A.; Andersen, Jesper; Heinänen, S.

    of the MSFD; and 3) to deepen the understanding of how errors in expert judgment affect the resulting cumulative human impact maps by means of Monte Carlo simulations. We combined existing data sets on the spatial distribution of 33 anthropogenic stressors (linked to the MSFD pressures) and 28 key habitats....... In contrast, the predicted impacts for much of the Norwegian EEZ and areas far offshore were lower. The Monte Carlo simulations confirmed earlier findings that mapping cumulative impacts is generally "robust", but also showed that specific combinations of errors can seriously change local and regional...... on marine ecosystems have only recently been developed. The aims of our study were: 1) to develop a map of cumulative human impacts for the Danish, Swedish, Norwegian and German parts of the Greater North Sea; 2) to adjust the existing methods for mapping cumulative human impacts to fit the requirements...

  20. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications...

  1. Estimating tail probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Carr, D.B.; Tolley, H.D.

    1982-12-01

    This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.

  2. Longhi Games, Internal Reservoirs, and Cumulate Porosity

    Science.gov (United States)

    Morse, S. A.

    2009-05-01

    Fe in plagioclase at an early age, T-rollers (or not) on the Di-Trid boundary in Fo-Di-Sil, the mantle solidus, origins of anorthosites, esoteric uses of Schreinemakers rules and many more topics are all fresh and pleasant memories of John Longhi's prolific and creative work. The Fram-Longhi experimental effect of pressure on plagioclase partitioning with liquid in mafic rocks became essential to an understanding of multiphase Rayleigh fractionation of plagioclase in big layered intrusions. Only by using the pressure effect could I find a good equation through the data for the Kiglapait intrusion, and that result among others required the existence with probability 1.0 of an internal reservoir (Morse, JPet 2008). Knowledge of cumulate porosity is a crucial key to the understanding of layered igneous rocks. We seek both the initial (inverse packing fraction) and residual porosity to find the time and process path from sedimentation to solidification. In the Kiglapait Lower Zone we have a robust estimate of mean residual porosity from the modes of the excluded phases augite, oxides, sulfide, and apatite. To this we apply the maximum variance of plagioclase composition (the An range) to find an algorithm that extends through the Upper Zone and to other intrusions. Of great importance is that all these measurements were made in grain mounts concentrated from typically about 200 g of core or hand specimen, hence the represented sample volume is thousands of times greater than for a thin section. The resulting distribution and scatter of the An range is novel and remarkable. It is V-shaped in the logarithmic representation of stratigraphic height, running from about 20 mole % at both ends (base to top of the Layered Series) to near-zero at 99 PCS. The intercept of the porosity-An range relation gives An range = 3.5 % at zero residual porosity. Petrographic analysis reveals that for PCS less than 95 and greater than 99.9, the An range is intrinsic, i.e. pre-cumulus, for

  3. Cumulant expansions for atmospheric flows

    CERN Document Server

    Ait-Chaalal, Farid; Meyer, Bettina; Marston, J B

    2015-01-01

    The equations governing atmospheric flows are nonlinear, and consequently the hierarchy of cumulant equations is not closed. But because atmospheric flows are inhomogeneous and anisotropic, the nonlinearity may manifests itself only weakly through interactions of mean fields with disturbances such as thermals or eddies. In such situations, truncations of the hierarchy of cumulant equations hold promise as a closure strategy. We review how truncations at second order can be used to model and elucidate the dynamics of turbulent atmospheric flows. Two examples are considered. First, we study the growth of a dry convective boundary layer, which is heated from below, leading to turbulent upward energy transport and growth of the boundary layer. We demonstrate that a quasilinear truncation of the equations of motion, in which interactions of disturbances among each other are neglected but interactions with mean fields are taken into account, can successfully capture the growth of the convective boundary layer. Seco...

  4. Electro-cumulation CNF project

    CERN Document Server

    Grishin, V G

    2000-01-01

    bound or free ion current within solid substances; non-plain symmetry; cumulation of the ion interaction. Experimental result: an Ice SuperPolarization. Cold nuclear fusion ? At http://www.shortway.to/to2084 . Keywords: ion, current, solid, symmetry, cumulation, cold nuclear fusion, polarization, depolarization, ionic conductor, superionic conductor, ice, crystal, strain, V-center, V-centre, doped crystal, interstitial impurity, intrinsic color center, high pressure technology, Bridgman, experiment, crowdion, dielectric, proton, layer, defect, lattice, dynamics, electromigration, mobility, muon catalysis, concentration, doping, dopant, conductivity, pycnonuclear reaction, permittivity, dielectric constant, point defects, interstitials, polarizability, imperfection, defect centers, glass, epitaxy, sodium hydroxide, metallic substrate, crystallization, point, tip, susceptibility, ferroelectric, ordering, force, correlation, collective, shift, distortion, coalescence, crowdions, electrolysis.

  5. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  6. Multivariate joint probability distribution of droughts in Wei River basin%渭河流域干旱特征联合概率分布研究

    Institute of Scientific and Technical Information of China (English)

    马明卫; 宋松柏; 于艺; 张雨; 李扬

    2012-01-01

    This study aims to model the dependence structures of multivariate drought variables using elliptical copulas.Bivariate dependence was estimated with Pearson′s classical correlation coefficient γn,Spearman′s ρnand Kendall′s τn,together with rank scatter plot and Chi-plot and K-plot,while parameters of trivariate copulas were estimated with the maximum likelihood method.For best-fitting of these copulas,Akaike information criterion(AIC),Bayesian information criterion(BIC) and RMS error(RMSE) were used,and a bootstrap version of Rosenblatt′s transformation was used to test goodness-of-fit for Gaussian copula and student t copula.In application to the Wei River basin for determination of its spatial distribution of drought return periods,Gaussian copula was selected for modeling the multivariate joint probability distribution of its drought duration,drought severity and severity peak.The results show that both Gaussian and student t copulas are applicable,but Gaussian copula gives better fitting.In the basin,prolonged droughts had frequently broken out with rather short return periods and thus more emphases should be placed on drought forecast and management in the future.%应用椭圆copulas描述干旱多变量间的相依性结构。采用Pearson'sγn、Spearman'sρn、Kendall'sτn、秩相关图、Chi-plot和K-plot度量2变量相依性;根据极大似然法估计3维copulas的参数,并以AIC、BIC和RMSE进行copulas拟合效果评价;运用基于Rosenblatt变换的Bootstrap法进行Gaussian copula和Student t copula的拟合度检验;选择Gaussiancopula描述干旱历时D、烈度S、和峰值P的联合概率分布,探讨渭河流域干旱重现期的空间分布规律。研究表明:①3维Gaussian copula和Student t copula均适合用来描述干旱多变量联合概率分布,且前者拟合效果优于后者;②渭河流域发生较长时期持续干旱的频率高、重现期短,应加强干旱预报与管理。

  7. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  8. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making.

  9. Moments of the log non-central chi-square distribution

    OpenAIRE

    Pav, Steven E.

    2015-01-01

    The cumulants and moments of the log of the non-central chi-square distribution are derived. For example, the expected log of a chi-square random variable with v degrees of freedom is log(2) + psi(v/2). Applications to modeling probability distributions are discussed.

  10. The Algebra of the Cumulative Percent Operation.

    Science.gov (United States)

    Berry, Andrew J.

    2002-01-01

    Discusses how to help students avoid some pervasive reasoning errors in solving cumulative percent problems. Discusses the meaning of ."%+b%." the additive inverse of ."%." and other useful applications. Emphasizes the operational aspect of the cumulative percent concept. (KHR)

  11. Adaptive strategies for cumulative cultural learning.

    Science.gov (United States)

    Ehn, Micael; Laland, Kevin

    2012-05-21

    The demographic and ecological success of our species is frequently attributed to our capacity for cumulative culture. However, it is not yet known how humans combine social and asocial learning to generate effective strategies for learning in a cumulative cultural context. Here we explore how cumulative culture influences the relative merits of various pure and conditional learning strategies, including pure asocial and social learning, critical social learning, conditional social learning and individual refiner strategies. We replicate the Rogers' paradox in the cumulative setting. However, our analysis suggests that strategies that resolved Rogers' paradox in a non-cumulative setting may not necessarily evolve in a cumulative setting, thus different strategies will optimize cumulative and non-cumulative cultural learning.

  12. "Buddha's Light" of Cumulative Particles

    CERN Document Server

    Kopeliovich, Vladimir B; Potashnikova, Irina K

    2014-01-01

    We show analytically that in the cumulative particles production off nuclei multiple interactions lead to a glory-like backward focusing effect. Employing the small phase space method we arrived at a characteristic angular dependence of the production cross section $d\\sigma \\sim 1/ \\sqrt {\\pi - \\theta}$ near the strictly backward direction. This effect takes place for any number $n\\geq 3 $ of interactions of rescattered particle, either elastic or inelastic (with resonance excitations in intermediate states), when the final particle is produced near corresponding kinematical boundary. Such a behaviour of the cross section near the backward direction is in qualitative agreement with some of available data.

  13. A Resource Cost Aware Cumulative

    Science.gov (United States)

    Simonis, Helmut; Hadzic, Tarik

    We motivate and introduce an extension of the well-known cumulative constraint which deals with time and volume dependent cost of resources. Our research is primarily interested in scheduling problems under time and volume variable electricity costs, but the constraint equally applies to manpower scheduling when hourly rates differ over time and/or extra personnel incur higher hourly rates. We present a number of possible lower bounds on the cost, including a min-cost flow, different LP and MIP models, as well as greedy algorithms, and provide a theoretical and experimental comparison of the different methods.

  14. Spline Histogram Method for Reconstruction of Probability Density Functions of Clusters of Galaxies

    Science.gov (United States)

    Docenko, Dmitrijs; Berzins, Karlis

    We describe the spline histogram algorithm which is useful for visualization of the probability density function setting up a statistical hypothesis for a test. The spline histogram is constructed from discrete data measurements using tensioned cubic spline interpolation of the cumulative distribution function which is then differentiated and smoothed using the Savitzky-Golay filter. The optimal width of the filter is determined by minimization of the Integrated Square Error function. The current distribution of the TCSplin algorithm written in f77 with IDL and Gnuplot visualization scripts is available from www.virac.lv/en/soft.html.

  15. Spline histogram method for reconstruction of probability density function of clusters of galaxies

    CERN Document Server

    Docenko, D; Docenko, Dmitrijs; Berzins, Karlis

    2003-01-01

    We describe the spline histogram algorithm which is useful for visualization of the probability density function setting up a statistical hypothesis for a test. The spline histogram is constructed from discrete data measurements using tensioned cubic spline interpolation of the cumulative distribution function which is then differentiated and smoothed using the Savitzky-Golay filter. The optimal width of the filter is determined by minimization of the Integrated Square Error function. The current distribution of the TCSplin algorithm written in f77 with IDL and Gnuplot visualization scripts is available from http://www.virac.lv/en/soft.html

  16. SU-E-J-182: Reproducibility of Tumor Motion Probability Distribution Function in Stereotactic Body Radiation Therapy of Lung Using Real-Time Tumor-Tracking Radiotherapy System

    Energy Technology Data Exchange (ETDEWEB)

    Shiinoki, T; Hanazawa, H; Park, S; Takahashi, T; Shibuya, K [Yamaguchi University, Ube, Yamaguchi (Japan); Kawamura, S; Uehara, T; Yuasa, Y; Koike, M [Yamaguchi University Hospital, Ube, Yamaguchi (Japan)

    2015-06-15

    Purpose: We aim to achieve new four-dimensional radiotherapy (4DRT) using the next generation real-time tumor-tracking (RTRT) system and flattening-filter-free techniques. To achieve new 4DRT, it is necessary to understand the respiratory motion of tumor. The purposes of this study were: 1.To develop the respiratory motion analysis tool using log files. 2.To evaluate the reproducibility of tumor motion probability distribution function (PDF) during stereotactic body RT (SBRT) of lung tumor. Methods: Seven patients having fiducial markers closely implanted to the lung tumor were enrolled in this study. The positions of fiducial markers were measured using the RTRT system (Mitsubishi Electronics Co., JP) and recorded as two types of log files during the course of SBRT. For each patients, tumor motion range and tumor motion PDFs in left-right (LR), anterior-posterior (AP) and superior-inferior (SI) directions were calculated using log files of all beams per fraction (PDFn). Fractional PDF reproducibility (Rn) was calculated as Kullback-Leibler (KL) divergence between PDF1 and PDFn of tumor motion. The mean of Rn (Rm) was calculated for each patient and correlated to the patient’s mean tumor motion range (Am). The change of Rm during the course of SBRT was also evluated. These analyses were performed using in-house developed software. Results: The Rm were 0.19 (0.07–0.30), 0.14 (0.07–0.32) and 0.16 (0.09–0.28) in LR, AP and SI directions, respectively. The Am were 5.11 mm (2.58–9.99 mm), 7.81 mm (2.87–15.57 mm) and 11.26 mm (3.80–21.27 mm) in LR, AP and SI directions, respectively. The PDF reproducibility decreased as the tumor motion range increased in AP and SI direction. That decreased slightly through the course of RT in SI direction. Conclusion: We developed the respiratory motion analysis tool for 4DRT using log files and quantified the range and reproducibility of respiratory motion for lung tumors.

  17. The Art of Probability Assignment

    CERN Document Server

    Dimitrov, Vesselin I

    2012-01-01

    The problem of assigning probabilities when little is known is analized in the case where the quanities of interest are physical observables, i.e. can be measured and their values expressed by numbers. It is pointed out that the assignment of probabilities based on observation is a process of inference, involving the use of Bayes' theorem and the choice of a probability prior. When a lot of data is available, the resulting probability are remarkable insensitive to the form of the prior. In the oposite case of scarse data, it is suggested that the probabilities are assigned such that they are the least sensitive to specific variations of the probability prior. In the continuous case this results in a probability assignment rule wich calls for minimizing the Fisher information subject to constraints reflecting all available information. In the discrete case, the corresponding quantity to be minimized turns out to be a Renyi distance between the original and the shifted distribution.

  18. A graduate course in probability

    CERN Document Server

    Tucker, Howard G

    2014-01-01

    Suitable for a graduate course in analytic probability, this text requires only a limited background in real analysis. Topics include probability spaces and distributions, stochastic independence, basic limiting options, strong limit theorems for independent random variables, central limit theorem, conditional expectation and Martingale theory, and an introduction to stochastic processes.

  19. The study of the (α, α'f) reaction at 120 MeV on 232Th and 238U (II) : Fission barrier properties deduced from fission probabilities and angular distributions

    NARCIS (Netherlands)

    Plicht, J. van der; Harakeh, M.N.; van der Woude, Adriaan; David, P.; Debrus, J.; Janszen, H.; Schulze, J.

    1981-01-01

    The fission probabilities and angular distributions of the fission fragments for the (α, α'f) reaction on 232Th and 238U at a bombarding energy of 120 MeV have been measured from about 4 to 14 MeV excitation energy. Evidence for sub-barrier resonances has been found, the negative parity ones occurri

  20. The study of the (α, α’f) reaction at 120 MeV on 232Th and 238U (I) : Fission probabilities and angular distributions in the region of the giant quadrupole resonances

    NARCIS (Netherlands)

    Plicht, J. van der; Harakeh, M.N.; van der Woude, Adriaan; David, P.; Debrus, J.; Janszen, H.; Schulze, J.

    1980-01-01

    The fission decay channel of 232Th and 238U has been investigated, using the (α, α’f) reaction at 120 MeV bombarding energy. The angular distributions of the fission fragments and the fission probabilities up to around 15 MeV excitation have been measured. No evidence for the fission decay of the gi

  1. The study of the (α, α’f) reaction at 120 MeV on 232Th and 238U (I) : Fission probabilities and angular distributions in the region of the giant quadrupole resonances

    NARCIS (Netherlands)

    Plicht, J. van der; Harakeh, M.N.; van der Woude, Adriaan; David, P.; Debrus, J.; Janszen, H.; Schulze, J.

    1980-01-01

    The fission decay channel of 232Th and 238U has been investigated, using the (α, α’f) reaction at 120 MeV bombarding energy. The angular distributions of the fission fragments and the fission probabilities up to around 15 MeV excitation have been measured. No evidence for the fission decay of the

  2. A paradox of cumulative culture.

    Science.gov (United States)

    Kobayashi, Yutaka; Wakano, Joe Yuichiro; Ohtsuki, Hisashi

    2015-08-21

    Culture can grow cumulatively if socially learnt behaviors are improved by individual learning before being passed on to the next generation. Previous authors showed that this kind of learning strategy is unlikely to be evolutionarily stable in the presence of a trade-off between learning and reproduction. This is because culture is a public good that is freely exploited by any member of the population in their model (cultural social dilemma). In this paper, we investigate the effect of vertical transmission (transmission from parents to offspring), which decreases the publicness of culture, on the evolution of cumulative culture in both infinite and finite population models. In the infinite population model, we confirm that culture accumulates largely as long as transmission is purely vertical. It turns out, however, that introduction of even slight oblique transmission drastically reduces the equilibrium level of culture. Even more surprisingly, if the population size is finite, culture hardly accumulates even under purely vertical transmission. This occurs because stochastic extinction due to random genetic drift prevents a learning strategy from accumulating enough culture. Overall, our theoretical results suggest that introducing vertical transmission alone does not really help solve the cultural social dilemma problem. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Joint probabilities and quantum cognition

    CERN Document Server

    de Barros, J Acacio

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantum-like response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  4. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  5. Fractal probability laws.

    Science.gov (United States)

    Eliazar, Iddo; Klafter, Joseph

    2008-06-01

    We explore six classes of fractal probability laws defined on the positive half-line: Weibull, Frechét, Lévy, hyper Pareto, hyper beta, and hyper shot noise. Each of these classes admits a unique statistical power-law structure, and is uniquely associated with a certain operation of renormalization. All six classes turn out to be one-dimensional projections of underlying Poisson processes which, in turn, are the unique fixed points of Poissonian renormalizations. The first three classes correspond to linear Poissonian renormalizations and are intimately related to extreme value theory (Weibull, Frechét) and to the central limit theorem (Lévy). The other three classes correspond to nonlinear Poissonian renormalizations. Pareto's law--commonly perceived as the "universal fractal probability distribution"--is merely a special case of the hyper Pareto class.

  6. The Inductive Applications of Probability Calculus

    Directory of Open Access Journals (Sweden)

    Corrado Gini

    2015-06-01

    Full Text Available The Author goes back to Founders of Probability calculus to investigate their original interpretation of the probability measure in the applications of the probability theory to real problems. The Author puts in evidence some misunderstandings related to the inversion of deductions derived by the use of probability distributions for investigating the causes of events.

  7. Cumulant dynamics in a finite population linkage equilibrium theory

    CERN Document Server

    Rattray, M; Rattray, Magnus; Shapiro, Jonathan L.

    1999-01-01

    The evolution of a finite population at linkage equilibrium is described in terms of the dynamics of phenotype distribution cumulants. This provides a powerful method for describing evolutionary transients and we elucidate the relationship between the cumulant dynamics and the diffusion approximation. A separation of time-scales between the first and higher cumulants for low mutation rates is demonstrated in the diffusion limit and provides a significant simplification of the dynamical system. However, the diffusion limit may not be appropriate for strong selection as the standard Fisher-Wright model of genetic drift can break down in this case. Two novel examples of this effect are considered: we shown that the dynamics may depend on the number of loci under strong directional selection and that environmental variance results in a reduced effective population size. We also consider a simple model of a changing environment which cannot be described by a diffusion equation and we derive the optimal mutation ra...

  8. Inferring Beliefs as Subjectively Imprecise Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.;

    2012-01-01

    We propose a method for estimating subjective beliefs, viewed as a subjective probability distribution. The key insight is to characterize beliefs as a parameter to be estimated from observed choices in a well-defined experimental task and to estimate that parameter as a random coefficient. The e...... probabilities are indeed best characterized as probability distributions with non-zero variance....

  9. Probability boxes on totally preordered spaces for multivariate modelling

    CERN Document Server

    Troffaes, Matthias C M; 10.1016/j.ijar.2011.02.001

    2011-01-01

    A pair of lower and upper cumulative distribution functions, also called probability box or p-box, is among the most popular models used in imprecise probability theory. They arise naturally in expert elicitation, for instance in cases where bounds are specified on the quantiles of a random variable, or when quantiles are specified only at a finite number of points. Many practical and formal results concerning p-boxes already exist in the literature. In this paper, we provide new efficient tools to construct multivariate p-boxes and develop algorithms to draw inferences from them. For this purpose, we formalise and extend the theory of p-boxes using Walley's behavioural theory of imprecise probabilities, and heavily rely on its notion of natural extension and existing results about independence modeling. In particular, we allow p-boxes to be defined on arbitrary totally preordered spaces, hence thereby also admitting multivariate p-boxes via probability bounds over any collection of nested sets. We focus on t...

  10. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  11. Cumulative risks of foster care placement for Danish children.

    Science.gov (United States)

    Fallesen, Peter; Emanuel, Natalia; Wildeman, Christopher

    2014-01-01

    Although recent research suggests that the cumulative risk of foster care placement is far higher for American children than originally suspected, little is known about the cumulative risk of foster care placement in other countries, which makes it difficult to gauge the degree to which factor foster care placement is salient in other contexts. In this article, we provide companion estimates to those provided in recent work on the US by using Danish registry data and synthetic cohort life tables to show how high and unequally distributed the cumulative risk of foster care placement is for Danish children. Results suggest that at the beginning of the study period (in 1998) the cumulative risk of foster care placement for Danish children was roughly in line with the risk for American children. Yet, by the end of the study period (2010), the risk had declined to half the risk for American children. Our results also show some variations by parental ethnicity and sex, but these differences are small. Indeed, they appear quite muted relative to racial/ethnic differences in these risks in the United States. Last, though cumulative risks are similar between Danish and American children (especially at the beginning of the study period), the age-specific risk profiles are markedly different, with higher risks for older Danish children than for older American children.

  12. Estimating probability curves of rock variables using orthogonal polynomials and sample moments

    Institute of Scientific and Technical Information of China (English)

    DENG Jian; BIAN Li

    2005-01-01

    A new algorithm using orthogonal polynomials and sample moments was presented for estimating probability curves directly from experimental or field data of rock variables. The moments estimated directly from a sample of observed values of a random variable could be conventional moments (moments about the origin or central moments) and probability-weighted moments (PWMs). Probability curves derived from orthogonal polynomials and conventional moments are probability density functions (PDF), and probability curves derived from orthogonal polynomials and PWMs are inverse cumulative density functions (CDF) of random variables. The proposed approach is verified by two most commonly-used theoretical standard distributions: normal and exponential distribution. Examples from observed data of uniaxial compressive strength of a rock and concrete strength data are presented for illustrative purposes. The results show that probability curves of rock variable can be accurately derived from orthogonal polynomials and sample moments. Orthogonal polynomials and PWMs enable more secure inferences to be made from relatively small samples about an underlying probability curve.

  13. Probably Almost Bayes Decisions

    DEFF Research Database (Denmark)

    Anoulova, S.; Fischer, Paul; Poelt, S.

    1996-01-01

    discriminant functions for this purpose. We analyze this approach for different classes of distribution functions of Boolean features:kth order Bahadur-Lazarsfeld expansions andkth order Chow expansions. In both cases, we obtain upper bounds for the required sample size which are small polynomials...... in the relevant parameters and which match the lower bounds known for these classes. Moreover, the learning algorithms are efficient.......In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...

  14. Probability for physicists

    CERN Document Server

    Sirca, Simon

    2016-01-01

    This book is designed as a practical and intuitive introduction to probability, statistics and random quantities for physicists. The book aims at getting to the main points by a clear, hands-on exposition supported by well-illustrated and worked-out examples. A strong focus on applications in physics and other natural sciences is maintained throughout. In addition to basic concepts of random variables, distributions, expected values and statistics, the book discusses the notions of entropy, Markov processes, and fundamentals of random number generation and Monte-Carlo methods.

  15. Occurrence Probability of Large Solar Energetic Particle Events: Assessment from Data on Cosmogenic Radionuclides in Lunar Rocks

    CERN Document Server

    Kovaltsov, Gennady A

    2014-01-01

    We revisited assessments of the occurrence probability distribution of large events in solar energetic particles (SEP), based on measurements of cosmogenic radionuclides in lunar rocks. We present a combined cumulative occurrence probability distribution of SEP events based on three time scales: directly measured SEP fluences for the last 60 years; estimates based on terrestrial cosmogenic radionuclides 10Be and 14C for the multi-millennial (Holocene) time scale; and cosmogenic radionuclides measured in lunar rocks on the time scale of up to 1 Myr. All the three time scales yield a consistent distribution. The data suggest a strong rollover of the occurrence probability so that SEP events with the fluence of protons with energy >30 MeV greater than 10^{11} (protons /cm2/yr) are not expected at the Myr time scale.

  16. Probability distribution of intersymbol distances in random symbolic sequences: Applications to improving detection of keywords in texts and of amino acid clustering in proteins

    Science.gov (United States)

    Carpena, Pedro; Bernaola-Galván, Pedro A.; Carretero-Campos, Concepción; Coronado, Ana V.

    2016-11-01

    Symbolic sequences have been extensively investigated in the past few years within the framework of statistical physics. Paradigmatic examples of such sequences are written texts, and deoxyribonucleic acid (DNA) and protein sequences. In these examples, the spatial distribution of a given symbol (a word, a DNA motif, an amino acid) is a key property usually related to the symbol importance in the sequence: The more uneven and far from random the symbol distribution, the higher the relevance of the symbol to the sequence. Thus, many techniques of analysis measure in some way the deviation of the symbol spatial distribution with respect to the random expectation. The problem is then to know the spatial distribution corresponding to randomness, which is typically considered to be either the geometric or the exponential distribution. However, these distributions are only valid for very large symbolic sequences and for many occurrences of the analyzed symbol. Here, we obtain analytically the exact, randomly expected spatial distribution valid for any sequence length and any symbol frequency, and we study its main properties. The knowledge of the distribution allows us to define a measure able to properly quantify the deviation from randomness of the symbol distribution, especially for short sequences and low symbol frequency. We apply the measure to the problem of keyword detection in written texts and to study amino acid clustering in protein sequences. In texts, we show how the results improve with respect to previous methods when short texts are analyzed. In proteins, which are typically short, we show how the measure quantifies unambiguously the amino acid clustering and characterize its spatial distribution.

  17. Probability distribution of intersymbol distances in random symbolic sequences: Applications to improving detection of keywords in texts and of amino acid clustering in proteins.

    Science.gov (United States)

    Carpena, Pedro; Bernaola-Galván, Pedro A; Carretero-Campos, Concepción; Coronado, Ana V

    2016-11-01

    Symbolic sequences have been extensively investigated in the past few years within the framework of statistical physics. Paradigmatic examples of such sequences are written texts, and deoxyribonucleic acid (DNA) and protein sequences. In these examples, the spatial distribution of a given symbol (a word, a DNA motif, an amino acid) is a key property usually related to the symbol importance in the sequence: The more uneven and far from random the symbol distribution, the higher the relevance of the symbol to the sequence. Thus, many techniques of analysis measure in some way the deviation of the symbol spatial distribution with respect to the random expectation. The problem is then to know the spatial distribution corresponding to randomness, which is typically considered to be either the geometric or the exponential distribution. However, these distributions are only valid for very large symbolic sequences and for many occurrences of the analyzed symbol. Here, we obtain analytically the exact, randomly expected spatial distribution valid for any sequence length and any symbol frequency, and we study its main properties. The knowledge of the distribution allows us to define a measure able to properly quantify the deviation from randomness of the symbol distribution, especially for short sequences and low symbol frequency. We apply the measure to the problem of keyword detection in written texts and to study amino acid clustering in protein sequences. In texts, we show how the results improve with respect to previous methods when short texts are analyzed. In proteins, which are typically short, we show how the measure quantifies unambiguously the amino acid clustering and characterize its spatial distribution.

  18. Cumulative Advantage and Success-Breeds-Success: The Value of Time Pattern Analysis.

    Science.gov (United States)

    Huber, John C.

    1998-01-01

    For the case of the distribution of papers across authors, the Success-Breeds-Success or Cumulative Advantage model is a popular candidate for informetrics. The method of time pattern of publication for individual authors can be used to discriminate between Cumulative Advantage and non-uniform giftedness models. The non-uniform giftedness model is…

  19. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  20. Measuring a fair and ambitious climate agreement using cumulative emissions

    Science.gov (United States)

    Peters, Glen P.; Andrew, Robbie M.; Solomon, Susan; Friedlingstein, Pierre

    2015-10-01

    Policy makers have called for a ‘fair and ambitious’ global climate agreement. Scientific constraints, such as the allowable carbon emissions to avoid exceeding a 2 °C global warming limit with 66% probability, can help define ambitious approaches to climate targets. However, fairly sharing the mitigation challenge to meet a global target involves human values rather than just scientific facts. We develop a framework based on cumulative emissions of carbon dioxide to compare the consistency of countries’ current emission pledges to the ambition of keeping global temperatures below 2 °C, and, further, compare two alternative methods of sharing the remaining emission allowance. We focus on the recent pledges and other official statements of the EU, USA, and China. The EU and US pledges are close to a 2 °C level of ambition only if the remaining emission allowance is distributed based on current emission shares, which is unlikely to be viewed as ‘fair and ambitious’ by others who presently emit less. China’s stated emissions target also differs from measures of global fairness, owing to emissions that continue to grow into the 2020s. We find that, combined, the EU, US, and Chinese pledges leave little room for other countries to emit CO2 if a 2 °C limit is the objective, essentially requiring all other countries to move towards per capita emissions 7 to 14 times lower than the EU, USA, or China by 2030. We argue that a fair and ambitious agreement for a 2 °C limit that would be globally inclusive and effective in the long term will require stronger mitigation than the goals currently proposed. Given such necessary and unprecedented mitigation and the current lack of availability of some key technologies, we suggest a new diplomatic effort directed at ensuring that the necessary technologies become available in the near future.