WorldWideScience

Sample records for national random probability

  1. What Are Probability Surveys used by the National Aquatic Resource Surveys?

    Science.gov (United States)

    The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.

  2. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  3. Probability, random variables, and random processes theory and signal processing applications

    CERN Document Server

    Shynk, John J

    2012-01-01

    Probability, Random Variables, and Random Processes is a comprehensive textbook on probability theory for engineers that provides a more rigorous mathematical framework than is usually encountered in undergraduate courses. It is intended for first-year graduate students who have some familiarity with probability and random variables, though not necessarily of random processes and systems that operate on random signals. It is also appropriate for advanced undergraduate students who have a strong mathematical background. The book has the following features: Several app

  4. Random phenomena fundamentals of probability and statistics for engineers

    CERN Document Server

    Ogunnaike, Babatunde A

    2009-01-01

    PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...

  5. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  6. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  7. Path probabilities of continuous time random walks

    International Nuclear Information System (INIS)

    Eule, Stephan; Friedrich, Rudolf

    2014-01-01

    Employing the path integral formulation of a broad class of anomalous diffusion processes, we derive the exact relations for the path probability densities of these processes. In particular, we obtain a closed analytical solution for the path probability distribution of a Continuous Time Random Walk (CTRW) process. This solution is given in terms of its waiting time distribution and short time propagator of the corresponding random walk as a solution of a Dyson equation. Applying our analytical solution we derive generalized Feynman–Kac formulae. (paper)

  8. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  9. The extinction probability in systems randomly varying in time

    Directory of Open Access Journals (Sweden)

    Imre Pázsit

    2017-09-01

    Full Text Available The extinction probability of a branching process (a neutron chain in a multiplying medium is calculated for a system randomly varying in time. The evolution of the first two moments of such a process was calculated previously by the authors in a system randomly shifting between two states of different multiplication properties. The same model is used here for the investigation of the extinction probability. It is seen that the determination of the extinction probability is significantly more complicated than that of the moments, and it can only be achieved by pure numerical methods. The numerical results indicate that for systems fluctuating between two subcritical or two supercritical states, the extinction probability behaves as expected, but for systems fluctuating between a supercritical and a subcritical state, there is a crucial and unexpected deviation from the predicted behaviour. The results bear some significance not only for neutron chains in a multiplying medium, but also for the evolution of biological populations in a time-varying environment.

  10. Non-equilibrium random matrix theory. Transition probabilities

    International Nuclear Information System (INIS)

    Pedro, Francisco Gil; Westphal, Alexander

    2016-06-01

    In this letter we present an analytic method for calculating the transition probability between two random Gaussian matrices with given eigenvalue spectra in the context of Dyson Brownian motion. We show that in the Coulomb gas language, in large N limit, memory of the initial state is preserved in the form of a universal linear potential acting on the eigenvalues. We compute the likelihood of any given transition as a function of time, showing that as memory of the initial state is lost, transition probabilities converge to those of the static ensemble.

  11. Non-equilibrium random matrix theory. Transition probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Pedro, Francisco Gil [Univ. Autonoma de Madrid (Spain). Dept. de Fisica Teorica; Westphal, Alexander [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany). Gruppe Theorie

    2016-06-15

    In this letter we present an analytic method for calculating the transition probability between two random Gaussian matrices with given eigenvalue spectra in the context of Dyson Brownian motion. We show that in the Coulomb gas language, in large N limit, memory of the initial state is preserved in the form of a universal linear potential acting on the eigenvalues. We compute the likelihood of any given transition as a function of time, showing that as memory of the initial state is lost, transition probabilities converge to those of the static ensemble.

  12. Predicting longitudinal trajectories of health probabilities with random-effects multinomial logit regression.

    Science.gov (United States)

    Liu, Xian; Engel, Charles C

    2012-12-20

    Researchers often encounter longitudinal health data characterized with three or more ordinal or nominal categories. Random-effects multinomial logit models are generally applied to account for potential lack of independence inherent in such clustered data. When parameter estimates are used to describe longitudinal processes, however, random effects, both between and within individuals, need to be retransformed for correctly predicting outcome probabilities. This study attempts to go beyond existing work by developing a retransformation method that derives longitudinal growth trajectories of unbiased health probabilities. We estimated variances of the predicted probabilities by using the delta method. Additionally, we transformed the covariates' regression coefficients on the multinomial logit function, not substantively meaningful, to the conditional effects on the predicted probabilities. The empirical illustration uses the longitudinal data from the Asset and Health Dynamics among the Oldest Old. Our analysis compared three sets of the predicted probabilities of three health states at six time points, obtained from, respectively, the retransformation method, the best linear unbiased prediction, and the fixed-effects approach. The results demonstrate that neglect of retransforming random errors in the random-effects multinomial logit model results in severely biased longitudinal trajectories of health probabilities as well as overestimated effects of covariates on the probabilities. Copyright © 2012 John Wiley & Sons, Ltd.

  13. Age replacement policy based on imperfect repair with random probability

    International Nuclear Information System (INIS)

    Lim, J.H.; Qu, Jian; Zuo, Ming J.

    2016-01-01

    In most of literatures of age replacement policy, failures before planned replacement age can be either minimally repaired or perfectly repaired based on the types of failures, cost for repairs and so on. In this paper, we propose age replacement policy based on imperfect repair with random probability. The proposed policy incorporates the case that such intermittent failure can be either minimally repaired or perfectly repaired with random probabilities. The mathematical formulas of the expected cost rate per unit time are derived for both the infinite-horizon case and the one-replacement-cycle case. For each case, we show that the optimal replacement age exists and is finite. - Highlights: • We propose a new age replacement policy with random probability of perfect repair. • We develop the expected cost per unit time. • We discuss the optimal age for replacement minimizing the expected cost rate.

  14. People's Intuitions about Randomness and Probability: An Empirical Study

    Science.gov (United States)

    Lecoutre, Marie-Paule; Rovira, Katia; Lecoutre, Bruno; Poitevineau, Jacques

    2006-01-01

    What people mean by randomness should be taken into account when teaching statistical inference. This experiment explored subjective beliefs about randomness and probability through two successive tasks. Subjects were asked to categorize 16 familiar items: 8 real items from everyday life experiences, and 8 stochastic items involving a repeatable…

  15. Stationary Probability and First-Passage Time of Biased Random Walk

    International Nuclear Information System (INIS)

    Li Jing-Wen; Tang Shen-Li; Xu Xin-Ping

    2016-01-01

    In this paper, we consider the stationary probability and first-passage time of biased random walk on 1D chain, where at each step the walker moves to the left and right with probabilities p and q respectively (0 ⩽ p, q ⩽ 1, p + q = 1). We derive exact analytical results for the stationary probability and first-passage time as a function of p and q for the first time. Our results suggest that the first-passage time shows a double power-law F ∼ (N − 1) γ , where the exponent γ = 2 for N < |p − q| −1 and γ = 1 for N > |p − q| −1 . Our study sheds useful insights into the biased random-walk process. (paper)

  16. Dynamic probability of reinforcement for cooperation: Random game termination in the centipede game.

    Science.gov (United States)

    Krockow, Eva M; Colman, Andrew M; Pulford, Briony D

    2018-03-01

    Experimental games have previously been used to study principles of human interaction. Many such games are characterized by iterated or repeated designs that model dynamic relationships, including reciprocal cooperation. To enable the study of infinite game repetitions and to avoid endgame effects of lower cooperation toward the final game round, investigators have introduced random termination rules. This study extends previous research that has focused narrowly on repeated Prisoner's Dilemma games by conducting a controlled experiment of two-player, random termination Centipede games involving probabilistic reinforcement and characterized by the longest decision sequences reported in the empirical literature to date (24 decision nodes). Specifically, we assessed mean exit points and cooperation rates, and compared the effects of four different termination rules: no random game termination, random game termination with constant termination probability, random game termination with increasing termination probability, and random game termination with decreasing termination probability. We found that although mean exit points were lower for games with shorter expected game lengths, the subjects' cooperativeness was significantly reduced only in the most extreme condition with decreasing computer termination probability and an expected game length of two decision nodes. © 2018 Society for the Experimental Analysis of Behavior.

  17. Learning Binomial Probability Concepts with Simulation, Random Numbers and a Spreadsheet

    Science.gov (United States)

    Rochowicz, John A., Jr.

    2005-01-01

    This paper introduces the reader to the concepts of binomial probability and simulation. A spreadsheet is used to illustrate these concepts. Random number generators are great technological tools for demonstrating the concepts of probability. Ideas of approximation, estimation, and mathematical usefulness provide numerous ways of learning…

  18. Probability, random processes, and ergodic properties

    CERN Document Server

    Gray, Robert M

    1988-01-01

    This book has been written for several reasons, not all of which are academic. This material was for many years the first half of a book in progress on information and ergodic theory. The intent was and is to provide a reasonably self-contained advanced treatment of measure theory, prob ability theory, and the theory of discrete time random processes with an emphasis on general alphabets and on ergodic and stationary properties of random processes that might be neither ergodic nor stationary. The intended audience was mathematically inc1ined engineering graduate students and visiting scholars who had not had formal courses in measure theoretic probability . Much of the material is familiar stuff for mathematicians, but many of the topics and results have not previously appeared in books. The original project grew too large and the first part contained much that would likely bore mathematicians and dis courage them from the second part. Hence I finally followed the suggestion to separate the material and split...

  19. Generation, combination and extension of random set approximations to coherent lower and upper probabilities

    International Nuclear Information System (INIS)

    Hall, Jim W.; Lawry, Jonathan

    2004-01-01

    Random set theory provides a convenient mechanism for representing uncertain knowledge including probabilistic and set-based information, and extending it through a function. This paper focuses upon the situation when the available information is in terms of coherent lower and upper probabilities, which are encountered, for example, when a probability distribution is specified by interval parameters. We propose an Iterative Rescaling Method (IRM) for constructing a random set with corresponding belief and plausibility measures that are a close outer approximation to the lower and upper probabilities. The approach is compared with the discrete approximation method of Williamson and Downs (sometimes referred to as the p-box), which generates a closer approximation to lower and upper cumulative probability distributions but in most cases a less accurate approximation to the lower and upper probabilities on the remainder of the power set. Four combination methods are compared by application to example random sets generated using the IRM

  20. Problems in probability theory, mathematical statistics and theory of random functions

    CERN Document Server

    Sveshnikov, A A

    1979-01-01

    Problem solving is the main thrust of this excellent, well-organized workbook. Suitable for students at all levels in probability theory and statistics, the book presents over 1,000 problems and their solutions, illustrating fundamental theory and representative applications in the following fields: Random Events; Distribution Laws; Correlation Theory; Random Variables; Entropy & Information; Markov Processes; Systems of Random Variables; Limit Theorems; Data Processing; and more.The coverage of topics is both broad and deep, ranging from the most elementary combinatorial problems through lim

  1. Probability distribution for the Gaussian curvature of the zero level surface of a random function

    Science.gov (United States)

    Hannay, J. H.

    2018-04-01

    A rather natural construction for a smooth random surface in space is the level surface of value zero, or ‘nodal’ surface f(x,y,z)  =  0, of a (real) random function f; the interface between positive and negative regions of the function. A physically significant local attribute at a point of a curved surface is its Gaussian curvature (the product of its principal curvatures) because, when integrated over the surface it gives the Euler characteristic. Here the probability distribution for the Gaussian curvature at a random point on the nodal surface f  =  0 is calculated for a statistically homogeneous (‘stationary’) and isotropic zero mean Gaussian random function f. Capitalizing on the isotropy, a ‘fixer’ device for axes supplies the probability distribution directly as a multiple integral. Its evaluation yields an explicit algebraic function with a simple average. Indeed, this average Gaussian curvature has long been known. For a non-zero level surface instead of the nodal one, the probability distribution is not fully tractable, but is supplied as an integral expression.

  2. Some Limit Properties of Random Transition Probability for Second-Order Nonhomogeneous Markov Chains Indexed by a Tree

    Directory of Open Access Journals (Sweden)

    Shi Zhiyan

    2009-01-01

    Full Text Available We study some limit properties of the harmonic mean of random transition probability for a second-order nonhomogeneous Markov chain and a nonhomogeneous Markov chain indexed by a tree. As corollary, we obtain the property of the harmonic mean of random transition probability for a nonhomogeneous Markov chain.

  3. Approximations to the Probability of Failure in Random Vibration by Integral Equation Methods

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    Close approximations to the first passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first passage probability density function and the distribution function for the time interval spent below a barrier before...... passage probability density. The results of the theory agree well with simulation results for narrow banded processes dominated by a single frequency, as well as for bimodal processes with 2 dominating frequencies in the structural response....... outcrossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval, and hence for the first...

  4. Fortran code for generating random probability vectors, unitaries, and quantum states

    Directory of Open Access Journals (Sweden)

    Jonas eMaziero

    2016-03-01

    Full Text Available The usefulness of generating random configurations is recognized in many areas of knowledge. Fortran was born for scientific computing and has been one of the main programming languages in this area since then. And several ongoing projects targeting towards its betterment indicate that it will keep this status in the decades to come. In this article, we describe Fortran codes produced, or organized, for the generation of the following random objects: numbers, probability vectors, unitary matrices, and quantum state vectors and density matrices. Some matrix functions are also included and may be of independent interest.

  5. Random function representation of stationary stochastic vector processes for probability density evolution analysis of wind-induced structures

    Science.gov (United States)

    Liu, Zhangjun; Liu, Zenghui

    2018-06-01

    This paper develops a hybrid approach of spectral representation and random function for simulating stationary stochastic vector processes. In the proposed approach, the high-dimensional random variables, included in the original spectral representation (OSR) formula, could be effectively reduced to only two elementary random variables by introducing the random functions that serve as random constraints. Based on this, a satisfactory simulation accuracy can be guaranteed by selecting a small representative point set of the elementary random variables. The probability information of the stochastic excitations can be fully emerged through just several hundred of sample functions generated by the proposed approach. Therefore, combined with the probability density evolution method (PDEM), it could be able to implement dynamic response analysis and reliability assessment of engineering structures. For illustrative purposes, a stochastic turbulence wind velocity field acting on a frame-shear-wall structure is simulated by constructing three types of random functions to demonstrate the accuracy and efficiency of the proposed approach. Careful and in-depth studies concerning the probability density evolution analysis of the wind-induced structure have been conducted so as to better illustrate the application prospects of the proposed approach. Numerical examples also show that the proposed approach possesses a good robustness.

  6. Impulsive synchronization of Markovian jumping randomly coupled neural networks with partly unknown transition probabilities via multiple integral approach.

    Science.gov (United States)

    Chandrasekar, A; Rakkiyappan, R; Cao, Jinde

    2015-10-01

    This paper studies the impulsive synchronization of Markovian jumping randomly coupled neural networks with partly unknown transition probabilities via multiple integral approach. The array of neural networks are coupled in a random fashion which is governed by Bernoulli random variable. The aim of this paper is to obtain the synchronization criteria, which is suitable for both exactly known and partly unknown transition probabilities such that the coupled neural network is synchronized with mixed time-delay. The considered impulsive effects can be synchronized at partly unknown transition probabilities. Besides, a multiple integral approach is also proposed to strengthen the Markovian jumping randomly coupled neural networks with partly unknown transition probabilities. By making use of Kronecker product and some useful integral inequalities, a novel Lyapunov-Krasovskii functional was designed for handling the coupled neural network with mixed delay and then impulsive synchronization criteria are solvable in a set of linear matrix inequalities. Finally, numerical examples are presented to illustrate the effectiveness and advantages of the theoretical results. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  8. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  9. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  10. A cellular automata model of traffic flow with variable probability of randomization

    International Nuclear Information System (INIS)

    Zheng Wei-Fan; Zhang Ji-Ye

    2015-01-01

    Research on the stochastic behavior of traffic flow is important to understand the intrinsic evolution rules of a traffic system. By introducing an interactional potential of vehicles into the randomization step, an improved cellular automata traffic flow model with variable probability of randomization is proposed in this paper. In the proposed model, the driver is affected by the interactional potential of vehicles before him, and his decision-making process is related to the interactional potential. Compared with the traditional cellular automata model, the modeling is more suitable for the driver’s random decision-making process based on the vehicle and traffic situations in front of him in actual traffic. From the improved model, the fundamental diagram (flow–density relationship) is obtained, and the detailed high-density traffic phenomenon is reproduced through numerical simulation. (paper)

  11. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  12. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  13. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  14. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  15. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  16. An extended car-following model considering random safety distance with different probabilities

    Science.gov (United States)

    Wang, Jufeng; Sun, Fengxin; Cheng, Rongjun; Ge, Hongxia; Wei, Qi

    2018-02-01

    Because of the difference in vehicle type or driving skill, the driving strategy is not exactly the same. The driving speeds of the different vehicles may be different for the same headway. Since the optimal velocity function is just determined by the safety distance besides the maximum velocity and headway, an extended car-following model accounting for random safety distance with different probabilities is proposed in this paper. The linear stable condition for this extended traffic model is obtained by using linear stability theory. Numerical simulations are carried out to explore the complex phenomenon resulting from multiple safety distance in the optimal velocity function. The cases of multiple types of safety distances selected with different probabilities are presented. Numerical results show that the traffic flow with multiple safety distances with different probabilities will be more unstable than that with single type of safety distance, and will result in more stop-and-go phenomena.

  17. Absolute transition probabilities for 559 strong lines of neutral cerium

    Energy Technology Data Exchange (ETDEWEB)

    Curry, J J, E-mail: jjcurry@nist.go [National Institute of Standards and Technology, Gaithersburg, MD 20899-8422 (United States)

    2009-07-07

    Absolute radiative transition probabilities are reported for 559 strong lines of neutral cerium covering the wavelength range 340-880 nm. These transition probabilities are obtained by scaling published relative line intensities (Meggers et al 1975 Tables of Spectral Line Intensities (National Bureau of Standards Monograph 145)) with a smaller set of published absolute transition probabilities (Bisson et al 1991 J. Opt. Soc. Am. B 8 1545). All 559 new values are for lines for which transition probabilities have not previously been available. The estimated relative random uncertainty of the new data is +-35% for nearly all lines.

  18. Analytic results for asymmetric random walk with exponential transition probabilities

    International Nuclear Information System (INIS)

    Gutkowicz-Krusin, D.; Procaccia, I.; Ross, J.

    1978-01-01

    We present here exact analytic results for a random walk on a one-dimensional lattice with asymmetric, exponentially distributed jump probabilities. We derive the generating functions of such a walk for a perfect lattice and for a lattice with absorbing boundaries. We obtain solutions for some interesting moment properties, such as mean first passage time, drift velocity, dispersion, and branching ratio for absorption. The symmetric exponential walk is solved as a special case. The scaling of the mean first passage time with the size of the system for the exponentially distributed walk is determined by the symmetry and is independent of the range

  19. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  20. Inverse probability weighting for covariate adjustment in randomized studies.

    Science.gov (United States)

    Shen, Changyu; Li, Xiaochun; Li, Lingling

    2014-02-20

    Covariate adjustment in randomized clinical trials has the potential benefit of precision gain. It also has the potential pitfall of reduced objectivity as it opens the possibility of selecting a 'favorable' model that yields strong treatment benefit estimate. Although there is a large volume of statistical literature targeting on the first aspect, realistic solutions to enforce objective inference and improve precision are rare. As a typical randomized trial needs to accommodate many implementation issues beyond statistical considerations, maintaining the objectivity is at least as important as precision gain if not more, particularly from the perspective of the regulatory agencies. In this article, we propose a two-stage estimation procedure based on inverse probability weighting to achieve better precision without compromising objectivity. The procedure is designed in a way such that the covariate adjustment is performed before seeing the outcome, effectively reducing the possibility of selecting a 'favorable' model that yields a strong intervention effect. Both theoretical and numerical properties of the estimation procedure are presented. Application of the proposed method to a real data example is presented. Copyright © 2013 John Wiley & Sons, Ltd.

  1. Contribution to the neutronic theory of random stacks (diffusion coefficient and first-flight collision probabilities) with a general theorem on collision probabilities

    International Nuclear Information System (INIS)

    Dixmier, Marc.

    1980-10-01

    A general expression of the diffusion coefficient (d.c.) of neutrons was given, with stress being put on symmetries. A system of first-flight collision probabilities for the case of a random stack of any number of types of one- and two-zoned spherical pebbles, with an albedo at the frontiers of the elements or (either) consideration of the interstital medium, was built; to that end, the bases of collision probability theory were reviewed, and a wide generalisation of the reciprocity theorem for those probabilities was demonstrated. The migration area of neutrons was expressed for any random stack of convex, 'simple' and 'regular-contact' elements, taking into account the correlations between free-paths; the average cosinus of re-emission of neutrons by an element, in the case of a homogeneous spherical pebble and the transport approximation, was expressed; the superiority of the so-found result over Behrens' theory, for the type of media under consideration, was established. The 'fine structure current term' of the d.c. was also expressed, and it was shown that its 'polarisation term' is negligible. Numerical applications showed that the global heterogeneity effect on the d.c. of pebble-bed reactors is comparable with that for Graphite-moderated, Carbon gas-cooled, natural Uranium reactors. The code CARACOLE, which integrates all the results here obtained, was introduced [fr

  2. Discrete probability models and methods probability on graphs and trees, Markov chains and random fields, entropy and coding

    CERN Document Server

    Brémaud, Pierre

    2017-01-01

    The emphasis in this book is placed on general models (Markov chains, random fields, random graphs), universal methods (the probabilistic method, the coupling method, the Stein-Chen method, martingale methods, the method of types) and versatile tools (Chernoff's bound, Hoeffding's inequality, Holley's inequality) whose domain of application extends far beyond the present text. Although the examples treated in the book relate to the possible applications, in the communication and computing sciences, in operations research and in physics, this book is in the first instance concerned with theory. The level of the book is that of a beginning graduate course. It is self-contained, the prerequisites consisting merely of basic calculus (series) and basic linear algebra (matrices). The reader is not assumed to be trained in probability since the first chapters give in considerable detail the background necessary to understand the rest of the book. .

  3. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  4. Probability for human intake of an atom randomly released into ground, rivers, oceans and air

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, B L

    1984-08-01

    Numerical estimates are developed for the probability of an atom randomly released in the top ground layers, in a river, or in the oceans to be ingested orally by a human, and for an atom emitted from an industrial source to be inhaled by a human. Estimates are obtained for both probability per year and for total eventual probability. Results vary considerably for different elements, but typical values for total probabilities are: ground, 3 X 10/sup -3/, oceans, 3 X 10/sup -4/; rivers, 1.7 x 10/sup -4/; and air, 5 X 10/sup -6/. Probabilities per year are typcially 1 X 10/sup -7/ for releases into the ground and 5 X 10/sup -8/ for releases into the oceans. These results indicate that for material with very long-lasting toxicity, it is important to include the pathways from the ground and from the oceans.

  5. Eliciting and Developing Teachers' Conceptions of Random Processes in a Probability and Statistics Course

    Science.gov (United States)

    Smith, Toni M.; Hjalmarson, Margret A.

    2013-01-01

    The purpose of this study is to examine prospective mathematics specialists' engagement in an instructional sequence designed to elicit and develop their understandings of random processes. The study was conducted with two different sections of a probability and statistics course for K-8 teachers. Thirty-two teachers participated. Video analyses…

  6. On the Generation of Random Ensembles of Qubits and Qutrits Computing Separability Probabilities for Fixed Rank States

    Directory of Open Access Journals (Sweden)

    Khvedelidze Arsen

    2018-01-01

    Full Text Available The generation of random mixed states is discussed, aiming for the computation of probabilistic characteristics of composite finite dimensional quantum systems. In particular, we consider the generation of random Hilbert-Schmidt and Bures ensembles of qubit and qutrit pairs and compute the corresponding probabilities to find a separable state among the states of a fixed rank.

  7. Digital simulation of two-dimensional random fields with arbitrary power spectra and non-Gaussian probability distribution functions.

    Science.gov (United States)

    Yura, Harold T; Hanson, Steen G

    2012-04-01

    Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative examples with relevance for optics are given.

  8. Probability of failure prediction for step-stress fatigue under sine or random stress

    Science.gov (United States)

    Lambert, R. G.

    1979-01-01

    A previously proposed cumulative fatigue damage law is extended to predict the probability of failure or fatigue life for structural materials with S-N fatigue curves represented as a scatterband of failure points. The proposed law applies to structures subjected to sinusoidal or random stresses and includes the effect of initial crack (i.e., flaw) sizes. The corrected cycle ratio damage function is shown to have physical significance.

  9. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  10. Nuclear data uncertainties: I, Basic concepts of probability

    International Nuclear Information System (INIS)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs

  11. A random spatial sampling method in a rural developing nation

    Science.gov (United States)

    Michelle C. Kondo; Kent D.W. Bream; Frances K. Barg; Charles C. Branas

    2014-01-01

    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method...

  12. Convergence estimates in probability and in expectation for discrete least squares with noisy evaluations at random points

    KAUST Repository

    Migliorati, Giovanni; Nobile, Fabio; Tempone, Raul

    2015-01-01

    We study the accuracy of the discrete least-squares approximation on a finite dimensional space of a real-valued target function from noisy pointwise evaluations at independent random points distributed according to a given sampling probability

  13. A probability measure for random surfaces of arbitrary genus and bosonic strings in 4 dimensions

    International Nuclear Information System (INIS)

    Albeverio, S.; Hoeegh-Krohn, R.; Paycha, S.; Scarlatti, S.

    1989-01-01

    We define a probability measure describing random surfaces in R D , 3≤D≤13, parametrized by compact Riemann surfaces of arbitrary genus. The measure involves the path space measure for scalar fields with exponential interaction in 2 space time dimensions. We show that it gives a mathematical realization of Polyakov's heuristic measure for bosonic strings. (orig.)

  14. Probability calculus of fractional order and fractional Taylor's series application to Fokker-Planck equation and information of non-random functions

    International Nuclear Information System (INIS)

    Jumarie, Guy

    2009-01-01

    A probability distribution of fractional (or fractal) order is defined by the measure μ{dx} = p(x)(dx) α , 0 α (D x α h α )f(x) provided by the modified Riemann Liouville definition, one can expand a probability calculus parallel to the standard one. A Fourier's transform of fractional order using the Mittag-Leffler function is introduced, together with its inversion formula; and it provides a suitable generalization of the characteristic function of fractal random variables. It appears that the state moments of fractional order are more especially relevant. The main properties of this fractional probability calculus are outlined, it is shown that it provides a sound approach to Fokker-Planck equation which are fractional in both space and time, and it provides new results in the information theory of non-random functions.

  15. Non-stationary random vibration analysis of a 3D train-bridge system using the probability density evolution method

    Science.gov (United States)

    Yu, Zhi-wu; Mao, Jian-feng; Guo, Feng-qi; Guo, Wei

    2016-03-01

    Rail irregularity is one of the main sources causing train-bridge random vibration. A new random vibration theory for the coupled train-bridge systems is proposed in this paper. First, number theory method (NTM) with 2N-dimensional vectors for the stochastic harmonic function (SHF) of rail irregularity power spectrum density was adopted to determine the representative points of spatial frequencies and phases to generate the random rail irregularity samples, and the non-stationary rail irregularity samples were modulated with the slowly varying function. Second, the probability density evolution method (PDEM) was employed to calculate the random dynamic vibration of the three-dimensional (3D) train-bridge system by a program compiled on the MATLAB® software platform. Eventually, the Newmark-β integration method and double edge difference method of total variation diminishing (TVD) format were adopted to obtain the mean value curve, the standard deviation curve and the time-history probability density information of responses. A case study was presented in which the ICE-3 train travels on a three-span simply-supported high-speed railway bridge with excitation of random rail irregularity. The results showed that compared to the Monte Carlo simulation, the PDEM has higher computational efficiency for the same accuracy, i.e., an improvement by 1-2 orders of magnitude. Additionally, the influences of rail irregularity and train speed on the random vibration of the coupled train-bridge system were discussed.

  16. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  17. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  18. Spencer-Brown vs. Probability and Statistics: Entropy’s Testimony on Subjective and Objective Randomness

    Directory of Open Access Journals (Sweden)

    Julio Michael Stern

    2011-04-01

    Full Text Available This article analyzes the role of entropy in Bayesian statistics, focusing on its use as a tool for detection, recognition and validation of eigen-solutions. “Objects as eigen-solutions” is a key metaphor of the cognitive constructivism epistemological framework developed by the philosopher Heinz von Foerster. Special attention is given to some objections to the concepts of probability, statistics and randomization posed by George Spencer-Brown, a figure of great influence in the field of radical constructivism.

  19. Identifying probable suicide clusters in wales using national mortality data.

    Directory of Open Access Journals (Sweden)

    Phillip Jones

    Full Text Available Up to 2% of suicides in young people may occur in clusters i.e., close together in time and space. In early 2008 unprecedented attention was given by national and international news media to a suspected suicide cluster among young people living in Bridgend, Wales. This paper investigates the strength of statistical evidence for this apparent cluster, its size, and temporal and geographical limits.The analysis is based on official mortality statistics for Wales for 2000-2009 provided by the UK's Office for National Statistics (ONS. Temporo-spatial analysis was performed using Space Time Permutation Scan Statistics with SaTScan v9.1 for suicide deaths aged 15 and over, with a sub-group analysis focussing on cases aged 15-34 years. These analyses were conducted for deaths coded by ONS as: (i suicide or of undetermined intent (probable suicides and (ii for a combination of suicide, undetermined, and accidental poisoning and hanging (possible suicides. The temporo-spatial analysis did not identify any clusters of suicide or undetermined intent deaths (probable suicides. However, analysis of all deaths by suicide, undetermined intent, accidental poisoning and accidental hanging (possible suicides identified a temporo-spatial cluster (p = 0.029 involving 10 deaths amongst 15-34 year olds centred on the County Borough of Bridgend for the period 27(th December 2007 to 19(th February 2008. Less than 1% of possible suicides in younger people in Wales in the ten year period were identified as being cluster-related.There was a possible suicide cluster in young people in Bridgend between December 2007 and February 2008. This cluster was smaller, shorter in duration, and predominantly later than the phenomenon that was reported in national and international print media. Further investigation of factors leading to the onset and termination of this series of deaths, in particular the role of the media, is required.

  20. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  1. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  2. Method for Evaluation of Outage Probability on Random Access Channel in Mobile Communication Systems

    Science.gov (United States)

    Kollár, Martin

    2012-05-01

    In order to access the cell in all mobile communication technologies a so called random-access procedure is used. For example in GSM this is represented by sending the CHANNEL REQUEST message from Mobile Station (MS) to Base Transceiver Station (BTS) which is consequently forwarded as an CHANNEL REQUIRED message to the Base Station Controller (BSC). If the BTS decodes some noise on the Random Access Channel (RACH) as random access by mistake (so- called ‘phantom RACH') then it is a question of pure coincidence which èstablishment cause’ the BTS thinks to have recognized. A typical invalid channel access request or phantom RACH is characterized by an IMMEDIATE ASSIGNMENT procedure (assignment of an SDCCH or TCH) which is not followed by sending an ESTABLISH INDICATION from MS to BTS. In this paper a mathematical model for evaluation of the Power RACH Busy Threshold (RACHBT) in order to guaranty in advance determined outage probability on RACH is described and discussed as well. It focuses on Global System for Mobile Communications (GSM) however the obtained results can be generalized on remaining mobile technologies (ie WCDMA and LTE).

  3. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  4. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  5. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  6. Probability densities and the radon variable transformation theorem

    International Nuclear Information System (INIS)

    Ramshaw, J.D.

    1985-01-01

    D. T. Gillespie recently derived a random variable transformation theorem relating to the joint probability densities of functionally dependent sets of random variables. The present author points out that the theorem can be derived as an immediate corollary of a simpler and more fundamental relation. In this relation the probability density is represented as a delta function averaged over an unspecified distribution of unspecified internal random variables. The random variable transformation is derived from this relation

  7. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  8. General Exact Solution to the Problem of the Probability Density for Sums of Random Variables

    Science.gov (United States)

    Tribelsky, Michael I.

    2002-07-01

    The exact explicit expression for the probability density pN(x) for a sum of N random, arbitrary correlated summands is obtained. The expression is valid for any number N and any distribution of the random summands. Most attention is paid to application of the developed approach to the case of independent and identically distributed summands. The obtained results reproduce all known exact solutions valid for the, so called, stable distributions of the summands. It is also shown that if the distribution is not stable, the profile of pN(x) may be divided into three parts, namely a core (small x), a tail (large x), and a crossover from the core to the tail (moderate x). The quantitative description of all three parts as well as that for the entire profile is obtained. A number of particular examples are considered in detail.

  9. Probability Theory Plus Noise: Descriptive Estimation and Inferential Judgment.

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2018-01-01

    We describe a computational model of two central aspects of people's probabilistic reasoning: descriptive probability estimation and inferential probability judgment. This model assumes that people's reasoning follows standard frequentist probability theory, but it is subject to random noise. This random noise has a regressive effect in descriptive probability estimation, moving probability estimates away from normative probabilities and toward the center of the probability scale. This random noise has an anti-regressive effect in inferential judgement, however. These regressive and anti-regressive effects explain various reliable and systematic biases seen in people's descriptive probability estimation and inferential probability judgment. This model predicts that these contrary effects will tend to cancel out in tasks that involve both descriptive estimation and inferential judgement, leading to unbiased responses in those tasks. We test this model by applying it to one such task, described by Gallistel et al. ). Participants' median responses in this task were unbiased, agreeing with normative probability theory over the full range of responses. Our model captures the pattern of unbiased responses in this task, while simultaneously explaining systematic biases away from normatively correct probabilities seen in other tasks. Copyright © 2018 Cognitive Science Society, Inc.

  10. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  11. Characteristics of the probability function for three random-walk models of reaction--diffusion processes

    International Nuclear Information System (INIS)

    Musho, M.K.; Kozak, J.J.

    1984-01-01

    A method is presented for calculating exactly the relative width (sigma 2 )/sup 1/2// , the skewness γ 1 , and the kurtosis γ 2 characterizing the probability distribution function for three random-walk models of diffusion-controlled processes. For processes in which a diffusing coreactant A reacts irreversibly with a target molecule B situated at a reaction center, three models are considered. The first is the traditional one of an unbiased, nearest-neighbor random walk on a d-dimensional periodic/confining lattice with traps; the second involves the consideration of unbiased, non-nearest-neigh bor (i.e., variable-step length) walks on the same d-dimensional lattice; and, the third deals with the case of a biased, nearest-neighbor walk on a d-dimensional lattice (wherein a walker experiences a potential centered at the deep trap site of the lattice). Our method, which has been described in detail elsewhere [P.A. Politowicz and J. J. Kozak, Phys. Rev. B 28, 5549 (1983)] is based on the use of group theoretic arguments within the framework of the theory of finite Markov processes

  12. More efficient integrated safeguards by applying a reasonable detection probability for maintaining low presence probability of undetected nuclear proliferating activities

    International Nuclear Information System (INIS)

    Otsuka, Naoto

    2013-01-01

    Highlights: • A theoretical foundation is presented for more efficient Integrated Safeguards (IS). • Probability of undetected nuclear proliferation activities should be maintained low. • For nations under IS, the probability to start proliferation activities is very low. • The fact can decrease the detection probability of IS by dozens of percentage points. • The cost of IS per nation can be cut down by reducing inspection frequencies etc. - Abstract: A theoretical foundation is presented for implementing more efficiently the present International Atomic Energy Agency (IAEA) integrated safeguards (ISs) on the basis of fuzzy evaluation of the probability that the evaluated nation will continue peaceful activities. It is shown that by determining the presence probability of undetected nuclear proliferating activities, nations under IS can be maintained at acceptably low proliferation risk levels even if the detection probability of current IS is decreased by dozens of percentage from the present value. This makes it possible to reduce inspection frequency and the number of collected samples, allowing the IAEA to cut costs per nation. This will contribute to further promotion and application of IS to more nations by the IAEA, and more efficient utilization of IAEA resources from the viewpoint of whole IS framework

  13. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be...

  14. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  15. Random magnetism

    International Nuclear Information System (INIS)

    Tahir-Kheli, R.A.

    1975-01-01

    A few simple problems relating to random magnetic systems are presented. Translational symmetry, only on the macroscopic scale, is assumed for these systems. A random set of parameters, on the microscopic scale, for the various regions of these systems is also assumed. A probability distribution for randomness is obeyed. Knowledge of the form of these probability distributions, is assumed in all cases [pt

  16. Elements of probability and statistics an introduction to probability with De Finetti’s approach and to Bayesian statistics

    CERN Document Server

    Biagini, Francesca

    2016-01-01

    This book provides an introduction to elementary probability and to Bayesian statistics using de Finetti's subjectivist approach. One of the features of this approach is that it does not require the introduction of sample space – a non-intrinsic concept that makes the treatment of elementary probability unnecessarily complicate – but introduces as fundamental the concept of random numbers directly related to their interpretation in applications. Events become a particular case of random numbers and probability a particular case of expectation when it is applied to events. The subjective evaluation of expectation and of conditional expectation is based on an economic choice of an acceptable bet or penalty. The properties of expectation and conditional expectation are derived by applying a coherence criterion that the evaluation has to follow. The book is suitable for all introductory courses in probability and statistics for students in Mathematics, Informatics, Engineering, and Physics.

  17. Convergence estimates in probability and in expectation for discrete least squares with noisy evaluations at random points

    KAUST Repository

    Migliorati, Giovanni

    2015-08-28

    We study the accuracy of the discrete least-squares approximation on a finite dimensional space of a real-valued target function from noisy pointwise evaluations at independent random points distributed according to a given sampling probability measure. The convergence estimates are given in mean-square sense with respect to the sampling measure. The noise may be correlated with the location of the evaluation and may have nonzero mean (offset). We consider both cases of bounded or square-integrable noise / offset. We prove conditions between the number of sampling points and the dimension of the underlying approximation space that ensure a stable and accurate approximation. Particular focus is on deriving estimates in probability within a given confidence level. We analyze how the best approximation error and the noise terms affect the convergence rate and the overall confidence level achieved by the convergence estimate. The proofs of our convergence estimates in probability use arguments from the theory of large deviations to bound the noise term. Finally we address the particular case of multivariate polynomial approximation spaces with any density in the beta family, including uniform and Chebyshev.

  18. The mean distance to the nth neighbour in a uniform distribution of random points: an application of probability theory

    International Nuclear Information System (INIS)

    Bhattacharyya, Pratip; Chakrabarti, Bikas K

    2008-01-01

    We study different ways of determining the mean distance (r n ) between a reference point and its nth neighbour among random points distributed with uniform density in a D-dimensional Euclidean space. First, we present a heuristic method; though this method provides only a crude mathematical result, it shows a simple way of estimating (r n ). Next, we describe two alternative means of deriving the exact expression of (r n ): we review the method using absolute probability and develop an alternative method using conditional probability. Finally, we obtain an approximation to (r n ) from the mean volume between the reference point and its nth neighbour and compare it with the heuristic and exact results

  19. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  20. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  1. Probability on graphs random processes on graphs and lattices

    CERN Document Server

    Grimmett, Geoffrey

    2018-01-01

    This introduction to some of the principal models in the theory of disordered systems leads the reader through the basics, to the very edge of contemporary research, with the minimum of technical fuss. Topics covered include random walk, percolation, self-avoiding walk, interacting particle systems, uniform spanning tree, random graphs, as well as the Ising, Potts, and random-cluster models for ferromagnetism, and the Lorentz model for motion in a random medium. This new edition features accounts of major recent progress, including the exact value of the connective constant of the hexagonal lattice, and the critical point of the random-cluster model on the square lattice. The choice of topics is strongly motivated by modern applications, and focuses on areas that merit further research. Accessible to a wide audience of mathematicians and physicists, this book can be used as a graduate course text. Each chapter ends with a range of exercises.

  2. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  3. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  4. Probability and Statistics The Science of Uncertainty (Revised Edition)

    CERN Document Server

    Tabak, John

    2011-01-01

    Probability and Statistics, Revised Edition deals with the history of probability, describing the modern concept of randomness and examining "pre-probabilistic" ideas of what most people today would characterize as randomness. This revised book documents some historically important early uses of probability to illustrate some very important probabilistic questions. It goes on to explore statistics and the generations of mathematicians and non-mathematicians who began to address problems in statistical analysis, including the statistical structure of data sets as well as the theory of

  5. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  6. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  7. Clear-Sky Probability for the August 21, 2017, Total Solar Eclipse Using the NREL National Solar Radiation Database

    Energy Technology Data Exchange (ETDEWEB)

    Habte, Aron M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Roberts, Billy J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Kutchenreiter, Mark C [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Sengupta, Manajit [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wilcox, Steve [Solar Resource Solutions, LLC, Lakewood, CO (United States); Stoffel, Tom [Solar Resource Solutions, LLC, Lakewood, CO (United States)

    2017-07-21

    The National Renewable Energy Laboratory (NREL) and collaborators have created a clear-sky probability analysis to help guide viewers of the August 21, 2017, total solar eclipse, the first continent-spanning eclipse in nearly 100 years in the United States. Using cloud and solar data from NREL's National Solar Radiation Database (NSRDB), the analysis provides cloudless sky probabilities specific to the date and time of the eclipse. Although this paper is not intended to be an eclipse weather forecast, the detailed maps can help guide eclipse enthusiasts to likely optimal viewing locations. Additionally, high-resolution data are presented for the centerline of the path of totality, representing the likelihood for cloudless skies and atmospheric clarity. The NSRDB provides industry, academia, and other stakeholders with high-resolution solar irradiance data to support feasibility analyses for photovoltaic and concentrating solar power generation projects.

  8. PROBABILITY CALIBRATION BY THE MINIMUM AND MAXIMUM PROBABILITY SCORES IN ONE-CLASS BAYES LEARNING FOR ANOMALY DETECTION

    Data.gov (United States)

    National Aeronautics and Space Administration — PROBABILITY CALIBRATION BY THE MINIMUM AND MAXIMUM PROBABILITY SCORES IN ONE-CLASS BAYES LEARNING FOR ANOMALY DETECTION GUICHONG LI, NATHALIE JAPKOWICZ, IAN HOFFMAN,...

  9. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  10. USING THE WEB-SERVICES WOLFRAM|ALPHA TO SOLVE PROBLEMS IN PROBABILITY THEORY

    Directory of Open Access Journals (Sweden)

    Taras Kobylnyk

    2015-10-01

    Full Text Available The trend towards the use of remote network resources on the Internet clearly delineated. Traditional training combined with increasingly networked, remote technologies become popular cloud computing. Research methods of probability theory are used in various fields. Of particular note is the use of methods of probability theory in psychological and educational research in statistical analysis of experimental data. Conducting such research is impossible without the use of modern information technology. Given the advantages of web-based software, the article describes web-service Wolfram|Alpha. Detailed analysis of the possibilities of using web-service Wolfram|Alpha for solving problems of probability theory. In the case studies described the results of queries for solving of probability theory, in particular the sections random events and random variables. Considered and analyzed the problem of the number of occurrences of event A in n independent trials using Wolfram|Alpha, detailed analysis of the possibilities of using the service Wolfram|Alpha for the study of continuous random variable that has a normal and uniform probability distribution, including calculating the probability of getting the value of a random variable in a given interval. The problem in applying the binomial and hypergeometric probability distribution of a discrete random variable and demonstrates the possibility of using the service Wolfram|Alpha for solving it.

  11. Joint probabilities and quantum cognition

    International Nuclear Information System (INIS)

    Acacio de Barros, J.

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  12. Joint probabilities and quantum cognition

    Energy Technology Data Exchange (ETDEWEB)

    Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  13. Zero field reversal probability in thermally assisted magnetization reversal

    Science.gov (United States)

    Prasetya, E. B.; Utari; Purnama, B.

    2017-11-01

    This paper discussed about zero field reversal probability in thermally assisted magnetization reversal (TAMR). Appearance of reversal probability in zero field investigated through micromagnetic simulation by solving stochastic Landau-Lifshitz-Gibert (LLG). The perpendicularly anisotropy magnetic dot of 50×50×20 nm3 is considered as single cell magnetic storage of magnetic random acces memory (MRAM). Thermally assisted magnetization reversal was performed by cooling writing process from near/almost Curie point to room temperature on 20 times runs for different randomly magnetized state. The results show that the probability reversal under zero magnetic field decreased with the increase of the energy barrier. The zero-field probability switching of 55% attained for energy barrier of 60 k B T and the reversal probability become zero noted at energy barrier of 2348 k B T. The higest zero-field switching probability of 55% attained for energy barrier of 60 k B T which corespond to magnetif field of 150 Oe for switching.

  14. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  15. School and conference on probability theory

    International Nuclear Information System (INIS)

    Lawler, G.F.

    2004-01-01

    This volume includes expanded lecture notes from the School and Conference in Probability Theory held at ICTP in May, 2001. Probability theory is a very large area, too large for a single school and conference. The organizers, G. Lawler, C. Newman, and S. Varadhan chose to focus on a number of active research areas that have their roots in statistical physics. The pervasive theme in these lectures is trying to find the large time or large space behaviour of models defined on discrete lattices. Usually the definition of the model is relatively simple: either assigning a particular weight to each possible configuration (equilibrium statistical mechanics) or specifying the rules under which the system evolves (nonequilibrium statistical mechanics). Interacting particle systems is the area of probability that studies the evolution of particles (either finite or infinite in number) under random motions. The evolution of particles depends on the positions of the other particles; often one assumes that it depends only on the particles that are close to the particular particle. Thomas Liggett's lectures give an introduction to this very large area. Claudio Landim's follows up by discussing hydrodynamic limits of particle systems. The goal of this area is to describe the long time, large system size dynamics in terms of partial differential equations. The area of random media is concerned with the properties of materials or environments that are not homogeneous. Percolation theory studies one of the simplest stated models for impurities - taking a lattice and removing some of the vertices or bonds. Luiz Renato G. Fontes and Vladas Sidoravicius give a detailed introduction to this area. Random walk in random environment combines two sources of randomness - a particle performing stochastic motion in which the transition probabilities depend on position and have been chosen from some probability distribution. Alain-Sol Sznitman gives a survey of recent developments in this

  16. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  17. Betting on Illusory Patterns: Probability Matching in Habitual Gamblers.

    Science.gov (United States)

    Gaissmaier, Wolfgang; Wilke, Andreas; Scheibehenne, Benjamin; McCanney, Paige; Barrett, H Clark

    2016-03-01

    Why do people gamble? A large body of research suggests that cognitive distortions play an important role in pathological gambling. Many of these distortions are specific cases of a more general misperception of randomness, specifically of an illusory perception of patterns in random sequences. In this article, we provide further evidence for the assumption that gamblers are particularly prone to perceiving illusory patterns. In particular, we compared habitual gamblers to a matched sample of community members with regard to how much they exhibit the choice anomaly 'probability matching'. Probability matching describes the tendency to match response proportions to outcome probabilities when predicting binary outcomes. It leads to a lower expected accuracy than the maximizing strategy of predicting the most likely event on each trial. Previous research has shown that an illusory perception of patterns in random sequences fuels probability matching. So does impulsivity, which is also reported to be higher in gamblers. We therefore hypothesized that gamblers will exhibit more probability matching than non-gamblers, which was confirmed in a controlled laboratory experiment. Additionally, gamblers scored much lower than community members on the cognitive reflection task, which indicates higher impulsivity. This difference could account for the difference in probability matching between the samples. These results suggest that gamblers are more willing to bet impulsively on perceived illusory patterns.

  18. Surprisingly rational: probability theory plus noise explains biases in judgment.

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2014-07-01

    The systematic biases seen in people's probability judgments are typically taken as evidence that people do not use the rules of probability theory when reasoning about probability but instead use heuristics, which sometimes yield reasonable judgments and sometimes yield systematic biases. This view has had a major impact in economics, law, medicine, and other fields; indeed, the idea that people cannot reason with probabilities has become a truism. We present a simple alternative to this view, where people reason about probability according to probability theory but are subject to random variation or noise in the reasoning process. In this account the effect of noise is canceled for some probabilistic expressions. Analyzing data from 2 experiments, we find that, for these expressions, people's probability judgments are strikingly close to those required by probability theory. For other expressions, this account produces systematic deviations in probability estimates. These deviations explain 4 reliable biases in human probabilistic reasoning (conservatism, subadditivity, conjunction, and disjunction fallacies). These results suggest that people's probability judgments embody the rules of probability theory and that biases in those judgments are due to the effects of random noise. (c) 2014 APA, all rights reserved.

  19. Entanglement probabilities of polymers: a white noise functional approach

    International Nuclear Information System (INIS)

    Bernido, Christopher C; Carpio-Bernido, M Victoria

    2003-01-01

    The entanglement probabilities for a highly flexible polymer to wind n times around a straight polymer are evaluated using white noise analysis. To introduce the white noise functional approach, the one-dimensional random walk problem is taken as an example. The polymer entanglement scenario, viewed as a random walk on a plane, is then treated and the entanglement probabilities are obtained for a magnetic flux confined along the straight polymer, and a case where an entangled polymer is subjected to the potential V = f-dot(s)θ. In the absence of the magnetic flux and the potential V, the entanglement probabilities reduce to a result obtained by Wiegel

  20. Orthogonal Algorithm of Logic Probability and Syndrome-Testable Analysis

    Institute of Scientific and Technical Information of China (English)

    1990-01-01

    A new method,orthogonal algoritm,is presented to compute the logic probabilities(i.e.signal probabilities)accurately,The transfer properties of logic probabilities are studied first,which are useful for the calculation of logic probability of the circuit with random independent inputs.Then the orthogonal algoritm is described to compute the logic probability of Boolean function realized by a combinational circuit.This algorithm can make Boolean function “ORTHOGONAL”so that the logic probabilities can be easily calculated by summing up the logic probabilities of all orthogonal terms of the Booleam function.

  1. Robust estimation of the expected survival probabilities from high-dimensional Cox models with biomarker-by-treatment interactions in randomized clinical trials

    Directory of Open Access Journals (Sweden)

    Nils Ternès

    2017-05-01

    Full Text Available Abstract Background Thanks to the advances in genomics and targeted treatments, more and more prediction models based on biomarkers are being developed to predict potential benefit from treatments in a randomized clinical trial. Despite the methodological framework for the development and validation of prediction models in a high-dimensional setting is getting more and more established, no clear guidance exists yet on how to estimate expected survival probabilities in a penalized model with biomarker-by-treatment interactions. Methods Based on a parsimonious biomarker selection in a penalized high-dimensional Cox model (lasso or adaptive lasso, we propose a unified framework to: estimate internally the predictive accuracy metrics of the developed model (using double cross-validation; estimate the individual survival probabilities at a given timepoint; construct confidence intervals thereof (analytical or bootstrap; and visualize them graphically (pointwise or smoothed with spline. We compared these strategies through a simulation study covering scenarios with or without biomarker effects. We applied the strategies to a large randomized phase III clinical trial that evaluated the effect of adding trastuzumab to chemotherapy in 1574 early breast cancer patients, for which the expression of 462 genes was measured. Results In our simulations, penalized regression models using the adaptive lasso estimated the survival probability of new patients with low bias and standard error; bootstrapped confidence intervals had empirical coverage probability close to the nominal level across very different scenarios. The double cross-validation performed on the training data set closely mimicked the predictive accuracy of the selected models in external validation data. We also propose a useful visual representation of the expected survival probabilities using splines. In the breast cancer trial, the adaptive lasso penalty selected a prediction model with 4

  2. Reliability of structures by using probability and fatigue theories

    International Nuclear Information System (INIS)

    Lee, Ouk Sub; Kim, Dong Hyeok; Park, Yeon Chang

    2008-01-01

    Methodologies to calculate failure probability and to estimate the reliability of fatigue loaded structures are developed. The applicability of the methodologies is evaluated with the help of the fatigue crack growth models suggested by Paris and Walker. The probability theories such as the FORM (first order reliability method), the SORM (second order reliability method) and the MCS (Monte Carlo simulation) are utilized. It is found that the failure probability decreases with the increase of the design fatigue life and the applied minimum stress, the decrease of the initial edge crack size, the applied maximum stress and the slope of Paris equation. Furthermore, according to the sensitivity analysis of random variables, the slope of Pairs equation affects the failure probability dominantly among other random variables in the Paris and the Walker models

  3. Visualization techniques for spatial probability density function data

    Directory of Open Access Journals (Sweden)

    Udeepta D Bordoloi

    2006-01-01

    Full Text Available Novel visualization methods are presented for spatial probability density function data. These are spatial datasets, where each pixel is a random variable, and has multiple samples which are the results of experiments on that random variable. We use clustering as a means to reduce the information contained in these datasets; and present two different ways of interpreting and clustering the data. The clustering methods are used on two datasets, and the results are discussed with the help of visualization techniques designed for the spatial probability data.

  4. Classical probability model for Bell inequality

    International Nuclear Information System (INIS)

    Khrennikov, Andrei

    2014-01-01

    We show that by taking into account randomness of realization of experimental contexts it is possible to construct common Kolmogorov space for data collected for these contexts, although they can be incompatible. We call such a construction 'Kolmogorovization' of contextuality. This construction of common probability space is applied to Bell's inequality. It is well known that its violation is a consequence of collecting statistical data in a few incompatible experiments. In experiments performed in quantum optics contexts are determined by selections of pairs of angles (θ i ,θ ' j ) fixing orientations of polarization beam splitters. Opposite to the common opinion, we show that statistical data corresponding to measurements of polarizations of photons in the singlet state, e.g., in the form of correlations, can be described in the classical probabilistic framework. The crucial point is that in constructing the common probability space one has to take into account not only randomness of the source (as Bell did), but also randomness of context-realizations (in particular, realizations of pairs of angles (θ i , θ ' j )). One may (but need not) say that randomness of 'free will' has to be accounted for.

  5. Effect of drain current on appearance probability and amplitude of random telegraph noise in low-noise CMOS image sensors

    Science.gov (United States)

    Ichino, Shinya; Mawaki, Takezo; Teramoto, Akinobu; Kuroda, Rihito; Park, Hyeonwoo; Wakashima, Shunichi; Goto, Tetsuya; Suwa, Tomoyuki; Sugawa, Shigetoshi

    2018-04-01

    Random telegraph noise (RTN), which occurs in in-pixel source follower (SF) transistors, has become one of the most critical problems in high-sensitivity CMOS image sensors (CIS) because it is a limiting factor of dark random noise. In this paper, the behaviors of RTN toward changes in SF drain current conditions were analyzed using a low-noise array test circuit measurement system with a floor noise of 35 µV rms. In addition to statistical analysis by measuring a large number of transistors (18048 transistors), we also analyzed the behaviors of RTN parameters such as amplitude and time constants in the individual transistors. It is demonstrated that the appearance probability of RTN becomes small under a small drain current condition, although large-amplitude RTN tends to appear in a very small number of cells.

  6. Two-slit experiment: quantum and classical probabilities

    International Nuclear Information System (INIS)

    Khrennikov, Andrei

    2015-01-01

    Inter-relation between quantum and classical probability models is one of the most fundamental problems of quantum foundations. Nowadays this problem also plays an important role in quantum technologies, in quantum cryptography and the theory of quantum random generators. In this letter, we compare the viewpoint of Richard Feynman that the behavior of quantum particles cannot be described by classical probability theory with the viewpoint that quantum–classical inter-relation is more complicated (cf, in particular, with the tomographic model of quantum mechanics developed in detail by Vladimir Man'ko). As a basic example, we consider the two-slit experiment, which played a crucial role in quantum foundational debates at the beginning of quantum mechanics (QM). In particular, its analysis led Niels Bohr to the formulation of the principle of complementarity. First, we demonstrate that in complete accordance with Feynman's viewpoint, the probabilities for the two-slit experiment have the non-Kolmogorovian structure, since they violate one of basic laws of classical probability theory, the law of total probability (the heart of the Bayesian analysis). However, then we show that these probabilities can be embedded in a natural way into the classical (Kolmogorov, 1933) probability model. To do this, one has to take into account the randomness of selection of different experimental contexts, the joint consideration of which led Feynman to a conclusion about the non-classicality of quantum probability. We compare this embedding of non-Kolmogorovian quantum probabilities into the Kolmogorov model with well-known embeddings of non-Euclidean geometries into Euclidean space (e.g., the Poincaré disk model for the Lobachvesky plane). (paper)

  7. Continuous time random walk model with asymptotical probability density of waiting times via inverse Mittag-Leffler function

    Science.gov (United States)

    Liang, Yingjie; Chen, Wen

    2018-04-01

    The mean squared displacement (MSD) of the traditional ultraslow diffusion is a logarithmic function of time. Recently, the continuous time random walk model is employed to characterize this ultraslow diffusion dynamics by connecting the heavy-tailed logarithmic function and its variation as the asymptotical waiting time density. In this study we investigate the limiting waiting time density of a general ultraslow diffusion model via the inverse Mittag-Leffler function, whose special case includes the traditional logarithmic ultraslow diffusion model. The MSD of the general ultraslow diffusion model is analytically derived as an inverse Mittag-Leffler function, and is observed to increase even more slowly than that of the logarithmic function model. The occurrence of very long waiting time in the case of the inverse Mittag-Leffler function has the largest probability compared with the power law model and the logarithmic function model. The Monte Carlo simulations of one dimensional sample path of a single particle are also performed. The results show that the inverse Mittag-Leffler waiting time density is effective in depicting the general ultraslow random motion.

  8. Evaluation of probability and hazard in nuclear energy

    International Nuclear Information System (INIS)

    Novikov, V.Ya.; Romanov, N.L.

    1979-01-01

    Various methods of evaluation of accident probability on NPP are proposed because of NPP security statistic evaluation unreliability. The conception of subjective probability for quantitative analysis of security and hazard are described. Intrepretation of probability as real faith of an expert is assumed as a basis of the conception. It is suggested to study the event uncertainty in the framework of subjective probability theory which not only permits but demands to take into account expert opinions when evaluating the probability. These subjective expert evaluations effect to a certain extent the calculation of the usual mathematical event probability. The above technique is advantageous to use for consideration of a separate experiment or random event

  9. A Randomized Central Limit Theorem

    International Nuclear Information System (INIS)

    Eliazar, Iddo; Klafter, Joseph

    2010-01-01

    The Central Limit Theorem (CLT), one of the most elemental pillars of Probability Theory and Statistical Physics, asserts that: the universal probability law of large aggregates of independent and identically distributed random summands with zero mean and finite variance, scaled by the square root of the aggregate-size (√(n)), is Gaussian. The scaling scheme of the CLT is deterministic and uniform - scaling all aggregate-summands by the common and deterministic factor √(n). This Letter considers scaling schemes which are stochastic and non-uniform, and presents a 'Randomized Central Limit Theorem' (RCLT): we establish a class of random scaling schemes which yields universal probability laws of large aggregates of independent and identically distributed random summands. The RCLT universal probability laws, in turn, are the one-sided and the symmetric Levy laws.

  10. Precise lim sup behavior of probabilities of large deviations for sums of i.i.d. random variables

    Directory of Open Access Journals (Sweden)

    Andrew Rosalsky

    2004-12-01

    Full Text Available Let {X,Xn;n≥1} be a sequence of real-valued i.i.d. random variables and let Sn=∑i=1nXi, n≥1. In this paper, we study the probabilities of large deviations of the form P(Sn>tn1/p, P(Sntn1/p, where t>0 and 0x1/p/ϕ(x=1, then for every t>0, limsupn→∞P(|Sn|>tn1/p/(nϕ(n=tpα.

  11. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  12. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , S st , p st ) for stochastic uncertainty, a probability space (S su , S su , p su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , S st , p st ) and (S su , S su , p su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  13. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-01-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , L st , P st ) for stochastic uncertainty, a probability space (S su , L su , P su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , L st , P st ) and (S su , L su , P su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  14. Introduction to probability and stochastic processes with applications

    CERN Document Server

    Castañ, Blanco; Arunachalam, Viswanathan; Dharmaraja, Selvamuthu

    2012-01-01

    An easily accessible, real-world approach to probability and stochastic processes Introduction to Probability and Stochastic Processes with Applications presents a clear, easy-to-understand treatment of probability and stochastic processes, providing readers with a solid foundation they can build upon throughout their careers. With an emphasis on applications in engineering, applied sciences, business and finance, statistics, mathematics, and operations research, the book features numerous real-world examples that illustrate how random phenomena occur in nature and how to use probabilistic t

  15. Poisson statistics of PageRank probabilities of Twitter and Wikipedia networks

    Science.gov (United States)

    Frahm, Klaus M.; Shepelyansky, Dima L.

    2014-04-01

    We use the methods of quantum chaos and Random Matrix Theory for analysis of statistical fluctuations of PageRank probabilities in directed networks. In this approach the effective energy levels are given by a logarithm of PageRank probability at a given node. After the standard energy level unfolding procedure we establish that the nearest spacing distribution of PageRank probabilities is described by the Poisson law typical for integrable quantum systems. Our studies are done for the Twitter network and three networks of Wikipedia editions in English, French and German. We argue that due to absence of level repulsion the PageRank order of nearby nodes can be easily interchanged. The obtained Poisson law implies that the nearby PageRank probabilities fluctuate as random independent variables.

  16. The randomly renewed general item and the randomly inspected item with exponential life distribution

    International Nuclear Information System (INIS)

    Schneeweiss, W.G.

    1979-01-01

    For a randomly renewed item the probability distributions of the time to failure and of the duration of down time and the expectations of these random variables are determined. Moreover, it is shown that the same theory applies to randomly checked items with exponential probability distribution of life such as electronic items. The case of periodic renewals is treated as an example. (orig.) [de

  17. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  18. Daniel Courgeau: Probability and social science: methodological relationships between the two approaches [Review of: . Probability and social science: methodological relationships between the two approaches

    NARCIS (Netherlands)

    Willekens, F.J.C.

    2013-01-01

    Throughout history, humans engaged in games in which randomness plays a role. In the 17th century, scientists started to approach chance scientifically and to develop a theory of probability. Courgeau describes how the relationship between probability theory and social sciences emerged and evolved

  19. On the universality of knot probability ratios

    Energy Technology Data Exchange (ETDEWEB)

    Janse van Rensburg, E J [Department of Mathematics and Statistics, York University, Toronto, Ontario M3J 1P3 (Canada); Rechnitzer, A, E-mail: rensburg@yorku.ca, E-mail: andrewr@math.ubc.ca [Department of Mathematics, University of British Columbia, 1984 Mathematics Road, Vancouver, BC V6T 1Z2 (Canada)

    2011-04-22

    Let p{sub n} denote the number of self-avoiding polygons of length n on a regular three-dimensional lattice, and let p{sub n}(K) be the number which have knot type K. The probability that a random polygon of length n has knot type K is p{sub n}(K)/p{sub n} and is known to decay exponentially with length (Sumners and Whittington 1988 J. Phys. A: Math. Gen. 21 1689-94, Pippenger 1989 Discrete Appl. Math. 25 273-8). Little is known rigorously about the asymptotics of p{sub n}(K), but there is substantial numerical evidence. It is believed that the entropic exponent, {alpha}, is universal, while the exponential growth rate is independent of the knot type but varies with the lattice. The amplitude, C{sub K}, depends on both the lattice and the knot type. The above asymptotic form implies that the relative probability of a random polygon of length n having prime knot type K over prime knot type L. In the thermodynamic limit this probability ratio becomes an amplitude ratio; it should be universal and depend only on the knot types K and L. In this communication we examine the universality of these probability ratios for polygons in the simple cubic, face-centred cubic and body-centred cubic lattices. Our results support the hypothesis that these are universal quantities. For example, we estimate that a long random polygon is approximately 28 times more likely to be a trefoil than be a figure-eight, independent of the underlying lattice, giving an estimate of the intrinsic entropy associated with knot types in closed curves. (fast track communication)

  20. Inference for binomial probability based on dependent Bernoulli random variables with applications to meta‐analysis and group level studies

    Science.gov (United States)

    Bakbergenuly, Ilyas; Morgenthaler, Stephan

    2016-01-01

    We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group‐level studies or in meta‐analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log‐odds and arcsine transformations of the estimated probability p^, both for single‐group studies and in combining results from several groups or studies in meta‐analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta‐analysis and result in abysmal coverage of the combined effect for large K. We also propose bias‐correction for the arcsine transformation. Our simulations demonstrate that this bias‐correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta‐analyses of prevalence. PMID:27192062

  1. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  3. Internal Medicine residents use heuristics to estimate disease probability

    OpenAIRE

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Background: Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. Method: We randomized 55 In...

  4. Return probabilities for the reflected random walk on N_0

    NARCIS (Netherlands)

    Essifi, R.; Peigné, M.

    2015-01-01

    Let \\((Y_n)\\) be a sequence of i.i.d. \\(\\mathbb{Z }\\)-valued random variables with law \\(\\mu \\). The reflected random walk \\((X_n)\\) is defined recursively by \\(X_0=x \\in \\mathbb{N }_0, X_{n+1}=\\vert X_n+Y_{n+1}\\vert \\). Under mild hypotheses on the law \\(\\mu \\), it is proved that, for any \\( y \\in

  5. Some results on convergence rates for probabilities of moderate deviations for sums of random variables

    Directory of Open Access Journals (Sweden)

    Deli Li

    1992-01-01

    Full Text Available Let X, Xn, n≥1 be a sequence of iid real random variables, and Sn=∑k=1nXk, n≥1. Convergence rates of moderate deviations are derived, i.e., the rate of convergence to zero of certain tail probabilities of the partial sums are determined. For example, we obtain equivalent conditions for the convergence of series ∑n≥1(ψ2(n/nP(|Sn|≥nφ(n only under the assumptions convergence that EX=0 and EX2=1, where φ and ψ are taken from a broad class of functions. These results generalize and improve some recent results of Li (1991 and Gafurov (1982 and some previous work of Davis (1968. For b∈[0,1] and ϵ>0, letλϵ,b=∑n≥3((loglognb/nI(|Sn|≥(2+ϵnloglogn.The behaviour of Eλϵ,b as ϵ↓0 is also studied.

  6. Lay understanding of forensic statistics: Evaluation of random match probabilities, likelihood ratios, and verbal equivalents.

    Science.gov (United States)

    Thompson, William C; Newman, Eryn J

    2015-08-01

    Forensic scientists have come under increasing pressure to quantify the strength of their evidence, but it is not clear which of several possible formats for presenting quantitative conclusions will be easiest for lay people, such as jurors, to understand. This experiment examined the way that people recruited from Amazon's Mechanical Turk (n = 541) responded to 2 types of forensic evidence--a DNA comparison and a shoeprint comparison--when an expert explained the strength of this evidence 3 different ways: using random match probabilities (RMPs), likelihood ratios (LRs), or verbal equivalents of likelihood ratios (VEs). We found that verdicts were sensitive to the strength of DNA evidence regardless of how the expert explained it, but verdicts were sensitive to the strength of shoeprint evidence only when the expert used RMPs. The weight given to DNA evidence was consistent with the predictions of a Bayesian network model that incorporated the perceived risk of a false match from 3 causes (coincidence, a laboratory error, and a frame-up), but shoeprint evidence was undervalued relative to the same Bayesian model. Fallacious interpretations of the expert's testimony (consistent with the source probability error and the defense attorney's fallacy) were common and were associated with the weight given to the evidence and verdicts. The findings indicate that perceptions of forensic science evidence are shaped by prior beliefs and expectations as well as expert testimony and consequently that the best way to characterize and explain forensic evidence may vary across forensic disciplines. (c) 2015 APA, all rights reserved).

  7. An analytical calculation of neighbourhood order probabilities for high dimensional Poissonian processes and mean field models

    International Nuclear Information System (INIS)

    Tercariol, Cesar Augusto Sangaletti; Kiipper, Felipe de Moura; Martinez, Alexandre Souto

    2007-01-01

    Consider that the coordinates of N points are randomly generated along the edges of a d-dimensional hypercube (random point problem). The probability P (d,N) m,n that an arbitrary point is the mth nearest neighbour to its own nth nearest neighbour (Cox probabilities) plays an important role in spatial statistics. Also, it has been useful in the description of physical processes in disordered media. Here we propose a simpler derivation of Cox probabilities, where we stress the role played by the system dimensionality d. In the limit d → ∞, the distances between pair of points become independent (random link model) and closed analytical forms for the neighbourhood probabilities are obtained both for the thermodynamic limit and finite-size system. Breaking the distance symmetry constraint drives us to the random map model, for which the Cox probabilities are obtained for two cases: whether a point is its own nearest neighbour or not

  8. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C. [Arizona State Univ., Tempe, AZ (United States)

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S{sub st}, S{sub st}, p{sub st}) for stochastic uncertainty, a probability space (S{sub su}, S{sub su}, p{sub su}) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S{sub st}, S{sub st}, p{sub st}) and (S{sub su}, S{sub su}, p{sub su}). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency`s standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems.

  9. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  10. Blocked Randomization with Randomly Selected Block Sizes

    Directory of Open Access Journals (Sweden)

    Jimmy Efird

    2010-12-01

    Full Text Available When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes.

  11. Visualizing and Understanding Probability and Statistics: Graphical Simulations Using Excel

    Science.gov (United States)

    Gordon, Sheldon P.; Gordon, Florence S.

    2009-01-01

    The authors describe a collection of dynamic interactive simulations for teaching and learning most of the important ideas and techniques of introductory statistics and probability. The modules cover such topics as randomness, simulations of probability experiments such as coin flipping, dice rolling and general binomial experiments, a simulation…

  12. Random phenomena; Phenomenes aleatoires

    Energy Technology Data Exchange (ETDEWEB)

    Bonnet, G. [Commissariat a l' energie atomique et aux energies alternatives - CEA, C.E.N.G., Service d' Electronique, Section d' Electronique, Grenoble (France)

    1963-07-01

    This document gathers a set of conferences presented in 1962. A first one proposes a mathematical introduction to the analysis of random phenomena. The second one presents an axiomatic of probability calculation. The third one proposes an overview of one-dimensional random variables. The fourth one addresses random pairs, and presents basic theorems regarding the algebra of mathematical expectations. The fifth conference discusses some probability laws: binomial distribution, the Poisson distribution, and the Laplace-Gauss distribution. The last one deals with the issues of stochastic convergence and asymptotic distributions.

  13. Data-driven probability concentration and sampling on manifold

    Energy Technology Data Exchange (ETDEWEB)

    Soize, C., E-mail: christian.soize@univ-paris-est.fr [Université Paris-Est, Laboratoire Modélisation et Simulation Multi-Echelle, MSME UMR 8208 CNRS, 5 bd Descartes, 77454 Marne-La-Vallée Cedex 2 (France); Ghanem, R., E-mail: ghanem@usc.edu [University of Southern California, 210 KAP Hall, Los Angeles, CA 90089 (United States)

    2016-09-15

    A new methodology is proposed for generating realizations of a random vector with values in a finite-dimensional Euclidean space that are statistically consistent with a dataset of observations of this vector. The probability distribution of this random vector, while a priori not known, is presumed to be concentrated on an unknown subset of the Euclidean space. A random matrix is introduced whose columns are independent copies of the random vector and for which the number of columns is the number of data points in the dataset. The approach is based on the use of (i) the multidimensional kernel-density estimation method for estimating the probability distribution of the random matrix, (ii) a MCMC method for generating realizations for the random matrix, (iii) the diffusion-maps approach for discovering and characterizing the geometry and the structure of the dataset, and (iv) a reduced-order representation of the random matrix, which is constructed using the diffusion-maps vectors associated with the first eigenvalues of the transition matrix relative to the given dataset. The convergence aspects of the proposed methodology are analyzed and a numerical validation is explored through three applications of increasing complexity. The proposed method is found to be robust to noise levels and data complexity as well as to the intrinsic dimension of data and the size of experimental datasets. Both the methodology and the underlying mathematical framework presented in this paper contribute new capabilities and perspectives at the interface of uncertainty quantification, statistical data analysis, stochastic modeling and associated statistical inverse problems.

  14. Perceptions of randomized security schedules.

    Science.gov (United States)

    Scurich, Nicholas; John, Richard S

    2014-04-01

    Security of infrastructure is a major concern. Traditional security schedules are unable to provide omnipresent coverage; consequently, adversaries can exploit predictable vulnerabilities to their advantage. Randomized security schedules, which randomly deploy security measures, overcome these limitations, but public perceptions of such schedules have not been examined. In this experiment, participants were asked to make a choice between attending a venue that employed a traditional (i.e., search everyone) or a random (i.e., a probability of being searched) security schedule. The absolute probability of detecting contraband was manipulated (i.e., 1/10, 1/4, 1/2) but equivalent between the two schedule types. In general, participants were indifferent to either security schedule, regardless of the probability of detection. The randomized schedule was deemed more convenient, but the traditional schedule was considered fairer and safer. There were no differences between traditional and random schedule in terms of perceived effectiveness or deterrence. Policy implications for the implementation and utilization of randomized schedules are discussed. © 2013 Society for Risk Analysis.

  15. Inference for binomial probability based on dependent Bernoulli random variables with applications to meta-analysis and group level studies.

    Science.gov (United States)

    Bakbergenuly, Ilyas; Kulinskaya, Elena; Morgenthaler, Stephan

    2016-07-01

    We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group-level studies or in meta-analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log-odds and arcsine transformations of the estimated probability p̂, both for single-group studies and in combining results from several groups or studies in meta-analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta-analysis and result in abysmal coverage of the combined effect for large K. We also propose bias-correction for the arcsine transformation. Our simulations demonstrate that this bias-correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta-analyses of prevalence. © 2016 The Authors. Biometrical Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  16. Event Discrimination Using Seismoacoustic Catalog Probabilities

    Science.gov (United States)

    Albert, S.; Arrowsmith, S.; Bowman, D.; Downey, N.; Koch, C.

    2017-12-01

    Presented here are three seismoacoustic catalogs from various years and locations throughout Utah and New Mexico. To create these catalogs, we combine seismic and acoustic events detected and located using different algorithms. Seismoacoustic events are formed based on similarity of origin time and location. Following seismoacoustic fusion, the data is compared against ground truth events. Each catalog contains events originating from both natural and anthropogenic sources. By creating these seismoacoustic catalogs, we show that the fusion of seismic and acoustic data leads to a better understanding of the nature of individual events. The probability of an event being a surface blast given its presence in each seismoacoustic catalog is quantified. We use these probabilities to discriminate between events from natural and anthropogenic sources. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525.

  17. What's Missing in Teaching Probability and Statistics: Building Cognitive Schema for Understanding Random Phenomena

    Science.gov (United States)

    Kuzmak, Sylvia

    2016-01-01

    Teaching probability and statistics is more than teaching the mathematics itself. Historically, the mathematics of probability and statistics was first developed through analyzing games of chance such as the rolling of dice. This article makes the case that the understanding of probability and statistics is dependent upon building a…

  18. Probability Judgements in Multi-Stage Problems : Experimental Evidence of Systematic Biases

    NARCIS (Netherlands)

    Gneezy, U.

    1996-01-01

    We report empirical evidence that in problems of random walk with positive drift, bounded rationality leads individuals to under-estimate the probability of success in the long run.In particular, individuals who were given the stage by stage probability distribution failed to aggregate this

  19. Probability of coincidental similarity among the orbits of small bodies - I. Pairing

    Science.gov (United States)

    Jopek, Tadeusz Jan; Bronikowska, Małgorzata

    2017-09-01

    Probability of coincidental clustering among orbits of comets, asteroids and meteoroids depends on many factors like: the size of the orbital sample searched for clusters or the size of the identified group, it is different for groups of 2,3,4,… members. Probability of coincidental clustering is assessed by the numerical simulation, therefore, it depends also on the method used for the synthetic orbits generation. We have tested the impact of some of these factors. For a given size of the orbital sample we have assessed probability of random pairing among several orbital populations of different sizes. We have found how these probabilities vary with the size of the orbital samples. Finally, keeping fixed size of the orbital sample we have shown that the probability of random pairing can be significantly different for the orbital samples obtained by different observation techniques. Also for the user convenience we have obtained several formulae which, for given size of the orbital sample can be used to calculate the similarity threshold corresponding to the small value of the probability of coincidental similarity among two orbits.

  20. Sampling, Probability Models and Statistical Reasoning -RE ...

    Indian Academy of Sciences (India)

    random sampling allows data to be modelled with the help of probability ... g based on different trials to get an estimate of the experimental error. ... research interests lie in the .... if e is indeed the true value of the proportion of defectives in the.

  1. Concurrency meets probability: theory and practice (abstract)

    NARCIS (Netherlands)

    Katoen, Joost P.

    Treating random phenomena in concurrency theory has a long tradition. Petri nets [18, 10] and process algebras [14] have been extended with probabilities. The same applies to behavioural semantics such as strong and weak (bi)simulation [1], and testing pre-orders [5]. Beautiful connections between

  2. Prestack inversion based on anisotropic Markov random field-maximum posterior probability inversion and its application to identify shale gas sweet spots

    Science.gov (United States)

    Wang, Kang-Ning; Sun, Zan-Dong; Dong, Ning

    2015-12-01

    Economic shale gas production requires hydraulic fracture stimulation to increase the formation permeability. Hydraulic fracturing strongly depends on geomechanical parameters such as Young's modulus and Poisson's ratio. Fracture-prone sweet spots can be predicted by prestack inversion, which is an ill-posed problem; thus, regularization is needed to obtain unique and stable solutions. To characterize gas-bearing shale sedimentary bodies, elastic parameter variations are regarded as an anisotropic Markov random field. Bayesian statistics are adopted for transforming prestack inversion to the maximum posterior probability. Two energy functions for the lateral and vertical directions are used to describe the distribution, and the expectation-maximization algorithm is used to estimate the hyperparameters of the prior probability of elastic parameters. Finally, the inversion yields clear geological boundaries, high vertical resolution, and reasonable lateral continuity using the conjugate gradient method to minimize the objective function. Antinoise and imaging ability of the method were tested using synthetic and real data.

  3. Introduction to probability and statistics for science, engineering, and finance

    CERN Document Server

    Rosenkrantz, Walter A

    2008-01-01

    Data Analysis Orientation The Role and Scope of Statistics in Science and Engineering Types of Data: Examples from Engineering, Public Health, and Finance The Frequency Distribution of a Variable Defined on a Population Quantiles of a Distribution Measures of Location (Central Value) and Variability Covariance, Correlation, and Regression: Computing a Stock's Beta Mathematical Details and Derivations Large Data Sets Probability Theory Orientation Sample Space, Events, Axioms of Probability Theory Mathematical Models of Random Sampling Conditional Probability and Baye

  4. Probabilities the little numbers that rule our lives

    CERN Document Server

    Olofsson, Peter

    2014-01-01

    Praise for the First Edition"If there is anything you want to know, or remind yourself, about probabilities, then look no further than this comprehensive, yet wittily written and enjoyable, compendium of how to apply probability calculations in real-world situations."- Keith Devlin, Stanford University, National Public Radio's "Math Guy" and author of The Math Gene and The Unfinished GameFrom probable improbabilities to regular irregularities, Probabilities: The Little Numbers That Rule Our Lives, Second Edition investigates the often surprising effects of risk and chance in our lives. Featur

  5. Random broadcast on random geometric graphs

    Energy Technology Data Exchange (ETDEWEB)

    Bradonjic, Milan [Los Alamos National Laboratory; Elsasser, Robert [UNIV OF PADERBORN; Friedrich, Tobias [ICSI/BERKELEY; Sauerwald, Tomas [ICSI/BERKELEY

    2009-01-01

    In this work, we consider the random broadcast time on random geometric graphs (RGGs). The classic random broadcast model, also known as push algorithm, is defined as: starting with one informed node, in each succeeding round every informed node chooses one of its neighbors uniformly at random and informs it. We consider the random broadcast time on RGGs, when with high probability: (i) RGG is connected, (ii) when there exists the giant component in RGG. We show that the random broadcast time is bounded by {Omicron}({radical} n + diam(component)), where diam(component) is a diameter of the entire graph, or the giant component, for the regimes (i), or (ii), respectively. In other words, for both regimes, we derive the broadcast time to be {Theta}(diam(G)), which is asymptotically optimal.

  6. Ruin probability with claims modeled by a stationary ergodic stable process

    NARCIS (Netherlands)

    Mikosch, T.; Samorodnitsky, G.

    2000-01-01

    For a random walk with negative drift we study the exceedance probability (ruin probability) of a high threshold. The steps of this walk (claim sizes) constitute a stationary ergodic stable process. We study how ruin occurs in this situation and evaluate the asymptotic behavior of the ruin

  7. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    Energy Technology Data Exchange (ETDEWEB)

    Garza, J. [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States); Millwater, H., E-mail: harry.millwater@utsa.edu [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States)

    2012-04-15

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: Black-Right-Pointing-Pointer Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. Black-Right-Pointing-Pointer The sensitivities are computed with negligible cost using Monte Carlo sampling. Black-Right-Pointing-Pointer The change in the POF due to a change in the POD curve parameters can be easily estimated.

  8. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2012-01-01

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: ► Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. ►The sensitivities are computed with negligible cost using Monte Carlo sampling. ► The change in the POF due to a change in the POD curve parameters can be easily estimated.

  9. Estimation of Extreme Response and Failure Probability of Wind Turbines under Normal Operation using Probability Density Evolution Method

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Liu, W. F.

    2013-01-01

    Estimation of extreme response and failure probability of structures subjected to ultimate design loads is essential for structural design of wind turbines according to the new standard IEC61400-1. This task is focused on in the present paper in virtue of probability density evolution method (PDEM......), which underlies the schemes of random vibration analysis and structural reliability assessment. The short-term rare failure probability of 5-mega-watt wind turbines, for illustrative purposes, in case of given mean wind speeds and turbulence levels is investigated through the scheme of extreme value...... distribution instead of any other approximate schemes of fitted distribution currently used in statistical extrapolation techniques. Besides, the comparative studies against the classical fitted distributions and the standard Monte Carlo techniques are carried out. Numerical results indicate that PDEM exhibits...

  10. Randomized random walk on a random walk

    International Nuclear Information System (INIS)

    Lee, P.A.

    1983-06-01

    This paper discusses generalizations of the model introduced by Kehr and Kunter of the random walk of a particle on a one-dimensional chain which in turn has been constructed by a random walk procedure. The superimposed random walk is randomised in time according to the occurrences of a stochastic point process. The probability of finding the particle in a particular position at a certain instant is obtained explicitly in the transform domain. It is found that the asymptotic behaviour for large time of the mean-square displacement of the particle depends critically on the assumed structure of the basic random walk, giving a diffusion-like term for an asymmetric walk or a square root law if the walk is symmetric. Many results are obtained in closed form for the Poisson process case, and these agree with those given previously by Kehr and Kunter. (author)

  11. Use of probability tables for propagating uncertainties in neutronics

    International Nuclear Information System (INIS)

    Coste-Delclaux, M.; Diop, C.M.; Lahaye, S.

    2017-01-01

    Highlights: • Moment-based probability table formalism is described. • Representation by probability tables of any uncertainty distribution is established. • Multiband equations for two kinds of uncertainty propagation problems are solved. • Numerical examples are provided and validated against Monte Carlo simulations. - Abstract: Probability tables are a generic tool that allows representing any random variable whose probability density function is known. In the field of nuclear reactor physics, this tool is currently used to represent the variation of cross-sections versus energy (neutron transport codes TRIPOLI4®, MCNP, APOLLO2, APOLLO3®, ECCO/ERANOS…). In the present article we show how we can propagate uncertainties, thanks to a probability table representation, through two simple physical problems: an eigenvalue problem (neutron multiplication factor) and a depletion problem.

  12. A Monte Carlo study of adsorption of random copolymers on random surfaces

    CERN Document Server

    Moghaddam, M S

    2003-01-01

    We study the adsorption problem of a random copolymer on a random surface in which a self-avoiding walk in three dimensions interacts with a plane defining a half-space to which the walk is confined. Each vertex of the walk is randomly labelled A with probability p sub p or B with probability 1 - p sub p , and only vertices labelled A are attracted to the surface plane. Each lattice site on the plane is also labelled either A with probability p sub s or B with probability 1 - p sub s , and only lattice sites labelled A interact with the walk. We study two variations of this model: in the first case the A-vertices of the walk interact only with the A-sites on the surface. In the second case the constraint of selective binding is removed; that is, any contact between the walk and the surface that involves an A-labelling, either from the surface or from the walk, is counted as a visit to the surface. The system is quenched in both cases, i.e. the labellings of the walk and of the surface are fixed as thermodynam...

  13. On the Determinants of the Conjunction Fallacy: Probability versus Inductive Confirmation

    Science.gov (United States)

    Tentori, Katya; Crupi, Vincenzo; Russo, Selena

    2013-01-01

    Major recent interpretations of the conjunction fallacy postulate that people assess the probability of a conjunction according to (non-normative) averaging rules as applied to the constituents' probabilities or represent the conjunction fallacy as an effect of random error in the judgment process. In the present contribution, we contrast such…

  14. Fixation probability in a two-locus intersexual selection model.

    Science.gov (United States)

    Durand, Guillermo; Lessard, Sabin

    2016-06-01

    We study a two-locus model of intersexual selection in a finite haploid population reproducing according to a discrete-time Moran model with a trait locus expressed in males and a preference locus expressed in females. We show that the probability of ultimate fixation of a single mutant allele for a male ornament introduced at random at the trait locus given any initial frequency state at the preference locus is increased by weak intersexual selection and recombination, weak or strong. Moreover, this probability exceeds the initial frequency of the mutant allele even in the case of a costly male ornament if intersexual selection is not too weak. On the other hand, the probability of ultimate fixation of a single mutant allele for a female preference towards a male ornament introduced at random at the preference locus is increased by weak intersexual selection and weak recombination if the female preference is not costly, and is strong enough in the case of a costly male ornament. The analysis relies on an extension of the ancestral recombination-selection graph for samples of haplotypes to take into account events of intersexual selection, while the symbolic calculation of the fixation probabilities is made possible in a reasonable time by an optimizing algorithm. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Bayesian optimization for computationally extensive probability distributions.

    Science.gov (United States)

    Tamura, Ryo; Hukushima, Koji

    2018-01-01

    An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.

  16. Probability problems in seismic risk analysis and load combinations for nuclear power plants

    International Nuclear Information System (INIS)

    George, L.L.

    1983-01-01

    This workshop describes some probability problems in power plant reliability and maintenance analysis. The problems are seismic risk analysis, loss of load probability, load combinations, and load sharing. The seismic risk problem is to compute power plant reliability given an earthquake and the resulting risk. Component survival occurs if its peak random response to the earthquake does not exceed its strength. Power plant survival is a complicated Boolean function of component failures and survivals. The responses and strengths of components are dependent random processes, and the peak responses are maxima of random processes. The resulting risk is the expected cost of power plant failure

  17. Fatigue Reliability under Random Loads

    DEFF Research Database (Denmark)

    Talreja, R.

    1979-01-01

    We consider the problem of estimating the probability of survival (non-failure) and the probability of safe operation (strength greater than a limiting value) of structures subjected to random loads. These probabilities are formulated in terms of the probability distributions of the loads...... propagation stage. The consequences of this behaviour on the fatigue reliability are discussed....

  18. Decomposition of conditional probability for high-order symbolic Markov chains

    Science.gov (United States)

    Melnik, S. S.; Usatenko, O. V.

    2017-07-01

    The main goal of this paper is to develop an estimate for the conditional probability function of random stationary ergodic symbolic sequences with elements belonging to a finite alphabet. We elaborate on a decomposition procedure for the conditional probability function of sequences considered to be high-order Markov chains. We represent the conditional probability function as the sum of multilinear memory function monomials of different orders (from zero up to the chain order). This allows us to introduce a family of Markov chain models and to construct artificial sequences via a method of successive iterations, taking into account at each step increasingly high correlations among random elements. At weak correlations, the memory functions are uniquely expressed in terms of the high-order symbolic correlation functions. The proposed method fills the gap between two approaches, namely the likelihood estimation and the additive Markov chains. The obtained results may have applications for sequential approximation of artificial neural network training.

  19. Determining probabilities of geologic events and processes

    International Nuclear Information System (INIS)

    Hunter, R.L.; Mann, C.J.; Cranwell, R.M.

    1985-01-01

    The Environmental Protection Agency has recently published a probabilistic standard for releases of high-level radioactive waste from a mined geologic repository. The standard sets limits for contaminant releases with more than one chance in 100 of occurring within 10,000 years, and less strict limits for releases of lower probability. The standard offers no methods for determining probabilities of geologic events and processes, and no consensus exists in the waste-management community on how to do this. Sandia National Laboratories is developing a general method for determining probabilities of a given set of geologic events and processes. In addition, we will develop a repeatable method for dealing with events and processes whose probability cannot be determined. 22 refs., 4 figs

  20. Predicting redox-sensitive contaminant concentrations in groundwater using random forest classification

    Science.gov (United States)

    Tesoriero, Anthony J.; Gronberg, Jo Ann; Juckem, Paul F.; Miller, Matthew P.; Austin, Brian P.

    2017-08-01

    Machine learning techniques were applied to a large (n > 10,000) compliance monitoring database to predict the occurrence of several redox-active constituents in groundwater across a large watershed. Specifically, random forest classification was used to determine the probabilities of detecting elevated concentrations of nitrate, iron, and arsenic in the Fox, Wolf, Peshtigo, and surrounding watersheds in northeastern Wisconsin. Random forest classification is well suited to describe the nonlinear relationships observed among several explanatory variables and the predicted probabilities of elevated concentrations of nitrate, iron, and arsenic. Maps of the probability of elevated nitrate, iron, and arsenic can be used to assess groundwater vulnerability and the vulnerability of streams to contaminants derived from groundwater. Processes responsible for elevated concentrations are elucidated using partial dependence plots. For example, an increase in the probability of elevated iron and arsenic occurred when well depths coincided with the glacial/bedrock interface, suggesting a bedrock source for these constituents. Furthermore, groundwater in contact with Ordovician bedrock has a higher likelihood of elevated iron concentrations, which supports the hypothesis that groundwater liberates iron from a sulfide-bearing secondary cement horizon of Ordovician age. Application of machine learning techniques to existing compliance monitoring data offers an opportunity to broadly assess aquifer and stream vulnerability at regional and national scales and to better understand geochemical processes responsible for observed conditions.

  1. Estimating the joint survival probabilities of married individuals

    NARCIS (Netherlands)

    Sanders, Lisanne; Melenberg, Bertrand

    We estimate the joint survival probability of spouses using a large random sample drawn from a Dutch census. As benchmarks we use two bivariate Weibull models. We consider more flexible models, using a semi-nonparametric approach, by extending the independent Weibull distribution using squared

  2. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  3. Probability theory and statistical applications a profound treatise for self-study

    CERN Document Server

    Zörnig, Peter

    2016-01-01

    This accessible and easy-to-read book provides many examples to illustrate diverse topics in probability and statistics, from initial concepts up to advanced calculations. Special attention is devoted e.g. to independency of events, inequalities in probability and functions of random variables. The book is directed to students of mathematics, statistics, engineering, and other quantitative sciences.

  4. SCRAED - Simple and Complex Random Assignment in Experimental Designs

    OpenAIRE

    Alferes, Valentim R.

    2009-01-01

    SCRAED is a package of 37 self-contained SPSS syntax files that performs simple and complex random assignment in experimental designs. For between-subjects designs, SCRAED includes simple random assignment (no restrictions, forced equal sizes, forced unequal sizes, and unequal probabilities), block random assignment (simple and generalized blocks), and stratified random assignment (no restrictions, forced equal sizes, forced unequal sizes, and unequal probabilities). For within-subject...

  5. Digital dice computational solutions to practical probability problems

    CERN Document Server

    Nahin, Paul J

    2013-01-01

    Some probability problems are so difficult that they stump the smartest mathematicians. But even the hardest of these problems can often be solved with a computer and a Monte Carlo simulation, in which a random-number generator simulates a physical process, such as a million rolls of a pair of dice. This is what Digital Dice is all about: how to get numerical answers to difficult probability problems without having to solve complicated mathematical equations. Popular-math writer Paul Nahin challenges readers to solve twenty-one difficult but fun problems, from determining the

  6. Fifty challenging problems in probability with solutions

    CERN Document Server

    Mosteller, Frederick

    1987-01-01

    Can you solve the problem of ""The Unfair Subway""? Marvin gets off work at random times between 3 and 5 p.m. His mother lives uptown, his girlfriend downtown. He takes the first subway that comes in either direction and eats dinner with the one he is delivered to. His mother complains that he never comes to see her, but he says she has a 50-50 chance. He has had dinner with her twice in the last 20 working days. Explain. Marvin's adventures in probability are one of the fifty intriguing puzzles that illustrate both elementary ad advanced aspects of probability, each problem designed to chall

  7. Large LOCA-earthquake combination probability assessment - Load combination program. Project 1 summary report

    Energy Technology Data Exchange (ETDEWEB)

    Lu, S; Streit, R D; Chou, C K

    1980-01-01

    This report summarizes work performed for the U.S. Nuclear Regulatory Commission (NRC) by the Load Combination Program at the Lawrence Livermore National Laboratory to establish a technical basis for the NRC to use in reassessing its requirement that earthquake and large loss-of-coolant accident (LOCA) loads be combined in the design of nuclear power plants. A systematic probabilistic approach is used to treat the random nature of earthquake and transient loading to estimate the probability of large LOCAs that are directly and indirectly induced by earthquakes. A large LOCA is defined in this report as a double-ended guillotine break of the primary reactor coolant loop piping (the hot leg, cold leg, and crossover) of a pressurized water reactor (PWR). Unit 1 of the Zion Nuclear Power Plant, a four-loop PWR-1, is used for this study. To estimate the probability of a large LOCA directly induced by earthquakes, only fatigue crack growth resulting from the combined effects of thermal, pressure, seismic, and other cyclic loads is considered. Fatigue crack growth is simulated with a deterministic fracture mechanics model that incorporates stochastic inputs of initial crack size distribution, material properties, stress histories, and leak detection probability. Results of the simulation indicate that the probability of a double-ended guillotine break, either with or without an earthquake, is very small (on the order of 10{sup -12}). The probability of a leak was found to be several orders of magnitude greater than that of a complete pipe rupture. A limited investigation involving engineering judgment of a double-ended guillotine break indirectly induced by an earthquake is also reported. (author)

  8. Large LOCA-earthquake combination probability assessment - Load combination program. Project 1 summary report

    International Nuclear Information System (INIS)

    Lu, S.; Streit, R.D.; Chou, C.K.

    1980-01-01

    This report summarizes work performed for the U.S. Nuclear Regulatory Commission (NRC) by the Load Combination Program at the Lawrence Livermore National Laboratory to establish a technical basis for the NRC to use in reassessing its requirement that earthquake and large loss-of-coolant accident (LOCA) loads be combined in the design of nuclear power plants. A systematic probabilistic approach is used to treat the random nature of earthquake and transient loading to estimate the probability of large LOCAs that are directly and indirectly induced by earthquakes. A large LOCA is defined in this report as a double-ended guillotine break of the primary reactor coolant loop piping (the hot leg, cold leg, and crossover) of a pressurized water reactor (PWR). Unit 1 of the Zion Nuclear Power Plant, a four-loop PWR-1, is used for this study. To estimate the probability of a large LOCA directly induced by earthquakes, only fatigue crack growth resulting from the combined effects of thermal, pressure, seismic, and other cyclic loads is considered. Fatigue crack growth is simulated with a deterministic fracture mechanics model that incorporates stochastic inputs of initial crack size distribution, material properties, stress histories, and leak detection probability. Results of the simulation indicate that the probability of a double-ended guillotine break, either with or without an earthquake, is very small (on the order of 10 -12 ). The probability of a leak was found to be several orders of magnitude greater than that of a complete pipe rupture. A limited investigation involving engineering judgment of a double-ended guillotine break indirectly induced by an earthquake is also reported. (author)

  9. Information-theoretic methods for estimating of complicated probability distributions

    CERN Document Server

    Zong, Zhi

    2006-01-01

    Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur

  10. Linking of uniform random polygons in confined spaces

    International Nuclear Information System (INIS)

    Arsuaga, J; Blackstone, T; Diao, Y; Karadayi, E; Saito, M

    2007-01-01

    In this paper, we study the topological entanglement of uniform random polygons in a confined space. We derive the formula for the mean squared linking number of such polygons. For a fixed simple closed curve in the confined space, we rigorously show that the linking probability between this curve and a uniform random polygon of n vertices is at least 1-O(1/√n). Our numerical study also indicates that the linking probability between two uniform random polygons (in a confined space), of m and n vertices respectively, is bounded below by 1-O(1/√(mn)). In particular, the linking probability between two uniform random polygons, both of n vertices, is bounded below by 1-O(1/n)

  11. Quantum Zeno and anti-Zeno effects measured by transition probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Wenxian, E-mail: wxzhang@whu.edu.cn [School of Physics and Technology, Wuhan University, Wuhan, Hubei 430072 (China); Department of Optical Science and Engineering, Fudan University, Shanghai 200433 (China); CEMS, RIKEN, Saitama 351-0198 (Japan); Kavli Institute for Theoretical Physics China, CAS, Beijing 100190 (China); Kofman, A.G. [CEMS, RIKEN, Saitama 351-0198 (Japan); Department of Physics, The University of Michigan, Ann Arbor, MI 48109-1040 (United States); Zhuang, Jun [Department of Optical Science and Engineering, Fudan University, Shanghai 200433 (China); You, J.Q. [Beijing Computational Science Research Center, Beijing 10084 (China); Department of Physics, Fudan University, Shanghai 200433 (China); CEMS, RIKEN, Saitama 351-0198 (Japan); Nori, Franco [CEMS, RIKEN, Saitama 351-0198 (Japan); Department of Physics, The University of Michigan, Ann Arbor, MI 48109-1040 (United States)

    2013-10-30

    Using numerical calculations, we compare the transition probabilities of many spins in random magnetic fields, subject to either frequent projective measurements, frequent phase modulations, or a mix of modulations and measurements. For various distribution functions, we find the transition probability under frequent modulations is suppressed most if the pulse delay is short and the evolution time is larger than a critical value. Furthermore, decay freezing occurs only under frequent modulations as the pulse delay approaches zero. In the large pulse-delay region, however, the transition probabilities under frequent modulations are highest among the three control methods.

  12. Modelling the Probability of Landslides Impacting Road Networks

    Science.gov (United States)

    Taylor, F. E.; Malamud, B. D.

    2012-04-01

    During a landslide triggering event, the threat of landslides blocking roads poses a risk to logistics, rescue efforts and communities dependant on those road networks. Here we present preliminary results of a stochastic model we have developed to evaluate the probability of landslides intersecting a simple road network during a landslide triggering event and apply simple network indices to measure the state of the road network in the affected region. A 4000 x 4000 cell array with a 5 m x 5 m resolution was used, with a pre-defined simple road network laid onto it, and landslides 'randomly' dropped onto it. Landslide areas (AL) were randomly selected from a three-parameter inverse gamma probability density function, consisting of a power-law decay of about -2.4 for medium and large values of AL and an exponential rollover for small values of AL; the rollover (maximum probability) occurs at about AL = 400 m2 This statistical distribution was chosen based on three substantially complete triggered landslide inventories recorded in existing literature. The number of landslide areas (NL) selected for each triggered event iteration was chosen to have an average density of 1 landslide km-2, i.e. NL = 400 landslide areas chosen randomly for each iteration, and was based on several existing triggered landslide event inventories. A simple road network was chosen, in a 'T' shape configuration, with one road 1 x 4000 cells (5 m x 20 km) in a 'T' formation with another road 1 x 2000 cells (5 m x 10 km). The landslide areas were then randomly 'dropped' over the road array and indices such as the location, size (ABL) and number of road blockages (NBL) recorded. This process was performed 500 times (iterations) in a Monte-Carlo type simulation. Initial results show that for a landslide triggering event with 400 landslides over a 400 km2 region, the number of road blocks per iteration, NBL,ranges from 0 to 7. The average blockage area for the 500 iterations (A¯ BL) is about 3000 m

  13. Random walk generated by random permutations of {1, 2, 3, ..., n + 1}

    International Nuclear Information System (INIS)

    Oshanin, G; Voituriez, R

    2004-01-01

    We study properties of a non-Markovian random walk X (n) l , l = 0, 1, 2, ..., n, evolving in discrete time l on a one-dimensional lattice of integers, whose moves to the right or to the left are prescribed by the rise-and-descent sequences characterizing random permutations π of [n + 1] = {1, 2, 3, ..., n + 1}. We determine exactly the probability of finding the end-point X n = X (n) n of the trajectory of such a permutation-generated random walk (PGRW) at site X, and show that in the limit n → ∞ it converges to a normal distribution with a smaller, compared to the conventional Polya random walk, diffusion coefficient. We formulate, as well, an auxiliary stochastic process whose distribution is identical to the distribution of the intermediate points X (n) l , l < n, which enables us to obtain the probability measure of different excursions and to define the asymptotic distribution of the number of 'turns' of the PGRW trajectories

  14. Probability theory plus noise: Replies to Crupi and Tentori (2016) and to Nilsson, Juslin, and Winman (2016).

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2016-01-01

    A standard assumption in much of current psychology is that people do not reason about probability using the rules of probability theory but instead use various heuristics or "rules of thumb," which can produce systematic reasoning biases. In Costello and Watts (2014), we showed that a number of these biases can be explained by a model where people reason according to probability theory but are subject to random noise. More importantly, that model also predicted agreement with probability theory for certain expressions that cancel the effects of random noise: Experimental results strongly confirmed this prediction, showing that probabilistic reasoning is simultaneously systematically biased and "surprisingly rational." In their commentaries on that paper, both Crupi and Tentori (2016) and Nilsson, Juslin, and Winman (2016) point to various experimental results that, they suggest, our model cannot explain. In this reply, we show that our probability theory plus noise model can in fact explain every one of the results identified by these authors. This gives a degree of additional support to the view that people's probability judgments embody the rational rules of probability theory and that biases in those judgments can be explained as simply effects of random noise. (c) 2015 APA, all rights reserved).

  15. Flipping Out: Calculating Probability with a Coin Game

    Science.gov (United States)

    Degner, Kate

    2015-01-01

    In the author's experience with this activity, students struggle with the idea of representativeness in probability. Therefore, this student misconception is part of the classroom discussion about the activities in this lesson. Representativeness is related to the (incorrect) idea that outcomes that seem more random are more likely to happen. This…

  16. Making Heads or Tails of Probability: An Experiment with Random Generators

    Science.gov (United States)

    Morsanyi, Kinga; Handley, Simon J.; Serpell, Sylvie

    2013-01-01

    Background: The equiprobability bias is a tendency for individuals to think of probabilistic events as "equiprobable" by nature, and to judge outcomes that occur with different probabilities as equally likely. The equiprobability bias has been repeatedly found to be related to formal education in statistics, and it is claimed to be based…

  17. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H

    2007-01-01

    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  18. Generation of pseudo-random numbers

    Science.gov (United States)

    Howell, L. W.; Rheinfurth, M. H.

    1982-01-01

    Practical methods for generating acceptable random numbers from a variety of probability distributions which are frequently encountered in engineering applications are described. The speed, accuracy, and guarantee of statistical randomness of the various methods are discussed.

  19. Linking of uniform random polygons in confined spaces

    Science.gov (United States)

    Arsuaga, J.; Blackstone, T.; Diao, Y.; Karadayi, E.; Saito, M.

    2007-03-01

    In this paper, we study the topological entanglement of uniform random polygons in a confined space. We derive the formula for the mean squared linking number of such polygons. For a fixed simple closed curve in the confined space, we rigorously show that the linking probability between this curve and a uniform random polygon of n vertices is at least 1-O\\big(\\frac{1}{\\sqrt{n}}\\big) . Our numerical study also indicates that the linking probability between two uniform random polygons (in a confined space), of m and n vertices respectively, is bounded below by 1-O\\big(\\frac{1}{\\sqrt{mn}}\\big) . In particular, the linking probability between two uniform random polygons, both of n vertices, is bounded below by 1-O\\big(\\frac{1}{n}\\big) .

  20. Probability that a specific cancer and a specified radiation exposure are causally related

    International Nuclear Information System (INIS)

    Breitenstein, B.D.

    1988-01-01

    It is fundamental that a given cancer case cannot be attributed with absolute certainty to a prior ionizing radiation exposure, whatever the level of exposure. It is possible to estimate the probability of a causal relationship based on data and models that have been inferred from group statistics. Two types of information are needed to make these probability calculations: natural cancer incidence rates and risks of cancer induction from ionizing radiation. Cancer incidence rates for the United States are available in the report of the Surveillance, Epidemiology and End Results (SEER) program of the National Cancer Institute. Estimates of the risk of cancer induction from ionizing radiation have been published by the Advisory Committee on the Biological Effects of Ionizing Radiation (BEIR) of the National Academy of Sciences, the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR), and the International Commission on Radiological Protection (ICRP). Using the parameters discussed above, the probability of causation formulation estimates the probability that a person who develops a particular cancer after a known quantifiable radiation exposure has the cancer as a result of the exposure. In 1985, the National Institutes of Health, responding to a U.S. Congressional mandate, published radioepidemiologic tables using the probability-of-causation method

  1. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  2. Free random variables

    CERN Document Server

    Voiculescu, Dan; Nica, Alexandru

    1992-01-01

    This book presents the first comprehensive introduction to free probability theory, a highly noncommutative probability theory with independence based on free products instead of tensor products. Basic examples of this kind of theory are provided by convolution operators on free groups and by the asymptotic behavior of large Gaussian random matrices. The probabilistic approach to free products has led to a recent surge of new results on the von Neumann algebras of free groups. The book is ideally suited as a textbook for an advanced graduate course and could also provide material for a seminar. In addition to researchers and graduate students in mathematics, this book will be of interest to physicists and others who use random matrices.

  3. Theory of random sets

    CERN Document Server

    Molchanov, Ilya

    2017-01-01

    This monograph, now in a thoroughly revised second edition, offers the latest research on random sets. It has been extended to include substantial developments achieved since 2005, some of them motivated by applications of random sets to econometrics and finance. The present volume builds on the foundations laid by Matheron and others, including the vast advances in stochastic geometry, probability theory, set-valued analysis, and statistical inference. It shows the various interdisciplinary relationships of random set theory within other parts of mathematics, and at the same time fixes terminology and notation that often vary in the literature, establishing it as a natural part of modern probability theory and providing a platform for future development. It is completely self-contained, systematic and exhaustive, with the full proofs that are necessary to gain insight. Aimed at research level, Theory of Random Sets will be an invaluable reference for probabilists; mathematicians working in convex and integ...

  4. Survival probability in a one-dimensional quantum walk on a trapped lattice

    International Nuclear Information System (INIS)

    Goenuelol, Meltem; Aydiner, Ekrem; Shikano, Yutaka; Muestecaplioglu, Oezguer E

    2011-01-01

    The dynamics of the survival probability of quantum walkers on a one-dimensional lattice with random distribution of absorbing immobile traps is investigated. The survival probability of quantum walkers is compared with that of classical walkers. It is shown that the time dependence of the survival probability of quantum walkers has a piecewise stretched exponential character depending on the density of traps in numerical and analytical observations. The crossover between the quantum analogues of the Rosenstock and Donsker-Varadhan behavior is identified.

  5. Theory of overdispersion in counting statistics caused by fluctuating probabilities

    International Nuclear Information System (INIS)

    Semkow, Thomas M.

    1999-01-01

    It is shown that the random Lexis fluctuations of probabilities such as probability of decay or detection cause the counting statistics to be overdispersed with respect to the classical binomial, Poisson, or Gaussian distributions. The generating and the distribution functions for the overdispersed counting statistics are derived. Applications to radioactive decay with detection and more complex experiments are given, as well as distinguishing between the source and background, in the presence of overdispersion. Monte-Carlo verifications are provided

  6. Randomized central limit theorems: A unified theory.

    Science.gov (United States)

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles' aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles' extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic-scaling all ensemble components by a common deterministic scale. However, there are "random environment" settings in which the underlying scaling schemes are stochastic-scaling the ensemble components by different random scales. Examples of such settings include Holtsmark's law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)-in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes-and present "randomized counterparts" to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  7. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  8. Sensitivity analysis of limit state functions for probability-based plastic design

    Science.gov (United States)

    Frangopol, D. M.

    1984-01-01

    The evaluation of the total probability of a plastic collapse failure P sub f for a highly redundant structure of random interdependent plastic moments acted on by random interdepedent loads is a difficult and computationally very costly process. The evaluation of reasonable bounds to this probability requires the use of second moment algebra which involves man statistical parameters. A computer program which selects the best strategy for minimizing the interval between upper and lower bounds of P sub f is now in its final stage of development. The relative importance of various uncertainties involved in the computational process on the resulting bounds of P sub f, sensitivity is analyzed. Response sensitivities for both mode and system reliability of an ideal plastic portal frame are shown.

  9. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  10. Pseudo-random number generator based on asymptotic deterministic randomness

    Science.gov (United States)

    Wang, Kai; Pei, Wenjiang; Xia, Haishan; Cheung, Yiu-ming

    2008-06-01

    A novel approach to generate the pseudorandom-bit sequence from the asymptotic deterministic randomness system is proposed in this Letter. We study the characteristic of multi-value correspondence of the asymptotic deterministic randomness constructed by the piecewise linear map and the noninvertible nonlinearity transform, and then give the discretized systems in the finite digitized state space. The statistic characteristics of the asymptotic deterministic randomness are investigated numerically, such as stationary probability density function and random-like behavior. Furthermore, we analyze the dynamics of the symbolic sequence. Both theoretical and experimental results show that the symbolic sequence of the asymptotic deterministic randomness possesses very good cryptographic properties, which improve the security of chaos based PRBGs and increase the resistance against entropy attacks and symbolic dynamics attacks.

  11. Pseudo-random number generator based on asymptotic deterministic randomness

    International Nuclear Information System (INIS)

    Wang Kai; Pei Wenjiang; Xia Haishan; Cheung Yiuming

    2008-01-01

    A novel approach to generate the pseudorandom-bit sequence from the asymptotic deterministic randomness system is proposed in this Letter. We study the characteristic of multi-value correspondence of the asymptotic deterministic randomness constructed by the piecewise linear map and the noninvertible nonlinearity transform, and then give the discretized systems in the finite digitized state space. The statistic characteristics of the asymptotic deterministic randomness are investigated numerically, such as stationary probability density function and random-like behavior. Furthermore, we analyze the dynamics of the symbolic sequence. Both theoretical and experimental results show that the symbolic sequence of the asymptotic deterministic randomness possesses very good cryptographic properties, which improve the security of chaos based PRBGs and increase the resistance against entropy attacks and symbolic dynamics attacks

  12. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  13. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  14. Hitting probabilities for nonlinear systems of stochastic waves

    CERN Document Server

    Dalang, Robert C

    2015-01-01

    The authors consider a d-dimensional random field u = \\{u(t,x)\\} that solves a non-linear system of stochastic wave equations in spatial dimensions k \\in \\{1,2,3\\}, driven by a spatially homogeneous Gaussian noise that is white in time. They mainly consider the case where the spatial covariance is given by a Riesz kernel with exponent \\beta. Using Malliavin calculus, they establish upper and lower bounds on the probabilities that the random field visits a deterministic subset of \\mathbb{R}^d, in terms, respectively, of Hausdorff measure and Newtonian capacity of this set. The dimension that ap

  15. Multipartite nonlocality and random measurements

    Science.gov (United States)

    de Rosier, Anna; Gruca, Jacek; Parisio, Fernando; Vértesi, Tamás; Laskowski, Wiesław

    2017-07-01

    We present an exhaustive numerical analysis of violations of local realism by families of multipartite quantum states. As an indicator of nonclassicality we employ the probability of violation for randomly sampled observables. Surprisingly, it rapidly increases with the number of parties or settings and even for relatively small values local realism is violated for almost all observables. We have observed this effect to be typical in the sense that it emerged for all investigated states including some with randomly drawn coefficients. We also present the probability of violation as a witness of genuine multipartite entanglement.

  16. A procedure for estimation of pipe break probabilities due to IGSCC

    International Nuclear Information System (INIS)

    Bergman, M.; Brickstad, B.; Nilsson, F.

    1998-06-01

    A procedure has been developed for estimation of the failure probability of welds joints in nuclear piping susceptible to intergranular stress corrosion cracking. The procedure aims at a robust and rapid estimate of the failure probability for a specific weld with known stress state. Random properties are taken into account of the crack initiation rate, the initial crack length, the in-service inspection efficiency and the leak rate. A computer realization of the procedure has been developed for user friendly applications by design engineers. Some examples are considered to investigate the sensitivity of the failure probability to different input quantities. (au)

  17. Sexual behaviors, relationships, and perceived health status among adult women in the United States: results from a national probability sample.

    Science.gov (United States)

    Herbenick, Debby; Reece, Michael; Schick, Vanessa; Sanders, Stephanie A; Dodge, Brian; Fortenberry, J Dennis

    2010-10-01

    Past surveys of sexual behavior have demonstrated that female sexual behavior is influenced by medical and sociocultural changes. To be most attentive to women and their sexual lives, it is important to have an understanding of the continually evolving sexual behaviors of contemporary women in the United States. The purpose of this study, the National Survey of Sexual Health and Behavior (NSSHB), was to, in a national probability survey of women ages 18-92, assess the proportion of women in various age cohorts who had engaged in solo and partnered sexual activities in the past 90 days and to explore associations with participants' sexual behavior and their relationship and perceived health status. Past year frequencies of masturbation, vaginal intercourse, and anal intercourse were also assessed. A national probability sample of 2,523 women ages 18 to 92 completed a cross-sectional internet based survey about their sexual behavior. Relationship status; perceived health status; experience of solo masturbation, partnered masturbation, giving oral sex, receiving oral sex, vaginal intercourse, anal intercourse, in the past 90 days; frequency of solo masturbation, vaginal intercourse, and anal intercourse in the past year. Recent solo masturbation, oral sex, and vaginal intercourse were prevalent among women, decreased with age, and varied in their associations with relationship and perceived health status. Recent anal sex and same-sex oral sex were uncommonly reported. Solo masturbation was most frequent among women ages 18 to 39, vaginal intercourse was most frequent among women ages 18 to 29 and anal sex was infrequently reported. Contemporary women in the United States engage in a diverse range of solo and partnered sexual activities, though sexual behavior is less common and more infrequent among older age cohorts. © 2010 International Society for Sexual Medicine.

  18. Probability of inadvertent operation of electrical components in harsh environments

    International Nuclear Information System (INIS)

    Knoll, A.

    1989-01-01

    Harsh environment, which means humidity and high temperature, may and will affect unsealed electrical components by causing leakage ground currents in ungrounded direct current systems. The concern in a nuclear power plant is that such harsh environment conditions could cause inadvertent operation of normally deenergized components, which may have a safety-related isolation function. Harsh environment is a common cause failure, and one way to approach the problem is to assume that all the unsealed electrical components will simultaneously and inadvertently energize as a result of the environmental common cause failure. This assumption is unrealistically conservative. Test results indicated that insulating resistences of any terminal block in harsh environments have a random distribution in the range of 1 to 270 kΩ, with a mean value ∼59 kΩ. The objective of this paper is to evaluate a realistic conditional failure probability for inadvertent operation of electrical components in harsh environments. This value will be used thereafter in probabilistic safety evaluations of harsh environment events and will replace both the overconservative common cause probability of 1 and the random failure probability used for mild environments

  19. On the pertinence to Physics of random walks induced by random dynamical systems: a survey

    International Nuclear Information System (INIS)

    Petritis, Dimitri

    2016-01-01

    Let be an abstract space and a denumerable (finite or infinite) alphabet. Suppose that is a family of functions such that for all we have and a family of transformations . The pair (( S_a)_a , ( p_a)_a ) is termed an iterated function system with place dependent probabilities. Such systems can be thought as generalisations of random dynamical systems. As a matter of fact, suppose we start from a given ; we pick then randomly, with probability p_a (x) , the transformation S_a and evolve to S_a (x) . We are interested in the behaviour of the system when the iteration continues indefinitely. Random walks of the above type are omnipresent in both classical and quantum Physics. To give a small sample of occurrences we mention: random walks on the affine group, random walks on Penrose lattices, random walks on partially directed lattices, evolution of density matrices induced by repeated quantum measurements, quantum channels, quantum random walks, etc. In this article, we review some basic properties of such systems and provide with a pathfinder in the extensive bibliography (both on mathematical and physical sides) where the main results have been originally published. (paper)

  20. k-Means: Random Sampling Procedure

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. k-Means: Random Sampling Procedure. Optimal 1-Mean is. Approximation of Centroid (Inaba et al). S = random sample of size O(1/ ); Centroid of S is a (1+ )-approx centroid of P with constant probability.

  1. Interpretation of the results of statistical measurements. [search for basic probability model

    Science.gov (United States)

    Olshevskiy, V. V.

    1973-01-01

    For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.

  2. Determination of bounds on failure probability in the presence of ...

    Indian Academy of Sciences (India)

    In particular, fuzzy set theory provides a more rational framework for ..... indicating that the random variations inT andO2 do not affect failure probability significantly. ... The upper-bound for PF shown in figure 6 can be used in decision-making.

  3. Sexual diversity in the United States: Results from a nationally representative probability sample of adult women and men.

    Directory of Open Access Journals (Sweden)

    Debby Herbenick

    Full Text Available In 2015, we conducted a cross-sectional, Internet-based, U.S. nationally representative probability survey of 2,021 adults (975 men, 1,046 women focused on a broad range of sexual behaviors. Individuals invited to participate were from the GfK KnowledgePanel®. The survey was titled the 2015 Sexual Exploration in America Study and survey completion took about 12 to 15 minutes. The survey was confidential and the researchers never had access to respondents' identifiers. Respondents reported on demographic items, lifetime and recent sexual behaviors, and the appeal of 50+ sexual behaviors. Most (>80% reported lifetime masturbation, vaginal sex, and oral sex. Lifetime anal sex was reported by 43% of men (insertive and 37% of women (receptive. Common lifetime sexual behaviors included wearing sexy lingerie/underwear (75% women, 26% men, sending/receiving digital nude/semi-nude photos (54% women, 65% men, reading erotic stories (57% of participants, public sex (≥43%, role-playing (≥22%, tying/being tied up (≥20%, spanking (≥30%, and watching sexually explicit videos/DVDs (60% women, 82% men. Having engaged in threesomes (10% women, 18% men and playful whipping (≥13% were less common. Lifetime group sex, sex parties, taking a sexuality class/workshop, and going to BDSM parties were uncommon (each <8%. More Americans identified behaviors as "appealing" than had engaged in them. Romantic/affectionate behaviors were among those most commonly identified as appealing for both men and women. The appeal of particular behaviors was associated with greater odds that the individual had ever engaged in the behavior. This study contributes to our understanding of more diverse adult sexual behaviors than has previously been captured in U.S. nationally representative probability surveys. Implications for sexuality educators, clinicians, and individuals in the general population are discussed.

  4. Sexual diversity in the United States: Results from a nationally representative probability sample of adult women and men

    Science.gov (United States)

    Herbenick, Debby; Bowling, Jessamyn; Fu, Tsung-Chieh (Jane); Guerra-Reyes, Lucia; Sanders, Stephanie

    2017-01-01

    In 2015, we conducted a cross-sectional, Internet-based, U.S. nationally representative probability survey of 2,021 adults (975 men, 1,046 women) focused on a broad range of sexual behaviors. Individuals invited to participate were from the GfK KnowledgePanel®. The survey was titled the 2015 Sexual Exploration in America Study and survey completion took about 12 to 15 minutes. The survey was confidential and the researchers never had access to respondents’ identifiers. Respondents reported on demographic items, lifetime and recent sexual behaviors, and the appeal of 50+ sexual behaviors. Most (>80%) reported lifetime masturbation, vaginal sex, and oral sex. Lifetime anal sex was reported by 43% of men (insertive) and 37% of women (receptive). Common lifetime sexual behaviors included wearing sexy lingerie/underwear (75% women, 26% men), sending/receiving digital nude/semi-nude photos (54% women, 65% men), reading erotic stories (57% of participants), public sex (≥43%), role-playing (≥22%), tying/being tied up (≥20%), spanking (≥30%), and watching sexually explicit videos/DVDs (60% women, 82% men). Having engaged in threesomes (10% women, 18% men) and playful whipping (≥13%) were less common. Lifetime group sex, sex parties, taking a sexuality class/workshop, and going to BDSM parties were uncommon (each <8%). More Americans identified behaviors as “appealing” than had engaged in them. Romantic/affectionate behaviors were among those most commonly identified as appealing for both men and women. The appeal of particular behaviors was associated with greater odds that the individual had ever engaged in the behavior. This study contributes to our understanding of more diverse adult sexual behaviors than has previously been captured in U.S. nationally representative probability surveys. Implications for sexuality educators, clinicians, and individuals in the general population are discussed. PMID:28727762

  5. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  6. On the Distribution of Random Geometric Graphs

    DEFF Research Database (Denmark)

    Badiu, Mihai Alin; Coon, Justin P.

    2018-01-01

    as a measure of the graph’s topological uncertainty (or information content). Moreover, the distribution is also relevant for determining average network performance or designing protocols. However, a major impediment in deducing the graph distribution is that it requires the joint probability distribution......Random geometric graphs (RGGs) are commonly used to model networked systems that depend on the underlying spatial embedding. We concern ourselves with the probability distribution of an RGG, which is crucial for studying its random topology, properties (e.g., connectedness), or Shannon entropy...... of the n(n − 1)/2 distances between n nodes randomly distributed in a bounded domain. As no such result exists in the literature, we make progress by obtaining the joint distribution of the distances between three nodes confined in a disk in R 2. This enables the calculation of the probability distribution...

  7. STADIC: a computer code for combining probability distributions

    International Nuclear Information System (INIS)

    Cairns, J.J.; Fleming, K.N.

    1977-03-01

    The STADIC computer code uses a Monte Carlo simulation technique for combining probability distributions. The specific function for combination of the input distribution is defined by the user by introducing the appropriate FORTRAN statements to the appropriate subroutine. The code generates a Monte Carlo sampling from each of the input distributions and combines these according to the user-supplied function to provide, in essence, a random sampling of the combined distribution. When the desired number of samples is obtained, the output routine calculates the mean, standard deviation, and confidence limits for the resultant distribution. This method of combining probability distributions is particularly useful in cases where analytical approaches are either too difficult or undefined

  8. Left passage probability of Schramm-Loewner Evolution

    Science.gov (United States)

    Najafi, M. N.

    2013-06-01

    SLE(κ,ρ⃗) is a variant of Schramm-Loewner Evolution (SLE) which describes the curves which are not conformal invariant, but are self-similar due to the presence of some other preferred points on the boundary. In this paper we study the left passage probability (LPP) of SLE(κ,ρ⃗) through field theoretical framework and find the differential equation governing this probability. This equation is numerically solved for the special case κ=2 and hρ=0 in which hρ is the conformal weight of the boundary changing (bcc) operator. It may be referred to loop erased random walk (LERW) and Abelian sandpile model (ASM) with a sink on its boundary. For the curve which starts from ξ0 and conditioned by a change of boundary conditions at x0, we find that this probability depends significantly on the factor x0-ξ0. We also present the perturbative general solution for large x0. As a prototype, we apply this formalism to SLE(κ,κ-6) which governs the curves that start from and end on the real axis.

  9. Simulation study on characteristics of long-range interaction in randomly asymmetric exclusion process

    Science.gov (United States)

    Zhao, Shi-Bo; Liu, Ming-Zhe; Yang, Lan-Ying

    2015-04-01

    In this paper we investigate the dynamics of an asymmetric exclusion process on a one-dimensional lattice with long-range hopping and random update via Monte Carlo simulations theoretically. Particles in the model will firstly try to hop over successive unoccupied sites with a probability q, which is different from previous exclusion process models. The probability q may represent the random access of particles. Numerical simulations for stationary particle currents, density profiles, and phase diagrams are obtained. There are three possible stationary phases: the low density (LD) phase, high density (HD) phase, and maximal current (MC) in the system, respectively. Interestingly, bulk density in the LD phase tends to zero, while the MC phase is governed by α, β, and q. The HD phase is nearly the same as the normal TASEP, determined by exit rate β. Theoretical analysis is in good agreement with simulation results. The proposed model may provide a better understanding of random interaction dynamics in complex systems. Project supported by the National Natural Science Foundation of China (Grant Nos. 41274109 and 11104022), the Fund for Sichuan Youth Science and Technology Innovation Research Team (Grant No. 2011JTD0013), and the Creative Team Program of Chengdu University of Technology.

  10. Global Grid of Probabilities of Urban Expansion to 2030

    Data.gov (United States)

    National Aeronautics and Space Administration — The Global Grid of Probabilities of Urban Expansion to 2030 presents spatially explicit probabilistic forecasts of global urban land cover change from 2000 to 2030...

  11. Probability distributions for Markov chain based quantum walks

    Science.gov (United States)

    Balu, Radhakrishnan; Liu, Chaobin; Venegas-Andraca, Salvador E.

    2018-01-01

    We analyze the probability distributions of the quantum walks induced from Markov chains by Szegedy (2004). The first part of this paper is devoted to the quantum walks induced from finite state Markov chains. It is shown that the probability distribution on the states of the underlying Markov chain is always convergent in the Cesaro sense. In particular, we deduce that the limiting distribution is uniform if the transition matrix is symmetric. In the case of a non-symmetric Markov chain, we exemplify that the limiting distribution of the quantum walk is not necessarily identical with the stationary distribution of the underlying irreducible Markov chain. The Szegedy scheme can be extended to infinite state Markov chains (random walks). In the second part, we formulate the quantum walk induced from a lazy random walk on the line. We then obtain the weak limit of the quantum walk. It is noted that the current quantum walk appears to spread faster than its counterpart-quantum walk on the line driven by the Grover coin discussed in literature. The paper closes with an outlook on possible future directions.

  12. Exclusion probabilities and likelihood ratios with applications to mixtures.

    Science.gov (United States)

    Slooten, Klaas-Jan; Egeland, Thore

    2016-01-01

    The statistical evidence obtained from mixed DNA profiles can be summarised in several ways in forensic casework including the likelihood ratio (LR) and the Random Man Not Excluded (RMNE) probability. The literature has seen a discussion of the advantages and disadvantages of likelihood ratios and exclusion probabilities, and part of our aim is to bring some clarification to this debate. In a previous paper, we proved that there is a general mathematical relationship between these statistics: RMNE can be expressed as a certain average of the LR, implying that the expected value of the LR, when applied to an actual contributor to the mixture, is at least equal to the inverse of the RMNE. While the mentioned paper presented applications for kinship problems, the current paper demonstrates the relevance for mixture cases, and for this purpose, we prove some new general properties. We also demonstrate how to use the distribution of the likelihood ratio for donors of a mixture, to obtain estimates for exceedance probabilities of the LR for non-donors, of which the RMNE is a special case corresponding to L R>0. In order to derive these results, we need to view the likelihood ratio as a random variable. In this paper, we describe how such a randomization can be achieved. The RMNE is usually invoked only for mixtures without dropout. In mixtures, artefacts like dropout and drop-in are commonly encountered and we address this situation too, illustrating our results with a basic but widely implemented model, a so-called binary model. The precise definitions, modelling and interpretation of the required concepts of dropout and drop-in are not entirely obvious, and we attempt to clarify them here in a general likelihood framework for a binary model.

  13. Scaling Argument of Anisotropic Random Walk

    International Nuclear Information System (INIS)

    Xu Bingzhen; Jin Guojun; Wang Feifeng

    2005-01-01

    In this paper, we analytically discuss the scaling properties of the average square end-to-end distance (R 2 ) for anisotropic random walk in D-dimensional space (D≥2), and the returning probability P n (r 0 ) for the walker into a certain neighborhood of the origin. We will not only give the calculating formula for (R 2 ) and P n (r 0 ), but also point out that if there is a symmetric axis for the distribution of the probability density of a single step displacement, we always obtain (R p erpendicular n 2 )∼n, where perpendicular refers to the projections of the displacement perpendicular to each symmetric axes of the walk; in D-dimensional space with D symmetric axes perpendicular to each other, we always have (R n 2 )∼n and the random walk will be like a purely random motion; if the number of inter-perpendicular symmetric axis is smaller than the dimensions of the space, we must have (R n 2 )∼n 2 for very large n and the walk will be like a ballistic motion. It is worth while to point out that unlike the isotropic random walk in one and two dimensions, which is certain to return into the neighborhood of the origin, generally there is only a nonzero probability for the anisotropic random walker in two dimensions to return to the neighborhood.

  14. Heuristics can produce surprisingly rational probability estimates: Comment on Costello and Watts (2014).

    Science.gov (United States)

    Nilsson, Håkan; Juslin, Peter; Winman, Anders

    2016-01-01

    Costello and Watts (2014) present a model assuming that people's knowledge of probabilities adheres to probability theory, but that their probability judgments are perturbed by a random noise in the retrieval from memory. Predictions for the relationships between probability judgments for constituent events and their disjunctions and conjunctions, as well as for sums of such judgments were derived from probability theory. Costello and Watts (2014) report behavioral data showing that subjective probability judgments accord with these predictions. Based on the finding that subjective probability judgments follow probability theory, Costello and Watts (2014) conclude that the results imply that people's probability judgments embody the rules of probability theory and thereby refute theories of heuristic processing. Here, we demonstrate the invalidity of this conclusion by showing that all of the tested predictions follow straightforwardly from an account assuming heuristic probability integration (Nilsson, Winman, Juslin, & Hansson, 2009). We end with a discussion of a number of previous findings that harmonize very poorly with the predictions by the model suggested by Costello and Watts (2014). (c) 2015 APA, all rights reserved).

  15. Recursive recovery of Markov transition probabilities from boundary value data

    Energy Technology Data Exchange (ETDEWEB)

    Patch, Sarah Kathyrn [Univ. of California, Berkeley, CA (United States)

    1994-04-01

    In an effort to mathematically describe the anisotropic diffusion of infrared radiation in biological tissue Gruenbaum posed an anisotropic diffusion boundary value problem in 1989. In order to accommodate anisotropy, he discretized the temporal as well as the spatial domain. The probabilistic interpretation of the diffusion equation is retained; radiation is assumed to travel according to a random walk (of sorts). In this random walk the probabilities with which photons change direction depend upon their previous as well as present location. The forward problem gives boundary value data as a function of the Markov transition probabilities. The inverse problem requires finding the transition probabilities from boundary value data. Problems in the plane are studied carefully in this thesis. Consistency conditions amongst the data are derived. These conditions have two effects: they prohibit inversion of the forward map but permit smoothing of noisy data. Next, a recursive algorithm which yields a family of solutions to the inverse problem is detailed. This algorithm takes advantage of all independent data and generates a system of highly nonlinear algebraic equations. Pluecker-Grassmann relations are instrumental in simplifying the equations. The algorithm is used to solve the 4 x 4 problem. Finally, the smallest nontrivial problem in three dimensions, the 2 x 2 x 2 problem, is solved.

  16. Physical Activity Improves Verbal and Spatial Memory in Older Adults with Probable Mild Cognitive Impairment: A 6-Month Randomized Controlled Trial

    Directory of Open Access Journals (Sweden)

    Lindsay S. Nagamatsu

    2013-01-01

    Full Text Available We report secondary findings from a randomized controlled trial on the effects of exercise on memory in older adults with probable MCI. We randomized 86 women aged 70–80 years with subjective memory complaints into one of three groups: resistance training, aerobic training, or balance and tone (control. All participants exercised twice per week for six months. We measured verbal memory and learning using the Rey Auditory Verbal Learning Test (RAVLT and spatial memory using a computerized test, before and after trial completion. We found that the aerobic training group remembered significantly more items in the loss after interference condition of the RAVLT compared with the control group after six months of training. In addition, both experimental groups showed improved spatial memory performance in the most difficult condition where they were required to memorize the spatial location of three items, compared with the control group. Lastly, we found a significant correlation between spatial memory performance and overall physical capacity after intervention in the aerobic training group. Taken together, our results provide support for the prevailing notion that exercise can positively impact cognitive functioning and may represent an effective strategy to improve memory in those who have begun to experience cognitive decline.

  17. Neutrosophic Probability, Set, And Logic (first version)

    OpenAIRE

    Smarandache, Florentin

    2000-01-01

    This project is a part of a National Science Foundation interdisciplinary project proposal. Starting from a new viewpoint in philosophy, the neutrosophy, one extends the classical "probability theory", "fuzzy set" and "fuzzy logic" to , and respectively. They are useful in artificial intelligence, neural networks, evolutionary programming, neutrosophic dynamic systems, and quantum mechanics.

  18. STRIP: stream learning of influence probabilities

    DEFF Research Database (Denmark)

    Kutzkov, Konstantin

    2013-01-01

    cascades, and developing applications such as viral marketing. Motivated by modern microblogging platforms, such as twitter, in this paper we study the problem of learning influence probabilities in a data-stream scenario, in which the network topology is relatively stable and the challenge of a learning...... algorithm is to keep up with a continuous stream of tweets using a small amount of time and memory. Our contribution is a number of randomized approximation algorithms, categorized according to the available space (superlinear, linear, and sublinear in the number of nodes n) and according to dierent models...

  19. Probable Unusual Transmission of Zika Virus

    Centers for Disease Control (CDC) Podcasts

    2011-05-23

    This podcast discusses a study about the probable unusual transmission of Zika Virus Infection from a scientist to his wife, published in the May 2011 issue of Emerging Infectious Diseases. Dr. Brian Foy, Associate Professor at Colorado State University, shares details of this event.  Created: 5/23/2011 by National Center for Emerging Zoonotic and Infectious Diseases (NCEZID).   Date Released: 5/25/2011.

  20. Probability and statistics for particle physics

    CERN Document Server

    Mana, Carlos

    2017-01-01

    This book comprehensively presents the basic concepts of probability and Bayesian inference with sufficient generality to make them applicable to current problems in scientific research. The first chapter provides the fundamentals of probability theory that are essential for the analysis of random phenomena. The second chapter includes a full and pragmatic review of the Bayesian methods that constitute a natural and coherent framework with enough freedom to analyze all the information available from experimental data in a conceptually simple manner. The third chapter presents the basic Monte Carlo techniques used in scientific research, allowing a large variety of problems to be handled difficult to tackle by other procedures. The author also introduces a basic algorithm, which enables readers to simulate samples from simple distribution, and describes useful cases for researchers in particle physics.The final chapter is devoted to the basic ideas of Information Theory, which are important in the Bayesian me...

  1. Dynamic Output Feedback Control for Nonlinear Networked Control Systems with Random Packet Dropout and Random Delay

    Directory of Open Access Journals (Sweden)

    Shuiqing Yu

    2013-01-01

    Full Text Available This paper investigates the dynamic output feedback control for nonlinear networked control systems with both random packet dropout and random delay. Random packet dropout and random delay are modeled as two independent random variables. An observer-based dynamic output feedback controller is designed based upon the Lyapunov theory. The quantitative relationship of the dropout rate, transition probability matrix, and nonlinear level is derived by solving a set of linear matrix inequalities. Finally, an example is presented to illustrate the effectiveness of the proposed method.

  2. Quantitative non-monotonic modeling of economic uncertainty by probability and possibility distributions

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans

    2012-01-01

    uncertainty can be calculated. The possibility approach is particular well suited for representation of uncertainty of a non-statistical nature due to lack of knowledge and requires less information than the probability approach. Based on the kind of uncertainty and knowledge present, these aspects...... to the understanding of similarities and differences of the two approaches as well as practical applications. The probability approach offers a good framework for representation of randomness and variability. Once the probability distributions of uncertain parameters and their correlations are known the resulting...... are thoroughly discussed in the case of rectangular representation of uncertainty by the uniform probability distribution and the interval, respectively. Also triangular representations are dealt with and compared. Calculation of monotonic as well as non-monotonic functions of variables represented...

  3. Data envelopment analysis of randomized ranks

    Directory of Open Access Journals (Sweden)

    Sant'Anna Annibal P.

    2002-01-01

    Full Text Available Probabilities and odds, derived from vectors of ranks, are here compared as measures of efficiency of decision-making units (DMUs. These measures are computed with the goal of providing preliminary information before starting a Data Envelopment Analysis (DEA or the application of any other evaluation or composition of preferences methodology. Preferences, quality and productivity evaluations are usually measured with errors or subject to influence of other random disturbances. Reducing evaluations to ranks and treating the ranks as estimates of location parameters of random variables, we are able to compute the probability of each DMU being classified as the best according to the consumption of each input and the production of each output. Employing the probabilities of being the best as efficiency measures, we stretch distances between the most efficient units. We combine these partial probabilities in a global efficiency score determined in terms of proximity to the efficiency frontier.

  4. Statistical tests for whether a given set of independent, identically distributed draws comes from a specified probability density.

    Science.gov (United States)

    Tygert, Mark

    2010-09-21

    We discuss several tests for determining whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov-Smirnov tests, particularly Kuiper's variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).

  5. Investigating Probability with the NBA Draft Lottery.

    Science.gov (United States)

    Quinn, Robert J.

    1997-01-01

    Investigates an interesting application of probability in the world of sports. Considers the role of permutations in the lottery system used by the National Basketball Association (NBA) in the United States to determine the order in which nonplayoff teams select players from the college ranks. Presents a lesson on this topic in which students work…

  6. Alzheimer random walk

    Science.gov (United States)

    Odagaki, Takashi; Kasuya, Keisuke

    2017-09-01

    Using the Monte Carlo simulation, we investigate a memory-impaired self-avoiding walk on a square lattice in which a random walker marks each of sites visited with a given probability p and makes a random walk avoiding the marked sites. Namely, p = 0 and p = 1 correspond to the simple random walk and the self-avoiding walk, respectively. When p> 0, there is a finite probability that the walker is trapped. We show that the trap time distribution can well be fitted by Stacy's Weibull distribution b(a/b){a+1}/{b}[Γ({a+1}/{b})]-1x^a\\exp(-a/bx^b)} where a and b are fitting parameters depending on p. We also find that the mean trap time diverges at p = 0 as p- α with α = 1.89. In order to produce sufficient number of long walks, we exploit the pivot algorithm and obtain the mean square displacement and its Flory exponent ν(p) as functions of p. We find that the exponent determined for 1000 step walks interpolates both limits ν(0) for the simple random walk and ν(1) for the self-avoiding walk as [ ν(p) - ν(0) ] / [ ν(1) - ν(0) ] = pβ with β = 0.388 when p ≪ 0.1 and β = 0.0822 when p ≫ 0.1. Contribution to the Topical Issue "Continuous Time Random Walk Still Trendy: Fifty-year History, Current State and Outlook", edited by Ryszard Kutner and Jaume Masoliver.

  7. WIENER-HOPF SOLVER WITH SMOOTH PROBABILITY DISTRIBUTIONS OF ITS COMPONENTS

    Directory of Open Access Journals (Sweden)

    Mr. Vladimir A. Smagin

    2016-12-01

    Full Text Available The Wiener – Hopf solver with smooth probability distributions of its component is presented. The method is based on hyper delta approximations of initial distributions. The use of Fourier series transformation and characteristic function allows working with the random variable method concentrated in transversal axis of absc.

  8. Probability distribution of long-run indiscriminate felling of trees in ...

    African Journals Online (AJOL)

    The study was undertaken to determine the probability distribution of Long-run indiscriminate felling of trees in northern senatorial district of Adamawa State. Specifically, the study focused on examining the future direction of indiscriminate felling of trees as well as its equilibrium distribution. A multi-stage and simple random ...

  9. Critical behavior in inhomogeneous random graphs

    NARCIS (Netherlands)

    Hofstad, van der R.W.

    2013-01-01

    We study the critical behavior of inhomogeneous random graphs in the so-called rank-1 case, where edges are present independently but with unequal edge occupation probabilities. The edge occupation probabilities are moderated by vertex weights, and are such that the degree of vertex i is close in

  10. Sexual behaviors, relationships, and perceived health among adult men in the United States: results from a national probability sample.

    Science.gov (United States)

    Reece, Michael; Herbenick, Debby; Schick, Vanessa; Sanders, Stephanie A; Dodge, Brian; Fortenberry, J Dennis

    2010-10-01

    To provide a foundation for those who provide sexual health services and programs to men in the United States, the need for population-based data that describes men's sexual behaviors and their correlates remains. The purpose of this study was to, in a national probability survey of men ages 18-94 years, assess the occurrence and frequency of sexual behaviors and their associations with relationship status and health status. A national probability sample of 2,522 men aged 18 to 94 completed a cross-sectional survey about their sexual behaviors, relationship status, and health. Relationship status; health status; experience of solo masturbation, partnered masturbation, giving oral sex, receiving oral sex, vaginal intercourse and anal intercourse, in the past 90 days; frequency of solo masturbation, vaginal intercourse and anal intercourse in the past year. Masturbation, oral intercourse, and vaginal intercourse are prevalent among men throughout most of their adult life, with both occurrence and frequency varying with age and as functions of relationship type and physical health status. Masturbation is prevalent and frequent across various stages of life and for both those with and without a relational partner, with fewer men with fair to poor health reporting recent masturbation. Patterns of giving oral sex to a female partner were similar to those for receiving oral sex. Vaginal intercourse in the past 90 days was more prevalent among men in their late 20s and 30s than in the other age groups, although being reported by approximately 50% of men in the sixth and seventh decades of life. Anal intercourse and sexual interactions with other men were less common than all other sexual behaviors. Contemporary men in the United States engage in diverse solo and partnered sexual activities; however, sexual behavior is less common and more infrequent among older age cohorts. © 2010 International Society for Sexual Medicine.

  11. Attitudes toward Bisexual Men and Women among a Nationally Representative Probability Sample of Adults in the United States.

    Science.gov (United States)

    Dodge, Brian; Herbenick, Debby; Friedman, M Reuel; Schick, Vanessa; Fu, Tsung-Chieh Jane; Bostwick, Wendy; Bartelt, Elizabeth; Muñoz-Laboy, Miguel; Pletta, David; Reece, Michael; Sandfort, Theo G M

    2016-01-01

    As bisexual individuals in the United States (U.S.) face significant health disparities, researchers have posited that these differences may be fueled, at least in part, by negative attitudes, prejudice, stigma, and discrimination toward bisexual individuals from heterosexual and gay/lesbian individuals. Previous studies of individual and social attitudes toward bisexual men and women have been conducted almost exclusively with convenience samples, with limited generalizability to the broader U.S. Our study provides an assessment of attitudes toward bisexual men and women among a nationally representative probability sample of heterosexual, gay, lesbian, and other-identified adults in the U.S. Data were collected from the 2015 National Survey of Sexual Health and Behavior (NSSHB), via an online questionnaire with a probability sample of adults (18 years and over) from throughout the U.S. We included two modified 5-item versions of the Bisexualities: Indiana Attitudes Scale (BIAS), validated sub-scales that were developed to measure attitudes toward bisexual men and women. Data were analyzed using descriptive statistics, gamma regression, and paired t-tests. Gender, sexual identity, age, race/ethnicity, income, and educational attainment were all significantly associated with participants' attitudes toward bisexual individuals. In terms of responses to individual scale items, participants were most likely to "neither agree nor disagree" with all attitudinal statements. Across sexual identities, self-identified other participants reported the most positive attitudes, while heterosexual male participants reported the least positive attitudes. As in previous research on convenience samples, we found a wide range of demographic characteristics were related with attitudes toward bisexual individuals in our nationally-representative study of heterosexual, gay/lesbian, and other-identified adults in the U.S. In particular, gender emerged as a significant characteristic

  12. Design and simulation of stratified probability digital receiver with application to the multipath communication

    Science.gov (United States)

    Deal, J. H.

    1975-01-01

    One approach to the problem of simplifying complex nonlinear filtering algorithms is through using stratified probability approximations where the continuous probability density functions of certain random variables are represented by discrete mass approximations. This technique is developed in this paper and used to simplify the filtering algorithms developed for the optimum receiver for signals corrupted by both additive and multiplicative noise.

  13. On reflexivity of random walks in a random environment on a metric space

    International Nuclear Information System (INIS)

    Rozikov, U.A.

    2002-11-01

    In this paper, we consider random walks in random environments on a countable metric space when jumps of the walks of the fractions are finite. The transfer probabilities of the random walk from x is an element of G (where G is the considering metric space) are defined by vector p(x) is an element of R k , k>1, where {p(x), x is an element of G} is the set of independent and indentically distributed random vectors. For the random walk, a sufficient condition of nonreflexivity is obtained. Examples for metric spaces Z d free groups and free product of finite numbers cyclic groups of the second order and some other metric spaces are considered. (author)

  14. On the probability distribution of the stochastic saturation scale in QCD

    International Nuclear Information System (INIS)

    Marquet, C.; Soyez, G.; Xiao Bowen

    2006-01-01

    It was recently noticed that high-energy scattering processes in QCD have a stochastic nature. An event-by-event scattering amplitude is characterised by a saturation scale which is a random variable. The statistical ensemble of saturation scales formed with all the events is distributed according to a probability law whose cumulants have been recently computed. In this work, we obtain the probability distribution from the cumulants. We prove that it can be considered as Gaussian over a large domain that we specify and our results are confirmed by numerical simulations

  15. National Coral Reef Monitoring Program: Benthic Images Collected from Stratified Random Sites (StRS) across American Samoa in 2015

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here are benthic habitat imagery that result from benthic photo-quadrat surveys conducted along transects at stratified random sites across...

  16. Probability sampling design in ethnobotanical surveys of medicinal plants

    Directory of Open Access Journals (Sweden)

    Mariano Martinez Espinosa

    2012-07-01

    Full Text Available Non-probability sampling design can be used in ethnobotanical surveys of medicinal plants. However, this method does not allow statistical inferences to be made from the data generated. The aim of this paper is to present a probability sampling design that is applicable in ethnobotanical studies of medicinal plants. The sampling design employed in the research titled "Ethnobotanical knowledge of medicinal plants used by traditional communities of Nossa Senhora Aparecida do Chumbo district (NSACD, Poconé, Mato Grosso, Brazil" was used as a case study. Probability sampling methods (simple random and stratified sampling were used in this study. In order to determine the sample size, the following data were considered: population size (N of 1179 families; confidence coefficient, 95%; sample error (d, 0.05; and a proportion (p, 0.5. The application of this sampling method resulted in a sample size (n of at least 290 families in the district. The present study concludes that probability sampling methods necessarily have to be employed in ethnobotanical studies of medicinal plants, particularly where statistical inferences have to be made using data obtained. This can be achieved by applying different existing probability sampling methods, or better still, a combination of such methods.

  17. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  18. Random number generation

    International Nuclear Information System (INIS)

    Coveyou, R.R.

    1974-01-01

    The subject of random number generation is currently controversial. Differing opinions on this subject seem to stem from implicit or explicit differences in philosophy; in particular, from differing ideas concerning the role of probability in the real world of physical processes, electronic computers, and Monte Carlo calculations. An attempt is made here to reconcile these views. The role of stochastic ideas in mathematical models is discussed. In illustration of these ideas, a mathematical model of the use of random number generators in Monte Carlo calculations is constructed. This model is used to set up criteria for the comparison and evaluation of random number generators. (U.S.)

  19. Separating the contributions of variability and parameter uncertainty in probability distributions

    International Nuclear Information System (INIS)

    Sankararaman, S.; Mahadevan, S.

    2013-01-01

    This paper proposes a computational methodology to quantify the individual contributions of variability and distribution parameter uncertainty to the overall uncertainty in a random variable. Even if the distribution type is assumed to be known, sparse or imprecise data leads to uncertainty about the distribution parameters. If uncertain distribution parameters are represented using probability distributions, then the random variable can be represented using a family of probability distributions. The family of distributions concept has been used to obtain qualitative, graphical inference of the contributions of natural variability and distribution parameter uncertainty. The proposed methodology provides quantitative estimates of the contributions of the two types of uncertainty. Using variance-based global sensitivity analysis, the contributions of variability and distribution parameter uncertainty to the overall uncertainty are computed. The proposed method is developed at two different levels; first, at the level of a variable whose distribution parameters are uncertain, and second, at the level of a model output whose inputs have uncertain distribution parameters

  20. APPROXIMATION OF PROBABILITY DISTRIBUTIONS IN QUEUEING MODELS

    Directory of Open Access Journals (Sweden)

    T. I. Aliev

    2013-03-01

    Full Text Available For probability distributions with variation coefficient, not equal to unity, mathematical dependences for approximating distributions on the basis of first two moments are derived by making use of multi exponential distributions. It is proposed to approximate distributions with coefficient of variation less than unity by using hypoexponential distribution, which makes it possible to generate random variables with coefficient of variation, taking any value in a range (0; 1, as opposed to Erlang distribution, having only discrete values of coefficient of variation.

  1. Formulas for Rational-Valued Separability Probabilities of Random Induced Generalized Two-Qubit States

    Directory of Open Access Journals (Sweden)

    Paul B. Slater

    2015-01-01

    Full Text Available Previously, a formula, incorporating a 5F4 hypergeometric function, for the Hilbert-Schmidt-averaged determinantal moments ρPTnρk/ρk of 4×4 density-matrices (ρ and their partial transposes (|ρPT|, was applied with k=0 to the generalized two-qubit separability probability question. The formula can, furthermore, be viewed, as we note here, as an averaging over “induced measures in the space of mixed quantum states.” The associated induced-measure separability probabilities (k=1,2,… are found—via a high-precision density approximation procedure—to assume interesting, relatively simple rational values in the two-re[al]bit (α=1/2, (standard two-qubit (α=1, and two-quater[nionic]bit (α=2 cases. We deduce rather simple companion (rebit, qubit, quaterbit, … formulas that successfully reproduce the rational values assumed for general  k. These formulas are observed to share certain features, possibly allowing them to be incorporated into a single master formula.

  2. Focus in High School Mathematics: Statistics and Probability

    Science.gov (United States)

    National Council of Teachers of Mathematics, 2009

    2009-01-01

    Reasoning about and making sense of statistics and probability are essential to students' future success. This volume belongs to a series that supports National Council of Teachers of Mathematics' (NCTM's) "Focus in High School Mathematics: Reasoning and Sense Making" by providing additional guidance for making reasoning and sense making part of…

  3. Scale-invariant transition probabilities in free word association trajectories

    Directory of Open Access Journals (Sweden)

    Martin Elias Costa

    2009-09-01

    Full Text Available Free-word association has been used as a vehicle to understand the organization of human thoughts. The original studies relied mainly on qualitative assertions, yielding the widely intuitive notion that trajectories of word associations are structured, yet considerably more random than organized linguistic text. Here we set to determine a precise characterization of this space, generating a large number of word association trajectories in a web implemented game. We embedded the trajectories in the graph of word co-occurrences from a linguistic corpus. To constrain possible transport models we measured the memory loss and the cycling probability. These two measures could not be reconciled by a bounded diffusive model since the cycling probability was very high (16 % of order-2 cycles implying a majority of short-range associations whereas the memory loss was very rapid (converging to the asymptotic value in ∼ 7 steps which, in turn, forced a high fraction of long-range associations. We show that memory loss and cycling probabilities of free word association trajectories can be simultaneously accounted by a model in which transitions are determined by a scale invariant probability distribution.

  4. The probability that a pair of group elements is autoconjugate

    Indian Academy of Sciences (India)

    Let and ℎ be arbitrary elements of a given finite group . Then and ℎ are said to be autoconjugate if there exists some automorphism of such that ℎ = . In this article, we construct some sharp bounds for the probability that two random elements of are autoconjugate, denoted by P a ( G ) . It is also shown that P ...

  5. Predicting non-square 2D dice probabilities

    Science.gov (United States)

    Pender, G. A. T.; Uhrin, M.

    2014-07-01

    The prediction of the final state probabilities of a general cuboid randomly thrown onto a surface is a problem that naturally arises in the minds of men and women familiar with regular cubic dice and the basic concepts of probability. Indeed, it was considered by Newton in 1664 (Newton 1967 The Mathematical Papers of Issac Newton vol I (Cambridge: Cambridge University Press) pp 60-1). In this paper we make progress on the 2D problem (which can be realized in 3D by considering a long cuboid, or alternatively a rectangular cross-sectioned dreidel). For the two-dimensional case we suggest that the ratio of the probabilities of landing on each of the two sides is given by \\frac{\\sqrt{{{k}^{2}}+{{l}^{2}}}-k}{\\sqrt{{{k}^{2}}+{{l}^{2}}}-l}\\frac{arctan \\frac{l}{k}}{arctan \\frac{k}{l}} where k and l are the lengths of the two sides. We test this theory both experimentally and computationally, and find good agreement between our theory, experimental and computational results. Our theory is known, from its derivation, to be an approximation for particularly bouncy or ‘grippy’ surfaces where the die rolls through many revolutions before settling. On real surfaces we would expect (and we observe) that the true probability ratio for a 2D die is a somewhat closer to unity than predicted by our theory. This problem may also have wider relevance in the testing of physics engines.

  6. Efficient search by optimized intermittent random walks

    International Nuclear Information System (INIS)

    Oshanin, Gleb; Lindenberg, Katja; Wio, Horacio S; Burlatsky, Sergei

    2009-01-01

    We study the kinetics for the search of an immobile target by randomly moving searchers that detect it only upon encounter. The searchers perform intermittent random walks on a one-dimensional lattice. Each searcher can step on a nearest neighbor site with probability α or go off lattice with probability 1 - α to move in a random direction until it lands back on the lattice at a fixed distance L away from the departure point. Considering α and L as optimization parameters, we seek to enhance the chances of successful detection by minimizing the probability P N that the target remains undetected up to the maximal search time N. We show that even in this simple model, a number of very efficient search strategies can lead to a decrease of P N by orders of magnitude upon appropriate choices of α and L. We demonstrate that, in general, such optimal intermittent strategies are much more efficient than Brownian searches and are as efficient as search algorithms based on random walks with heavy-tailed Cauchy jump-length distributions. In addition, such intermittent strategies appear to be more advantageous than Levy-based ones in that they lead to more thorough exploration of visited regions in space and thus lend themselves to parallelization of the search processes.

  7. Non-parametric adaptive importance sampling for the probability estimation of a launcher impact position

    International Nuclear Information System (INIS)

    Morio, Jerome

    2011-01-01

    Importance sampling (IS) is a useful simulation technique to estimate critical probability with a better accuracy than Monte Carlo methods. It consists in generating random weighted samples from an auxiliary distribution rather than the distribution of interest. The crucial part of this algorithm is the choice of an efficient auxiliary PDF that has to be able to simulate more rare random events. The optimisation of this auxiliary distribution is often in practice very difficult. In this article, we propose to approach the IS optimal auxiliary density with non-parametric adaptive importance sampling (NAIS). We apply this technique for the probability estimation of spatial launcher impact position since it has currently become a more and more important issue in the field of aeronautics.

  8. Investigation of Probability Distributions Using Dice Rolling Simulation

    Science.gov (United States)

    Lukac, Stanislav; Engel, Radovan

    2010-01-01

    Dice are considered one of the oldest gambling devices and thus many mathematicians have been interested in various dice gambling games in the past. Dice have been used to teach probability, and dice rolls can be effectively simulated using technology. The National Council of Teachers of Mathematics (NCTM) recommends that teachers use simulations…

  9. Intermittent random walks for an optimal search strategy: one-dimensional case

    International Nuclear Information System (INIS)

    Oshanin, G; Wio, H S; Lindenberg, K; Burlatsky, S F

    2007-01-01

    We study the search kinetics of an immobile target by a concentration of randomly moving searchers. The object of the study is to optimize the probability of detection within the constraints of our model. The target is hidden on a one-dimensional lattice in the sense that searchers have no a priori information about where it is, and may detect it only upon encounter. The searchers perform random walks in discrete time n = 0,1,2,...,N, where N is the maximal time the search process is allowed to run. With probability α the searchers step on a nearest-neighbour, and with probability (1-α) they leave the lattice and stay off until they land back on the lattice at a fixed distance L away from the departure point. The random walk is thus intermittent. We calculate the probability P N that the target remains undetected up to the maximal search time N, and seek to minimize this probability. We find that P N is a non-monotonic function of α, and show that there is an optimal choice α opt (N) of α well within the intermittent regime, 0 opt (N) N can be orders of magnitude smaller compared to the 'pure' random walk cases α = 0 and α = 1

  10. 134Cs emission probabilities determination by gamma spectrometry

    Science.gov (United States)

    de Almeida, M. C. M.; Poledna, R.; Delgado, J. U.; Silva, R. L.; Araujo, M. T. F.; da Silva, C. J.

    2018-03-01

    The National Laboratory for Ionizing Radiation Metrology (LNMRI/IRD/CNEN) of Rio de Janeiro performed primary and secondary standardization of different radionuclides reaching satisfactory uncertainties. A solution of 134Cs radionuclide was purchased from commercial supplier to emission probabilities determination of some of its energies. 134Cs is a beta gamma emitter with 754 days of half-life. This radionuclide is used as standard in environmental, water and food control. It is also important to germanium detector calibration. The gamma emission probabilities (Pγ) were determined mainly for some energies of the 134Cs by efficiency curve method and the Pγ absolute uncertainties obtained were below 1% (k=1).

  11. Digital simulation of two-dimensional random fields with arbitrary power spectra and non-Gaussian probability distribution functions

    DEFF Research Database (Denmark)

    Yura, Harold; Hanson, Steen Grüner

    2012-01-01

    with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative...

  12. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  13. Predicting Flow Breakdown Probability and Duration in Stochastic Network Models: Impact on Travel Time Reliability

    Energy Technology Data Exchange (ETDEWEB)

    Dong, Jing [ORNL; Mahmassani, Hani S. [Northwestern University, Evanston

    2011-01-01

    This paper proposes a methodology to produce random flow breakdown endogenously in a mesoscopic operational model, by capturing breakdown probability and duration. Based on previous research findings that probability of flow breakdown can be represented as a function of flow rate and the duration can be characterized by a hazard model. By generating random flow breakdown at various levels and capturing the traffic characteristics at the onset of the breakdown, the stochastic network simulation model provides a tool for evaluating travel time variability. The proposed model can be used for (1) providing reliability related traveler information; (2) designing ITS (intelligent transportation systems) strategies to improve reliability; and (3) evaluating reliability-related performance measures of the system.

  14. Study on conditional probability of surface rupture: effect of fault dip and width of seismogenic layer

    Science.gov (United States)

    Inoue, N.

    2017-12-01

    The conditional probability of surface ruptures is affected by various factors, such as shallow material properties, process of earthquakes, ground motions and so on. Toda (2013) pointed out difference of the conditional probability of strike and reverse fault by considering the fault dip and width of seismogenic layer. This study evaluated conditional probability of surface rupture based on following procedures. Fault geometry was determined from the randomly generated magnitude based on The Headquarters for Earthquake Research Promotion (2017) method. If the defined fault plane was not saturated in the assumed width of the seismogenic layer, the fault plane depth was randomly provided within the seismogenic layer. The logistic analysis was performed to two data sets: surface displacement calculated by dislocation methods (Wang et al., 2003) from the defined source fault, the depth of top of the defined source fault. The estimated conditional probability from surface displacement indicated higher probability of reverse faults than that of strike faults, and this result coincides to previous similar studies (i.e. Kagawa et al., 2004; Kataoka and Kusakabe, 2005). On the contrary, the probability estimated from the depth of the source fault indicated higher probability of thrust faults than that of strike and reverse faults, and this trend is similar to the conditional probability of PFDHA results (Youngs et al., 2003; Moss and Ross, 2011). The probability of combined simulated results of thrust and reverse also shows low probability. The worldwide compiled reverse fault data include low fault dip angle earthquake. On the other hand, in the case of Japanese reverse fault, there is possibility that the conditional probability of reverse faults with less low dip angle earthquake shows low probability and indicates similar probability of strike fault (i.e. Takao et al., 2013). In the future, numerical simulation by considering failure condition of surface by the source

  15. Attitudes toward Bisexual Men and Women among a Nationally Representative Probability Sample of Adults in the United States

    Science.gov (United States)

    Herbenick, Debby; Friedman, M. Reuel; Schick, Vanessa; Fu, Tsung-Chieh (Jane); Bostwick, Wendy; Bartelt, Elizabeth; Muñoz-Laboy, Miguel; Pletta, David; Reece, Michael; Sandfort, Theo G. M.

    2016-01-01

    As bisexual individuals in the United States (U.S.) face significant health disparities, researchers have posited that these differences may be fueled, at least in part, by negative attitudes, prejudice, stigma, and discrimination toward bisexual individuals from heterosexual and gay/lesbian individuals. Previous studies of individual and social attitudes toward bisexual men and women have been conducted almost exclusively with convenience samples, with limited generalizability to the broader U.S. population. Our study provides an assessment of attitudes toward bisexual men and women among a nationally representative probability sample of heterosexual, gay, lesbian, and other-identified adults in the U.S. Data were collected from the 2015 National Survey of Sexual Health and Behavior (NSSHB), via an online questionnaire with a probability sample of adults (18 years and over) from throughout the U.S. We included two modified 5-item versions of the Bisexualities: Indiana Attitudes Scale (BIAS), validated sub-scales that were developed to measure attitudes toward bisexual men and women. Data were analyzed using descriptive statistics, gamma regression, and paired t-tests. Gender, sexual identity, age, race/ethnicity, income, and educational attainment were all significantly associated with participants' attitudes toward bisexual individuals. In terms of responses to individual scale items, participants were most likely to “neither agree nor disagree” with all attitudinal statements. Across sexual identities, self-identified other participants reported the most positive attitudes, while heterosexual male participants reported the least positive attitudes. As in previous research on convenience samples, we found a wide range of demographic characteristics were related with attitudes toward bisexual individuals in our nationally-representative study of heterosexual, gay/lesbian, and other-identified adults in the U.S. In particular, gender emerged as a significant

  16. Gap probabilities for edge intervals in finite Gaussian and Jacobi unitary matrix ensembles

    International Nuclear Information System (INIS)

    Witte, N.S.; Forrester, P.J.

    1999-01-01

    The probabilities for gaps in the eigenvalue spectrum of the finite dimension N x N random matrix Hermite and Jacobi unitary ensembles on some single and disconnected double intervals are found. These are cases where a reflection symmetry exists and the probability factors into two other related probabilities, defined on single intervals. Our investigation uses the system of partial differential equations arising from the Fredholm determinant expression for the gap probability and the differential-recurrence equations satisfied by Hermite and Jacobi orthogonal polynomials. In our study we find second and third order nonlinear ordinary differential equations defining the probabilities in the general N case, specific explicit solutions for N = 1 and N = 2, asymptotic expansions, scaling at the edge of the Hermite spectrum as N →∞ and the Jacobi to Hermite limit both of which make correspondence to other cases reported here or known previously. (authors)

  17. Knotting probability of self-avoiding polygons under a topological constraint

    Science.gov (United States)

    Uehara, Erica; Deguchi, Tetsuo

    2017-09-01

    We define the knotting probability of a knot K by the probability for a random polygon or self-avoiding polygon (SAP) of N segments having the knot type K. We show fundamental and generic properties of the knotting probability particularly its dependence on the excluded volume. We investigate them for the SAP consisting of hard cylindrical segments of unit length and radius rex. For various prime and composite knots, we numerically show that a compact formula describes the knotting probabilities for the cylindrical SAP as a function of segment number N and radius rex. It connects the small-N to the large-N behavior and even to lattice knots in the case of large values of radius. As the excluded volume increases, the maximum of the knotting probability decreases for prime knots except for the trefoil knot. If it is large, the trefoil knot and its descendants are dominant among the nontrivial knots in the SAP. From the factorization property of the knotting probability, we derive a sum rule among the estimates of a fitting parameter for all prime knots, which suggests the local knot picture and the dominance of the trefoil knot in the case of large excluded volumes. Here we remark that the cylindrical SAP gives a model of circular DNA which is negatively charged and semiflexible, where radius rex corresponds to the screening length.

  18. Knotting probability of self-avoiding polygons under a topological constraint.

    Science.gov (United States)

    Uehara, Erica; Deguchi, Tetsuo

    2017-09-07

    We define the knotting probability of a knot K by the probability for a random polygon or self-avoiding polygon (SAP) of N segments having the knot type K. We show fundamental and generic properties of the knotting probability particularly its dependence on the excluded volume. We investigate them for the SAP consisting of hard cylindrical segments of unit length and radius r ex . For various prime and composite knots, we numerically show that a compact formula describes the knotting probabilities for the cylindrical SAP as a function of segment number N and radius r ex . It connects the small-N to the large-N behavior and even to lattice knots in the case of large values of radius. As the excluded volume increases, the maximum of the knotting probability decreases for prime knots except for the trefoil knot. If it is large, the trefoil knot and its descendants are dominant among the nontrivial knots in the SAP. From the factorization property of the knotting probability, we derive a sum rule among the estimates of a fitting parameter for all prime knots, which suggests the local knot picture and the dominance of the trefoil knot in the case of large excluded volumes. Here we remark that the cylindrical SAP gives a model of circular DNA which is negatively charged and semiflexible, where radius r ex corresponds to the screening length.

  19. Path probability distribution of stochastic motion of non dissipative systems: a classical analog of Feynman factor of path integral

    International Nuclear Information System (INIS)

    Lin, T.L.; Wang, R.; Bi, W.P.; El Kaabouchi, A.; Pujos, C.; Calvayrac, F.; Wang, Q.A.

    2013-01-01

    We investigate, by numerical simulation, the path probability of non dissipative mechanical systems undergoing stochastic motion. The aim is to search for the relationship between this probability and the usual mechanical action. The model of simulation is a one-dimensional particle subject to conservative force and Gaussian random displacement. The probability that a sample path between two fixed points is taken is computed from the number of particles moving along this path, an output of the simulation, divided by the total number of particles arriving at the final point. It is found that the path probability decays exponentially with increasing action of the sample paths. The decay rate increases with decreasing randomness. This result supports the existence of a classical analog of the Feynman factor in the path integral formulation of quantum mechanics for Hamiltonian systems

  20. A method for generating skewed random numbers using two overlapping uniform distributions

    International Nuclear Information System (INIS)

    Ermak, D.L.; Nasstrom, J.S.

    1995-02-01

    The objective of this work was to implement and evaluate a method for generating skewed random numbers using a combination of uniform random numbers. The method provides a simple and accurate way of generating skewed random numbers from the specified first three moments without an a priori specification of the probability density function. We describe the procedure for generating skewed random numbers from unifon-n random numbers, and show that it accurately produces random numbers with the desired first three moments over a range of skewness values. We also show that in the limit of zero skewness, the distribution of random numbers is an accurate approximation to the Gaussian probability density function. Future work win use this method to provide skewed random numbers for a Langevin equation model for diffusion in skewed turbulence

  1. Random Walks on Homeo( S 1)

    Science.gov (United States)

    Malicet, Dominique

    2017-12-01

    In this paper, we study random walks {g_n=f_{n-1}\\ldots f_0} on the group Homeo ( S 1) of the homeomorphisms of the circle, where the homeomorphisms f k are chosen randomly, independently, with respect to a same probability measure {ν}. We prove that under the only condition that there is no probability measure invariant by {ν}-almost every homeomorphism, the random walk almost surely contracts small intervals. It generalizes what has been known on this subject until now, since various conditions on {ν} were imposed in order to get the phenomenon of contractions. Moreover, we obtain the surprising fact that the rate of contraction is exponential, even in the lack of assumptions of smoothness on the f k 's. We deduce various dynamical consequences on the random walk ( g n ): finiteness of ergodic stationary measures, distribution of the trajectories, asymptotic law of the evaluations, etc. The proof of the main result is based on a modification of the Ávila-Viana's invariance principle, working for continuous cocycles on a space fibred in circles.

  2. Positive random variables with a discrete probability mass at the origin: Parameter estimation for left-censored samples with application to air quality monitoring data

    International Nuclear Information System (INIS)

    Gogolak, C.V.

    1986-11-01

    The concentration of a contaminant measured in a particular medium might be distributed as a positive random variable when it is present, but it may not always be present. If there is a level below which the concentration cannot be distinguished from zero by the analytical apparatus, a sample from such a population will be censored on the left. The presence of both zeros and positive values in the censored portion of such samples complicates the problem of estimating the parameters of the underlying positive random variable and the probability of a zero observation. Using the method of maximum likelihood, it is shown that the solution to this estimation problem reduces largely to that of estimating the parameters of the distribution truncated at the point of censorship. The maximum likelihood estimate of the proportion of zero values follows directly. The derivation of the maximum likelihood estimates for a lognormal population with zeros is given in detail, and the asymptotic properties of the estimates are examined. The estimation method was used to fit several different distributions to a set of severely censored 85 Kr monitoring data from six locations at the Savannah River Plant chemical separations facilities

  3. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  4. Probability of misclassifying biological elements in surface waters.

    Science.gov (United States)

    Loga, Małgorzata; Wierzchołowska-Dziedzic, Anna

    2017-11-24

    Measurement uncertainties are inherent to assessment of biological indices of water bodies. The effect of these uncertainties on the probability of misclassification of ecological status is the subject of this paper. Four Monte-Carlo (M-C) models were applied to simulate the occurrence of random errors in the measurements of metrics corresponding to four biological elements of surface waters: macrophytes, phytoplankton, phytobenthos, and benthic macroinvertebrates. Long series of error-prone measurement values of these metrics, generated by M-C models, were used to identify cases in which values of any of the four biological indices lay outside of the "true" water body class, i.e., outside the class assigned from the actual physical measurements. Fraction of such cases in the M-C generated series was used to estimate the probability of misclassification. The method is particularly useful for estimating the probability of misclassification of the ecological status of surface water bodies in the case of short sequences of measurements of biological indices. The results of the Monte-Carlo simulations show a relatively high sensitivity of this probability to measurement errors of the river macrophyte index (MIR) and high robustness to measurement errors of the benthic macroinvertebrate index (MMI). The proposed method of using Monte-Carlo models to estimate the probability of misclassification has significant potential for assessing the uncertainty of water body status reported to the EC by the EU member countries according to WFD. The method can be readily applied also in risk assessment of water management decisions before adopting the status dependent corrective actions.

  5. Image Watermarking Scheme for Specifying False Positive Probability and Bit-pattern Embedding

    Science.gov (United States)

    Sayama, Kohei; Nakamoto, Masayoshi; Muneyasu, Mitsuji; Ohno, Shuichi

    This paper treats a discrete wavelet transform(DWT)-based image watermarking with considering the false positive probability and bit-pattern embedding. We propose an iterative embedding algorithm of watermarking signals which are K sets pseudo-random numbers generated by a secret key. In the detection, K correlations between the watermarked DWT coefficients and watermark signals are computed by using the secret key. L correlations are made available for the judgment of the watermark presence with specified false positive probability, and the other K-L correlations are corresponding to the bit-pattern signal. In the experiment, we show the detection results with specified false positive probability and the bit-pattern recovery, and the comparison of the proposed method against JPEG compression, scaling down and cropping.

  6. Probabilities for gravitational lensing by point masses in a locally inhomogeneous universe

    International Nuclear Information System (INIS)

    Isaacson, J.A.; Canizares, C.R.

    1989-01-01

    Probability functions for gravitational lensing by point masses that incorporate Poisson statistics and flux conservation are formulated in the Dyer-Roeder construction. Optical depths to lensing for distant sources are calculated using both the method of Press and Gunn (1973) which counts lenses in an otherwise empty cone, and the method of Ehlers and Schneider (1986) which projects lensing cross sections onto the source sphere. These are then used as parameters of the probability density for lensing in the case of a critical (q0 = 1/2) Friedmann universe. A comparison of the probability functions indicates that the effects of angle-averaging can be well approximated by adjusting the average magnification along a random line of sight so as to conserve flux. 17 references

  7. Random walks, random fields, and disordered systems

    CERN Document Server

    Černý, Jiří; Kotecký, Roman

    2015-01-01

    Focusing on the mathematics that lies at the intersection of probability theory, statistical physics, combinatorics and computer science, this volume collects together lecture notes on recent developments in the area. The common ground of these subjects is perhaps best described by the three terms in the title: Random Walks, Random Fields and Disordered Systems. The specific topics covered include a study of Branching Brownian Motion from the perspective of disordered (spin-glass) systems, a detailed analysis of weakly self-avoiding random walks in four spatial dimensions via methods of field theory and the renormalization group, a study of phase transitions in disordered discrete structures using a rigorous version of the cavity method, a survey of recent work on interacting polymers in the ballisticity regime and, finally, a treatise on two-dimensional loop-soup models and their connection to conformally invariant systems and the Gaussian Free Field. The notes are aimed at early graduate students with a mod...

  8. Internal Medicine residents use heuristics to estimate disease probability.

    Science.gov (United States)

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. We randomized 55 Internal Medicine residents to different versions of four clinical vignettes and asked them to estimate probabilities of target conditions. We manipulated the clinical data for each vignette to be consistent with either 1) using a representative heuristic, by adding non-discriminating prototypical clinical features of the target condition, or 2) using anchoring with adjustment heuristic, by providing a high or low anchor for the target condition. When presented with additional non-discriminating data the odds of diagnosing the target condition were increased (odds ratio (OR) 2.83, 95% confidence interval [1.30, 6.15], p = 0.009). Similarly, the odds of diagnosing the target condition were increased when a high anchor preceded the vignette (OR 2.04, [1.09, 3.81], p = 0.025). Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing.

  9. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  10. On the spectral properties of random finite difference operators

    International Nuclear Information System (INIS)

    Kunz, H.; Souillard, B.

    1980-01-01

    We study a class of random finite difference operators, a typical example of which is the finite difference Schroedinger operator with a random potential which arises in solid state physics in the tight binding approximation. We obtain with probability one, in various situations, the exact location of the spectrum, and criterions for a given part in the spectrum to be pure point or purely continuous, or for the static electric conductivity to vanish. A general formalism is developped which transforms the study of these random operators into that of the asymptotics of a multiple integral constructed from a given recipe. Finally we apply our criterions and formalism to prove that, with probability one, the one-dimensional finite difference Schroedinger operator with a random potential has pure point spectrum and developps no static conductivity. (orig.)

  11. Gossip in Random Networks

    Science.gov (United States)

    Malarz, K.; Szvetelszky, Z.; Szekf, B.; Kulakowski, K.

    2006-11-01

    We consider the average probability X of being informed on a gossip in a given social network. The network is modeled within the random graph theory of Erd{õ}s and Rényi. In this theory, a network is characterized by two parameters: the size N and the link probability p. Our experimental data suggest three levels of social inclusion of friendship. The critical value pc, for which half of agents are informed, scales with the system size as N-gamma with gamma approx 0.68. Computer simulations show that the probability X varies with p as a sigmoidal curve. Influence of the correlations between neighbors is also evaluated: with increasing clustering coefficient C, X decreases.

  12. Efficient decoding of random errors for quantum expander codes

    OpenAIRE

    Fawzi , Omar; Grospellier , Antoine; Leverrier , Anthony

    2017-01-01

    We show that quantum expander codes, a constant-rate family of quantum LDPC codes, with the quasi-linear time decoding algorithm of Leverrier, Tillich and Z\\'emor can correct a constant fraction of random errors with very high probability. This is the first construction of a constant-rate quantum LDPC code with an efficient decoding algorithm that can correct a linear number of random errors with a negligible failure probability. Finding codes with these properties is also motivated by Gottes...

  13. Probability of causation tables and their possible implications for the practice of diagnostic radiology

    International Nuclear Information System (INIS)

    Gur, D.; Wald, N.

    1986-01-01

    In compliance with requirements in the Orphan Drug Act (97-414) of 1983, tables were recently constructed by an ad hoc committee of the National Institutes of Health (NIH) in which the probabilities that certain specific cancers are caused by previous radiation exposure are estimated. The reports of the NIH committee and a National Academy of Science oversight committee may have broad implications for the future practice of diagnostic radiology. The basis on which the probability of causation tables were established and some of the possible implications for diagnostic radiology are discussed

  14. Targeting the probability versus cost of feared outcomes in public speaking anxiety.

    Science.gov (United States)

    Nelson, Elizabeth A; Deacon, Brett J; Lickel, James J; Sy, Jennifer T

    2010-04-01

    Cognitive-behavioral theory suggests that social phobia is maintained, in part, by overestimates of the probability and cost of negative social events. Indeed, empirically supported cognitive-behavioral treatments directly target these cognitive biases through the use of in vivo exposure or behavioral experiments. While cognitive-behavioral theories and treatment protocols emphasize the importance of targeting probability and cost biases in the reduction of social anxiety, few studies have examined specific techniques for reducing probability and cost bias, and thus the relative efficacy of exposure to the probability versus cost of negative social events is unknown. In the present study, 37 undergraduates with high public speaking anxiety were randomly assigned to a single-session intervention designed to reduce either the perceived probability or the perceived cost of negative outcomes associated with public speaking. Compared to participants in the probability treatment condition, those in the cost treatment condition demonstrated significantly greater improvement on measures of public speaking anxiety and cost estimates for negative social events. The superior efficacy of the cost treatment condition was mediated by greater treatment-related changes in social cost estimates. The clinical implications of these findings are discussed. Published by Elsevier Ltd.

  15. Asymptotic Properties of Multistate Random Walks. I. Theory

    NARCIS (Netherlands)

    Roerdink, J.B.T.M.; Shuler, K.E.

    1985-01-01

    A calculation is presented of the long-time behavior of various random walk properties (moments, probability of return to the origin, expected number of distinct sites visited) for multistate random walks on periodic lattices. In particular, we consider inhomogeneous periodic lattices, consisting of

  16. Limit Shapes and Fluctuations of Bounded Random Partitions

    DEFF Research Database (Denmark)

    Beltoft, Dan

    Random partitions of integers, bounded both in the number of summands and the size of each summand are considered, subject to the probability measure which assigns a probability proportional to some fixed positive number to the power of the number being partitioned. This corresponds to considering...

  17. Probability theory for 3-layer remote sensing radiative transfer model: univariate case.

    Science.gov (United States)

    Ben-David, Avishai; Davidson, Charles E

    2012-04-23

    A probability model for a 3-layer radiative transfer model (foreground layer, cloud layer, background layer, and an external source at the end of line of sight) has been developed. The 3-layer model is fundamentally important as the primary physical model in passive infrared remote sensing. The probability model is described by the Johnson family of distributions that are used as a fit for theoretically computed moments of the radiative transfer model. From the Johnson family we use the SU distribution that can address a wide range of skewness and kurtosis values (in addition to addressing the first two moments, mean and variance). In the limit, SU can also describe lognormal and normal distributions. With the probability model one can evaluate the potential for detecting a target (vapor cloud layer), the probability of observing thermal contrast, and evaluate performance (receiver operating characteristics curves) in clutter-noise limited scenarios. This is (to our knowledge) the first probability model for the 3-layer remote sensing geometry that treats all parameters as random variables and includes higher-order statistics. © 2012 Optical Society of America

  18. Front Probability, NOAA GOES Imager, 0.05 degrees, Western Hemisphere, EXPERIMENTAL

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data indicates the probability of oceanic sea surface temperature fronts off the California coast. They were created using remote sensing sea surface temperature...

  19. Fracture fragility of HFIR vessel caused by random crack size or random toughness

    International Nuclear Information System (INIS)

    Chang, Shih-Jung; Proctor, L.D.

    1993-01-01

    This report discuses the probability of fracture (fracture fragility) versus a range of applied hoop stresses along the HFIR vessel which is obtained as an estimate of its fracture capacity. Both the crack size and the fracture toughness are assumed to be random variables that follow given distribution functions. Possible hoop stress is based on the numerical solution of the vessel response by applying a point pressure-pulse it the center of the fluid volume within the vessel. Both the fluid-structure interaction and radiation embrittlement are taken into consideration. Elastic fracture mechanics is used throughout the analysis. The probability of vessel fracture for a single crack caused by either a variable crack depth or a variable toughness is first derived. Then the probability of fracture with multiple number of cracks is obtained. The probability of fracture is further extended to include different levels of confidence and variability. It, therefore, enables one to estimate the high confidence and low probability capacity accident load

  20. Independent random sampling methods

    CERN Document Server

    Martino, Luca; Míguez, Joaquín

    2018-01-01

    This book systematically addresses the design and analysis of efficient techniques for independent random sampling. Both general-purpose approaches, which can be used to generate samples from arbitrary probability distributions, and tailored techniques, designed to efficiently address common real-world practical problems, are introduced and discussed in detail. In turn, the monograph presents fundamental results and methodologies in the field, elaborating and developing them into the latest techniques. The theory and methods are illustrated with a varied collection of examples, which are discussed in detail in the text and supplemented with ready-to-run computer code. The main problem addressed in the book is how to generate independent random samples from an arbitrary probability distribution with the weakest possible constraints or assumptions in a form suitable for practical implementation. The authors review the fundamental results and methods in the field, address the latest methods, and emphasize the li...

  1. Random sets and random fuzzy sets as ill-perceived random variables an introduction for Ph.D. students and practitioners

    CERN Document Server

    Couso, Inés; Sánchez, Luciano

    2014-01-01

    This short book provides a unified view of the history and theory of random sets and fuzzy random variables, with special emphasis on its use for representing higher-order non-statistical uncertainty about statistical experiments. The authors lay bare the existence of two streams of works using the same mathematical ground, but differing form their use of sets, according to whether they represent objects of interest naturally taking the form of sets, or imprecise knowledge about such objects. Random (fuzzy) sets can be used in many fields ranging from mathematical morphology, economics, artificial intelligence, information processing and statistics per se, especially in areas where the outcomes of random experiments cannot be observed with full precision. This book also emphasizes the link between random sets and fuzzy sets with some techniques related to the theory of imprecise probabilities. This small book is intended for graduate and doctoral students in mathematics or engineering, but also provides an i...

  2. Determination of probability density functions for parameters in the Munson-Dawson model for creep behavior of salt

    International Nuclear Information System (INIS)

    Pfeifle, T.W.; Mellegard, K.D.; Munson, D.E.

    1992-10-01

    The modified Munson-Dawson (M-D) constitutive model that describes the creep behavior of salt will be used in performance assessment calculations to assess compliance of the Waste Isolation Pilot Plant (WIPP) facility with requirements governing the disposal of nuclear waste. One of these standards requires that the uncertainty of future states of the system, material model parameters, and data be addressed in the performance assessment models. This paper presents a method in which measurement uncertainty and the inherent variability of the material are characterized by treating the M-D model parameters as random variables. The random variables can be described by appropriate probability distribution functions which then can be used in Monte Carlo or structural reliability analyses. Estimates of three random variables in the M-D model were obtained by fitting a scalar form of the model to triaxial compression creep data generated from tests of WIPP salt. Candidate probability distribution functions for each of the variables were then fitted to the estimates and their relative goodness-of-fit tested using the Kolmogorov-Smirnov statistic. A sophisticated statistical software package obtained from BMDP Statistical Software, Inc. was used in the M-D model fitting. A separate software package, STATGRAPHICS, was used in fitting the candidate probability distribution functions to estimates of the variables. Skewed distributions, i.e., lognormal and Weibull, were found to be appropriate for the random variables analyzed

  3. Culture and Probability Judgment Accuracy: The Influence of Holistic Reasoning.

    Science.gov (United States)

    Lechuga, Julia; Wiebe, John S

    2011-08-01

    A well-established phenomenon in the judgment and decision-making tradition is the overconfidence one places in the amount of knowledge that one possesses. Overconfidence or probability judgment accuracy varies not only individually but also across cultures. However, research efforts to explain cross-cultural variations in the overconfidence phenomenon have seldom been made. In Study 1, the authors compared the probability judgment accuracy of U.S. Americans (N = 108) and Mexican participants (N = 100). In Study 2, they experimentally primed culture by randomly assigning English/Spanish bilingual Mexican Americans (N = 195) to response language. Results of both studies replicated the cross-cultural variation of probability judgment accuracy previously observed in other cultural groups. U.S. Americans displayed less overconfidence when compared to Mexicans. These results were then replicated in bilingual participants, when culture was experimentally manipulated with language priming. Holistic reasoning did not account for the cross-cultural variation of overconfidence. Suggestions for future studies are discussed.

  4. A scaling law for random walks on networks

    Science.gov (United States)

    Perkins, Theodore J.; Foxall, Eric; Glass, Leon; Edwards, Roderick

    2014-10-01

    The dynamics of many natural and artificial systems are well described as random walks on a network: the stochastic behaviour of molecules, traffic patterns on the internet, fluctuations in stock prices and so on. The vast literature on random walks provides many tools for computing properties such as steady-state probabilities or expected hitting times. Previously, however, there has been no general theory describing the distribution of possible paths followed by a random walk. Here, we show that for any random walk on a finite network, there are precisely three mutually exclusive possibilities for the form of the path distribution: finite, stretched exponential and power law. The form of the distribution depends only on the structure of the network, while the stepping probabilities control the parameters of the distribution. We use our theory to explain path distributions in domains such as sports, music, nonlinear dynamics and stochastic chemical kinetics.

  5. Inverse problems for random differential equations using the collage method for random contraction mappings

    Science.gov (United States)

    Kunze, H. E.; La Torre, D.; Vrscay, E. R.

    2009-01-01

    In this paper we are concerned with differential equations with random coefficients which will be considered as random fixed point equations of the form T([omega],x([omega]))=x([omega]), [omega][set membership, variant][Omega]. Here T:[Omega]×X-->X is a random integral operator, is a probability space and X is a complete metric space. We consider the following inverse problem for such equations: Given a set of realizations of the fixed point of T (possibly the interpolations of different observational data sets), determine the operator T or the mean value of its random components, as appropriate. We solve the inverse problem for this class of equations by using the collage theorem for contraction mappings.

  6. Random vibrations theory and practice

    CERN Document Server

    Wirsching, Paul H; Ortiz, Keith

    1995-01-01

    Random Vibrations: Theory and Practice covers the theory and analysis of mechanical and structural systems undergoing random oscillations due to any number of phenomena— from engine noise, turbulent flow, and acoustic noise to wind, ocean waves, earthquakes, and rough pavement. For systems operating in such environments, a random vibration analysis is essential to the safety and reliability of the system. By far the most comprehensive text available on random vibrations, Random Vibrations: Theory and Practice is designed for readers who are new to the subject as well as those who are familiar with the fundamentals and wish to study a particular topic or use the text as an authoritative reference. It is divided into three major sections: fundamental background, random vibration development and applications to design, and random signal analysis. Introductory chapters cover topics in probability, statistics, and random processes that prepare the reader for the development of the theory of random vibrations a...

  7. Recurrence and Polya Number of General One-Dimensional Random Walks

    International Nuclear Information System (INIS)

    Zhang Xiaokun; Wan Jing; Lu Jingju; Xu Xinping

    2011-01-01

    The recurrence properties of random walks can be characterized by Polya number, i.e., the probability that the walker has returned to the origin at least once. In this paper, we consider recurrence properties for a general 1D random walk on a line, in which at each time step the walker can move to the left or right with probabilities l and r, or remain at the same position with probability o (l + r + o = 1). We calculate Polya number P of this model and find a simple expression for P as, P = 1 - Δ, where Δ is the absolute difference of l and r (Δ = |l - r|). We prove this rigorous expression by the method of creative telescoping, and our result suggests that the walk is recurrent if and only if the left-moving probability l equals to the right-moving probability r. (general)

  8. Spectral shaping of a randomized PWM DC-DC converter using maximum entropy probability distributions

    CSIR Research Space (South Africa)

    Dove, Albert

    2017-01-01

    Full Text Available maintaining constraints in a DC-DC converter is investigated. A probability distribution whose aim is to ensure maximal harmonic spreading and yet mainaint constraints is presented. The PDFs are determined from a direct application of the method of Maximum...

  9. New family of probability distributions with applications to Monte Carlo studies

    International Nuclear Information System (INIS)

    Johnson, M.E.; Tietjen, G.L.; Beckman, R.J.

    1980-01-01

    A new probability distribution is presented that offers considerable potential for providing stochastic inputs to Monte Carlo simulation studies. The distribution includes the exponential power family as a special case. An efficient computational strategy is proposed for random variate generation. An example for testing the hypothesis of unit variance illustrates the advantages of the proposed distribution

  10. Snow-melt flood frequency analysis by means of copula based 2D probability distributions for the Narew River in Poland

    Directory of Open Access Journals (Sweden)

    Bogdan Ozga-Zielinski

    2016-06-01

    New hydrological insights for the region: The results indicated that the 2D normal probability distribution model gives a better probabilistic description of snowmelt floods characterized by the 2-dimensional random variable (Qmax,f, Vf compared to the elliptical Gaussian copula and Archimedean 1-parameter Gumbel–Hougaard copula models, in particular from the view point of probability of exceedance as well as complexity and time of computation. Nevertheless, the copula approach offers a new perspective in estimating the 2D probability distribution for multidimensional random variables. Results showed that the 2D model for snowmelt floods built using the Gumbel–Hougaard copula is much better than the model built using the Gaussian copula.

  11. The groupies of random multipartite graphs

    OpenAIRE

    Portmann, Marius; Wang, Hongyun

    2012-01-01

    If a vertex $v$ in a graph $G$ has degree larger than the average of the degrees of its neighbors, we call it a groupie in $G$. In the current work, we study the behavior of groupie in random multipartite graphs with the link probability between sets of nodes fixed. Our results extend the previous ones on random (bipartite) graphs.

  12. A stochastic-bayesian model for the fracture probability of PWR pressure vessels

    Energy Technology Data Exchange (ETDEWEB)

    Francisco, Alexandre S.; Duran, Jorge Alberto R., E-mail: afrancisco@metal.eeimvr.uff.br, E-mail: duran@metal.eeimvr.uff.br [Universidade Federal Fluminense (UFF), Volta Redonda, RJ (Brazil). Dept. de Engenharia Mecanica

    2013-07-01

    Fracture probability of pressure vessels containing cracks can be obtained by methodologies of easy understanding, which require a deterministic treatment, complemented by statistical methods. However, more accurate results are required, methodologies need to be better formulated. This paper presents a new methodology to address this problem. First, a more rigorous methodology is obtained by means of the relationship of probability distributions that model crack incidence and nondestructive inspection efficiency using the Bayes' theorem. The result is an updated crack incidence distribution. Further, the accuracy of the methodology is improved by using a stochastic model for the crack growth. The stochastic model incorporates the statistical variability of the crack growth process, combining the stochastic theory with experimental data. Stochastic differential equations are derived by the randomization of empirical equations. From the solution of this equation, a distribution function related to the crack growth is derived. The fracture probability using both probability distribution functions is in agreement with theory, and presents realistic value for pressure vessels. (author)

  13. A stochastic-bayesian model for the fracture probability of PWR pressure vessels

    International Nuclear Information System (INIS)

    Francisco, Alexandre S.; Duran, Jorge Alberto R.

    2013-01-01

    Fracture probability of pressure vessels containing cracks can be obtained by methodologies of easy understanding, which require a deterministic treatment, complemented by statistical methods. However, more accurate results are required, methodologies need to be better formulated. This paper presents a new methodology to address this problem. First, a more rigorous methodology is obtained by means of the relationship of probability distributions that model crack incidence and nondestructive inspection efficiency using the Bayes' theorem. The result is an updated crack incidence distribution. Further, the accuracy of the methodology is improved by using a stochastic model for the crack growth. The stochastic model incorporates the statistical variability of the crack growth process, combining the stochastic theory with experimental data. Stochastic differential equations are derived by the randomization of empirical equations. From the solution of this equation, a distribution function related to the crack growth is derived. The fracture probability using both probability distribution functions is in agreement with theory, and presents realistic value for pressure vessels. (author)

  14. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    International Nuclear Information System (INIS)

    Vourdas, A.

    2014-01-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H 1 ,H 2 ), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H 1 ),P(H 2 ), to the subspaces H 1 , H 2 . As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities

  15. Device-independent randomness generation from several Bell estimators

    Science.gov (United States)

    Nieto-Silleras, Olmo; Bamps, Cédric; Silman, Jonathan; Pironio, Stefano

    2018-02-01

    Device-independent randomness generation and quantum key distribution protocols rely on a fundamental relation between the non-locality of quantum theory and its random character. This relation is usually expressed in terms of a trade-off between the probability of guessing correctly the outcomes of measurements performed on quantum systems and the amount of violation of a given Bell inequality. However, a more accurate assessment of the randomness produced in Bell experiments can be obtained if the value of several Bell expressions is simultaneously taken into account, or if the full set of probabilities characterizing the behavior of the device is considered. We introduce protocols for device-independent randomness generation secure against classical side information, that rely on the estimation of an arbitrary number of Bell expressions or even directly on the experimental frequencies of measurement outcomes. Asymptotically, this results in an optimal generation of randomness from experimental data (as measured by the min-entropy), without having to assume beforehand that the devices violate a specific Bell inequality.

  16. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  17. A New Random Walk for Replica Detection in WSNs

    Science.gov (United States)

    Aalsalem, Mohammed Y.; Saad, N. M.; Hossain, Md. Shohrab; Atiquzzaman, Mohammed; Khan, Muhammad Khurram

    2016-01-01

    Wireless Sensor Networks (WSNs) are vulnerable to Node Replication attacks or Clone attacks. Among all the existing clone detection protocols in WSNs, RAWL shows the most promising results by employing Simple Random Walk (SRW). More recently, RAND outperforms RAWL by incorporating Network Division with SRW. Both RAND and RAWL have used SRW for random selection of witness nodes which is problematic because of frequently revisiting the previously passed nodes that leads to longer delays, high expenditures of energy with lower probability that witness nodes intersect. To circumvent this problem, we propose to employ a new kind of constrained random walk, namely Single Stage Memory Random Walk and present a distributed technique called SSRWND (Single Stage Memory Random Walk with Network Division). In SSRWND, single stage memory random walk is combined with network division aiming to decrease the communication and memory costs while keeping the detection probability higher. Through intensive simulations it is verified that SSRWND guarantees higher witness node security with moderate communication and memory overheads. SSRWND is expedient for security oriented application fields of WSNs like military and medical. PMID:27409082

  18. A New Random Walk for Replica Detection in WSNs.

    Science.gov (United States)

    Aalsalem, Mohammed Y; Khan, Wazir Zada; Saad, N M; Hossain, Md Shohrab; Atiquzzaman, Mohammed; Khan, Muhammad Khurram

    2016-01-01

    Wireless Sensor Networks (WSNs) are vulnerable to Node Replication attacks or Clone attacks. Among all the existing clone detection protocols in WSNs, RAWL shows the most promising results by employing Simple Random Walk (SRW). More recently, RAND outperforms RAWL by incorporating Network Division with SRW. Both RAND and RAWL have used SRW for random selection of witness nodes which is problematic because of frequently revisiting the previously passed nodes that leads to longer delays, high expenditures of energy with lower probability that witness nodes intersect. To circumvent this problem, we propose to employ a new kind of constrained random walk, namely Single Stage Memory Random Walk and present a distributed technique called SSRWND (Single Stage Memory Random Walk with Network Division). In SSRWND, single stage memory random walk is combined with network division aiming to decrease the communication and memory costs while keeping the detection probability higher. Through intensive simulations it is verified that SSRWND guarantees higher witness node security with moderate communication and memory overheads. SSRWND is expedient for security oriented application fields of WSNs like military and medical.

  19. Probability of Finding Marrow Unrelated Donor (MUD) for an Indian patient in a Multi-national Human Leukocyte Antigen (HLA) Registry.

    Science.gov (United States)

    Tiwari, Aseem K; Bhati-Kushwaha, Himakshi; Kukreja, Pooja; Mishra, Vikash C; Tyagi, Neetu; Sharma, Ashish; Raina, Vimarsh

    2015-06-01

    With an increase in the number of transplants happening globally, hematopoietic stem cells (HSC) transplantation from matched unrelated donor (MUD) has begun. The increasing trend of MUD transplants across countries has been largely facilitated with the conspicuous growth of volunteer HSC donor noted in the last decade i.e. 8 million HSC donors in 2002 to more than 22 million in 2013 registered in 71 member registries of the Bone Marrow Donor Worldwide (BMDW). Some populations of the world are still very poorly represented in these registries. Since, the chances of successful engraftment and disease free survival are directly proportional to the HLA compatibility between the recipient and the prospective donor, the diversity of the HLA system at the antigenic and allelic level and the heterogeneity of HLA data of the registered donors has a bearing on the probability of finding a volunteer unrelated HSC donor for patients from such populations. In the present study 126 patients were identified suffering from hematological diseases requiring MUD transplant. Their HLA typing was performed and search was done using BMDW database. The search results for these Indian patients in the multinational registry as well as in the Indian Registries were analyzed using mean, range, standard deviation and finally evaluated in terms of probability for finding matched donor (MUD). Total Asian population is only 11 % in the BMDW making it difficult to find a MUD for an Asian patient. The current study supports this, experimentally; revealing that the probability of finding an allele match for an Indian patient in the multinational Human Leukocyte Antigen (HLA) registries is 16 % and a dismal 0.008 % in the Indian registries (donors in Indian registries is just 33,678 as compared to 22.5 million in BMDW). This greatly, emphasizes on enhancing the number of Indian donors in Indian and multi-national registries.

  20. Random walks on reductive groups

    CERN Document Server

    Benoist, Yves

    2016-01-01

    The classical theory of Random Walks describes the asymptotic behavior of sums of independent identically distributed random real variables. This book explains the generalization of this theory to products of independent identically distributed random matrices with real coefficients. Under the assumption that the action of the matrices is semisimple – or, equivalently, that the Zariski closure of the group generated by these matrices is reductive - and under suitable moment assumptions, it is shown that the norm of the products of such random matrices satisfies a number of classical probabilistic laws. This book includes necessary background on the theory of reductive algebraic groups, probability theory and operator theory, thereby providing a modern introduction to the topic.

  1. Random Fields

    Science.gov (United States)

    Vanmarcke, Erik

    1983-03-01

    Random variation over space and time is one of the few attributes that might safely be predicted as characterizing almost any given complex system. Random fields or "distributed disorder systems" confront astronomers, physicists, geologists, meteorologists, biologists, and other natural scientists. They appear in the artifacts developed by electrical, mechanical, civil, and other engineers. They even underlie the processes of social and economic change. The purpose of this book is to bring together existing and new methodologies of random field theory and indicate how they can be applied to these diverse areas where a "deterministic treatment is inefficient and conventional statistics insufficient." Many new results and methods are included. After outlining the extent and characteristics of the random field approach, the book reviews the classical theory of multidimensional random processes and introduces basic probability concepts and methods in the random field context. It next gives a concise amount of the second-order analysis of homogeneous random fields, in both the space-time domain and the wave number-frequency domain. This is followed by a chapter on spectral moments and related measures of disorder and on level excursions and extremes of Gaussian and related random fields. After developing a new framework of analysis based on local averages of one-, two-, and n-dimensional processes, the book concludes with a chapter discussing ramifications in the important areas of estimation, prediction, and control. The mathematical prerequisite has been held to basic college-level calculus.

  2. A Novel Probability Model for Suppressing Multipath Ghosts in GPR and TWI Imaging: A Numerical Study

    Directory of Open Access Journals (Sweden)

    Tan Yun-hua

    2015-10-01

    Full Text Available A novel concept for suppressing the problem of multipath ghosts in Ground Penetrating Radar (GPR and Through-Wall Imaging (TWI is presented. Ghosts (i.e., false targets mainly arise from the use of the Born or single-scattering approximations that lead to linearized imaging algorithms; however, these approximations neglect the effect of multiple scattering (or multipath between the electromagnetic wavefield and the object under investigation. In contrast to existing methods of suppressing multipath ghosts, the proposed method models for the first time the reflectivity of the probed objects as a probability function up to a normalized factor and introduces the concept of random subaperture by randomly picking up measurement locations from the entire aperture. Thus, the final radar image is a joint probability distribution that corresponds to radar images derived from multiple random subapertures. Finally, numerical experiments are used to demonstrate the performance of the proposed methodology in GPR and TWI imaging.

  3. Joint genome-wide prediction in several populations accounting for randomness of genotypes: A hierarchical Bayes approach. I: Multivariate Gaussian priors for marker effects and derivation of the joint probability mass function of genotypes.

    Science.gov (United States)

    Martínez, Carlos Alberto; Khare, Kshitij; Banerjee, Arunava; Elzo, Mauricio A

    2017-03-21

    It is important to consider heterogeneity of marker effects and allelic frequencies in across population genome-wide prediction studies. Moreover, all regression models used in genome-wide prediction overlook randomness of genotypes. In this study, a family of hierarchical Bayesian models to perform across population genome-wide prediction modeling genotypes as random variables and allowing population-specific effects for each marker was developed. Models shared a common structure and differed in the priors used and the assumption about residual variances (homogeneous or heterogeneous). Randomness of genotypes was accounted for by deriving the joint probability mass function of marker genotypes conditional on allelic frequencies and pedigree information. As a consequence, these models incorporated kinship and genotypic information that not only permitted to account for heterogeneity of allelic frequencies, but also to include individuals with missing genotypes at some or all loci without the need for previous imputation. This was possible because the non-observed fraction of the design matrix was treated as an unknown model parameter. For each model, a simpler version ignoring population structure, but still accounting for randomness of genotypes was proposed. Implementation of these models and computation of some criteria for model comparison were illustrated using two simulated datasets. Theoretical and computational issues along with possible applications, extensions and refinements were discussed. Some features of the models developed in this study make them promising for genome-wide prediction, the use of information contained in the probability distribution of genotypes is perhaps the most appealing. Further studies to assess the performance of the models proposed here and also to compare them with conventional models used in genome-wide prediction are needed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Reducing the Probability of Incidents Through Behavior-Based Safety -- An Anomaly or Not?

    International Nuclear Information System (INIS)

    Turek, John A

    2002-01-01

    Reducing the probability of incidents through Behavior-Based Safety--an anomaly or not? Can a Behavior-Based Safety (BBS) process reduce the probability of an employee sustaining a work-related injury or illness? This presentation describes the actions taken to implement a sustainable BBS process and evaluates its effectiveness. The BBS process at the Stanford Linear Accelerator Center used a pilot population of national laboratory employees to: Achieve employee and management support; Reduce the probability of employees' sustaining work-related injuries and illnesses; and Provide support for additional funding to expand within the laboratory

  5. Component fragility data base for reliability and probability studies

    International Nuclear Information System (INIS)

    Bandyopadhyay, K.; Hofmayer, C.; Kassier, M.; Pepper, S.

    1989-01-01

    Safety-related equipment in a nuclear plant plays a vital role in its proper operation and control, and failure of such equipment due to an earthquake may pose a risk to the safe operation of the plant. Therefore, in order to assess the overall reliability of a plant, the reliability of performance of the equipment should be studied first. The success of a reliability or a probability study depends to a great extent on the data base. To meet this demand, Brookhaven National Laboratory (BNL) has formed a test data base relating the seismic capacity of equipment specimens to the earthquake levels. Subsequently, the test data have been analyzed for use in reliability and probability studies. This paper describes the data base and discusses the analysis methods. The final results that can be directly used in plant reliability and probability studies are also presented in this paper

  6. Objective Lightning Probability Forecasts for East-Central Florida Airports

    Science.gov (United States)

    Crawford, Winfred C.

    2013-01-01

    The forecasters at the National Weather Service in Melbourne, FL, (NWS MLB) identified a need to make more accurate lightning forecasts to help alleviate delays due to thunderstorms in the vicinity of several commercial airports in central Florida at which they are responsible for issuing terminal aerodrome forecasts. Such forecasts would also provide safer ground operations around terminals, and would be of value to Center Weather Service Units serving air traffic controllers in Florida. To improve the forecast, the AMU was tasked to develop an objective lightning probability forecast tool for the airports using data from the National Lightning Detection Network (NLDN). The resulting forecast tool is similar to that developed by the AMU to support space launch operations at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) for use by the 45th Weather Squadron (45 WS) in previous tasks (Lambert and Wheeler 2005, Lambert 2007). The lightning probability forecasts are valid for the time periods and areas needed by the NWS MLB forecasters in the warm season months, defined in this task as May-September.

  7. Flux-probability distributions from the master equation for radiation transport in stochastic media

    International Nuclear Information System (INIS)

    Franke, Brian C.; Prinja, Anil K.

    2011-01-01

    We present numerical investigations into the accuracy of approximations in the master equation for radiation transport in discrete binary random media. Our solutions of the master equation yield probability distributions of particle flux at each element of phase space. We employ the Levermore-Pomraning interface closure and evaluate the effectiveness of closures for the joint conditional flux distribution for estimating scattering integrals. We propose a parameterized model for this joint-pdf closure, varying between correlation neglect and a full-correlation model. The closure is evaluated for a variety of parameter settings. Comparisons are made with benchmark results obtained through suites of fixed-geometry realizations of random media in rod problems. All calculations are performed using Monte Carlo techniques. Accuracy of the approximations in the master equation is assessed by examining the probability distributions for reflection and transmission and by evaluating the moments of the pdfs. The results suggest the correlation-neglect setting in our model performs best and shows improved agreement in the atomic-mix limit. (author)

  8. Universal critical wrapping probabilities in the canonical ensemble

    Directory of Open Access Journals (Sweden)

    Hao Hu

    2015-09-01

    Full Text Available Universal dimensionless quantities, such as Binder ratios and wrapping probabilities, play an important role in the study of critical phenomena. We study the finite-size scaling behavior of the wrapping probability for the Potts model in the random-cluster representation, under the constraint that the total number of occupied bonds is fixed, so that the canonical ensemble applies. We derive that, in the limit L→∞, the critical values of the wrapping probability are different from those of the unconstrained model, i.e. the model in the grand-canonical ensemble, but still universal, for systems with 2yt−d>0 where yt=1/ν is the thermal renormalization exponent and d is the spatial dimension. Similar modifications apply to other dimensionless quantities, such as Binder ratios. For systems with 2yt−d≤0, these quantities share same critical universal values in the two ensembles. It is also derived that new finite-size corrections are induced. These findings apply more generally to systems in the canonical ensemble, e.g. the dilute Potts model with a fixed total number of vacancies. Finally, we formulate an efficient cluster-type algorithm for the canonical ensemble, and confirm these predictions by extensive simulations.

  9. Misclassification probability as obese or lean in hypercaloric and normocaloric diet

    Directory of Open Access Journals (Sweden)

    ANDRÉ F NASCIMENTO

    2008-01-01

    Full Text Available The aim of the present study was to determine the classification error probabilities, as lean or obese, in hypercaloric diet-induced obesity, which depends on the variable used to characterize animal obesity. In addition, the misclassification probabilities in animáis submitted to normocaloric diet were also evaluated. Male Wistar rats were randomly distributed into two groups: normal diet (ND; n=31; 3,5 Kcal/g and hypercaloric diet (HD; n=31; 4,6 Kcal/g. The ND group received commercial Labina rat feed and HD animáis a cycle of five hypercaloric diets for a 14-week period. The variables analysed were body weight, body composition, body weight to length ratio, Lee Índex, body mass Índex and misclassification probability. A 5% significance level was used. The hypercaloric pellet-diet cycle promoted increase of body weight, carcass fat, body weight to length ratio and Lee Índex. The total misclassification probabilities ranged from 19.21% to 40.91%. In conclusión, the results of this experiment show that misclassification probabilities occur when dietary manipulation is used to promote obesity in animáis. This misjudgement ranges from 19.49% to 40.52% in hypercaloric diet and 18.94% to 41.30% in normocaloric diet.

  10. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  11. Continuous-Time Classical and Quantum Random Walk on Direct Product of Cayley Graphs

    International Nuclear Information System (INIS)

    Salimi, S.; Jafarizadeh, M. A.

    2009-01-01

    In this paper we define direct product of graphs and give a recipe for obtaining probability of observing particle on vertices in the continuous-time classical and quantum random walk. In the recipe, the probability of observing particle on direct product of graph is obtained by multiplication of probability on the corresponding to sub-graphs, where this method is useful to determining probability of walk on complicated graphs. Using this method, we calculate the probability of continuous-time classical and quantum random walks on many of finite direct product Cayley graphs (complete cycle, complete K n , charter and n-cube). Also, we inquire that the classical state the stationary uniform distribution is reached as t → ∞ but for quantum state is not always satisfied. (general)

  12. Outage probability of dual-hop partial relay selection with feedback delay in the presence of interference

    KAUST Repository

    Al-Qahtani, Fawaz S.

    2011-09-01

    In this paper, we investigate the outage performance of a dual-hop relaying systems with partial relay selection and feedback delay. The analysis considers the case of Rayleigh fading channels when the relaying station as well as the destination undergo mutually independent interfering signals. Particularly, we derive the cumulative distribution function (c.d.f.) of a new type of random variable involving sum of multiple independent exponential random variables, based on which, we present closed-form expressions for the exact outage probability of a fixed amplify-and-forward (AF) and decode-and-forward (DF) relaying protocols. Numerical results are provided to illustrate the joint effect of the delayed feedback and co-channel interference on the outage probability. © 2011 IEEE.

  13. Blind Students' Learning of Probability through the Use of a Tactile Model

    Science.gov (United States)

    Vita, Aida Carvalho; Kataoka, Verônica Yumi

    2014-01-01

    The objective of this paper is to discuss how blind students learn basic concepts of probability using the tactile model proposed by Vita (2012). Among the activities were part of the teaching sequence "Jefferson's Random Walk", in which students built a tree diagram (using plastic trays, foam cards, and toys), and pictograms in 3D…

  14. Determination of the probability for radioactive materials on properties in Monticello, Utah

    International Nuclear Information System (INIS)

    Wilson, M.J.; Crutcher, J.W.; Halford, D.K.

    1991-01-01

    The former uranium mill site at Monticello, Utah, is a surplus facility subject to clean-up under the Surplus Facilities Management Program (SFMP). Surrounding properties contaminated with mill site material are also subject to cleanup, and are referred to as Monticello Vicinity Properties (MVP). The Pollutant Assessments Group (PAG) of Oak Ridge National Laboratory (ORNL), Grand Junction, Colorado (GJ), was directed by the US Department of Energy (DOE) in July 1988 to assess the radiological condition of properties in Monticello, Utah. Since the Monticello activities are on the National Priority List, extra measures to identify potentially contaminated properties were undertaken. Thus, the likelihood that a random property could contain radioactive materials became a concern to the DOE. The objective of this study was to determine the probability that a vicinity property not addressed under the MVP project could contain Monticello mill-related residual radioactive material in excess of the DOE guidelines. Results suggest approximately 20% of the properties in the Monticello area contain Monticello mill-related residual radioactive material in excess of the DOE guidelines. This suggested that further designation measures be taken prior to the close of the designation phase. A public relations effort that included a property-owner mailing effort, public posting, and newspaper advertisement was one measure taken to ensure that most properties were assessed. As a consequence of this study, DOE directed that radiological screening surveys be conducted on the entirety of the Monticello area

  15. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  16. Joint probability safety assessment for NPP defense infrastructure against extreme external natural hazards

    International Nuclear Information System (INIS)

    Guilin, L.; Defu, L.; Huajun, L.; Fengqing, W.; Tao, Z.

    2012-01-01

    With the increasing tendency of natural hazards, the typhoon, hurricane and tropical Cyclone induced surge, wave, precipitation, flood and wind as extreme external loads menacing Nuclear Power Plants (NPP) in coastal and inland provinces of China. For all of planned, designed And constructed NPP the National Nuclear Safety Administration of China and IAEA recommended Probable Maximum Hurricane /Typhoon/(PMH/T), Probable Maximum Storm Surge (PMSS), Probable Maximum Flood (PMF), Design Basis Flood (DBF) as safety regulations for NPP defense infrastructures. This paper discusses the joint probability analysis of simultaneous occurrence typhoon induced extreme external hazards and compare with IAEA 2006-2009 recommended safety regulation design criteria for some NPP defense infrastructures along China coast. (authors)

  17. Evaluation of Bearing Capacity of Strip Footing Using Random Layers Concept

    Directory of Open Access Journals (Sweden)

    Kawa Marek

    2015-09-01

    Full Text Available The paper deals with evaluation of bearing capacity of strip foundation on random purely cohesive soil. The approach proposed combines random field theory in the form of random layers with classical limit analysis and Monte Carlo simulation. For given realization of random the bearing capacity of strip footing is evaluated by employing the kinematic approach of yield design theory. The results in the form of histograms for both bearing capacity of footing as well as optimal depth of failure mechanism are obtained for different thickness of random layers. For zero and infinite thickness of random layer the values of depth of failure mechanism as well as bearing capacity assessment are derived in a closed form. Finally based on a sequence of Monte Carlo simulations the bearing capacity of strip footing corresponding to a certain probability of failure is estimated. While the mean value of the foundation bearing capacity increases with the thickness of the random layers, the ultimate load corresponding to a certain probability of failure appears to be a decreasing function of random layers thickness.

  18. Ensemble based system for whole-slide prostate cancer probability mapping using color texture features.

    LENUS (Irish Health Repository)

    DiFranco, Matthew D

    2011-01-01

    We present a tile-based approach for producing clinically relevant probability maps of prostatic carcinoma in histological sections from radical prostatectomy. Our methodology incorporates ensemble learning for feature selection and classification on expert-annotated images. Random forest feature selection performed over varying training sets provides a subset of generalized CIEL*a*b* co-occurrence texture features, while sample selection strategies with minimal constraints reduce training data requirements to achieve reliable results. Ensembles of classifiers are built using expert-annotated tiles from training images, and scores for the probability of cancer presence are calculated from the responses of each classifier in the ensemble. Spatial filtering of tile-based texture features prior to classification results in increased heat-map coherence as well as AUC values of 95% using ensembles of either random forests or support vector machines. Our approach is designed for adaptation to different imaging modalities, image features, and histological decision domains.

  19. Off-diagonal long-range order, cycle probabilities, and condensate fraction in the ideal Bose gas.

    Science.gov (United States)

    Chevallier, Maguelonne; Krauth, Werner

    2007-11-01

    We discuss the relationship between the cycle probabilities in the path-integral representation of the ideal Bose gas, off-diagonal long-range order, and Bose-Einstein condensation. Starting from the Landsberg recursion relation for the canonic partition function, we use elementary considerations to show that in a box of size L3 the sum of the cycle probabilities of length k>L2 equals the off-diagonal long-range order parameter in the thermodynamic limit. For arbitrary systems of ideal bosons, the integer derivative of the cycle probabilities is related to the probability of condensing k bosons. We use this relation to derive the precise form of the pik in the thermodynamic limit. We also determine the function pik for arbitrary systems. Furthermore, we use the cycle probabilities to compute the probability distribution of the maximum-length cycles both at T=0, where the ideal Bose gas reduces to the study of random permutations, and at finite temperature. We close with comments on the cycle probabilities in interacting Bose gases.

  20. Interpretations of Probability in Quantum Mechanics: A Case of "Experimental Metaphysics"

    Science.gov (United States)

    Hellman, Geoffrey

    After reviewing paradigmatic cases of "experimental metaphysics" basing inferences against local realism and determinism on experimental tests of Bells theorem (and successors), we concentrate on clarifying the meaning and status of "objective probability" in quantum mechanics. The terms "objective" and "subjective" are found ambiguous and inadequate, masking crucial differences turning on the question of what the numerical values of probability functions measure vs. the question of the nature of the "events" on which such functions are defined. This leads naturally to a 2×2 matrix of types of interpretations, which are then illustrated with salient examples. (Of independent interest are the splitting of "Copenhagen interpretation" into "objective" and "subjective" varieties in one of the dimensions and the splitting of Bohmian hidden variables from (other) modal interpretations along that same dimension.) It is then explained why Everett interpretations are difficult to categorize in these terms. Finally, we argue that Bohmian mechanics does not seriously threaten the experimental-metaphysical case for ultimate randomness and purely physical probabilities.

  1. National Coral Reef Monitoring Program: Benthic Images Collected from Stratified Random Sites (StRS) across American Samoa in 2015 (NCEI Accession 0159168)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here are benthic habitat imagery that result from benthic photo-quadrat surveys conducted along transects at stratified random sites across...

  2. A Stochastic Collocation Method for Elliptic Partial Differential Equations with Random Input Data

    KAUST Repository

    Babuška, Ivo; Nobile, Fabio; Tempone, Raul

    2010-01-01

    This work proposes and analyzes a stochastic collocation method for solving elliptic partial differential equations with random coefficients and forcing terms. These input data are assumed to depend on a finite number of random variables. The method consists of a Galerkin approximation in space and a collocation in the zeros of suitable tensor product orthogonal polynomials (Gauss points) in the probability space, and naturally leads to the solution of uncoupled deterministic problems as in the Monte Carlo approach. It treats easily a wide range of situations, such as input data that depend nonlinearly on the random variables, diffusivity coefficients with unbounded second moments, and random variables that are correlated or even unbounded. We provide a rigorous convergence analysis and demonstrate exponential convergence of the “probability error” with respect to the number of Gauss points in each direction of the probability space, under some regularity assumptions on the random input data. Numerical examples show the effectiveness of the method. Finally, we include a section with developments posterior to the original publication of this work. There we review sparse grid stochastic collocation methods, which are effective collocation strategies for problems that depend on a moderately large number of random variables.

  3. {sup 134}Cs emission probabilities determination by gamma spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, M.C.M. de, E-mail: candida@cnen.gov.br [Comissão Nacional de Energia Nuclear (DINOR/CNEN), Riode Janeiro, RJ (Brazil); Poledna, R.; Delgado, J.U.; Silva, R.L.; Araujo, M.T.; Silva, C.J. da [Instituto de Radioproteção e Dosimetria (LNMRI/IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2017-07-01

    The National Laboratory for Ionizing Radiation Metrology (LNMRI/IRD/CNEN) of Rio de Janeiro performed primary and secondary standardization of different radionuclides reaching satisfactory uncertainties. A solution of {sup 134}Cs radionuclide was purchased from commercial supplier to emission probabilities determination of some of its energies. {sup 134}Cs is a beta gamma emitter with 754 days of half-life. This radionuclide is used as standard in environmental, water and food control. It is also important to germanium detector calibration.The gamma emission probabilities (Pγ) were determined mainly for some energies of the {sup 134}Cs by efficiency curve method and the Pγ absolute uncertainties obtained were below 1% (k=1). (author)

  4. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  5. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  6. Random walks in Euclidean space

    OpenAIRE

    Varjú, Péter Pál

    2012-01-01

    Consider a sequence of independent random isometries of Euclidean space with a previously fixed probability law. Apply these isometries successively to the origin and consider the sequence of random points that we obtain this way. We prove a local limit theorem under a suitable moment condition and a necessary non-degeneracy condition. Under stronger hypothesis, we prove a limit theorem on a wide range of scales: between e^(-cl^(1/4)) and l^(1/2), where l is the number of steps.

  7. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  8. Failure Probability Estimation Using Asymptotic Sampling and Its Dependence upon the Selected Sampling Scheme

    Directory of Open Access Journals (Sweden)

    Martinásková Magdalena

    2017-12-01

    Full Text Available The article examines the use of Asymptotic Sampling (AS for the estimation of failure probability. The AS algorithm requires samples of multidimensional Gaussian random vectors, which may be obtained by many alternative means that influence the performance of the AS method. Several reliability problems (test functions have been selected in order to test AS with various sampling schemes: (i Monte Carlo designs; (ii LHS designs optimized using the Periodic Audze-Eglājs (PAE criterion; (iii designs prepared using Sobol’ sequences. All results are compared with the exact failure probability value.

  9. On randomly interrupted diffusion

    International Nuclear Information System (INIS)

    Luczka, J.

    1993-01-01

    Processes driven by randomly interrupted Gaussian white noise are considered. An evolution equation for single-event probability distributions in presented. Stationary states are considered as a solution of a second-order ordinary differential equation with two imposed conditions. A linear model is analyzed and its stationary distributions are explicitly given. (author). 10 refs

  10. On Field Size and Success Probability in Network Coding

    DEFF Research Database (Denmark)

    Geil, Hans Olav; Matsumoto, Ryutaroh; Thomsen, Casper

    2008-01-01

    Using tools from algebraic geometry and Gröbner basis theory we solve two problems in network coding. First we present a method to determine the smallest field size for which linear network coding is feasible. Second we derive improved estimates on the success probability of random linear network...... coding. These estimates take into account which monomials occur in the support of the determinant of the product of Edmonds matrices. Therefore we finally investigate which monomials can occur in the determinant of the Edmonds matrix....

  11. Dynamic random walks theory and applications

    CERN Document Server

    Guillotin-Plantard, Nadine

    2006-01-01

    The aim of this book is to report on the progress realized in probability theory in the field of dynamic random walks and to present applications in computer science, mathematical physics and finance. Each chapter contains didactical material as well as more advanced technical sections. Few appendices will help refreshing memories (if necessary!).· New probabilistic model, new results in probability theory· Original applications in computer science· Applications in mathematical physics· Applications in finance

  12. Aggregate and Individual Replication Probability within an Explicit Model of the Research Process

    Science.gov (United States)

    Miller, Jeff; Schwarz, Wolf

    2011-01-01

    We study a model of the research process in which the true effect size, the replication jitter due to changes in experimental procedure, and the statistical error of effect size measurement are all normally distributed random variables. Within this model, we analyze the probability of successfully replicating an initial experimental result by…

  13. Radiation Transport in Random Media With Large Fluctuations

    Science.gov (United States)

    Olson, Aaron; Prinja, Anil; Franke, Brian

    2017-09-01

    Neutral particle transport in media exhibiting large and complex material property spatial variation is modeled by representing cross sections as lognormal random functions of space and generated through a nonlinear memory-less transformation of a Gaussian process with covariance uniquely determined by the covariance of the cross section. A Karhunen-Loève decomposition of the Gaussian process is implemented to effciently generate realizations of the random cross sections and Woodcock Monte Carlo used to transport particles on each realization and generate benchmark solutions for the mean and variance of the particle flux as well as probability densities of the particle reflectance and transmittance. A computationally effcient stochastic collocation method is implemented to directly compute the statistical moments such as the mean and variance, while a polynomial chaos expansion in conjunction with stochastic collocation provides a convenient surrogate model that also produces probability densities of output quantities of interest. Extensive numerical testing demonstrates that use of stochastic reduced-order modeling provides an accurate and cost-effective alternative to random sampling for particle transport in random media.

  14. Prevalence of masturbation and associated factors in a British national probability survey

    OpenAIRE

    Gerressu, Makeda; Mercer, Catherine H.; Graham, Cynthia A.; Wellings, Kaye; Johnson, Anne M.

    2008-01-01

    This is the post-print version of the article. The official published version can be found at the link below. A stratified probability sample survey of the British general population, aged 16 to 44 years, was conducted from 1999 to 2001 (N = 11,161) using face-to-face interviewing and computer-assisted self-interviewing. We used these data to estimate the population prevalence of masturbation, and to identify sociodemographic, sexual behavioral, and attitudinal factors associated with repo...

  15. A new formulation of the probability density function in random walk models for atmospheric dispersion

    DEFF Research Database (Denmark)

    Falk, Anne Katrine Vinther; Gryning, Sven-Erik

    1997-01-01

    In this model for atmospheric dispersion particles are simulated by the Langevin Equation, which is a stochastic differential equation. It uses the probability density function (PDF) of the vertical velocity fluctuations as input. The PDF is constructed as an expansion after Hermite polynomials...

  16. Clinical Features in a Danish Population-Based Cohort of Probable Multiple System Atrophy Patients

    DEFF Research Database (Denmark)

    Starhof, Charlotte; Korbo, Lise; Lassen, Christina Funch

    2016-01-01

    Background: Multiple system atrophy (MSA) is a rare, sporadic and progressive neurodegenerative disorder. We aimed to describe the clinical features of Danish probable MSA patients, evaluate their initial response to dopaminergic therapy and examine mortality. Methods: From the Danish National...... the criteria for probable MSA. We recorded clinical features, examined differences by MSA subtype and used Kaplan-Meier survival analysis to examine mortality. Results: The mean age at onset of patients with probable MSA was 60.2 years (range 36-75 years) and mean time to wheelchair dependency was 4.7 years...

  17. Generation of Random Numbers and Parallel Random Number Streams for Monte Carlo Simulations

    Directory of Open Access Journals (Sweden)

    L. Yu. Barash

    2012-01-01

    Full Text Available Modern methods and libraries for high quality pseudorandom number generation and for generation of parallel random number streams for Monte Carlo simulations are considered. The probability equidistribution property and the parameters when the property holds at dimensions up to logarithm of mesh size are considered for Multiple Recursive Generators.

  18. Direct modeling of regression effects for transition probabilities in the progressive illness-death model

    DEFF Research Database (Denmark)

    Azarang, Leyla; Scheike, Thomas; de Uña-Álvarez, Jacobo

    2017-01-01

    In this work, we present direct regression analysis for the transition probabilities in the possibly non-Markov progressive illness–death model. The method is based on binomial regression, where the response is the indicator of the occupancy for the given state along time. Randomly weighted score...

  19. On Generating Optimal Signal Probabilities for Random Tests: A Genetic Approach

    Directory of Open Access Journals (Sweden)

    M. Srinivas

    1996-01-01

    Full Text Available Genetic Algorithms are robust search and optimization techniques. A Genetic Algorithm based approach for determining the optimal input distributions for generating random test vectors is proposed in the paper. A cost function based on the COP testability measure for determining the efficacy of the input distributions is discussed. A brief overview of Genetic Algorithms (GAs and the specific details of our implementation are described. Experimental results based on ISCAS-85 benchmark circuits are presented. The performance of our GAbased approach is compared with previous results. While the GA generates more efficient input distributions than the previous methods which are based on gradient descent search, the overheads of the GA in computing the input distributions are larger.

  20. Random cyclic constitutive models of 0Cr18Ni10Ti pipe steel

    International Nuclear Information System (INIS)

    Zhao Yongxiang; Yang Bing

    2004-01-01

    Experimental study is performed on the random cyclic constitutive relations of a new pipe stainless steel, 0Cr18Ni10Ti, by an incremental strain-controlled fatigue test. In the test, it is verified that the random cyclic constitutive relations, like the wide recognized random cyclic strain-life relations, is an intrinsic fatigue phenomenon of engineering materials. Extrapolating the previous work by Zhao et al, probability-based constitutive models are constructed, respectively, on the bases of Ramberg-Osgood equation and its modified form. Scattering regularity and amount of the test data are taken into account. The models consist of the survival probability-strain-life curves, the confidence strain-life curves, and the survival probability-confidence-strain-life curves. Availability and feasibility of the models have been indicated by analysis of the present test data

  1. Probabilistic analysis of Millstone Unit 3 ultimate containment failure probability given high pressure: Chapter 14

    International Nuclear Information System (INIS)

    Bickel, J.H.

    1983-01-01

    The quantification of the containment event trees in the Millstone Unit 3 Probabilistic Safety Study utilizes a conditional probability of failure given high pressure which is based on a new approach. The generation of this conditional probability was based on a weakest link failure mode model which considered contributions from a number of overlapping failure modes. This overlap effect was due to a number of failure modes whose mean failure pressures were clustered within a 5 psi range and which had uncertainties due to variances in material strengths and analytical uncertainties which were between 9 and 15 psi. Based on a review of possible probability laws to describe the failure probability of individual structural failure modes, it was determined that a Weibull probability law most adequately described the randomness in the physical process of interest. The resultant conditional probability of failure is found to have a median failure pressure of 132.4 psia. The corresponding 5-95 percentile values are 112 psia and 146.7 psia respectively. The skewed nature of the conditional probability of failure vs. pressure results in a lower overall containment failure probability for an appreciable number of the severe accident sequences of interest, but also probabilities which are more rigorously traceable from first principles

  2. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  3. Predicting the probability of slip in gait: methodology and distribution study.

    Science.gov (United States)

    Gragg, Jared; Yang, James

    2016-01-01

    The likelihood of a slip is related to the available and required friction for a certain activity, here gait. Classical slip and fall analysis presumed that a walking surface was safe if the difference between the mean available and required friction coefficients exceeded a certain threshold. Previous research was dedicated to reformulating the classical slip and fall theory to include the stochastic variation of the available and required friction when predicting the probability of slip in gait. However, when predicting the probability of a slip, previous researchers have either ignored the variation in the required friction or assumed the available and required friction to be normally distributed. Also, there are no published results that actually give the probability of slip for various combinations of required and available frictions. This study proposes a modification to the equation for predicting the probability of slip, reducing the previous equation from a double-integral to a more convenient single-integral form. Also, a simple numerical integration technique is provided to predict the probability of slip in gait: the trapezoidal method. The effect of the random variable distributions on the probability of slip is also studied. It is shown that both the required and available friction distributions cannot automatically be assumed as being normally distributed. The proposed methods allow for any combination of distributions for the available and required friction, and numerical results are compared to analytical solutions for an error analysis. The trapezoidal method is shown to be highly accurate and efficient. The probability of slip is also shown to be sensitive to the input distributions of the required and available friction. Lastly, a critical value for the probability of slip is proposed based on the number of steps taken by an average person in a single day.

  4. Enhancing Security of Double Random Phase Encoding Based on Random S-Box

    Science.gov (United States)

    Girija, R.; Singh, Hukum

    2018-06-01

    In this paper, we propose a novel asymmetric cryptosystem for double random phase encoding (DRPE) using random S-Box. While utilising S-Box separately is not reliable and DRPE does not support non-linearity, so, our system unites the effectiveness of S-Box with an asymmetric system of DRPE (through Fourier transform). The uniqueness of proposed cryptosystem lies on employing high sensitivity dynamic S-Box for our DRPE system. The randomness and scalability achieved due to applied technique is an additional feature of the proposed solution. The firmness of random S-Box is investigated in terms of performance parameters such as non-linearity, strict avalanche criterion, bit independence criterion, linear and differential approximation probabilities etc. S-Boxes convey nonlinearity to cryptosystems which is a significant parameter and very essential for DRPE. The strength of proposed cryptosystem has been analysed using various parameters such as MSE, PSNR, correlation coefficient analysis, noise analysis, SVD analysis, etc. Experimental results are conferred in detail to exhibit proposed cryptosystem is highly secure.

  5. Randomized Item Response Theory Models

    NARCIS (Netherlands)

    Fox, Gerardus J.A.

    2005-01-01

    The randomized response (RR) technique is often used to obtain answers on sensitive questions. A new method is developed to measure latent variables using the RR technique because direct questioning leads to biased results. Within the RR technique is the probability of the true response modeled by

  6. Reduction of the Random Variables of the Turbulent Wind Field

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.

    2012-01-01

    .e. Importance Sampling (IS) or Subset Simulation (SS), will be deteriorated on problems with many random variables. The problem with PDEM is that a multidimensional integral has to be carried out over the space defined by the random variables of the system. The numerical procedure requires discretization......Applicability of the Probability Density Evolution Method (PDEM) for realizing evolution of the probability density for the wind turbines has rather strict bounds on the basic number of the random variables involved in the model. The efficiency of most of the Advanced Monte Carlo (AMC) methods, i...... of the integral domain; this becomes increasingly difficult as the dimensions of the integral domain increase. On the other hand efficiency of the AMC methods is closely dependent on the design points of the problem. Presence of many random variables may increase the number of the design points, hence affects...

  7. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  8. Knock probability estimation through an in-cylinder temperature model with exogenous noise

    Science.gov (United States)

    Bares, P.; Selmanaj, D.; Guardiola, C.; Onder, C.

    2018-01-01

    This paper presents a new knock model which combines a deterministic knock model based on the in-cylinder temperature and an exogenous noise disturbing this temperature. The autoignition of the end-gas is modelled by an Arrhenius-like function and the knock probability is estimated by propagating a virtual error probability distribution. Results show that the random nature of knock can be explained by uncertainties at the in-cylinder temperature estimation. The model only has one parameter for calibration and thus can be easily adapted online. In order to reduce the measurement uncertainties associated with the air mass flow sensor, the trapped mass is derived from the in-cylinder pressure resonance, which improves the knock probability estimation and reduces the number of sensors needed for the model. A four stroke SI engine was used for model validation. By varying the intake temperature, the engine speed, the injected fuel mass, and the spark advance, specific tests were conducted, which furnished data with various knock intensities and probabilities. The new model is able to predict the knock probability within a sufficient range at various operating conditions. The trapped mass obtained by the acoustical model was compared in steady conditions by using a fuel balance and a lambda sensor and differences below 1 % were found.

  9. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  10. Probability of assertive behaviour, interpersonal anxiety and self-efficacy of South African registered dietitians.

    Science.gov (United States)

    Paterson, Marie; Green, J M; Basson, C J; Ross, F

    2002-02-01

    There is little information on the probability of assertive behaviour, interpersonal anxiety and self-efficacy in the literature regarding dietitians. The objective of this study was to establish baseline information of these attributes and the factors affecting them. Questionnaires collecting biographical information and self-assessment psychometric scales measuring levels of probability of assertiveness, interpersonal anxiety and self-efficacy were mailed to 350 subjects, who comprised a random sample of dietitians registered with the Health Professions Council of South Africa. Forty-one per cent (n=145) of the sample responded. Self-assessment inventory results were compared to test levels of probability of assertive behaviour, interpersonal anxiety and self-efficacy. The inventory results were compared with the biographical findings to establish statistical relationships between the variables. The hypotheses were formulated before data collection. Dietitians had acceptable levels of probability of assertive behaviour and interpersonal anxiety. The probability of assertive behaviour was significantly lower than the level noted in the literature and was negatively related to interpersonal anxiety and positively related to self-efficacy.

  11. Subjective randomness as statistical inference.

    Science.gov (United States)

    Griffiths, Thomas L; Daniels, Dylan; Austerweil, Joseph L; Tenenbaum, Joshua B

    2018-06-01

    Some events seem more random than others. For example, when tossing a coin, a sequence of eight heads in a row does not seem very random. Where do these intuitions about randomness come from? We argue that subjective randomness can be understood as the result of a statistical inference assessing the evidence that an event provides for having been produced by a random generating process. We show how this account provides a link to previous work relating randomness to algorithmic complexity, in which random events are those that cannot be described by short computer programs. Algorithmic complexity is both incomputable and too general to capture the regularities that people can recognize, but viewing randomness as statistical inference provides two paths to addressing these problems: considering regularities generated by simpler computing machines, and restricting the set of probability distributions that characterize regularity. Building on previous work exploring these different routes to a more restricted notion of randomness, we define strong quantitative models of human randomness judgments that apply not just to binary sequences - which have been the focus of much of the previous work on subjective randomness - but also to binary matrices and spatial clustering. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. Optimal random search for a single hidden target.

    Science.gov (United States)

    Snider, Joseph

    2011-01-01

    A single target is hidden at a location chosen from a predetermined probability distribution. Then, a searcher must find a second probability distribution from which random search points are sampled such that the target is found in the minimum number of trials. Here it will be shown that if the searcher must get very close to the target to find it, then the best search distribution is proportional to the square root of the target distribution regardless of dimension. For a Gaussian target distribution, the optimum search distribution is approximately a Gaussian with a standard deviation that varies inversely with how close the searcher must be to the target to find it. For a network where the searcher randomly samples nodes and looks for the fixed target along edges, the optimum is either to sample a node with probability proportional to the square root of the out-degree plus 1 or not to do so at all.

  13. Human Inferences about Sequences: A Minimal Transition Probability Model.

    Directory of Open Access Journals (Sweden)

    Florent Meyniel

    2016-12-01

    Full Text Available The brain constantly infers the causes of the inputs it receives and uses these inferences to generate statistical expectations about future observations. Experimental evidence for these expectations and their violations include explicit reports, sequential effects on reaction times, and mismatch or surprise signals recorded in electrophysiology and functional MRI. Here, we explore the hypothesis that the brain acts as a near-optimal inference device that constantly attempts to infer the time-varying matrix of transition probabilities between the stimuli it receives, even when those stimuli are in fact fully unpredictable. This parsimonious Bayesian model, with a single free parameter, accounts for a broad range of findings on surprise signals, sequential effects and the perception of randomness. Notably, it explains the pervasive asymmetry between repetitions and alternations encountered in those studies. Our analysis suggests that a neural machinery for inferring transition probabilities lies at the core of human sequence knowledge.

  14. A random network based, node attraction facilitated network evolution method

    Directory of Open Access Journals (Sweden)

    WenJun Zhang

    2016-03-01

    Full Text Available In present study, I present a method of network evolution that based on random network, and facilitated by node attraction. In this method, I assume that the initial network is a random network, or a given initial network. When a node is ready to connect, it tends to link to the node already owning the most connections, which coincides with the general rule (Barabasi and Albert, 1999 of node connecting. In addition, a node may randomly disconnect a connection i.e., the addition of connections in the network is accompanied by the pruning of some connections. The dynamics of network evolution is determined of the attraction factor Lamda of nodes, the probability of node connection, the probability of node disconnection, and the expected initial connectance. The attraction factor of nodes, the probability of node connection, and the probability of node disconnection are time and node varying. Various dynamics can be achieved by adjusting these parameters. Effects of simplified parameters on network evolution are analyzed. The changes of attraction factor Lamda can reflect various effects of the node degree on connection mechanism. Even the changes of Lamda only will generate various networks from the random to the complex. Therefore, the present algorithm can be treated as a general model for network evolution. Modeling results show that to generate a power-law type of network, the likelihood of a node attracting connections is dependent upon the power function of the node's degree with a higher-order power. Matlab codes for simplified version of the method are provided.

  15. Reactor pressure vessel failure probability following through-wall cracks due to pressurized thermal shock events

    International Nuclear Information System (INIS)

    Simonen, F.A.; Garnich, M.R.; Simonen, E.P.; Bian, S.H.; Nomura, K.K.; Anderson, W.E.; Pedersen, L.T.

    1986-04-01

    A fracture mechanics model was developed at the Pacific Northwest Laboratory (PNL) to predict the behavior of a reactor pressure vessel following a through-wall crack that occurs during a pressurized thermal shock (PTS) event. This study, which contributed to a US Nuclear Regulatory Commission (NRC) program to study PTS risk, was coordinated with the Integrated Pressurized Thermal Shock (IPTS) Program at Oak Ridge National Laboratory (ORNL). The PNL fracture mechanics model uses the critical transients and probabilities of through-wall cracks from the IPTS Program. The PNL model predicts the arrest, reinitiation, and direction of crack growth for a postulated through-wall crack and thereby predicts the mode of vessel failure. A Monte-Carlo type of computer code was written to predict the probabilities of the alternative failure modes. This code treats the fracture mechanics properties of the various welds and plates of a vessel as random variables. Plant-specific calculations were performed for the Oconee-1, Calvert Cliffs-1, and H.B. Robinson-2 reactor pressure vessels for the conditions of postulated transients. The model predicted that 50% or more of the through-wall axial cracks will turn to follow a circumferential weld. The predicted failure mode is a complete circumferential fracture of the vessel, which results in a potential vertically directed missile consisting of the upper head assembly. Missile arrest calculations for the three nuclear plants predict that such vertical missiles, as well as all potential horizontally directed fragmentation type missiles, will be confined to the vessel enclosre cavity. The PNL failure mode model is recommended for use in future evaluations of other plants, to determine the failure modes that are most probable for postulated PTS events

  16. Probability estimation with machine learning methods for dichotomous and multicategory outcome: theory.

    Science.gov (United States)

    Kruppa, Jochen; Liu, Yufeng; Biau, Gérard; Kohler, Michael; König, Inke R; Malley, James D; Ziegler, Andreas

    2014-07-01

    Probability estimation for binary and multicategory outcome using logistic and multinomial logistic regression has a long-standing tradition in biostatistics. However, biases may occur if the model is misspecified. In contrast, outcome probabilities for individuals can be estimated consistently with machine learning approaches, including k-nearest neighbors (k-NN), bagged nearest neighbors (b-NN), random forests (RF), and support vector machines (SVM). Because machine learning methods are rarely used by applied biostatisticians, the primary goal of this paper is to explain the concept of probability estimation with these methods and to summarize recent theoretical findings. Probability estimation in k-NN, b-NN, and RF can be embedded into the class of nonparametric regression learning machines; therefore, we start with the construction of nonparametric regression estimates and review results on consistency and rates of convergence. In SVMs, outcome probabilities for individuals are estimated consistently by repeatedly solving classification problems. For SVMs we review classification problem and then dichotomous probability estimation. Next we extend the algorithms for estimating probabilities using k-NN, b-NN, and RF to multicategory outcomes and discuss approaches for the multicategory probability estimation problem using SVM. In simulation studies for dichotomous and multicategory dependent variables we demonstrate the general validity of the machine learning methods and compare it with logistic regression. However, each method fails in at least one simulation scenario. We conclude with a discussion of the failures and give recommendations for selecting and tuning the methods. Applications to real data and example code are provided in a companion article (doi:10.1002/bimj.201300077). © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Implicit Segmentation of a Stream of Syllables Based on Transitional Probabilities: An MEG Study

    Science.gov (United States)

    Teinonen, Tuomas; Huotilainen, Minna

    2012-01-01

    Statistical segmentation of continuous speech, i.e., the ability to utilise transitional probabilities between syllables in order to detect word boundaries, is reflected in the brain's auditory event-related potentials (ERPs). The N1 and N400 ERP components are typically enhanced for word onsets compared to random syllables during active…

  18. Quantum randomness and unpredictability

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, Gregg [Quantum Communication and Measurement Laboratory, Department of Electrical and Computer Engineering and Division of Natural Science and Mathematics, Boston University, Boston, MA (United States)

    2017-06-15

    Quantum mechanics is a physical theory supplying probabilities corresponding to expectation values for measurement outcomes. Indeed, its formalism can be constructed with measurement as a fundamental process, as was done by Schwinger, provided that individual measurements outcomes occur in a random way. The randomness appearing in quantum mechanics, as with other forms of randomness, has often been considered equivalent to a form of indeterminism. Here, it is argued that quantum randomness should instead be understood as a form of unpredictability because, amongst other things, indeterminism is not a necessary condition for randomness. For concreteness, an explication of the randomness of quantum mechanics as the unpredictability of quantum measurement outcomes is provided. Finally, it is shown how this view can be combined with the recently introduced view that the very appearance of individual quantum measurement outcomes can be grounded in the Plenitude principle of Leibniz, a principle variants of which have been utilized in physics by Dirac and Gell-Mann in relation to the fundamental processes. This move provides further support to Schwinger's ''symbolic'' derivation of quantum mechanics from measurement. (copyright 2016 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  19. A Semi-Analytical Method for the PDFs of A Ship Rolling in Random Oblique Waves

    Science.gov (United States)

    Liu, Li-qin; Liu, Ya-liu; Xu, Wan-hai; Li, Yan; Tang, You-gang

    2018-03-01

    The PDFs (probability density functions) and probability of a ship rolling under the random parametric and forced excitations were studied by a semi-analytical method. The rolling motion equation of the ship in random oblique waves was established. The righting arm obtained by the numerical simulation was approximately fitted by an analytical function. The irregular waves were decomposed into two Gauss stationary random processes, and the CARMA (2, 1) model was used to fit the spectral density function of parametric and forced excitations. The stochastic energy envelope averaging method was used to solve the PDFs and the probability. The validity of the semi-analytical method was verified by the Monte Carlo method. The C11 ship was taken as an example, and the influences of the system parameters on the PDFs and probability were analyzed. The results show that the probability of ship rolling is affected by the characteristic wave height, wave length, and the heading angle. In order to provide proper advice for the ship's manoeuvring, the parametric excitations should be considered appropriately when the ship navigates in the oblique seas.

  20. Trending in Probability of Collision Measurements via a Bayesian Zero-Inflated Beta Mixed Model

    Science.gov (United States)

    Vallejo, Jonathon; Hejduk, Matt; Stamey, James

    2015-01-01

    We investigate the performance of a generalized linear mixed model in predicting the Probabilities of Collision (Pc) for conjunction events. Specifically, we apply this model to the log(sub 10) transformation of these probabilities and argue that this transformation yields values that can be considered bounded in practice. Additionally, this bounded random variable, after scaling, is zero-inflated. Consequently, we model these values using the zero-inflated Beta distribution, and utilize the Bayesian paradigm and the mixed model framework to borrow information from past and current events. This provides a natural way to model the data and provides a basis for answering questions of interest, such as what is the likelihood of observing a probability of collision equal to the effective value of zero on a subsequent observation.

  1. On the joint statistics of stable random processes

    International Nuclear Information System (INIS)

    Hopcraft, K I; Jakeman, E

    2011-01-01

    A utilitarian continuous bi-variate random process whose first-order probability density function is a stable random variable is constructed. Results paralleling some of those familiar from the theory of Gaussian noise are derived. In addition to the joint-probability density for the process, these include fractional moments and structure functions. Although the correlation functions for stable processes other than Gaussian do not exist, we show that there is coherence between values adopted by the process at different times, which identifies a characteristic evolution with time. The distribution of the derivative of the process, and the joint-density function of the value of the process and its derivative measured at the same time are evaluated. These enable properties to be calculated analytically such as level crossing statistics and those related to the random telegraph wave. When the stable process is fractal, the proportion of time it spends at zero is finite and some properties of this quantity are evaluated, an optical interpretation for which is provided. (paper)

  2. VISA-2, Reactor Vessel Failure Probability Under Thermal Shock

    International Nuclear Information System (INIS)

    Simonen, F.; Johnson, K.

    1992-01-01

    1 - Description of program or function: VISA2 (Vessel Integrity Simulation Analysis) was developed to estimate the failure probability of nuclear reactor pressure vessels under pressurized thermal shock conditions. The deterministic portion of the code performs heat transfer, stress, and fracture mechanics calculations for a vessel subjected to a user-specified temperature and pressure transient. The probabilistic analysis performs a Monte Carlo simulation to estimate the probability of vessel failure. Parameters such as initial crack size and position, copper and nickel content, fluence, and the fracture toughness values for crack initiation and arrest are treated as random variables. Linear elastic fracture mechanics methods are used to model crack initiation and growth. This includes cladding effects in the heat transfer, stress, and fracture mechanics calculations. The simulation procedure treats an entire vessel and recognizes that more than one flaw can exist in a given vessel. The flaw model allows random positioning of the flaw within the vessel wall thickness, and the user can specify either flaw length or length-to-depth aspect ratio for crack initiation and arrest predictions. The flaw size distribution can be adjust on the basis of different inservice inspection techniques and inspection conditions. The toughness simulation model includes a menu of alternative equations for predicting the shift in the reference temperature of the nil-ductility transition. 2 - Method of solution: The solution method uses closed form equations for temperatures, stresses, and stress intensity factors. A polynomial fitting procedure approximates the specified pressure and temperature transient. Failure probabilities are calculated by a Monte Carlo simulation. 3 - Restrictions on the complexity of the problem: Maxima of 30 welds. VISA2 models only the belt-line (cylindrical) region of a reactor vessel. The stresses are a function of the radial (through-wall) coordinate only

  3. Reliability assessment and probability based design of reinforced concrete containments and shear walls

    International Nuclear Information System (INIS)

    Hwang, H.; Reich, M.; Ellingwood, B.; Shinozuka, M.

    1986-03-01

    This report summarizes work completed under the program entitled, ''Probability-Based Load Combinations for Design of Category I Structures.'' Under this program, the probabilistic models for various static and dynamic loads were formulated. The randomness and uncertainties in material strengths and structural resistance were established. Several limit states of concrete containments and shear walls were identified and analytically formulated. Furthermore, the reliability analysis methods for estimating limit state probabilities were established. These reliability analysis methods can be used to evaluate the safety levels of nuclear structures under various combinations of static and dynamic loads. They can also be used to generate analytically the fragility data for PRA studies. In addition to the development of reliability analysis methods, probability-based design criteria for concrete containments and shear wall structures have also been developed. The proposed design criteria are in the load and resistance factor design (LRFD) format. The load and resistance factors are determined for several limit states and target limit state probabilities. Thus, the proposed design criteria are risk-consistent and have a well-established rationale. 73 refs., 18 figs., 16 tabs

  4. Monte Carlo simulation of the sequential probability ratio test for radiation monitoring

    International Nuclear Information System (INIS)

    Coop, K.L.

    1984-01-01

    A computer program simulates the Sequential Probability Ratio Test (SPRT) using Monte Carlo techniques. The program, SEQTEST, performs random-number sampling of either a Poisson or normal distribution to simulate radiation monitoring data. The results are in terms of the detection probabilities and the average time required for a trial. The computed SPRT results can be compared with tabulated single interval test (SIT) values to determine the better statistical test for particular monitoring applications. Use of the SPRT in a hand-and-foot alpha monitor shows that the SPRT provides better detection probabilities while generally requiring less counting time. Calculations are also performed for a monitor where the SPRT is not permitted to the take longer than the single interval test. Although the performance of the SPRT is degraded by this restriction, the detection probabilities are still similar to the SIT values, and average counting times are always less than 75% of the SIT time. Some optimal conditions for use of the SPRT are described. The SPRT should be the test of choice in many radiation monitoring situations. 6 references, 8 figures, 1 table

  5. Distribution of sizes of erased loops for loop-erased random walks

    OpenAIRE

    Dhar, Deepak; Dhar, Abhishek

    1997-01-01

    We study the distribution of sizes of erased loops for loop-erased random walks on regular and fractal lattices. We show that for arbitrary graphs the probability $P(l)$ of generating a loop of perimeter $l$ is expressible in terms of the probability $P_{st}(l)$ of forming a loop of perimeter $l$ when a bond is added to a random spanning tree on the same graph by the simple relation $P(l)=P_{st}(l)/l$. On $d$-dimensional hypercubical lattices, $P(l)$ varies as $l^{-\\sigma}$ for large $l$, whe...

  6. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  7. [Inverse probability weighting (IPW) for evaluating and "correcting" selection bias].

    Science.gov (United States)

    Narduzzi, Silvia; Golini, Martina Nicole; Porta, Daniela; Stafoggia, Massimo; Forastiere, Francesco

    2014-01-01

    the Inverse probability weighting (IPW) is a methodology developed to account for missingness and selection bias caused by non-randomselection of observations, or non-random lack of some information in a subgroup of the population. to provide an overview of IPW methodology and an application in a cohort study of the association between exposure to traffic air pollution (nitrogen dioxide, NO₂) and 7-year children IQ. this methodology allows to correct the analysis by weighting the observations with the probability of being selected. The IPW is based on the assumption that individual information that can predict the probability of inclusion (non-missingness) are available for the entire study population, so that, after taking account of them, we can make inferences about the entire target population starting from the nonmissing observations alone.The procedure for the calculation is the following: firstly, we consider the entire population at study and calculate the probability of non-missing information using a logistic regression model, where the response is the nonmissingness and the covariates are its possible predictors.The weight of each subject is given by the inverse of the predicted probability. Then the analysis is performed only on the non-missing observations using a weighted model. IPW is a technique that allows to embed the selection process in the analysis of the estimates, but its effectiveness in "correcting" the selection bias depends on the availability of enough information, for the entire population, to predict the non-missingness probability. In the example proposed, the IPW application showed that the effect of exposure to NO2 on the area of verbal intelligence quotient of children is stronger than the effect showed from the analysis performed without regard to the selection processes.

  8. Critical behavior in inhomogeneous random graphs

    NARCIS (Netherlands)

    Hofstad, van der R.W.

    2009-01-01

    We study the critical behavior of inhomogeneous random graphs where edges are present independently but with unequal edge occupation probabilities. We show that the critical behavior depends sensitively on the properties of the asymptotic degrees. Indeed, when the proportion of vertices with degree

  9. Data-Driven Lead-Acid Battery Prognostics Using Random Survival Forests

    Science.gov (United States)

    2014-10-02

    Kogalur, Blackstone , & Lauer, 2008; Ishwaran & Kogalur, 2010). Random survival forest is a sur- vival analysis extension of Random Forests (Breiman, 2001...Statistics & probability letters, 80(13), 1056–1064. Ishwaran, H., Kogalur, U. B., Blackstone , E. H., & Lauer, M. S. (2008). Random survival forests. The...and environment for sta- tistical computing [Computer software manual]. Vienna, Austria. Retrieved from http://www.R-project .org/ Wager, S., Hastie, T

  10. Inference of a random potential from random walk realizations: Formalism and application to the one-dimensional Sinai model with a drift

    International Nuclear Information System (INIS)

    Cocco, S; Monasson, R

    2009-01-01

    We consider the Sinai model, in which a random walker moves in a random quenched potential V, and ask the following questions: 1. how can the quenched potential V be inferred from the observations of one or more realizations of the random motion? 2. how many observations (walks) are required to make a reliable inference, that is, to be able to distinguish between two similar but distinct potentials, V 1 and V 2 ? We show how question 1 can be easily solved within the Bayesian framework. In addition, we show that the answer to question 2 is, in general, intimately connected to the calculation of the survival probability of a fictitious walker in a potential W defined from V 1 and V 2 , with partial absorption at sites where V 1 and V 2 do not coincide. For the one-dimensional Sinai model, this survival probability can be analytically calculated, in excellent agreement with numerical simulations.

  11. A short course on measure and probability theories

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre

    2004-02-01

    This brief Introduction to Measure Theory, and its applications to Probabilities, corresponds to the lecture notes of a seminar series given at Sandia National Laboratories in Livermore, during the spring of 2003. The goal of these seminars was to provide a minimal background to Computational Combustion scientists interested in using more advanced stochastic concepts and methods, e.g., in the context of uncertainty quantification. Indeed, most mechanical engineering curricula do not provide students with formal training in the field of probability, and even in less in measure theory. However, stochastic methods have been used more and more extensively in the past decade, and have provided more successful computational tools. Scientists at the Combustion Research Facility of Sandia National Laboratories have been using computational stochastic methods for years. Addressing more and more complex applications, and facing difficult problems that arose in applications showed the need for a better understanding of theoretical foundations. This is why the seminar series was launched, and these notes summarize most of the concepts which have been discussed. The goal of the seminars was to bring a group of mechanical engineers and computational combustion scientists to a full understanding of N. WIENER'S polynomial chaos theory. Therefore, these lectures notes are built along those lines, and are not intended to be exhaustive. In particular, the author welcomes any comments or criticisms.

  12. Log-concave Probability Distributions: Theory and Statistical Testing

    DEFF Research Database (Denmark)

    An, Mark Yuing

    1996-01-01

    This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...... and multivariate distributions are also discussed. We propose simple non-parametric testing procedures for log-concavity. The test statistics are constructed to test one of the two implicati ons of log-concavity: increasing hazard rates and new-is-better-than-used (NBU) property. The test for increasing hazard...... rates are based on normalized spacing of the sample order statistics. The tests for NBU property fall into the category of Hoeffding's U-statistics...

  13. Iterated random walks with shape prior

    DEFF Research Database (Denmark)

    Pujadas, Esmeralda Ruiz; Kjer, Hans Martin; Piella, Gemma

    2016-01-01

    the parametric probability density function. Then, random walks is performed iteratively aligning the prior with the current segmentation in every iteration. We tested the proposed approach with natural and medical images and compared it with the latest techniques with random walks and shape priors......We propose a new framework for image segmentation using random walks where a distance shape prior is combined with a region term. The shape prior is weighted by a confidence map to reduce the influence of the prior in high gradient areas and the region term is computed with k-means to estimate....... The experiments suggest that this method gives promising results for medical and natural images....

  14. Pólya number and first return of bursty random walk: Rigorous solutions

    Science.gov (United States)

    Wan, J.; Xu, X. P.

    2012-03-01

    The recurrence properties of random walks can be characterized by Pólya number, i.e., the probability that the walker has returned to the origin at least once. In this paper, we investigate Pólya number and first return for bursty random walk on a line, in which the walk has different step size and moving probabilities. Using the concept of the Catalan number, we obtain exact results for first return probability, the average first return time and Pólya number for the first time. We show that Pólya number displays two different functional behavior when the walk deviates from the recurrent point. By utilizing the Lagrange inversion formula, we interpret our findings by transferring Pólya number to the closed-form solutions of an inverse function. We also calculate Pólya number using another approach, which corroborates our results and conclusions. Finally, we consider the recurrence properties and Pólya number of two variations of the bursty random walk model.

  15. The influence of initial beliefs on judgments of probability.

    Science.gov (United States)

    Yu, Erica C; Lagnado, David A

    2012-01-01

    This study aims to investigate whether experimentally induced prior beliefs affect processing of evidence including the updating of beliefs under uncertainty about the unknown probabilities of outcomes and the structural, outcome-generating nature of the environment. Participants played a gambling task in the form of computer-simulated slot machines and were given information about the slot machines' possible outcomes without their associated probabilities. One group was induced with a prior belief about the outcome space that matched the space of actual outcomes to be sampled; the other group was induced with a skewed prior belief that included the actual outcomes and also fictional higher outcomes. In reality, however, all participants sampled evidence from the same underlying outcome distribution, regardless of priors given. Before and during sampling, participants expressed their beliefs about the outcome distribution (values and probabilities). Evaluation of those subjective probability distributions suggests that all participants' judgments converged toward the observed outcome distribution. However, despite observing no supporting evidence for fictional outcomes, a significant proportion of participants in the skewed priors condition expected them in the future. A probe of the participants' understanding of the underlying outcome-generating processes indicated that participants' judgments were based on the information given in the induced priors and consequently, a significant proportion of participants in the skewed condition believed the slot machines were not games of chance while participants in the control condition believed the machines generated outcomes at random. Beyond Bayesian or heuristic belief updating, priors not only contribute to belief revision but also affect one's deeper understanding of the environment.

  16. An introduction to random interlacements

    CERN Document Server

    Drewitz, Alexander; Sapozhnikov, Artëm

    2014-01-01

    This book gives a self-contained introduction to the theory of random interlacements. The intended reader of the book is a graduate student with a background in probability theory who wants to learn about the fundamental results and methods of this rapidly emerging field of research. The model was introduced by Sznitman in 2007 in order to describe the local picture left by the trace of a random walk on a large discrete torus when it runs up to times proportional to the volume of the torus. Random interlacements is a new percolation model on the d-dimensional lattice. The main results covered by the book include the full proof of the local convergence of random walk trace on the torus to random interlacements and the full proof of the percolation phase transition of the vacant set of random interlacements in all dimensions. The reader will become familiar with the techniques relevant to working with the underlying Poisson Process and the method of multi-scale renormalization, which helps in overcoming the ch...

  17. On Randomness and Probability

    Indian Academy of Sciences (India)

    An axiomatic development of such a model is given below. It is also shown ... teacher needs to decide which students deserve to be promoted to the next class - it is not ... whether an unborn child would be a boy or a girl, the total number of births in a ..... that the outcome of the previous trials has no influence on the next trial.

  18. Uniform Estimate of the Finite-Time Ruin Probability for All Times in a Generalized Compound Renewal Risk Model

    Directory of Open Access Journals (Sweden)

    Qingwu Gao

    2012-01-01

    Full Text Available We discuss the uniformly asymptotic estimate of the finite-time ruin probability for all times in a generalized compound renewal risk model, where the interarrival times of successive accidents and all the claim sizes caused by an accident are two sequences of random variables following a wide dependence structure. This wide dependence structure allows random variables to be either negatively dependent or positively dependent.

  19. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  20. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  1. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality

    Science.gov (United States)

    Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in

  2. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Science.gov (United States)

    L'Engle, Kelly; Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile

  3. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Directory of Open Access Journals (Sweden)

    Kelly L'Engle

    Full Text Available Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample.The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census.The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample.The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit

  4. Electromagnetic Wave Propagation in Random Media

    DEFF Research Database (Denmark)

    Pécseli, Hans

    1984-01-01

    The propagation of a narrow frequency band beam of electromagnetic waves in a medium with randomly varying index of refraction is considered. A novel formulation of the governing equation is proposed. An equation for the average Green function (or transition probability) can then be derived...

  5. Random distance distribution for spherical objects: general theory and applications to physics

    International Nuclear Information System (INIS)

    Tu Shuju; Fischbach, Ephraim

    2002-01-01

    A formalism is presented for analytically obtaining the probability density function, P n (s), for the random distance s between two random points in an n-dimensional spherical object of radius R. Our formalism allows P n (s) to be calculated for a spherical n-ball having an arbitrary volume density, and reproduces the well-known results for the case of uniform density. The results find applications in geometric probability, computational science, molecular biological systems, statistical physics, astrophysics, condensed matter physics, nuclear physics and elementary particle physics. As one application of these results, we propose a new statistical method derived from our formalism to study random number generators used in Monte Carlo simulations. (author)

  6. Random walk on a population of random walkers

    International Nuclear Information System (INIS)

    Agliari, E; Burioni, R; Cassi, D; Neri, F M

    2008-01-01

    We consider a population of N labelled random walkers moving on a substrate, and an excitation jumping among the walkers upon contact. The label X(t) of the walker carrying the excitation at time t can be viewed as a stochastic process, where the transition probabilities are a stochastic process themselves. Upon mapping onto two simpler processes, the quantities characterizing X(t) can be calculated in the limit of long times and low walkers density. The results are compared with numerical simulations. Several different topologies for the substrate underlying diffusion are considered

  7. The Power of Probability: Poster/Teaching Guide for Grades 6-8. Expect the Unexpected with Math[R

    Science.gov (United States)

    Actuarial Foundation, 2013

    2013-01-01

    "The Power of Probability" is a new math program aligned with the National Council of Teachers of Mathematics (NCTM) and Common Core State Standards, which gives students opportunities to practice their skills and knowledge of the mathematics of probability. Developed by The Actuarial Foundation, the program's lessons and worksheets motivate…

  8. Modeling of chromosome intermingling by partially overlapping uniform random polygons.

    Science.gov (United States)

    Blackstone, T; Scharein, R; Borgo, B; Varela, R; Diao, Y; Arsuaga, J

    2011-03-01

    During the early phase of the cell cycle the eukaryotic genome is organized into chromosome territories. The geometry of the interface between any two chromosomes remains a matter of debate and may have important functional consequences. The Interchromosomal Network model (introduced by Branco and Pombo) proposes that territories intermingle along their periphery. In order to partially quantify this concept we here investigate the probability that two chromosomes form an unsplittable link. We use the uniform random polygon as a crude model for chromosome territories and we model the interchromosomal network as the common spatial region of two overlapping uniform random polygons. This simple model allows us to derive some rigorous mathematical results as well as to perform computer simulations easily. We find that the probability that one uniform random polygon of length n that partially overlaps a fixed polygon is bounded below by 1 − O(1/√n). We use numerical simulations to estimate the dependence of the linking probability of two uniform random polygons (of lengths n and m, respectively) on the amount of overlapping. The degree of overlapping is parametrized by a parameter [Formula: see text] such that [Formula: see text] indicates no overlapping and [Formula: see text] indicates total overlapping. We propose that this dependence relation may be modeled as f (ε, m, n) = [Formula: see text]. Numerical evidence shows that this model works well when [Formula: see text] is relatively large (ε ≥ 0.5). We then use these results to model the data published by Branco and Pombo and observe that for the amount of overlapping observed experimentally the URPs have a non-zero probability of forming an unsplittable link.

  9. Using complete measurement statistics for optimal device-independent randomness evaluation

    International Nuclear Information System (INIS)

    Nieto-Silleras, O; Pironio, S; Silman, J

    2014-01-01

    The majority of recent works investigating the link between non-locality and randomness, e.g. in the context of device-independent cryptography, do so with respect to some specific Bell inequality, usually the CHSH inequality. However, the joint probabilities characterizing the measurement outcomes of a Bell test are richer than just the degree of violation of a single Bell inequality. In this work we show how to take this extra information into account in a systematic manner in order to optimally evaluate the randomness that can be certified from non-local correlations. We further show that taking into account the complete set of outcome probabilities is equivalent to optimizing over all possible Bell inequalities, thereby allowing us to determine the optimal Bell inequality for certifying the maximal amount of randomness from a given set of non-local correlations. (paper)

  10. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  11. Capsule Performance Optimization in the National Ignition Campaign

    Energy Technology Data Exchange (ETDEWEB)

    Landen, O L; MacGowan, B J; Haan, S W; Edwards, J

    2009-10-13

    A capsule performance optimization campaign will be conducted at the National Ignition Facility to substantially increase the probability of ignition. The campaign will experimentally correct for residual uncertainties in the implosion and hohlraum physics used in our radiation-hydrodynamic computational models before proceeding to cryogenic-layered implosions and ignition attempts. The required tuning techniques using a variety of ignition capsule surrogates have been demonstrated at the Omega facility under scaled hohlraum and capsule conditions relevant to the ignition design and shown to meet the required sensitivity and accuracy. In addition, a roll-up of all expected random and systematic uncertainties in setting the key ignition laser and target parameters due to residual measurement, calibration, cross-coupling, surrogacy, and scale-up errors has been derived that meets the required budget.

  12. Capsule performance optimization in the national ignition campaign

    International Nuclear Information System (INIS)

    Landen, O L; MacGowan, B J; Haan, S W; Edwards, J

    2010-01-01

    A capsule performance optimization campaign will be conducted at the National Ignition Facility [1] to substantially increase the probability of ignition. The campaign will experimentally correct for residual uncertainties in the implosion and hohlraum physics used in our radiation-hydrodynamic computational models before proceeding to cryogenic-layered implosions and ignition attempts. The required tuning techniques using a variety of ignition capsule surrogates have been demonstrated at the Omega facility under scaled hohlraum and capsule conditions relevant to the ignition design and shown to meet the required sensitivity and accuracy. In addition, a roll-up of all expected random and systematic uncertainties in setting the key ignition laser and target parameters due to residual measurement, calibration, cross-coupling, surrogacy, and scale-up errors has been derived that meets the required budget.

  13. Capsule performance optimization in the national ignition campaign

    Science.gov (United States)

    Landen, O. L.; MacGowan, B. J.; Haan, S. W.; Edwards, J.

    2010-08-01

    A capsule performance optimization campaign will be conducted at the National Ignition Facility [1] to substantially increase the probability of ignition. The campaign will experimentally correct for residual uncertainties in the implosion and hohlraum physics used in our radiation-hydrodynamic computational models before proceeding to cryogenic-layered implosions and ignition attempts. The required tuning techniques using a variety of ignition capsule surrogates have been demonstrated at the Omega facility under scaled hohlraum and capsule conditions relevant to the ignition design and shown to meet the required sensitivity and accuracy. In addition, a roll-up of all expected random and systematic uncertainties in setting the key ignition laser and target parameters due to residual measurement, calibration, cross-coupling, surrogacy, and scale-up errors has been derived that meets the required budget.

  14. Covariate-adjusted Spearman's rank correlation with probability-scale residuals.

    Science.gov (United States)

    Liu, Qi; Li, Chun; Wanga, Valentine; Shepherd, Bryan E

    2018-06-01

    It is desirable to adjust Spearman's rank correlation for covariates, yet existing approaches have limitations. For example, the traditionally defined partial Spearman's correlation does not have a sensible population parameter, and the conditional Spearman's correlation defined with copulas cannot be easily generalized to discrete variables. We define population parameters for both partial and conditional Spearman's correlation through concordance-discordance probabilities. The definitions are natural extensions of Spearman's rank correlation in the presence of covariates and are general for any orderable random variables. We show that they can be neatly expressed using probability-scale residuals (PSRs). This connection allows us to derive simple estimators. Our partial estimator for Spearman's correlation between X and Y adjusted for Z is the correlation of PSRs from models of X on Z and of Y on Z, which is analogous to the partial Pearson's correlation derived as the correlation of observed-minus-expected residuals. Our conditional estimator is the conditional correlation of PSRs. We describe estimation and inference, and highlight the use of semiparametric cumulative probability models, which allow preservation of the rank-based nature of Spearman's correlation. We conduct simulations to evaluate the performance of our estimators and compare them with other popular measures of association, demonstrating their robustness and efficiency. We illustrate our method in two applications, a biomarker study and a large survey. © 2017, The International Biometric Society.

  15. Modeling Stochastic Complexity in Complex Adaptive Systems: Non-Kolmogorov Probability and the Process Algebra Approach.

    Science.gov (United States)

    Sulis, William H

    2017-10-01

    Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.

  16. Algebraic polynomials with random coefficients

    Directory of Open Access Journals (Sweden)

    K. Farahmand

    2002-01-01

    Full Text Available This paper provides an asymptotic value for the mathematical expected number of points of inflections of a random polynomial of the form a0(ω+a1(ω(n11/2x+a2(ω(n21/2x2+…an(ω(nn1/2xn when n is large. The coefficients {aj(w}j=0n, w∈Ω are assumed to be a sequence of independent normally distributed random variables with means zero and variance one, each defined on a fixed probability space (A,Ω,Pr. A special case of dependent coefficients is also studied.

  17. INTERACTIVE VISUALIZATION OF PROBABILITY AND CUMULATIVE DENSITY FUNCTIONS

    KAUST Repository

    Potter, Kristin; Kirby, Robert Michael; Xiu, Dongbin; Johnson, Chris R.

    2012-01-01

    The probability density function (PDF), and its corresponding cumulative density function (CDF), provide direct statistical insight into the characterization of a random process or field. Typically displayed as a histogram, one can infer probabilities of the occurrence of particular events. When examining a field over some two-dimensional domain in which at each point a PDF of the function values is available, it is challenging to assess the global (stochastic) features present within the field. In this paper, we present a visualization system that allows the user to examine two-dimensional data sets in which PDF (or CDF) information is available at any position within the domain. The tool provides a contour display showing the normed difference between the PDFs and an ansatz PDF selected by the user and, furthermore, allows the user to interactively examine the PDF at any particular position. Canonical examples of the tool are provided to help guide the reader into the mapping of stochastic information to visual cues along with a description of the use of the tool for examining data generated from an uncertainty quantification exercise accomplished within the field of electrophysiology.

  18. Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education.

    Science.gov (United States)

    Masel, J; Humphrey, P T; Blackburn, B; Levine, J A

    2015-01-01

    Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students' intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes' theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education. © 2015 J. Masel et al. CBE—Life Sciences Education © 2015 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  19. Tempered stable laws as random walk limits

    OpenAIRE

    Chakrabarty, Arijit; Meerschaert, Mark M.

    2010-01-01

    Stable laws can be tempered by modifying the L\\'evy measure to cool the probability of large jumps. Tempered stable laws retain their signature power law behavior at infinity, and infinite divisibility. This paper develops random walk models that converge to a tempered stable law under a triangular array scheme. Since tempered stable laws and processes are useful in statistical physics, these random walk models can provide a basic physical model for the underlying physical phenomena.

  20. Uncertainties and quantification of common cause failure rates and probabilities for system analyses

    International Nuclear Information System (INIS)

    Vaurio, Jussi K.

    2005-01-01

    Simultaneous failures of multiple components due to common causes at random times are modelled by constant multiple-failure rates. A procedure is described for quantification of common cause failure (CCF) basic event probabilities for system models using plant-specific and multiple-plant failure-event data. Methodology is presented for estimating CCF-rates from event data contaminated with assessment uncertainties. Generalised impact vectors determine the moments for the rates of individual systems or plants. These moments determine the effective numbers of events and observation times to be input to a Bayesian formalism to obtain plant-specific posterior CCF-rates. The rates are used to determine plant-specific common cause event probabilities for the basic events of explicit fault tree models depending on test intervals, test schedules and repair policies. Three methods are presented to determine these probabilities such that the correct time-average system unavailability can be obtained with single fault tree quantification. Recommended numerical values are given and examples illustrate different aspects of the methodology

  1. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  2. Impact of target point deviations on control and complication probabilities in stereotactic radiosurgery of AVMs and metastases

    International Nuclear Information System (INIS)

    Treuer, Harald; Kocher, Martin; Hoevels, Moritz; Hunsche, Stefan; Luyken, Klaus; Maarouf, Mohammad; Voges, Juergen; Mueller, Rolf-Peter; Sturm, Volker

    2006-01-01

    Objective: Determination of the impact of inaccuracies in the determination and setup of the target point in stereotactic radiosurgery (SRS) on the expectable complication and control probabilities. Methods: Two randomized samples of patients with arteriovenous malformation (AVM) (n = 20) and with brain metastases (n = 20) treated with SRS were formed, and the probability for complete obliteration (COP) or complete remission (CRP), the size of the 10 Gy-volume in the brain tissue (VOI10), and the probability for radiation necrosis (NTCP) were calculated. The dose-effect relations for COP and CRP were fitted to clinical data. Target point deviations were simulated through random vectors and the resulting probabilities and volumes were calculated and compared with the values of the treatment plan. Results: The decrease of the relative value of the control probabilities at 1 mm target point deviation was up to 4% for AVMs and up to 10% for metastases. At 2 mm the median decrease was 5% for AVMs and 9% for metastases. The value for the target point deviation, at which COP and CRP decreased about 0.05 in 90% of the cases, was 1.3 mm. The increase of NTCP was maximally 0.0025 per mm target point deviation for AVMs and 0.0035/mm for metastases. The maximal increase of VOI10 was 0.7 cm 3 /mm target point deviation in both patient groups. Conclusions: The upper limit for tolerable target point deviations is at 1.3 mm. If this value cannot be achieved during the system test, a supplementary safety margin should be applied for the definition of the target volume. A better accuracy level is desirable, in order to ensure optimal chances for the success of the treatment. The target point precision is less important for the minimization of the probability of radiation necroses

  3. Updated greenhouse gas and criteria air pollutant emission factors and their probability distribution functions for electricity generating units

    International Nuclear Information System (INIS)

    Cai, H.; Wang, M.; Elgowainy, A.; Han, J.

    2012-01-01

    Greenhouse gas (CO 2 , CH 4 and N 2 O, hereinafter GHG) and criteria air pollutant (CO, NO x , VOC, PM 10 , PM 2.5 and SO x , hereinafter CAP) emission factors for various types of power plants burning various fuels with different technologies are important upstream parameters for estimating life-cycle emissions associated with alternative vehicle/fuel systems in the transportation sector, especially electric vehicles. The emission factors are typically expressed in grams of GHG or CAP per kWh of electricity generated by a specific power generation technology. This document describes our approach for updating and expanding GHG and CAP emission factors in the GREET (Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation) model developed at Argonne National Laboratory (see Wang 1999 and the GREET website at http://greet.es.anl.gov/main) for various power generation technologies. These GHG and CAP emissions are used to estimate the impact of electricity use by stationary and transportation applications on their fuel-cycle emissions. The electricity generation mixes and the fuel shares attributable to various combustion technologies at the national, regional and state levels are also updated in this document. The energy conversion efficiencies of electric generating units (EGUs) by fuel type and combustion technology are calculated on the basis of the lower heating values of each fuel, to be consistent with the basis used in GREET for transportation fuels. On the basis of the updated GHG and CAP emission factors and energy efficiencies of EGUs, the probability distribution functions (PDFs), which are functions that describe the relative likelihood for the emission factors and energy efficiencies as random variables to take on a given value by the integral of their own probability distributions, are updated using best-fit statistical curves to characterize the uncertainties associated with GHG and CAP emissions in life-cycle modeling with GREET.

  4. On New Cautious Structural Reliability Models in the Framework of imprecise Probabilities

    DEFF Research Database (Denmark)

    Utkin, Lev V.; Kozine, Igor

    2010-01-01

    models and gen-eralizing conventional ones to imprecise probabili-ties. The theoretical setup employed for this purpose is imprecise statistical reasoning (Walley 1991), whose general framework is provided by upper and lower previsions (expectations). The appeal of this theory is its ability to capture......Uncertainty of parameters in engineering design has been modeled in different frameworks such as inter-val analysis, fuzzy set and possibility theories, ran-dom set theory and imprecise probability theory. The authors of this paper for many years have been de-veloping new imprecise reliability...... both aleatory (stochas-tic) and epistemic uncertainty and the flexibility with which information can be represented. The previous research of the authors related to generalizing structural reliability models to impre-cise statistical measures is summarized in Utkin & Kozine (2002) and Utkin (2004...

  5. First-passage time asymptotics over moving boundaries for random walk bridges

    NARCIS (Netherlands)

    Sloothaak, F.; Zwart, B.; Wachtel, V.

    2017-01-01

    We study the asymptotic tail probability of the first-passage time over a moving boundary for a random walk conditioned to return to zero, where the increments of the random walk have finite variance. Typically, the asymptotic tail behavior may be described through a regularly varying function with

  6. Generating variable and random schedules of reinforcement using Microsoft Excel macros.

    Science.gov (United States)

    Bancroft, Stacie L; Bourret, Jason C

    2008-01-01

    Variable reinforcement schedules are used to arrange the availability of reinforcement following varying response ratios or intervals of time. Random reinforcement schedules are subtypes of variable reinforcement schedules that can be used to arrange the availability of reinforcement at a constant probability across number of responses or time. Generating schedule values for variable and random reinforcement schedules can be difficult. The present article describes the steps necessary to write macros in Microsoft Excel that will generate variable-ratio, variable-interval, variable-time, random-ratio, random-interval, and random-time reinforcement schedule values.

  7. Associations Among Religiousness and Community Volunteerism in National Random Samples of American Adults.

    Science.gov (United States)

    Haggard, Megan C; Kang, Linda L; Rowatt, Wade C; Shen, Megan Johnson

    2015-01-01

    The connection between religiousness and volunteering for the community can be explained through two distinct features of religion. First, religious organizations are social groups that encourage members to help others through planned opportunities. Second, helping others is regarded as an important value for members in religious organizations to uphold. We examined the relationship between religiousness and self-reported community volunteering in two independent national random surveys of American adults (i.e., the 2005 and 2007 waves of the Baylor Religion Survey). In both waves, frequency of religious service attendance was associated with an increase in likelihood that individuals would volunteer, whether through their religious organization or not, whereas frequency of reading sacred texts outside of religious services was associated with an increase in likelihood of volunteering only for or through their religious organization. The role of religion in community volunteering is discussed in light of these findings.

  8. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  9. A Note on the Tail Behavior of Randomly Weighted Sums with Convolution-Equivalently Distributed Random Variables

    Directory of Open Access Journals (Sweden)

    Yang Yang

    2013-01-01

    Full Text Available We investigate the tailed asymptotic behavior of the randomly weighted sums with increments with convolution-equivalent distributions. Our obtained result can be directly applied to a discrete-time insurance risk model with insurance and financial risks and derive the asymptotics for the finite-time probability of the above risk model.

  10. Statistics of light deflection in a random two-phase medium

    International Nuclear Information System (INIS)

    Sviridov, A P

    2007-01-01

    The statistics of the angles of light deflection during its propagation in a random two-phase medium with randomly oriented phase interfaces is considered within the framework of geometrical optics. The probabilities of finding a randomly walking photon in different phases of the inhomogeneous medium are calculated. Analytic expressions are obtained for the scattering phase function and the scattering phase matrix which relates the Stokes vector of the incident light beam with the Stokes vectors of deflected beams. (special issue devoted to multiple radiation scattering in random media)

  11. KiVa Antibullying Program: Overview of Evaluation Studies Based on a Randomized Controlled Trial and National Rollout in Finland

    Directory of Open Access Journals (Sweden)

    Christina Salmivalli

    2012-12-01

    Full Text Available The effects of a Finnish national school-based antibullying program (KiVa were evaluated in a randomized controlled trial (2007–2009 and during nationwide implementation (since 2009. The KiVa program is been found to reduce bullying and victimization and increase empathy towards victimized peers and self-efficacy to support and defend them. KiVa increases school liking and motivation and contributes to significant reductions in anxiety, depression, and negative peer perceptions. Somewhat larger reductions in bullying and victimization were found in the randomized controlled trial than in the broad rollout, and the largest effects were obtained in primary school (grades 1–6. The uptake of the KiVa program is remarkable, with 90 percent of Finnish comprehensive schools currently registered as program users.

  12. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  13. Transition probabilities of Ce I obtained from Boltzmann analysis of visible and near-infrared emission spectra

    Science.gov (United States)

    Nitz, D. E.; Curry, J. J.; Buuck, M.; DeMann, A.; Mitchell, N.; Shull, W.

    2018-02-01

    We report radiative transition probabilities for 5029 emission lines of neutral cerium within the wavelength range 417-1110 nm. Transition probabilities for only 4% of these lines have been previously measured. These results are obtained from a Boltzmann analysis of two high resolution Fourier transform emission spectra used in previous studies of cerium, obtained from the digital archives of the National Solar Observatory at Kitt Peak. The set of transition probabilities used for the Boltzmann analysis are those published by Lawler et al (2010 J. Phys. B: At. Mol. Opt. Phys. 43 085701). Comparisons of branching ratios and transition probabilities for lines common to the two spectra provide important self-consistency checks and test for the presence of self-absorption effects. Estimated 1σ uncertainties for our transition probability results range from 10% to 18%.

  14. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  15. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  16. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  17. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  18. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  19. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  20. Probability of atrial fibrillation after ablation: Using a parametric nonlinear temporal decomposition mixed effects model.

    Science.gov (United States)

    Rajeswaran, Jeevanantham; Blackstone, Eugene H; Ehrlinger, John; Li, Liang; Ishwaran, Hemant; Parides, Michael K

    2018-01-01

    Atrial fibrillation is an arrhythmic disorder where the electrical signals of the heart become irregular. The probability of atrial fibrillation (binary response) is often time varying in a structured fashion, as is the influence of associated risk factors. A generalized nonlinear mixed effects model is presented to estimate the time-related probability of atrial fibrillation using a temporal decomposition approach to reveal the pattern of the probability of atrial fibrillation and their determinants. This methodology generalizes to patient-specific analysis of longitudinal binary data with possibly time-varying effects of covariates and with different patient-specific random effects influencing different temporal phases. The motivation and application of this model is illustrated using longitudinally measured atrial fibrillation data obtained through weekly trans-telephonic monitoring from an NIH sponsored clinical trial being conducted by the Cardiothoracic Surgery Clinical Trials Network.

  1. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  2. On Origin of Power-Law Distributions in Self-Organized Criticality from Random Walk Treatment

    International Nuclear Information System (INIS)

    Cao Xiaofeng; Deng Zongwei; Yang Chunbin

    2008-01-01

    The origin of power-law distributions in self-organized criticality is investigated by treating the variation of the number of active sites in the system as a stochastic process. An avalanche is then regarded as a first-return random walk process in a one-dimensional lattice. We assume that the variation of the number of active sites has three possibilities in each update: to increase by 1 with probability f 1 , to decrease by 1 with probability f 2 , or remain unchanged with probability 1-f 1 -f 2 . This mimics the dynamics in the system. Power-law distributions of the lifetime are found when the random walk is unbiased with equal probability to move in opposite directions. This shows that power-law distributions in self-organized criticality may be caused by the balance of competitive interactions.

  3. Confidence intervals for the lognormal probability distribution

    International Nuclear Information System (INIS)

    Smith, D.L.; Naberejnev, D.G.

    2004-01-01

    The present communication addresses the topic of symmetric confidence intervals for the lognormal probability distribution. This distribution is frequently utilized to characterize inherently positive, continuous random variables that are selected to represent many physical quantities in applied nuclear science and technology. The basic formalism is outlined herein and a conjured numerical example is provided for illustration. It is demonstrated that when the uncertainty reflected in a lognormal probability distribution is large, the use of a confidence interval provides much more useful information about the variable used to represent a particular physical quantity than can be had by adhering to the notion that the mean value and standard deviation of the distribution ought to be interpreted as best value and corresponding error, respectively. Furthermore, it is shown that if the uncertainty is very large a disturbing anomaly can arise when one insists on interpreting the mean value and standard deviation as the best value and corresponding error, respectively. Reliance on using the mode and median as alternative parameters to represent the best available knowledge of a variable with large uncertainties is also shown to entail limitations. Finally, a realistic physical example involving the decay of radioactivity over a time period that spans many half-lives is presented and analyzed to further illustrate the concepts discussed in this communication

  4. Emission probability determination of {sup 133}Ba by the sum-peak method

    Energy Technology Data Exchange (ETDEWEB)

    Silva, R.L. da; Almeida, M.C.M. de; Delgado, J.U.; Poledna, R.; Araujo, M.T.F.; Trindade, O.L.; Veras, E.V. de; Santos, A.; Rangel, J.; Ferreira Filho, A.L., E-mail: ronaldo@ird.gov.br, E-mail: marcandida@yahoo.com.br [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2016-07-01

    The National Laboratory of Metrology Ionizing Radiation (LNMRI/IRD/CNEN) has several measurement methods in order to ensure low uncertainties about the results. Through gamma spectrometry analysis by sum-peak absolute method they were performed the standardization of {sup 133}Ba activity and your emission probability determination of different energies with reduced uncertainties. The advantages of radionuclides calibrations by absolute method are accuracy, low uncertainties and is not necessary the use of radionuclides reference standards. {sup 133}Ba is used in research laboratories on calibration detectors in different work areas. The uncertainties for the activity and for the emission probability results are lower than 1%. (author)

  5. Routing in Networks with Random Topologies

    Science.gov (United States)

    Bambos, Nicholas

    1997-01-01

    We examine the problems of routing and server assignment in networks with random connectivities. In such a network the basic topology is fixed, but during each time slot and for each of tis input queues, each server (node) is either connected to or disconnected from each of its queues with some probability.

  6. Evaluation of the probability distribution of intake from a single measurement on a personal air sampler

    International Nuclear Information System (INIS)

    Birchall, A.; Muirhead, C.R.; James, A.C.

    1988-01-01

    An analytical expression has been derived for the k-sum distribution, formed by summing k random variables from a lognormal population. Poisson statistics are used with this distribution to derive distribution of intake when breathing an atmosphere with a constant particle number concentration. Bayesian inference is then used to calculate the posterior probability distribution of concentrations from a given measurement. This is combined with the above intake distribution to give the probability distribution of intake resulting from a single measurement of activity made by an ideal sampler. It is shown that the probability distribution of intake is very dependent on the prior distribution used in Bayes' theorem. The usual prior assumption, that all number concentrations are equally probable, leads to an imbalance in the posterior intake distribution. This can be resolved if a new prior proportional to w -2/3 is used, where w is the expected number of particles collected. (author)

  7. An novel frequent probability pattern mining algorithm based on circuit simulation method in uncertain biological networks

    Science.gov (United States)

    2014-01-01

    Background Motif mining has always been a hot research topic in bioinformatics. Most of current research on biological networks focuses on exact motif mining. However, due to the inevitable experimental error and noisy data, biological network data represented as the probability model could better reflect the authenticity and biological significance, therefore, it is more biological meaningful to discover probability motif in uncertain biological networks. One of the key steps in probability motif mining is frequent pattern discovery which is usually based on the possible world model having a relatively high computational complexity. Methods In this paper, we present a novel method for detecting frequent probability patterns based on circuit simulation in the uncertain biological networks. First, the partition based efficient search is applied to the non-tree like subgraph mining where the probability of occurrence in random networks is small. Then, an algorithm of probability isomorphic based on circuit simulation is proposed. The probability isomorphic combines the analysis of circuit topology structure with related physical properties of voltage in order to evaluate the probability isomorphism between probability subgraphs. The circuit simulation based probability isomorphic can avoid using traditional possible world model. Finally, based on the algorithm of probability subgraph isomorphism, two-step hierarchical clustering method is used to cluster subgraphs, and discover frequent probability patterns from the clusters. Results The experiment results on data sets of the Protein-Protein Interaction (PPI) networks and the transcriptional regulatory networks of E. coli and S. cerevisiae show that the proposed method can efficiently discover the frequent probability subgraphs. The discovered subgraphs in our study contain all probability motifs reported in the experiments published in other related papers. Conclusions The algorithm of probability graph isomorphism

  8. National Coral Reef Monitoring Program: Benthic Cover Derived from Analysis of Benthic Images Collected during Stratified Random Surveys (StRS) across American Samoa in 2015

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here result from benthic photo-quadrat surveys conducted along transects at stratified random sites across American Samoa in 2015 as a part of...

  9. Probability density function evolution of power systems subject to stochastic variation of renewable energy

    Science.gov (United States)

    Wei, J. Q.; Cong, Y. C.; Xiao, M. Q.

    2018-05-01

    As renewable energies are increasingly integrated into power systems, there is increasing interest in stochastic analysis of power systems.Better techniques should be developed to account for the uncertainty caused by penetration of renewables and consequently analyse its impacts on stochastic stability of power systems. In this paper, the Stochastic Differential Equations (SDEs) are used to represent the evolutionary behaviour of the power systems. The stationary Probability Density Function (PDF) solution to SDEs modelling power systems excited by Gaussian white noise is analysed. Subjected to such random excitation, the Joint Probability Density Function (JPDF) solution to the phase angle and angular velocity is governed by the generalized Fokker-Planck-Kolmogorov (FPK) equation. To solve this equation, the numerical method is adopted. Special measure is taken such that the generalized FPK equation is satisfied in the average sense of integration with the assumed PDF. Both weak and strong intensities of the stochastic excitations are considered in a single machine infinite bus power system. The numerical analysis has the same result as the one given by the Monte Carlo simulation. Potential studies on stochastic behaviour of multi-machine power systems with random excitations are discussed at the end.

  10. Random walks and polygons in tight confinement

    International Nuclear Information System (INIS)

    Diao, Y; Ernst, C; Ziegler, U

    2014-01-01

    We discuss the effect of confinement on the topology and geometry of tightly confined random walks and polygons. Here the walks and polygons are confined in a sphere of radius R ≥ 1/2 and the polygons are equilateral with n edges of unit length. We illustrate numerically that for a fixed length of random polygons the knotting probability increases to one as the radius decreases to 1/2. We also demonstrate that for random polygons (walks) the curvature increases to πn (π(n – 1)) as the radius approaches 1/2 and that the torsion decreases to ≈ πn/3 (≈ π(n – 1)/3). In addition we show the effect of length and confinement on the average crossing number of a random polygon

  11. Evolution of a Modified Binomial Random Graph by Agglomeration

    Science.gov (United States)

    Kang, Mihyun; Pachon, Angelica; Rodríguez, Pablo M.

    2018-02-01

    In the classical Erdős-Rényi random graph G( n, p) there are n vertices and each of the possible edges is independently present with probability p. The random graph G( n, p) is homogeneous in the sense that all vertices have the same characteristics. On the other hand, numerous real-world networks are inhomogeneous in this respect. Such an inhomogeneity of vertices may influence the connection probability between pairs of vertices. The purpose of this paper is to propose a new inhomogeneous random graph model which is obtained in a constructive way from the Erdős-Rényi random graph G( n, p). Given a configuration of n vertices arranged in N subsets of vertices (we call each subset a super-vertex), we define a random graph with N super-vertices by letting two super-vertices be connected if and only if there is at least one edge between them in G( n, p). Our main result concerns the threshold for connectedness. We also analyze the phase transition for the emergence of the giant component and the degree distribution. Even though our model begins with G( n, p), it assumes the existence of some community structure encoded in the configuration. Furthermore, under certain conditions it exhibits a power law degree distribution. Both properties are important for real-world applications.

  12. Quantum processes: probability fluxes, transition probabilities in unit time and vacuum vibrations

    International Nuclear Information System (INIS)

    Oleinik, V.P.; Arepjev, Ju D.

    1989-01-01

    Transition probabilities in unit time and probability fluxes are compared in studying the elementary quantum processes -the decay of a bound state under the action of time-varying and constant electric fields. It is shown that the difference between these quantities may be considerable, and so the use of transition probabilities W instead of probability fluxes Π, in calculating the particle fluxes, may lead to serious errors. The quantity W represents the rate of change with time of the population of the energy levels relating partly to the real states and partly to the virtual ones, and it cannot be directly measured in experiment. The vacuum background is shown to be continuously distorted when a perturbation acts on a system. Because of this the viewpoint of an observer on the physical properties of real particles continuously varies with time. This fact is not taken into consideration in the conventional theory of quantum transitions based on using the notion of probability amplitude. As a result, the probability amplitudes lose their physical meaning. All the physical information on quantum dynamics of a system is contained in the mean values of physical quantities. The existence of considerable differences between the quantities W and Π permits one in principle to make a choice of the correct theory of quantum transitions on the basis of experimental data. (author)

  13. A unified approach for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties

    Science.gov (United States)

    Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie

    2017-09-01

    Automotive brake systems are always subjected to various types of uncertainties and two types of random-fuzzy uncertainties may exist in the brakes. In this paper, a unified approach is proposed for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties. In the proposed approach, two uncertainty analysis models with mixed variables are introduced to model the random-fuzzy uncertainties. The first one is the random and fuzzy model, in which random variables and fuzzy variables exist simultaneously and independently. The second one is the fuzzy random model, in which uncertain parameters are all treated as random variables while their distribution parameters are expressed as fuzzy numbers. Firstly, the fuzziness is discretized by using α-cut technique and the two uncertainty analysis models are simplified into random-interval models. Afterwards, by temporarily neglecting interval uncertainties, the random-interval models are degraded into random models, in which the expectations, variances, reliability indexes and reliability probabilities of system stability functions are calculated. And then, by reconsidering the interval uncertainties, the bounds of the expectations, variances, reliability indexes and reliability probabilities are computed based on Taylor series expansion. Finally, by recomposing the analysis results at each α-cut level, the fuzzy reliability indexes and probabilities can be obtained, by which the brake squeal instability can be evaluated. The proposed approach gives a general framework to deal with both types of random-fuzzy uncertainties that may exist in the brakes and its effectiveness is demonstrated by numerical examples. It will be a valuable supplement to the systematic study of brake squeal considering uncertainty.

  14. Improving biobank consent comprehension: a national randomized survey to assess the effect of a simplified form and review/retest intervention

    OpenAIRE

    Beskow, Laura M.; Lin, Li; Dombeck, Carrie B.; Gao, Emily; Weinfurt, Kevin P.

    2016-01-01

    Purpose: To determine the individual and combined effects of a simplified form and a review/retest intervention on biobanking consent comprehension. Methods: We conducted a national online survey in which participants were randomized within four educational strata to review a simplified or traditional consent form. Participants then completed a comprehension quiz; for each item answered incorrectly, they reviewed the corresponding consent form section and answered another quiz item on that to...

  15. Verbal versus Numerical Probabilities: Does Format Presentation of Probabilistic Information regarding Breast Cancer Screening Affect Women's Comprehension?

    Science.gov (United States)

    Vahabi, Mandana

    2010-01-01

    Objective: To test whether the format in which women receive probabilistic information about breast cancer and mammography affects their comprehension. Methods: A convenience sample of 180 women received pre-assembled randomized packages containing a breast health information brochure, with probabilities presented in either verbal or numeric…

  16. Palm theory for random time changes

    Directory of Open Access Journals (Sweden)

    Masakiyo Miyazawa

    2001-01-01

    Full Text Available Palm distributions are basic tools when studying stationarity in the context of point processes, queueing systems, fluid queues or random measures. The framework varies with the random phenomenon of interest, but usually a one-dimensional group of measure-preserving shifts is the starting point. In the present paper, by alternatively using a framework involving random time changes (RTCs and a two-dimensional family of shifts, we are able to characterize all of the above systems in a single framework. Moreover, this leads to what we call the detailed Palm distribution (DPD which is stationary with respect to a certain group of shifts. The DPD has a very natural interpretation as the distribution seen at a randomly chosen position on the extended graph of the RTC, and satisfies a general duality criterion: the DPD of the DPD gives the underlying probability P in return.

  17. A Repetition Test for Pseudo-Random Number Generators

    OpenAIRE

    Gil, Manuel; Gonnet, Gaston H.; Petersen, Wesley P.

    2017-01-01

    A new statistical test for uniform pseudo-random number generators (PRNGs) is presented. The idea is that a sequence of pseudo-random numbers should have numbers reappear with a certain probability. The expectation time that a repetition occurs provides the metric for the test. For linear congruential generators (LCGs) failure can be shown theoretically. Empirical test results for a number of commonly used PRNGs are reported, showing that some PRNGs considered to have good statistical propert...

  18. Designing a national soil erosion monitoring network for England and Wales

    Science.gov (United States)

    Lark, Murray; Rawlins, Barry; Anderson, Karen; Evans, Martin; Farrow, Luke; Glendell, Miriam; James, Mike; Rickson, Jane; Quine, Timothy; Quinton, John; Brazier, Richard

    2014-05-01

    Although soil erosion is recognised as a significant threat to sustainable land use and may be a priority for action in any forthcoming EU Soil Framework Directive, those responsible for setting national policy with respect to erosion are constrained by a lack of robust, representative, data at large spatial scales. This reflects the process-orientated nature of much soil erosion research. Recognising this limitation, The UK Department for Environment, Food and Rural Affairs (Defra) established a project to pilot a cost-effective framework for monitoring of soil erosion in England and Wales (E&W). The pilot will compare different soil erosion monitoring methods at a site scale and provide statistical information for the final design of the full national monitoring network that will: provide unbiased estimates of the spatial mean of soil erosion rate across E&W (tonnes ha-1 yr-1) for each of three land-use classes - arable and horticultural grassland upland and semi-natural habitats quantify the uncertainty of these estimates with confidence intervals. Probability (design-based) sampling provides most efficient unbiased estimates of spatial means. In this study, a 16 hectare area (a square of 400 x 400 m) positioned at the centre of a 1-km grid cell, selected at random from mapped land use across E&W, provided the sampling support for measurement of erosion rates, with at least 94% of the support area corresponding to the target land use classes. Very small or zero erosion rates likely to be encountered at many sites reduce the sampling efficiency and make it difficult to compare different methods of soil erosion monitoring. Therefore, to increase the proportion of samples with larger erosion rates without biasing our estimates, we increased the inclusion probability density in areas where the erosion rate is likely to be large by using stratified random sampling. First, each sampling domain (land use class in E&W) was divided into strata; e.g. two sub

  19. A Campbell random process

    International Nuclear Information System (INIS)

    Reuss, J.D.; Misguich, J.H.

    1993-02-01

    The Campbell process is a stationary random process which can have various correlation functions, according to the choice of an elementary response function. The statistical properties of this process are presented. A numerical algorithm and a subroutine for generating such a process is built up and tested, for the physically interesting case of a Campbell process with Gaussian correlations. The (non-Gaussian) probability distribution appears to be similar to the Gamma distribution

  20. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  1. Randomized Prediction Games for Adversarial Machine Learning.

    Science.gov (United States)

    Rota Bulo, Samuel; Biggio, Battista; Pillai, Ignazio; Pelillo, Marcello; Roli, Fabio

    In spam and malware detection, attackers exploit randomization to obfuscate malicious data and increase their chances of evading detection at test time, e.g., malware code is typically obfuscated using random strings or byte sequences to hide known exploits. Interestingly, randomization has also been proposed to improve security of learning algorithms against evasion attacks, as it results in hiding information about the classifier to the attacker. Recent work has proposed game-theoretical formulations to learn secure classifiers, by simulating different evasion attacks and modifying the classification function accordingly. However, both the classification function and the simulated data manipulations have been modeled in a deterministic manner, without accounting for any form of randomization. In this paper, we overcome this limitation by proposing a randomized prediction game, namely, a noncooperative game-theoretic formulation in which the classifier and the attacker make randomized strategy selections according to some probability distribution defined over the respective strategy set. We show that our approach allows one to improve the tradeoff between attack detection and false alarms with respect to the state-of-the-art secure classifiers, even against attacks that are different from those hypothesized during design, on application examples including handwritten digit recognition, spam, and malware detection.In spam and malware detection, attackers exploit randomization to obfuscate malicious data and increase their chances of evading detection at test time, e.g., malware code is typically obfuscated using random strings or byte sequences to hide known exploits. Interestingly, randomization has also been proposed to improve security of learning algorithms against evasion attacks, as it results in hiding information about the classifier to the attacker. Recent work has proposed game-theoretical formulations to learn secure classifiers, by simulating different

  2. Superparamagnetic perpendicular magnetic tunnel junctions for true random number generators

    Science.gov (United States)

    Parks, Bradley; Bapna, Mukund; Igbokwe, Julianne; Almasi, Hamid; Wang, Weigang; Majetich, Sara A.

    2018-05-01

    Superparamagnetic perpendicular magnetic tunnel junctions are fabricated and analyzed for use in random number generators. Time-resolved resistance measurements are used as streams of bits in statistical tests for randomness. Voltage control of the thermal stability enables tuning the average speed of random bit generation up to 70 kHz in a 60 nm diameter device. In its most efficient operating mode, the device generates random bits at an energy cost of 600 fJ/bit. A narrow range of magnetic field tunes the probability of a given state from 0 to 1, offering a means of probabilistic computing.

  3. Collocation methods for uncertainty quanti cation in PDE models with random data

    KAUST Repository

    Nobile, Fabio

    2014-01-06

    In this talk we consider Partial Differential Equations (PDEs) whose input data are modeled as random fields to account for their intrinsic variability or our lack of knowledge. After parametrizing the input random fields by finitely many independent random variables, we exploit the high regularity of the solution of the PDE as a function of the input random variables and consider sparse polynomial approximations in probability (Polynomial Chaos expansion) by collocation methods. We first address interpolatory approximations where the PDE is solved on a sparse grid of Gauss points in the probability space and the solutions thus obtained interpolated by multivariate polynomials. We present recent results on optimized sparse grids in which the selection of points is based on a knapsack approach and relies on sharp estimates of the decay of the coefficients of the polynomial chaos expansion of the solution. Secondly, we consider regression approaches where the PDE is evaluated on randomly chosen points in the probability space and a polynomial approximation constructed by the least square method. We present recent theoretical results on the stability and optimality of the approximation under suitable conditions between the number of sampling points and the dimension of the polynomial space. In particular, we show that for uniform random variables, the number of sampling point has to scale quadratically with the dimension of the polynomial space to maintain the stability and optimality of the approximation. Numerical results show that such condition is sharp in the monovariate case but seems to be over-constraining in higher dimensions. The regression technique seems therefore to be attractive in higher dimensions.

  4. Mesoscopic description of random walks on combs

    Science.gov (United States)

    Méndez, Vicenç; Iomin, Alexander; Campos, Daniel; Horsthemke, Werner

    2015-12-01

    Combs are a simple caricature of various types of natural branched structures, which belong to the category of loopless graphs and consist of a backbone and branches. We study continuous time random walks on combs and present a generic method to obtain their transport properties. The random walk along the branches may be biased, and we account for the effect of the branches by renormalizing the waiting time probability distribution function for the motion along the backbone. We analyze the overall diffusion properties along the backbone and find normal diffusion, anomalous diffusion, and stochastic localization (diffusion failure), respectively, depending on the characteristics of the continuous time random walk along the branches, and compare our analytical results with stochastic simulations.

  5. Outage Probability Analysis in Power-Beacon Assisted Energy Harvesting Cognitive Relay Wireless Networks

    Directory of Open Access Journals (Sweden)

    Ngoc Phuc Le

    2017-01-01

    Full Text Available We study the performance of the secondary relay system in a power-beacon (PB assisted energy harvesting cognitive relay wireless network. In our system model, a secondary source node and a relay node first harvest energy from distributed PBs. Then, the source node transmits its data to the destination node with the help of the relay node. Also, fading coefficients of the links from the PBs to the source node and relay node are assumed independent but not necessarily identically distributed (i.n.i.d Nakagami-m random variables. We derive exact expressions for the power outage probability and the channel outage probability. Based on that, we analyze the total outage probability of the secondary relay system. Asymptotic analysis is also performed, which provides insights into the system behavior. Moreover, we evaluate impacts of the primary network on the performance of the secondary network with respect to the tolerant interference threshold at the primary receiver as well as the interference introduced by the primary transmitter at the secondary source and relay nodes. Simulation results are provided to validate the analysis.

  6. Fast Outage Probability Simulation for FSO Links with a Generalized Pointing Error Model

    KAUST Repository

    Ben Issaid, Chaouki

    2017-02-07

    Over the past few years, free-space optical (FSO) communication has gained significant attention. In fact, FSO can provide cost-effective and unlicensed links, with high-bandwidth capacity and low error rate, making it an exciting alternative to traditional wireless radio-frequency communication systems. However, the system performance is affected not only by the presence of atmospheric turbulences, which occur due to random fluctuations in the air refractive index but also by the existence of pointing errors. Metrics, such as the outage probability which quantifies the probability that the instantaneous signal-to-noise ratio is smaller than a given threshold, can be used to analyze the performance of this system. In this work, we consider weak and strong turbulence regimes, and we study the outage probability of an FSO communication system under a generalized pointing error model with both a nonzero boresight component and different horizontal and vertical jitter effects. More specifically, we use an importance sampling approach which is based on the exponential twisting technique to offer fast and accurate results.

  7. A drawback and an improvement of the classical Weibull probability plot

    International Nuclear Information System (INIS)

    Jiang, R.

    2014-01-01

    The classical Weibull Probability Paper (WPP) plot has been widely used to identify a model for fitting a given dataset. It is based on a match between the WPP plots of the model and data in shape. This paper carries out an analysis for the Weibull transformations that create the WPP plot and shows that the shape of the WPP plot of the data randomly generated from a distribution model can be significantly different from the shape of the WPP plot of the model due to the high non-linearity of the Weibull transformations. As such, choosing model based on the shape of the WPP plot of data can be unreliable. A cdf-based weighted least squares method is proposed to improve the parameter estimation accuracy; and an improved WPP plot is suggested to avoid the drawback of the classical WPP plot. The appropriateness and usefulness of the proposed estimation method and probability plot are illustrated by simulation and real-world examples

  8. A Markov random field approach for microstructure synthesis

    International Nuclear Information System (INIS)

    Kumar, A; Nguyen, L; DeGraef, M; Sundararaghavan, V

    2016-01-01

    We test the notion that many microstructures have an underlying stationary probability distribution. The stationary probability distribution is ubiquitous: we know that different windows taken from a polycrystalline microstructure are generally ‘statistically similar’. To enable computation of such a probability distribution, microstructures are represented in the form of undirected probabilistic graphs called Markov Random Fields (MRFs). In the model, pixels take up integer or vector states and interact with multiple neighbors over a window. Using this lattice structure, algorithms are developed to sample the conditional probability density for the state of each pixel given the known states of its neighboring pixels. The sampling is performed using reference experimental images. 2D microstructures are artificially synthesized using the sampled probabilities. Statistical features such as grain size distribution and autocorrelation functions closely match with those of the experimental images. The mechanical properties of the synthesized microstructures were computed using the finite element method and were also found to match the experimental values. (paper)

  9. Analogies between colored Lévy noise and random channel approach to disordered kinetics

    Science.gov (United States)

    Vlad, Marcel O.; Velarde, Manuel G.; Ross, John

    2004-02-01

    We point out some interesting analogies between colored Lévy noise and the random channel approach to disordered kinetics. These analogies are due to the fact that the probability density of the Lévy noise source plays a similar role as the probability density of rate coefficients in disordered kinetics. Although the equations for the two approaches are not identical, the analogies can be used for deriving new, useful results for both problems. The random channel approach makes it possible to generalize the fractional Uhlenbeck-Ornstein processes (FUO) for space- and time-dependent colored noise. We describe the properties of colored noise in terms of characteristic functionals, which are evaluated by using a generalization of Huber's approach to complex relaxation [Phys. Rev. B 31, 6070 (1985)]. We start out by investigating the properties of symmetrical white noise and then define the Lévy colored noise in terms of a Langevin equation with a Lévy white noise source. We derive exact analytical expressions for the various characteristic functionals, which characterize the noise, and a functional fractional Fokker-Planck equation for the probability density functional of the noise at a given moment in time. Second, by making an analogy between the theory of colored noise and the random channel approach to disordered kinetics, we derive fractional equations for the evolution of the probability densities of the random rate coefficients in disordered kinetics. These equations serve as a basis for developing methods for the evaluation of the statistical properties of the random rate coefficients from experimental data. Special attention is paid to the analysis of systems for which the observed kinetic curves can be described by linear or nonlinear stretched exponential kinetics.

  10. Properties and simulation of α-permanental random fields

    DEFF Research Database (Denmark)

    Møller, Jesper; Rubak, Ege Holger

    An α-permanental random field is briefly speaking a model for a collection of random variables with positive associations, where α is a positive number and the probability generating function is given in terms of a covariance or more general function so that density and moment expressions are given...... by certain α-permanents. Though such models possess many appealing probabilistic properties, many statisticians seem unaware of  α-permanental random fields and their potential applications. The purpose of this paper is first to summarize useful probabilistic results using the simplest possible setting......, and second to study stochastic constructions and simulation techniques, which should provide a useful basis for discussing the statistical aspects in future work. The paper also discusses some examples of  α-permanental random fields....

  11. Multiple Scattering in Random Mechanical Systems and Diffusion Approximation

    Science.gov (United States)

    Feres, Renato; Ng, Jasmine; Zhang, Hong-Kun

    2013-10-01

    This paper is concerned with stochastic processes that model multiple (or iterated) scattering in classical mechanical systems of billiard type, defined below. From a given (deterministic) system of billiard type, a random process with transition probabilities operator P is introduced by assuming that some of the dynamical variables are random with prescribed probability distributions. Of particular interest are systems with weak scattering, which are associated to parametric families of operators P h , depending on a geometric or mechanical parameter h, that approaches the identity as h goes to 0. It is shown that ( P h - I)/ h converges for small h to a second order elliptic differential operator on compactly supported functions and that the Markov chain process associated to P h converges to a diffusion with infinitesimal generator . Both P h and are self-adjoint (densely) defined on the space of square-integrable functions over the (lower) half-space in , where η is a stationary measure. This measure's density is either (post-collision) Maxwell-Boltzmann distribution or Knudsen cosine law, and the random processes with infinitesimal generator respectively correspond to what we call MB diffusion and (generalized) Legendre diffusion. Concrete examples of simple mechanical systems are given and illustrated by numerically simulating the random processes.

  12. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  13. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  14. Generating random networks and graphs

    CERN Document Server

    Coolen, Ton; Roberts, Ekaterina

    2017-01-01

    This book supports researchers who need to generate random networks, or who are interested in the theoretical study of random graphs. The coverage includes exponential random graphs (where the targeted probability of each network appearing in the ensemble is specified), growth algorithms (i.e. preferential attachment and the stub-joining configuration model), special constructions (e.g. geometric graphs and Watts Strogatz models) and graphs on structured spaces (e.g. multiplex networks). The presentation aims to be a complete starting point, including details of both theory and implementation, as well as discussions of the main strengths and weaknesses of each approach. It includes extensive references for readers wishing to go further. The material is carefully structured to be accessible to researchers from all disciplines while also containing rigorous mathematical analysis (largely based on the techniques of statistical mechanics) to support those wishing to further develop or implement the theory of rand...

  15. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  16. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  17. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  18. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  19. Contextuality in canonical systems of random variables

    Science.gov (United States)

    Dzhafarov, Ehtibar N.; Cervantes, Víctor H.; Kujala, Janne V.

    2017-10-01

    Random variables representing measurements, broadly understood to include any responses to any inputs, form a system in which each of them is uniquely identified by its content (that which it measures) and its context (the conditions under which it is recorded). Two random variables are jointly distributed if and only if they share a context. In a canonical representation of a system, all random variables are binary, and every content-sharing pair of random variables has a unique maximal coupling (the joint distribution imposed on them so that they coincide with maximal possible probability). The system is contextual if these maximal couplings are incompatible with the joint distributions of the context-sharing random variables. We propose to represent any system of measurements in a canonical form and to consider the system contextual if and only if its canonical representation is contextual. As an illustration, we establish a criterion for contextuality of the canonical system consisting of all dichotomizations of a single pair of content-sharing categorical random variables. This article is part of the themed issue `Second quantum revolution: foundational questions'.

  20. Discrete least squares polynomial approximation with random evaluations - application to PDEs with Random parameters

    KAUST Repository

    Nobile, Fabio

    2015-01-07

    We consider a general problem F(u, y) = 0 where u is the unknown solution, possibly Hilbert space valued, and y a set of uncertain parameters. We specifically address the situation in which the parameterto-solution map u(y) is smooth, however y could be very high (or even infinite) dimensional. In particular, we are interested in cases in which F is a differential operator, u a Hilbert space valued function and y a distributed, space and/or time varying, random field. We aim at reconstructing the parameter-to-solution map u(y) from random noise-free or noisy observations in random points by discrete least squares on polynomial spaces. The noise-free case is relevant whenever the technique is used to construct metamodels, based on polynomial expansions, for the output of computer experiments. In the case of PDEs with random parameters, the metamodel is then used to approximate statistics of the output quantity. We discuss the stability of discrete least squares on random points show convergence estimates both in expectation and probability. We also present possible strategies to select, either a-priori or by adaptive algorithms, sequences of approximating polynomial spaces that allow to reduce, and in some cases break, the curse of dimensionality

  1. Midcourse Guidance Law Based on High Target Acquisition Probability Considering Angular Constraint and Line-of-Sight Angle Rate Control

    Directory of Open Access Journals (Sweden)

    Xiao Liu

    2016-01-01

    Full Text Available Random disturbance factors would lead to the variation of target acquisition point during the long distance flight. To acquire a high target acquisition probability and improve the impact precision, missiles should be guided to an appropriate target acquisition position with certain attitude angles and line-of-sight (LOS angle rate. This paper has presented a new midcourse guidance law considering the influences of random disturbances, detection distance restraint, and target acquisition probability with Monte Carlo simulation. Detailed analyses of the impact points on the ground and the random distribution of the target acquisition position in the 3D space are given to get the appropriate attitude angles and the end position for the midcourse guidance. Then, a new formulation biased proportional navigation (BPN guidance law with angular constraint and LOS angle rate control has been derived to ensure the tracking ability when attacking the maneuvering target. Numerical simulations demonstrates that, compared with the proportional navigation guidance (PNG law and the near-optimal spatial midcourse guidance (NSMG law, BPN guidance law demonstrates satisfactory performances and can meet both the midcourse terminal angular constraint and the LOS angle rate requirement.

  2. Continuous state branching processes in random environment: The Brownian case

    OpenAIRE

    Palau, Sandra; Pardo, Juan Carlos

    2015-01-01

    We consider continuous state branching processes that are perturbed by a Brownian motion. These processes are constructed as the unique strong solution of a stochastic differential equation. The long-term extinction and explosion behaviours are studied. In the stable case, the extinction and explosion probabilities are given explicitly. We find three regimes for the asymptotic behaviour of the explosion probability and, as in the case of branching processes in random environment, we find five...

  3. [Conditional probability analysis between tinnitus and comorbidities in patients attending the National Rehabilitation Institute-LGII in the period 2012-2013].

    Science.gov (United States)

    Gómez Toledo, Verónica; Gutiérrez Farfán, Ileana; Verduzco-Mendoza, Antonio; Arch-Tirado, Emilio

    Tinnitus is defined as the conscious perception of a sensation of sound that occurs in the absence of an external stimulus. This audiological symptom affects 7% to 19% of the adult population. The aim of this study is to describe the associated comorbidities present in patients with tinnitus usingjoint and conditional probability analysis. Patients of both genders, diagnosed with unilateral or bilateral tinnitus, aged between 20 and 45 years, and had a full computerised medical record, were selected. Study groups were formed on the basis of the following clinical aspects: 1) audiological findings; 2) vestibular findings; 3) comorbidities such as, temporomandibular dysfunction, tubal dysfunction, otosclerosis and, 4) triggering factors of tinnitus noise exposure, respiratory tract infection, use of ototoxic and/or drugs. Of the patients with tinnitus, 27 (65%) reported hearing loss, 11 (26.19%) temporomandibular dysfunction, and 11 (26.19%) with vestibular disorders. When performing the joint probability analysis, it was found that the probability that a patient with tinnitus having hearing loss was 2742 0.65, and 2042 0.47 for bilateral type. The result for P (A ∩ B)=30%. Bayes' theorem P (AiB) = P(Ai∩B)P(B) was used, and various probabilities were calculated. Therefore, in patients with temporomandibulardysfunction and vestibular disorders, a posterior probability of P (Aі/B)=31.44% was calculated. Consideration should be given to the joint and conditional probability approach as tools for the study of different pathologies. Copyright © 2016 Academia Mexicana de Cirugía A.C. Publicado por Masson Doyma México S.A. All rights reserved.

  4. Amnestically Induced Persistence in Random Walks

    Science.gov (United States)

    Cressoni, J. C.; da Silva, Marco Antonio Alves; Viswanathan, G. M.

    2007-02-01

    We study how the Hurst exponent α depends on the fraction f of the total time t remembered by non-Markovian random walkers that recall only the distant past. We find that otherwise nonpersistent random walkers switch to persistent behavior when inflicted with significant memory loss. Such memory losses induce the probability density function of the walker’s position to undergo a transition from Gaussian to non-Gaussian. We interpret these findings of persistence in terms of a breakdown of self-regulation mechanisms and discuss their possible relevance to some of the burdensome behavioral and psychological symptoms of Alzheimer’s disease and other dementias.

  5. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  6. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  7. The Random Walk of Cars and Their Collision Probabilities with Planets

    Directory of Open Access Journals (Sweden)

    Hanno Rein

    2018-05-01

    Full Text Available On 6 February 2018, SpaceX launched a Tesla Roadster on a Mars-crossing orbit. We perform N-body simulations to determine the fate of the object over the next 15 Myr. The orbital evolution is initially dominated by close encounters with the Earth. While a precise orbit can not be predicted beyond the next several centuries due to these repeated chaotic scatterings, one can reliably predict the long-term outcomes by statistically analyzing a large suite of possible trajectories with slightly perturbed initial conditions. Repeated gravitational scatterings with Earth lead to a random walk. Collisions with the Earth, Venus and the Sun represent primary sinks for the Roadster’s orbital evolution. Collisions with Mercury and Mars, or ejections from the Solar System by Jupiter, are highly unlikely. We calculate a dynamical half-life of the Tesla of approximately 15 Myr, with some 22%, 12% and 12% of Roadster orbit realizations impacting the Earth, Venus, and the Sun within one half-life, respectively. Because the eccentricities and inclinations in our ensemble increase over time due to mean-motion and secular resonances, the impact rates with the terrestrial planets decrease beyond a few million years, whereas the impact rate on the Sun remains roughly constant.

  8. CPSC’s National Electronic Injury Surveillance System (NEISS)

    Data.gov (United States)

    US Consumer Product Safety Commission — CPSC’s National Electronic Injury Surveillance System (NEISS) is a national probability sample of hospitals in the U.S. and its territories. Patient information is...

  9. Foreword [International conference on algebra, analysis and quantum probability

    International Nuclear Information System (INIS)

    2016-01-01

    The present volume of the Journal of Physics: Conference Series represents contributions from participants of the International Conference ’’Algebra, Analysis and Quantum Probability” (Tashkent, 10-12 September 2015) organized by the Institute of Mathematics and the Faculty of Mechanics and Mathematics of the National University of Uzbekistan (NUUz) in collaboration with University Putra Malaysia (UPM) and International Islamic University Malaysia (IIUM). The Conference is dedicated to the 100th anniversary of one of the outstanding scientists of Uzbekistan, the founder of the Tashkent scientific school of functional analysis, who has initiated the investigations on operator algebras and quantum probability theory in Uzbekistan - Professor Tashmukhamed Alievich Sarymsakov (10 Sept. 1915 - 19 Dec. 1995). Among the mathematical community Professor T. A. Sarymsakov is widely known for his research in the fields of probability theory, functional analysis, general topology and their applications. A gifted teacher and skilful organizer he had a beneficial effect on the development of many new mathematicians in Uzbekistan. Professor T.A. Sarymsakov, an outstanding organizer of science in Uzbekistan, was one of the founders of the Uzbekistan Academy of Sciences, where from 1943 he was a member and Vice President, and from 1946 to 1952 president of the Academy of Sciences. Professor Sarymsakov successfully combined his fruitful scientific research with teaching and social work. During 1943-1944, 1952-1958 and 1971-1983 he was the rector of Tashkent State University (now the National University of Uzbekistan). He has made a significant contribution to the development of higher education in Uzbekistan, serving from 1959 to 1960 as the Chairman of the State Committee, and from 1960 to 1971 as the Minister of Higher and Secondary Special Education of Uzbekistan. The main objective of the scientific conference was to facilitate communication and collaboration between

  10. A new reliability measure based on specified minimum distances before the locations of random variables in a finite interval

    International Nuclear Information System (INIS)

    Todinov, M.T.

    2004-01-01

    A new reliability measure is proposed and equations are derived which determine the probability of existence of a specified set of minimum gaps between random variables following a homogeneous Poisson process in a finite interval. Using the derived equations, a method is proposed for specifying the upper bound of the random variables' number density which guarantees that the probability of clustering of two or more random variables in a finite interval remains below a maximum acceptable level. It is demonstrated that even for moderate number densities the probability of clustering is substantial and should not be neglected in reliability calculations. In the important special case where the random variables are failure times, models have been proposed for determining the upper bound of the hazard rate which guarantees a set of minimum failure-free operating intervals before the random failures, with a specified probability. A model has also been proposed for determining the upper bound of the hazard rate which guarantees a minimum availability target. Using the models proposed, a new strategy, models and reliability tools have been developed for setting quantitative reliability requirements which consist of determining the intersection of the hazard rate envelopes (hazard rate upper bounds) which deliver a minimum failure-free operating period before random failures, a risk of premature failure below a maximum acceptable level and a minimum required availability. It is demonstrated that setting reliability requirements solely based on an availability target does not necessarily mean a low risk of premature failure. Even at a high availability level, the probability of premature failure can be substantial. For industries characterised by a high cost of failure, the reliability requirements should involve a hazard rate envelope limiting the risk of failure below a maximum acceptable level

  11. Assessing wildfire occurrence probability in Pinus pinaster Ait. stands in Portugal

    Energy Technology Data Exchange (ETDEWEB)

    Marques, S.; Garcia-Gonzalo, J.; Botequim, B.; Ricardo, A.; Borges, J. G.; Tome, M.; Oliveira, M. M.

    2012-11-01

    Maritime pine (Pinus pinaster Ait.) is an important conifer from the western Mediterranean Basin extending over 22% of the forest area in Portugal. In the last three decades nearly 4% of Maritime pine area has been burned by wildfires. Yet no wildfire occurrence probability models are available and forest and fire management planning activities are thus carried out mostly independently of each other. This paper presents research to address this gap. Specifically, it presents a model to assess wildfire occurrence probability in regular and pure Maritime pine stands in Portugal. Emphasis was in developing a model based on easily available inventory data so that it might be useful to forest managers. For that purpose, data from the last two Portuguese National Forest Inventories (NFI) and data from wildfire perimeters in the years from 1998 to 2004 and from 2006 to 2007 were used. A binary logistic regression model was build using biometrics data from the NFI. Biometric data included indicators that might be changed by operations prescribed in forest planning. Results showed that the probability of wildfire occurrence in a stand increases in stand located at steeper slopes and with high shrubs load while it decreases with precipitation and with stand basal area. These results are instrumental for assessing the impact of forest management options on wildfire probability thus helping forest managers to reduce the risk of wildfires. (Author) 57 refs.

  12. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  13. On the regularity of the extinction probability of a branching process in varying and random environments

    International Nuclear Information System (INIS)

    Alili, Smail; Rugh, Hans Henrik

    2008-01-01

    We consider a supercritical branching process in time-dependent environment ξ. We assume that the offspring distributions depend regularly (C k or real-analytically) on real parameters λ. We show that the extinction probability q λ (ξ), given the environment ξ 'inherits' this regularity whenever the offspring distributions satisfy a condition of contraction-type. Our proof makes use of the Poincaré metric on the complex unit disc and a real-analytic implicit function theorem

  14. The development and validation of the Male Genital Self-Image Scale: results from a nationally representative probability sample of men in the United States.

    Science.gov (United States)

    Herbenick, Debby; Schick, Vanessa; Reece, Michael; Sanders, Stephanie A; Fortenberry, J Dennis

    2013-06-01

    Numerous factors may affect men's sexual experiences, including their health status, past trauma or abuse, medication use, relationships, mood, anxiety, and body image. Little research has assessed the influence of men's genital self-image on their sexual function or behaviors and none has done so in a nationally representative sample. The purpose of this study was to, in a nationally representative probability sample of men ages 18 to 60, assess the reliability and validity of the Male Genital Self-Image Scale (MGSIS), and to examine the relationship between scores on the MGSIS and men's scores on the International Index of Erectile Function (IIEF). The MGSIS was developed in two stages. Phase One involved a review of the literature and an analysis of cross-sectional survey data. Phase Two involved an administration of the scale items to a nationally representative sample of men in the United States ages 18 to 60. Measures include demographic items, the IIEF, and the MGSIS. Overall, most men felt positively about their genitals. However, 24.6% of men expressed some discomfort letting a healthcare provider examine their genitals and about 20% reported dissatisfaction with their genital size. The MGSIS was found to be reliable and valid, with the MGSIS-5 (consisting of five items) being the best fit to the data. The MGSIS was found to be a reliable and valid measure. In addition, men's scores on the MGSIS-5 were found to be positively related to men's scores on the IIEF. © 2013 International Society for Sexual Medicine.

  15. Uniqueness conditions for finitely dependent random fields

    International Nuclear Information System (INIS)

    Dobrushin, R.L.; Pecherski, E.A.

    1981-01-01

    The authors consider a random field for which uniqueness and some additional conditions guaranteeing that the correlations between the variables of the field decrease rapidly enough with the distance between the values of the parameter occur. The main result of the paper states that in such a case uniqueness is true for any other field with transition probabilities sufficiently close to those of the original field. Then they apply this result to some ''degenerate'' classes of random fields for which one can check this condition of correlation to decay, and thus obtain some new conditions of uniqueness. (Auth.)

  16. Discrete random walk models for space-time fractional diffusion

    International Nuclear Information System (INIS)

    Gorenflo, Rudolf; Mainardi, Francesco; Moretti, Daniele; Pagnini, Gianni; Paradisi, Paolo

    2002-01-01

    A physical-mathematical approach to anomalous diffusion may be based on generalized diffusion equations (containing derivatives of fractional order in space or/and time) and related random walk models. By space-time fractional diffusion equation we mean an evolution equation obtained from the standard linear diffusion equation by replacing the second-order space derivative with a Riesz-Feller derivative of order α is part of (0,2] and skewness θ (moduleθ≤{α,2-α}), and the first-order time derivative with a Caputo derivative of order β is part of (0,1]. Such evolution equation implies for the flux a fractional Fick's law which accounts for spatial and temporal non-locality. The fundamental solution (for the Cauchy problem) of the fractional diffusion equation can be interpreted as a probability density evolving in time of a peculiar self-similar stochastic process that we view as a generalized diffusion process. By adopting appropriate finite-difference schemes of solution, we generate models of random walk discrete in space and time suitable for simulating random variables whose spatial probability density evolves in time according to this fractional diffusion equation

  17. Logical independence and quantum randomness

    International Nuclear Information System (INIS)

    Paterek, T; Kofler, J; Aspelmeyer, M; Zeilinger, A; Brukner, C; Prevedel, R; Klimek, P

    2010-01-01

    We propose a link between logical independence and quantum physics. We demonstrate that quantum systems in the eigenstates of Pauli group operators are capable of encoding mathematical axioms and show that Pauli group quantum measurements are capable of revealing whether or not a given proposition is logically dependent on the axiomatic system. Whenever a mathematical proposition is logically independent of the axioms encoded in the measured state, the measurement associated with the proposition gives random outcomes. This allows for an experimental test of logical independence. Conversely, it also allows for an explanation of the probabilities of random outcomes observed in Pauli group measurements from logical independence without invoking quantum theory. The axiomatic systems we study can be completed and are therefore not subject to Goedel's incompleteness theorem.

  18. Testing for variation in taxonomic extinction probabilities: a suggested methodology and some results

    Science.gov (United States)

    Conroy, M.J.; Nichols, J.D.

    1984-01-01

    Several important questions in evolutionary biology and paleobiology involve sources of variation in extinction rates. In all cases of which we are aware, extinction rates have been estimated from data in which the probability that an observation (e.g., a fossil taxon) will occur is related both to extinction rates and to what we term encounter probabilities. Any statistical method for analyzing fossil data should at a minimum permit separate inferences on these two components. We develop a method for estimating taxonomic extinction rates from stratigraphic range data and for testing hypotheses about variability in these rates. We use this method to estimate extinction rates and to test the hypothesis of constant extinction rates for several sets of stratigraphic range data. The results of our tests support the hypothesis that extinction rates varied over the geologic time periods examined. We also present a test that can be used to identify periods of high or low extinction probabilities and provide an example using Phanerozoic invertebrate data. Extinction rates should be analyzed using stochastic models, in which it is recognized that stratigraphic samples are random varlates and that sampling is imperfect

  19. Application of Probability Calculations to the Study of the Permissible Step and Touch Potentials to Ensure Personnel Safety

    International Nuclear Information System (INIS)

    Eisawy, E.A.

    2011-01-01

    The aim of this paper is to develop a practical method to evaluate the actual step and touch potential distributions in order to determine the risk of failure of the grounding system. The failure probability, indicating the safety level of the grounding system, is related to both applied (stress) and withstand (strength) step or touch potentials. The probability distributions of the applied step and touch potentials as well as the corresponding withstand step and touch potentials which represent the capability of the human body to resist stress potentials are presented. These two distributions are used to evaluate the failure probability of the grounding system which denotes the probability that the applied potential exceeds the withstand potential. The method is accomplished in considering the resistance of the human body, the foot contact resistance and the fault clearing time as an independent random variables, rather than fixed values as treated in the previous analysis in determining the safety requirements for a given grounding system

  20. Correlation between discrete probability and reaction front propagation rate in heterogeneous mixtures

    Science.gov (United States)

    Naine, Tarun Bharath; Gundawar, Manoj Kumar

    2017-09-01

    We demonstrate a very powerful correlation between the discrete probability of distances of neighboring cells and thermal wave propagation rate, for a system of cells spread on a one-dimensional chain. A gamma distribution is employed to model the distances of neighboring cells. In the absence of an analytical solution and the differences in ignition times of adjacent reaction cells following non-Markovian statistics, invariably the solution for thermal wave propagation rate for a one-dimensional system with randomly distributed cells is obtained by numerical simulations. However, such simulations which are based on Monte-Carlo methods require several iterations of calculations for different realizations of distribution of adjacent cells. For several one-dimensional systems, differing in the value of shaping parameter of the gamma distribution, we show that the average reaction front propagation rates obtained by a discrete probability between two limits, shows excellent agreement with those obtained numerically. With the upper limit at 1.3, the lower limit depends on the non-dimensional ignition temperature. Additionally, this approach also facilitates the prediction of burning limits of heterogeneous thermal mixtures. The proposed method completely eliminates the need for laborious, time intensive numerical calculations where the thermal wave propagation rates can now be calculated based only on macroscopic entity of discrete probability.

  1. Estimating the Probability of Vegetation to Be Groundwater Dependent Based on the Evaluation of Tree Models

    Directory of Open Access Journals (Sweden)

    Isabel C. Pérez Hoyos

    2016-04-01

    Full Text Available Groundwater Dependent Ecosystems (GDEs are increasingly threatened by humans’ rising demand for water resources. Consequently, it is imperative to identify the location of GDEs to protect them. This paper develops a methodology to identify the probability of an ecosystem to be groundwater dependent. Probabilities are obtained by modeling the relationship between the known locations of GDEs and factors influencing groundwater dependence, namely water table depth and climatic aridity index. Probabilities are derived for the state of Nevada, USA, using modeled water table depth and aridity index values obtained from the Global Aridity database. The model selected results from the performance comparison of classification trees (CT and random forests (RF. Based on a threshold-independent accuracy measure, RF has a better ability to generate probability estimates. Considering a threshold that minimizes the misclassification rate for each model, RF also proves to be more accurate. Regarding training accuracy, performance measures such as accuracy, sensitivity, and specificity are higher for RF. For the test set, higher values of accuracy and kappa for CT highlight the fact that these measures are greatly affected by low prevalence. As shown for RF, the choice of the cutoff probability value has important consequences on model accuracy and the overall proportion of locations where GDEs are found.

  2. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  3. Localization for random Schroedinger operators with correlated potentials

    Energy Technology Data Exchange (ETDEWEB)

    Von Dreifus, H [Princeton Univ., NJ (USA). Dept. of Physics; Klein, A [California Univ., Irvine (USA). Dept. of Mathematics

    1991-08-01

    We prove localization at high disorder or low energy for lattice Schroedinger operators with random potentials whose values at different lattice sites are correlated over large distances. The class of admissible random potentials for our multiscale analysis includes potentials with a stationary Gaussian distribution whose covariance function C(x,y) decays as vertical strokex-yvertical stroke{sup -{theta}}, where {theta}>0 can be arbitrarily small, and potentials whose probability distribution is a completely analytical Gibbs measure. The result for Gaussian potentials depends on a multivariable form of Nelson's best possible hypercontractive estimate. (orig.).

  4. Updated greenhouse gas and criteria air pollutant emission factors and their probability distribution functions for electricity generating units

    Energy Technology Data Exchange (ETDEWEB)

    Cai, H.; Wang, M.; Elgowainy, A.; Han, J. (Energy Systems)

    2012-07-06

    Greenhouse gas (CO{sub 2}, CH{sub 4} and N{sub 2}O, hereinafter GHG) and criteria air pollutant (CO, NO{sub x}, VOC, PM{sub 10}, PM{sub 2.5} and SO{sub x}, hereinafter CAP) emission factors for various types of power plants burning various fuels with different technologies are important upstream parameters for estimating life-cycle emissions associated with alternative vehicle/fuel systems in the transportation sector, especially electric vehicles. The emission factors are typically expressed in grams of GHG or CAP per kWh of electricity generated by a specific power generation technology. This document describes our approach for updating and expanding GHG and CAP emission factors in the GREET (Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation) model developed at Argonne National Laboratory (see Wang 1999 and the GREET website at http://greet.es.anl.gov/main) for various power generation technologies. These GHG and CAP emissions are used to estimate the impact of electricity use by stationary and transportation applications on their fuel-cycle emissions. The electricity generation mixes and the fuel shares attributable to various combustion technologies at the national, regional and state levels are also updated in this document. The energy conversion efficiencies of electric generating units (EGUs) by fuel type and combustion technology are calculated on the basis of the lower heating values of each fuel, to be consistent with the basis used in GREET for transportation fuels. On the basis of the updated GHG and CAP emission factors and energy efficiencies of EGUs, the probability distribution functions (PDFs), which are functions that describe the relative likelihood for the emission factors and energy efficiencies as random variables to take on a given value by the integral of their own probability distributions, are updated using best-fit statistical curves to characterize the uncertainties associated with GHG and CAP emissions in life

  5. Influence of random setup error on dose distribution

    International Nuclear Information System (INIS)

    Zhai Zhenyu

    2008-01-01

    Objective: To investigate the influence of random setup error on dose distribution in radiotherapy and determine the margin from ITV to PTV. Methods: A random sample approach was used to simulate the fields position in target coordinate system. Cumulative effect of random setup error was the sum of dose distributions of all individual treatment fractions. Study of 100 cumulative effects might get shift sizes of 90% dose point position. Margins from ITV to PTV caused by random setup error were chosen by 95% probability. Spearman's correlation was used to analyze the influence of each factor. Results: The average shift sizes of 90% dose point position was 0.62, 1.84, 3.13, 4.78, 6.34 and 8.03 mm if random setup error was 1,2,3,4,5 and 6 mm,respectively. Univariate analysis showed the size of margin was associated only by the size of random setup error. Conclusions: Margin of ITV to PTV is 1.2 times random setup error for head-and-neck cancer and 1.5 times for thoracic and abdominal cancer. Field size, energy and target depth, unlike random setup error, have no relation with the size of the margin. (authors)

  6. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  7. Interpretations of Probability in Quantum Mechanics: A Case of ``Experimental Metaphysics''

    Science.gov (United States)

    Hellman, Geoffrey

    After reviewing paradigmatic cases of “experimental metaphysics” basing inferences against local realism and determinism on experimental tests of Bells theorem (and successors), we concentrate on clarifying the meaning and status of “objective probability” in quantum mechanics. The terms “objective” and “subjective” are found ambiguous and inadequate, masking crucial differences turning on the question of what the numerical values of probability functions measure vs. the question of the nature of the “events” on which such functions are defined. This leads naturally to a 2×2 matrix of types of interpretations, which are then illustrated with salient examples. (Of independent interest are the splitting of “Copenhagen interpretation” into “objective” and “subjective” varieties in one of the dimensions and the splitting of Bohmian hidden variables from (other) modal interpretations along that same dimension.) It is then explained why Everett interpretations are difficult to categorize in these terms. Finally, we argue that Bohmian mechanics does not seriously threaten the experimental-metaphysical case for ultimate randomness and purely physical probabilities.

  8. Market-implied risk-neutral probabilities, actual probabilities, credit risk and news

    Directory of Open Access Journals (Sweden)

    Shashidhar Murthy

    2011-09-01

    Full Text Available Motivated by the credit crisis, this paper investigates links between risk-neutral probabilities of default implied by markets (e.g. from yield spreads and their actual counterparts (e.g. from ratings. It discusses differences between the two and clarifies underlying economic intuition using simple representations of credit risk pricing. Observed large differences across bonds in the ratio of the two probabilities are shown to imply that apparently safer securities can be more sensitive to news.

  9. Tailored Random Graph Ensembles

    International Nuclear Information System (INIS)

    Roberts, E S; Annibale, A; Coolen, A C C

    2013-01-01

    Tailored graph ensembles are a developing bridge between biological networks and statistical mechanics. The aim is to use this concept to generate a suite of rigorous tools that can be used to quantify and compare the topology of cellular signalling networks, such as protein-protein interaction networks and gene regulation networks. We calculate exact and explicit formulae for the leading orders in the system size of the Shannon entropies of random graph ensembles constrained with degree distribution and degree-degree correlation. We also construct an ergodic detailed balance Markov chain with non-trivial acceptance probabilities which converges to a strictly uniform measure and is based on edge swaps that conserve all degrees. The acceptance probabilities can be generalized to define Markov chains that target any alternative desired measure on the space of directed or undirected graphs, in order to generate graphs with more sophisticated topological features.

  10. XI Symposium on Probability and Stochastic Processes

    CERN Document Server

    Pardo, Juan; Rivero, Víctor; Bravo, Gerónimo

    2015-01-01

    This volume features lecture notes and a collection of contributed articles from the XI Symposium on Probability and Stochastic Processes, held at CIMAT Mexico in September 2013. Since the symposium was part of the activities organized in Mexico to celebrate the International Year of Statistics, the program included topics from the interface between statistics and stochastic processes. The book starts with notes from the mini-course given by Louigi Addario-Berry with an accessible description of some features of the multiplicative coalescent and its connection with random graphs and minimum spanning trees. It includes a number of exercises and a section on unanswered questions. Further contributions provide the reader with a broad perspective on the state-of-the art of active areas of research. Contributions by: Louigi Addario-Berry Octavio Arizmendi Fabrice Baudoin Jochen Blath Loïc Chaumont J. Armando Domínguez-Molina Bjarki Eldon Shui Feng Tulio Gaxiola Adrián González Casanova Evgueni Gordienko Daniel...

  11. Automated segmentation of dental CBCT image with prior-guided sequential random forests

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Li; Gao, Yaozong; Shi, Feng; Li, Gang [Department of Radiology and BRIC, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina 27599-7513 (United States); Chen, Ken-Chung; Tang, Zhen [Surgical Planning Laboratory, Department of Oral and Maxillofacial Surgery, Houston Methodist Research Institute, Houston, Texas 77030 (United States); Xia, James J., E-mail: dgshen@med.unc.edu, E-mail: JXia@HoustonMethodist.org [Surgical Planning Laboratory, Department of Oral and Maxillofacial Surgery, Houston Methodist Research Institute, Houston, Texas 77030 (United States); Department of Surgery (Oral and Maxillofacial Surgery), Weill Medical College, Cornell University, New York, New York 10065 (United States); Department of Oral and Craniomaxillofacial Surgery, Shanghai Jiao Tong University School of Medicine, Shanghai Ninth People’s Hospital, Shanghai 200011 (China); Shen, Dinggang, E-mail: dgshen@med.unc.edu, E-mail: JXia@HoustonMethodist.org [Department of Radiology and BRIC, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina 27599-7513 and Department of Brain and Cognitive Engineering, Korea University, Seoul 02841 (Korea, Republic of)

    2016-01-15

    Purpose: Cone-beam computed tomography (CBCT) is an increasingly utilized imaging modality for the diagnosis and treatment planning of the patients with craniomaxillofacial (CMF) deformities. Accurate segmentation of CBCT image is an essential step to generate 3D models for the diagnosis and treatment planning of the patients with CMF deformities. However, due to the image artifacts caused by beam hardening, imaging noise, inhomogeneity, truncation, and maximal intercuspation, it is difficult to segment the CBCT. Methods: In this paper, the authors present a new automatic segmentation method to address these problems. Specifically, the authors first employ a majority voting method to estimate the initial segmentation probability maps of both mandible and maxilla based on multiple aligned expert-segmented CBCT images. These probability maps provide an important prior guidance for CBCT segmentation. The authors then extract both the appearance features from CBCTs and the context features from the initial probability maps to train the first-layer of random forest classifier that can select discriminative features for segmentation. Based on the first-layer of trained classifier, the probability maps are updated, which will be employed to further train the next layer of random forest classifier. By iteratively training the subsequent random forest classifier using both the original CBCT features and the updated segmentation probability maps, a sequence of classifiers can be derived for accurate segmentation of CBCT images. Results: Segmentation results on CBCTs of 30 subjects were both quantitatively and qualitatively validated based on manually labeled ground truth. The average Dice ratios of mandible and maxilla by the authors’ method were 0.94 and 0.91, respectively, which are significantly better than the state-of-the-art method based on sparse representation (p-value < 0.001). Conclusions: The authors have developed and validated a novel fully automated method

  12. Automated segmentation of dental CBCT image with prior-guided sequential random forests

    International Nuclear Information System (INIS)

    Wang, Li; Gao, Yaozong; Shi, Feng; Li, Gang; Chen, Ken-Chung; Tang, Zhen; Xia, James J.; Shen, Dinggang

    2016-01-01

    Purpose: Cone-beam computed tomography (CBCT) is an increasingly utilized imaging modality for the diagnosis and treatment planning of the patients with craniomaxillofacial (CMF) deformities. Accurate segmentation of CBCT image is an essential step to generate 3D models for the diagnosis and treatment planning of the patients with CMF deformities. However, due to the image artifacts caused by beam hardening, imaging noise, inhomogeneity, truncation, and maximal intercuspation, it is difficult to segment the CBCT. Methods: In this paper, the authors present a new automatic segmentation method to address these problems. Specifically, the authors first employ a majority voting method to estimate the initial segmentation probability maps of both mandible and maxilla based on multiple aligned expert-segmented CBCT images. These probability maps provide an important prior guidance for CBCT segmentation. The authors then extract both the appearance features from CBCTs and the context features from the initial probability maps to train the first-layer of random forest classifier that can select discriminative features for segmentation. Based on the first-layer of trained classifier, the probability maps are updated, which will be employed to further train the next layer of random forest classifier. By iteratively training the subsequent random forest classifier using both the original CBCT features and the updated segmentation probability maps, a sequence of classifiers can be derived for accurate segmentation of CBCT images. Results: Segmentation results on CBCTs of 30 subjects were both quantitatively and qualitatively validated based on manually labeled ground truth. The average Dice ratios of mandible and maxilla by the authors’ method were 0.94 and 0.91, respectively, which are significantly better than the state-of-the-art method based on sparse representation (p-value < 0.001). Conclusions: The authors have developed and validated a novel fully automated method

  13. Random wave fields and scintillated beams

    CSIR Research Space (South Africa)

    Roux, FS

    2009-01-01

    Full Text Available F. Stef Roux CSIR National Laser Centre PO Box 395, Pretoria 0001, South Africa CSIR National Laser Centre – p.1/29 Contents . Scintillated beams and adaptive optics . Detecting a vortex — Shack-Hartmann . Remove optical vortices . Random vortex... beam. CSIR National Laser Centre – p.3/29 Weak scintillation If the scintillation is weak the resulting phase function of the optical beam is still continuous. Such a weakly scintillated beam can be corrected by an adaptive optical system. CSIR National...

  14. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  15. Adaptive random walks on the class of Web graphs

    Science.gov (United States)

    Tadić, B.

    2001-09-01

    We study random walk with adaptive move strategies on a class of directed graphs with variable wiring diagram. The graphs are grown from the evolution rules compatible with the dynamics of the world-wide Web [B. Tadić, Physica A 293, 273 (2001)], and are characterized by a pair of power-law distributions of out- and in-degree for each value of the parameter β, which measures the degree of rewiring in the graph. The walker adapts its move strategy according to locally available information both on out-degree of the visited node and in-degree of target node. A standard random walk, on the other hand, uses the out-degree only. We compute the distribution of connected subgraphs visited by an ensemble of walkers, the average access time and survival probability of the walks. We discuss these properties of the walk dynamics relative to the changes in the global graph structure when the control parameter β is varied. For β≥ 3, corresponding to the world-wide Web, the access time of the walk to a given level of hierarchy on the graph is much shorter compared to the standard random walk on the same graph. By reducing the amount of rewiring towards rigidity limit β↦βc≲ 0.1, corresponding to the range of naturally occurring biochemical networks, the survival probability of adaptive and standard random walk become increasingly similar. The adaptive random walk can be used as an efficient message-passing algorithm on this class of graphs for large degree of rewiring.

  16. National Coral Reef Monitoring Program: Benthic Images Collected from Stratified Random Sites (StRS) across Wake Island from 2014-03-16 to 2014-03-20 (NCEI Accession 0159157)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The data described here are benthic habitat imagery that result from benthic photo-quadrat surveys conducted along transects at stratified random sites across Wake...

  17. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  18. Contextuality is about identity of random variables

    International Nuclear Information System (INIS)

    Dzhafarov, Ehtibar N; Kujala, Janne V

    2014-01-01

    Contextual situations are those in which seemingly ‘the same’ random variable changes its identity depending on the conditions under which it is recorded. Such a change of identity is observed whenever the assumption that the variable is one and the same under different conditions leads to contradictions when one considers its joint distribution with other random variables (this is the essence of all Bell-type theorems). In our Contextuality-by-Default approach, instead of asking why or how the conditions force ‘one and the same’ random variable to change ‘its’ identity, any two random variables recorded under different conditions are considered different ‘automatically.’ They are never the same, nor are they jointly distributed, but one can always impose on them a joint distribution (probabilistic coupling). The special situations when there is a coupling in which these random variables are equal with probability 1 are considered noncontextual. Contextuality means that such couplings do not exist. We argue that the determination of the identity of random variables by conditions under which they are recorded is not a causal relationship and cannot violate laws of physics. (paper)

  19. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  20. Noisy probability judgment, the conjunction fallacy, and rationality: Comment on Costello and Watts (2014).

    Science.gov (United States)

    Crupi, Vincenzo; Tentori, Katya

    2016-01-01

    According to Costello and Watts (2014), probability theory can account for key findings in human judgment research provided that random noise is embedded in the model. We concur with a number of Costello and Watts's remarks, but challenge the empirical adequacy of their model in one of their key illustrations (the conjunction fallacy) on the basis of recent experimental findings. We also discuss how our argument bears on heuristic and rational thinking. (c) 2015 APA, all rights reserved).