WorldWideScience

Sample records for random probability measures

  1. On Randomness and Probability

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 2. On Randomness and Probability How to Mathematically Model Uncertain Events ... Author Affiliations. Rajeeva L Karandikar1. Statistics and Mathematics Unit, Indian Statistical Institute, 7 S J S Sansanwal Marg, New Delhi 110 016, India.

  2. On Randomness and Probability

    Indian Academy of Sciences (India)

    casinos and gambling houses? How does one interpret a statement like "there is a 30 per cent chance of rain tonight" - a statement we often hear on the news? Such questions arise in the mind of every student when she/he is taught probability as part of mathematics. Many students who go on to study probability and ...

  3. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  4. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  5. Measurement uncertainty and probability

    National Research Council Canada - National Science Library

    Willink, Robin

    2013-01-01

    ... and probability models 3.4 Inference and confidence 3.5 Two central limit theorems 3.6 The Monte Carlo method and process simulation 4 The randomization of systematic errors page xi xii 3 3 5 7 10 12 16 19 21 21 23 28 30 32 33 39 43 45 52 53 56 viiviii 4.1 4.2 4.3 4.4 4.5 Contents The Working Group of 1980 From classical repetition to practica...

  6. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  7. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  8. Agreeing Probability Measures for Comparative Probability Structures

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1981-01-01

    textabstractIt is proved that fine and tight comparative probability structures (where the set of events is assumed to be an algebra, not necessarily a σ-algebra) have agreeing probability measures. Although this was often claimed in the literature, all proofs the author encountered are not valid

  9. Measure, integral and probability

    CERN Document Server

    Capiński, Marek

    2004-01-01

    Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.

  10. Integration, measure and probability

    CERN Document Server

    Pitt, H R

    2012-01-01

    Introductory treatment develops the theory of integration in a general context, making it applicable to other branches of analysis. More specialized topics include convergence theorems and random sequences and functions. 1963 edition.

  11. Introduction to probability and measure

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  12. Probability measures on metric spaces

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    In this book, the author gives a cohesive account of the theory of probability measures on complete metric spaces (which is viewed as an alternative approach to the general theory of stochastic processes). After a general description of the basics of topology on the set of measures, the author discusses regularity, tightness, and perfectness of measures, properties of sampling distributions, and metrizability and compactness theorems. Next, he describes arithmetic properties of probability measures on metric groups and locally compact abelian groups. Covered in detail are notions such as decom

  13. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out...

  14. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  15. Probability, random processes, and ergodic properties

    CERN Document Server

    Gray, Robert M

    1988-01-01

    This book has been written for several reasons, not all of which are academic. This material was for many years the first half of a book in progress on information and ergodic theory. The intent was and is to provide a reasonably self-contained advanced treatment of measure theory, prob ability theory, and the theory of discrete time random processes with an emphasis on general alphabets and on ergodic and stationary properties of random processes that might be neither ergodic nor stationary. The intended audience was mathematically inc1ined engineering graduate students and visiting scholars who had not had formal courses in measure theoretic probability . Much of the material is familiar stuff for mathematicians, but many of the topics and results have not previously appeared in books. The original project grew too large and the first part contained much that would likely bore mathematicians and dis courage them from the second part. Hence I finally followed the suggestion to separate the material and split...

  16. Nonequilibrium random matrix theory: Transition probabilities

    Science.gov (United States)

    Pedro, Francisco Gil; Westphal, Alexander

    2017-03-01

    In this paper we present an analytic method for calculating the transition probability between two random Gaussian matrices with given eigenvalue spectra in the context of Dyson Brownian motion. We show that in the Coulomb gas language, in large N limit, memory of the initial state is preserved in the form of a universal linear potential acting on the eigenvalues. We compute the likelihood of any given transition as a function of time, showing that as memory of the initial state is lost, transition probabilities converge to those of the static ensemble.

  17. Negative probability of random multiplier in turbulence

    Science.gov (United States)

    Bai, Xuan; Su, Weidong

    2017-11-01

    The random multiplicative process (RMP), which has been proposed for over 50 years, is a convenient phenomenological ansatz of turbulence cascade. In the RMP, the fluctuation in a large scale is statistically mapped to the one in a small scale by the linear action of an independent random multiplier (RM). Simple as it is, the RMP is powerful enough since all of the known scaling laws can be included in this model. So far as we know, however, a direct extraction for the probability density function (PDF) of RM has been absent yet. The reason is the deconvolution during the process is ill-posed. Nevertheless, with the progress in the studies of inverse problems, the situation can be changed. By using some new regularization techniques, for the first time we recover the PDFs of the RMs in some turbulent flows. All the consistent results from various methods point to an amazing observation-the PDFs can attain negative values in some intervals; and this can also be justified by some properties of infinitely divisible distributions. Despite the conceptual unconventionality, the present study illustrates the implications of negative probability in turbulence in several aspects, with emphasis on its role in describing the interaction between fluctuations at different scales. This work is supported by the NSFC (No. 11221062 and No. 11521091).

  18. Probability Distributions for Random Quantum Operations

    Science.gov (United States)

    Schultz, Kevin

    Motivated by uncertainty quantification and inference of quantum information systems, in this work we draw connections between the notions of random quantum states and operations in quantum information with probability distributions commonly encountered in the field of orientation statistics. This approach identifies natural sample spaces and probability distributions upon these spaces that can be used in the analysis, simulation, and inference of quantum information systems. The theory of exponential families on Stiefel manifolds provides the appropriate generalization to the classical case. Furthermore, this viewpoint motivates a number of additional questions into the convex geometry of quantum operations relative to both the differential geometry of Stiefel manifolds as well as the information geometry of exponential families defined upon them. In particular, we draw on results from convex geometry to characterize which quantum operations can be represented as the average of a random quantum operation. This project was supported by the Intelligence Advanced Research Projects Activity via Department of Interior National Business Center Contract Number 2012-12050800010.

  19. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  20. Random phenomena fundamentals of probability and statistics for engineers

    CERN Document Server

    Ogunnaike, Babatunde A

    2009-01-01

    PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...

  1. Hybrid computer technique yields random signal probability distributions

    Science.gov (United States)

    Cameron, W. D.

    1965-01-01

    Hybrid computer determines the probability distributions of instantaneous and peak amplitudes of random signals. This combined digital and analog computer system reduces the errors and delays of manual data analysis.

  2. Probability, random variables, and random processes theory and signal processing applications

    CERN Document Server

    Shynk, John J

    2012-01-01

    Probability, Random Variables, and Random Processes is a comprehensive textbook on probability theory for engineers that provides a more rigorous mathematical framework than is usually encountered in undergraduate courses. It is intended for first-year graduate students who have some familiarity with probability and random variables, though not necessarily of random processes and systems that operate on random signals. It is also appropriate for advanced undergraduate students who have a strong mathematical background. The book has the following features: Several app

  3. Probability of stress-corrosion fracture under random loading

    Science.gov (United States)

    Yang, J. N.

    1974-01-01

    Mathematical formulation is based on cumulative-damage hypothesis and experimentally-determined stress-corrosion characteristics. Under both stationary random loadings, mean value and variance of cumulative damage are obtained. Probability of stress-corrosion fracture is then evaluated, using principle of maximum entropy.

  4. Non-equilibrium random matrix theory. Transition probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Pedro, Francisco Gil [Univ. Autonoma de Madrid (Spain). Dept. de Fisica Teorica; Westphal, Alexander [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany). Gruppe Theorie

    2016-06-15

    In this letter we present an analytic method for calculating the transition probability between two random Gaussian matrices with given eigenvalue spectra in the context of Dyson Brownian motion. We show that in the Coulomb gas language, in large N limit, memory of the initial state is preserved in the form of a universal linear potential acting on the eigenvalues. We compute the likelihood of any given transition as a function of time, showing that as memory of the initial state is lost, transition probabilities converge to those of the static ensemble.

  5. Computer routines for probability distributions, random numbers, and related functions

    Science.gov (United States)

    Kirby, W.

    1983-01-01

    Use of previously coded and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main progress. The probability distributions provided include the beta, chi-square, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F. Other mathematical functions include the Bessel function, I sub o, gamma and log-gamma functions, error functions, and exponential integral. Auxiliary services include sorting and printer-plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)

  6. Application of random match probability calculations to mixed STR profiles.

    Science.gov (United States)

    Bille, Todd; Bright, Jo-Anne; Buckleton, John

    2013-03-01

    Mixed DNA profiles are being encountered more frequently as laboratories analyze increasing amounts of touch evidence. If it is determined that an individual could be a possible contributor to the mixture, it is necessary to perform a statistical analysis to allow an assignment of weight to the evidence. Currently, the combined probability of inclusion (CPI) and the likelihood ratio (LR) are the most commonly used methods to perform the statistical analysis. A third method, random match probability (RMP), is available. This article compares the advantages and disadvantages of the CPI and LR methods to the RMP method. We demonstrate that although the LR method is still considered the most powerful of the binary methods, the RMP and LR methods make similar use of the observed data such as peak height, assumed number of contributors, and known contributors where the CPI calculation tends to waste information and be less informative. © 2013 American Academy of Forensic Sciences.

  7. Weak measurements measure probability amplitudes (and very little else)

    Energy Technology Data Exchange (ETDEWEB)

    Sokolovski, D., E-mail: dgsokol15@gmail.com [Departmento de Química-Física, Universidad del País Vasco, UPV/EHU, Leioa (Spain); IKERBASQUE, Basque Foundation for Science, Maria Diaz de Haro 3, 48013, Bilbao (Spain)

    2016-04-22

    Conventional quantum mechanics describes a pre- and post-selected system in terms of virtual (Feynman) paths via which the final state can be reached. In the absence of probabilities, a weak measurement (WM) determines the probability amplitudes for the paths involved. The weak values (WV) can be identified with these amplitudes, or their linear combinations. This allows us to explain the “unusual” properties of the WV, and avoid the “paradoxes” often associated with the WM. - Highlights: • Weak measurement on a pre- and post-selected system is a particular perturbative scheme. • A conventional average for the additional degree of freedom measured. • The result is proportional to the amplitudes on the virtual paths connecting two system's states. • Over-interpretation of the weak values (WV) is unwise. • “Unusual” WVs are not unusual after all.

  8. An introduction to measure-theoretic probability

    CERN Document Server

    Roussas, George G

    2004-01-01

    This book provides in a concise, yet detailed way, the bulk of the probabilistic tools that a student working toward an advanced degree in statistics,probability and other related areas, should be equipped with. The approach is classical, avoiding the use of mathematical tools not necessary for carrying out the discussions. All proofs are presented in full detail.* Excellent exposition marked by a clear, coherent and logical devleopment of the subject* Easy to understand, detailed discussion of material* Complete proofs

  9. Random measures, theory and applications

    CERN Document Server

    Kallenberg, Olav

    2017-01-01

    Offering the first comprehensive treatment of the theory of random measures, this book has a very broad scope, ranging from basic properties of Poisson and related processes to the modern theories of convergence, stationarity, Palm measures, conditioning, and compensation. The three large final chapters focus on applications within the areas of stochastic geometry, excursion theory, and branching processes. Although this theory plays a fundamental role in most areas of modern probability, much of it, including the most basic material, has previously been available only in scores of journal articles. The book is primarily directed towards researchers and advanced graduate students in stochastic processes and related areas.

  10. Trending in Probability of Collision Measurements

    Science.gov (United States)

    Vallejo, J. J.; Hejduk, M. D.; Stamey, J. D.

    2015-01-01

    A simple model is proposed to predict the behavior of Probabilities of Collision (P(sub c)) for conjunction events. The model attempts to predict the location and magnitude of the peak P(sub c) value for an event by assuming the progression of P(sub c) values can be modeled to first order by a downward-opening parabola. To incorporate prior information from a large database of past conjunctions, the Bayes paradigm is utilized; and the operating characteristics of the model are established through a large simulation study. Though the model is simple, it performs well in predicting the temporal location of the peak (P(sub c)) and thus shows promise as a decision aid in operational conjunction assessment risk analysis.

  11. Limit law for transition probabilities and moderate deviations for Sinai's random walk in random environment

    CERN Document Server

    Comets, F

    2003-01-01

    We consider a one-dimensional random walk in random environment in the Sinai's regime. Our main result is that logarithms of the transition probabilities, after a suitable rescaling, converge in distribution as time tends to infinity, to some functional of the Brownian motion. We compute the law of this functional when the initial and final points agree. Also, among other things, we estimate the probability of being at time~$t$ at distance at least $z$ from the initial position, when $z$ is larger than $\\ln^2 t$, but still of logarithmic order in time.

  12. Fuzzy Prokhorov metric on the set of probability measures

    OpenAIRE

    Repovš, D.; Savchenko, A.; Zarichnyi, M.

    2011-01-01

    We introduce a fuzzy metric on the set of probability measures on a fuzzy metric space. The construction is an analogue, in the realm of fuzzy metric spaces, of the Prokhorov metric on the set of probability measures on compact metric spaces.

  13. THE SEMIGROUP OF METRIC MEASURE SPACES AND ITS INFINITELY DIVISIBLE PROBABILITY MEASURES.

    Science.gov (United States)

    Evans, Steven N; Molchanov, Ilya

    2017-01-01

    A metric measure space is a complete, separable metric space equipped with a probability measure that has full support. Two such spaces are equivalent if they are isometric as metric spaces via an isometry that maps the probability measure on the first space to the probability measure on the second. The resulting set of equivalence classes can be metrized with the Gromov-Prohorov metric of Greven, Pfaffelhuber and Winter. We consider the natural binary operation ⊞ on this space that takes two metric measure spaces and forms their Cartesian product equipped with the sum of the two metrics and the product of the two probability measures. We show that the metric measure spaces equipped with this operation form a cancellative, commutative, Polish semigroup with a translation invariant metric. There is an explicit family of continuous semicharacters that is extremely useful for, inter alia , establishing that there are no infinitely divisible elements and that each element has a unique factorization into prime elements. We investigate the interaction between the semigroup structure and the natural action of the positive real numbers on this space that arises from scaling the metric. For example, we show that for any given positive real numbers a , b , c the trivial space is the only space that satisfies a ⊞ b = c . We establish that there is no analogue of the law of large numbers: if X 1 , X 2 , … is an identically distributed independent sequence of random spaces, then no subsequence of [Formula: see text] converges in distribution unless each X k is almost surely equal to the trivial space. We characterize the infinitely divisible probability measures and the Lévy processes on this semigroup, characterize the stable probability measures and establish a counterpart of the LePage representation for the latter class.

  14. What Randomized Benchmarking Actually Measures

    Science.gov (United States)

    Proctor, Timothy; Rudinger, Kenneth; Young, Kevin; Sarovar, Mohan; Blume-Kohout, Robin

    2017-09-01

    Randomized benchmarking (RB) is widely used to measure an error rate of a set of quantum gates, by performing random circuits that would do nothing if the gates were perfect. In the limit of no finite-sampling error, the exponential decay rate of the observable survival probabilities, versus circuit length, yields a single error metric r . For Clifford gates with arbitrary small errors described by process matrices, r was believed to reliably correspond to the mean, over all Clifford gates, of the average gate infidelity between the imperfect gates and their ideal counterparts. We show that this quantity is not a well-defined property of a physical gate set. It depends on the representations used for the imperfect and ideal gates, and the variant typically computed in the literature can differ from r by orders of magnitude. We present new theories of the RB decay that are accurate for all small errors describable by process matrices, and show that the RB decay curve is a simple exponential for all such errors. These theories allow explicit computation of the error rate that RB measures (r ), but as far as we can tell it does not correspond to the infidelity of a physically allowed (completely positive) representation of the imperfect gates.

  15. Probability measures, Lévy measures and analyticity in time

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Hubalek, Friedrich

    2008-01-01

    We investigate the relation of the semigroup probability density of an infinite activity Lévy process to the corresponding Lévy density. For subordinators, we provide three methods to compute the former from the latter. The first method is based on approximating compound Poisson distributions...

  16. Probability Measures, Lévy Measures, and Analyticity in Time

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Hubalek, Friedrich

    We investigate the relation of the semigroup probability density of an infinite activity Lévy process to the corresponding Lévy density. For subordinators we provide three methods to compute the former from the latter. The first method is based on approximating compound Poisson distributions...

  17. Some Bounds on the Deviation Probability for Sums of Nonnegative Random Variables Using Upper Polynomials, Moment and Probability Generating Functions

    OpenAIRE

    From, Steven G.

    2010-01-01

    We present several new bounds for certain sums of deviation probabilities involving sums of nonnegative random variables. These are based upon upper bounds for the moment generating functions of the sums. We compare these new bounds to those of Maurer [2], Bernstein [4], Pinelis [16], and Bentkus [3]. We also briefly discuss the infinitely divisible distributions case.

  18. Probability on graphs random processes on graphs and lattices

    CERN Document Server

    Grimmett, Geoffrey

    2018-01-01

    This introduction to some of the principal models in the theory of disordered systems leads the reader through the basics, to the very edge of contemporary research, with the minimum of technical fuss. Topics covered include random walk, percolation, self-avoiding walk, interacting particle systems, uniform spanning tree, random graphs, as well as the Ising, Potts, and random-cluster models for ferromagnetism, and the Lorentz model for motion in a random medium. This new edition features accounts of major recent progress, including the exact value of the connective constant of the hexagonal lattice, and the critical point of the random-cluster model on the square lattice. The choice of topics is strongly motivated by modern applications, and focuses on areas that merit further research. Accessible to a wide audience of mathematicians and physicists, this book can be used as a graduate course text. Each chapter ends with a range of exercises.

  19. A lower bound on the probability that a binomial random variable is exceeding its mean

    OpenAIRE

    Pelekis, Christos; Ramon, Jan

    2016-01-01

    We provide a lower bound on the probability that a binomial random variable is exceeding its mean. Our proof employs estimates on the mean absolute deviation and the tail conditional expectation of binomial random variables.

  20. Measurement of zero-count probability in photoelectron statistics.

    Science.gov (United States)

    Basano, L; Ottonello, P

    1982-10-15

    The probability of zero-count P(0)(T) (as a function of the counting interval T) is one of the most interesting functions characterizing a light field. Experimentally, P(0)(T) is usually obtained by measuring successively the zero-count probability for a set of different intervals. This procedure exposes the measurement of P(0)(T) to errors imputable to drift. We present a simple zero-counter which is essentially free from drift effects and displays P(0)(T) directly on the CRT of an oscilloscope for sixteen values of T. Another advantage of the instrument is a conspicuous reduction of the overall measuring time.

  1. Measurement and probability a probabilistic theory of measurement with applications

    CERN Document Server

    Rossi, Giovanni Battista

    2014-01-01

    Measurement plays a fundamental role both in physical and behavioral sciences, as well as in engineering and technology: it is the link between abstract models and empirical reality and is a privileged method of gathering information from the real world. Is it possible to develop a single theory of measurement for the various domains of science and technology in which measurement is involved? This book takes the challenge by addressing the following main issues: What is the meaning of measurement? How do we measure? What can be measured? A theoretical framework that could truly be shared by scientists in different fields, ranging from physics and engineering to psychology is developed. The future in fact will require greater collaboration between science and technology and between different sciences. Measurement, which played a key role in the birth of modern science, can act as an essential interdisciplinary tool and language for this new scenario. A sound theoretical basis for addressing key problems in mea...

  2. Comparing heteroscedastic measurement systems with the probability of agreement.

    Science.gov (United States)

    Stevens, Nathaniel T; Steiner, Stefan H; MacKay, R Jock

    2017-01-01

    Deciding whether two measurement systems agree well enough to be used interchangeably is important in medical and clinical contexts. Recently, the probability of agreement was proposed as an alternative to comparison techniques such as correlation, regression, and the limits of agreement approach, when the systems' measurement errors are homoscedastic. However, in medical and clinical contexts, it is common for measurement variability to increase proportionally with the magnitude of measurement. In this article, we extend the probability of agreement analysis to accommodate heteroscedastic measurement errors, demonstrating the versatility of this simple metric. We illustrate its use with two examples: one involving the comparison of blood pressure measurement devices, and the other involving the comparison of serum cholesterol assays.

  3. Concept Acquisition and Confidence Using a Spatial Probability Measure Instrument

    Science.gov (United States)

    Moore, David Richard

    2007-01-01

    Instructional strategies for teaching concepts have long been identified. Less commonly studied is a learner's level of confidence and certitude in their knowledge based upon exposure to these instructional treatments. This experimental research study used an instrument referred to as the Spatial Probability Measure (SPM) to solicit levels of…

  4. Three Experiments Involving Probability Measurement Procedures with Mathematics Test Items.

    Science.gov (United States)

    Romberg, Thomas A.; And Others

    This is a report from the Project on Individually Guided Mathematics, Phase 2 Analysis of Mathematics Instruction. The report outlines some of the characteristics of probability measurement procedures for scoring objective tests, discusses hypothesized advantages and disadvantages of the methods, and reports the results of three experiments…

  5. Measurement of "optical" transition probabilities in the silver atom

    NARCIS (Netherlands)

    Terpstra, J.; Smit, J.A.

    1958-01-01

    For 22 spectral lines of the silver atom the probability of spontaneous transition has been derived from measurements of the emission intensity of the line and the population of the corresponding upper level. The medium of excitation was the column of a vertical arc discharge in air of atmospheric

  6. From gap probabilities in random matrix theory to eigenvalue expansions

    Science.gov (United States)

    Bothner, Thomas

    2016-02-01

    We present a method to derive asymptotics of eigenvalues for trace-class integral operators K :{L}2(J;{{d}}λ )\\circlearrowleft , acting on a single interval J\\subset {{R}}, which belongs to the ring of integrable operators (Its et al 1990 Int. J. Mod. Phys. B 4 1003-37 ). Our emphasis lies on the behavior of the spectrum \\{{λ }i(J)\\}{}i=0∞ of K as | J| \\to ∞ and i is fixed. We show that this behavior is intimately linked to the analysis of the Fredholm determinant {det}(I-γ K){| }{L2(J)} as | J| \\to ∞ and γ \\uparrow 1 in a Stokes type scaling regime. Concrete asymptotic formulæ are obtained for the eigenvalues of Airy and Bessel kernels in random matrix theory. Dedicated to Percy Deift and Craig Tracy on the occasion of their 70th birthdays.

  7. Statistical hydrodynamics and related problems in spaces of probability measures

    Science.gov (United States)

    Dostoglou, Stamatios

    2017-11-01

    A rigorous theory of statistical solutions of the Navier-Stokes equations, suitable for exploring Kolmogorov's ideas, has been developed by M.I. Vishik and A.V. Fursikov, culminating in their monograph "Mathematical problems of Statistical Hydromechanics." We review some progress made in recent years following this approach, with emphasis on problems concerning the correlation of velocities and corresponding questions in the space of probability measures on Hilbert spaces.

  8. Discrete probability models and methods probability on graphs and trees, Markov chains and random fields, entropy and coding

    CERN Document Server

    Brémaud, Pierre

    2017-01-01

    The emphasis in this book is placed on general models (Markov chains, random fields, random graphs), universal methods (the probabilistic method, the coupling method, the Stein-Chen method, martingale methods, the method of types) and versatile tools (Chernoff's bound, Hoeffding's inequality, Holley's inequality) whose domain of application extends far beyond the present text. Although the examples treated in the book relate to the possible applications, in the communication and computing sciences, in operations research and in physics, this book is in the first instance concerned with theory. The level of the book is that of a beginning graduate course. It is self-contained, the prerequisites consisting merely of basic calculus (series) and basic linear algebra (matrices). The reader is not assumed to be trained in probability since the first chapters give in considerable detail the background necessary to understand the rest of the book. .

  9. University Students’ Conceptual Knowledge of Randomness and Probability in the Contexts of Evolution and Mathematics

    Science.gov (United States)

    Fiedler, Daniela; Tröbst, Steffen; Harms, Ute

    2017-01-01

    Students of all ages face severe conceptual difficulties regarding key aspects of evolution—the central, unifying, and overarching theme in biology. Aspects strongly related to abstract “threshold” concepts like randomness and probability appear to pose particular difficulties. A further problem is the lack of an appropriate instrument for assessing students’ conceptual knowledge of randomness and probability in the context of evolution. To address this problem, we have developed two instruments, Randomness and Probability Test in the Context of Evolution (RaProEvo) and Randomness and Probability Test in the Context of Mathematics (RaProMath), that include both multiple-choice and free-response items. The instruments were administered to 140 university students in Germany, then the Rasch partial-credit model was applied to assess them. The results indicate that the instruments generate reliable and valid inferences about students’ conceptual knowledge of randomness and probability in the two contexts (which are separable competencies). Furthermore, RaProEvo detected significant differences in knowledge of randomness and probability, as well as evolutionary theory, between biology majors and preservice biology teachers. PMID:28572180

  10. Predicting longitudinal trajectories of health probabilities with random-effects multinomial logit regression.

    Science.gov (United States)

    Liu, Xian; Engel, Charles C

    2012-12-20

    Researchers often encounter longitudinal health data characterized with three or more ordinal or nominal categories. Random-effects multinomial logit models are generally applied to account for potential lack of independence inherent in such clustered data. When parameter estimates are used to describe longitudinal processes, however, random effects, both between and within individuals, need to be retransformed for correctly predicting outcome probabilities. This study attempts to go beyond existing work by developing a retransformation method that derives longitudinal growth trajectories of unbiased health probabilities. We estimated variances of the predicted probabilities by using the delta method. Additionally, we transformed the covariates' regression coefficients on the multinomial logit function, not substantively meaningful, to the conditional effects on the predicted probabilities. The empirical illustration uses the longitudinal data from the Asset and Health Dynamics among the Oldest Old. Our analysis compared three sets of the predicted probabilities of three health states at six time points, obtained from, respectively, the retransformation method, the best linear unbiased prediction, and the fixed-effects approach. The results demonstrate that neglect of retransforming random errors in the random-effects multinomial logit model results in severely biased longitudinal trajectories of health probabilities as well as overestimated effects of covariates on the probabilities. Copyright © 2012 John Wiley & Sons, Ltd.

  11. Crossing probability for directed polymers in random media. II. Exact tail of the distribution.

    Science.gov (United States)

    De Luca, Andrea; Le Doussal, Pierre

    2016-03-01

    We study the probability p ≡ p(η)(t) that two directed polymers in a given random potential η and with fixed and nearby endpoints do not cross until time t. This probability is itself a random variable (over samples η), which, as we show, acquires a very broad probability distribution at large time. In particular, the moments of p are found to be dominated by atypical samples where p is of order unity. Building on a formula established by us in a previous work using nested Bethe ansatz and Macdonald process methods, we obtain analytically the leading large time behavior of all moments p(m) ≃ γ(m)/t. From this, we extract the exact tail ∼ρ(p)/t of the probability distribution of the noncrossing probability at large time. The exact formula is compared to numerical simulations, with excellent agreement.

  12. Consensus in the Wasserstein Metric Space of Probability Measures

    Science.gov (United States)

    2015-07-01

    or unreliable or communication constraints are also present. All such variations in topology can happen randomly and often the network is...measures. 5A time-invariant network topology and a doubly stochastic weighting matrix. The time-invariance constraint can be relaxed (it is just...the delta- Dirac measure located at y, α i j ≥ 0 and ∑N j=1 α i j = 1. In this case, the minimization in (1) can be solved exactly via a finite

  13. A short course on measure and probability theories

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre

    2004-02-01

    This brief Introduction to Measure Theory, and its applications to Probabilities, corresponds to the lecture notes of a seminar series given at Sandia National Laboratories in Livermore, during the spring of 2003. The goal of these seminars was to provide a minimal background to Computational Combustion scientists interested in using more advanced stochastic concepts and methods, e.g., in the context of uncertainty quantification. Indeed, most mechanical engineering curricula do not provide students with formal training in the field of probability, and even in less in measure theory. However, stochastic methods have been used more and more extensively in the past decade, and have provided more successful computational tools. Scientists at the Combustion Research Facility of Sandia National Laboratories have been using computational stochastic methods for years. Addressing more and more complex applications, and facing difficult problems that arose in applications showed the need for a better understanding of theoretical foundations. This is why the seminar series was launched, and these notes summarize most of the concepts which have been discussed. The goal of the seminars was to bring a group of mechanical engineers and computational combustion scientists to a full understanding of N. WIENER'S polynomial chaos theory. Therefore, these lectures notes are built along those lines, and are not intended to be exhaustive. In particular, the author welcomes any comments or criticisms.

  14. Use of ELVIS II platform for random process modelling and analysis of its probability density function

    Science.gov (United States)

    Maslennikova, Yu. S.; Nugmanov, I. S.

    2016-08-01

    The problem of probability density function estimation for a random process is one of the most common in practice. There are several methods to solve this problem. Presented laboratory work uses methods of the mathematical statistics to detect patterns in the realization of random process. On the basis of ergodic theory, we construct algorithm for estimating univariate probability density distribution function for a random process. Correlational analysis of realizations is applied to estimate the necessary size of the sample and the time of observation. Hypothesis testing for two probability distributions (normal and Cauchy) is used on the experimental data, using χ2 criterion. To facilitate understanding and clarity of the problem solved, we use ELVIS II platform and LabVIEW software package that allows us to make the necessary calculations, display results of the experiment and, most importantly, to control the experiment. At the same time students are introduced to a LabVIEW software package and its capabilities.

  15. Escape probability and mean residence time in random flows with unsteady drift

    Directory of Open Access Journals (Sweden)

    Brannan James R.

    2001-01-01

    Full Text Available We investigate fluid transport in random velocity fields with unsteady drift. First, we propose to quantify fluid transport between flow regimes of different characteristic motion, by escape probability and mean residence time. We then develop numerical algorithms to solve for escape probability and mean residence time, which are described by backward Fokker-Planck type partial differential equations. A few computational issues are also discussed. Finally, we apply these ideas and numerical algorithms to a tidal flow model.

  16. Grounding the randomness of quantum measurement.

    Science.gov (United States)

    Jaeger, Gregg

    2016-05-28

    Julian Schwinger provided to physics a mathematical reconstruction of quantum mechanics on the basis of the characteristics of sequences of measurements occurring at the atomic level of physical structure. The central component of this reconstruction is an algebra of symbols corresponding to quantum measurements, conceived of as discrete processes, which serve to relate experience to theory; collections of outcomes of identically circumscribed such measurements are attributed expectation values, which constitute the predictive content of the theory. The outcomes correspond to certain phase parameters appearing in the corresponding symbols, which are complex numbers, the algebra of which he finds by a process he refers to as 'induction'. Schwinger assumed these (individually unpredictable) phase parameters to take random, uniformly distributed definite values within a natural range. I have previously suggested that the 'principle of plenitude' may serve as a basis in principle for the occurrence of the definite measured values that are those members of the collections of measurement outcomes from which the corresponding observed statistics derive (Jaeger 2015Found. Phys.45, 806-819. (doi:10.1007/s10701-015-9893-6)). Here, I evaluate Schwinger's assumption in the context of recent critiques of the notion of randomness and explicitly relate the randomness of these phases with the principle of plenitude and, in this way, provide a fundamental grounding for the objective, physically irreducible probabilities, conceived of as graded possibilities, that are attributed to measurement outcomes by quantum mechanics. © 2016 The Author(s).

  17. Approximations to the Probability of Failure in Random Vibration by Integral Equation Methods

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    Close approximations to the first passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first passage probability density function and the distribution function for the time interval spent below a barrier before...... outcrossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval, and hence for the first...... passage probability density. The results of the theory agree well with simulation results for narrow banded processes dominated by a single frequency, as well as for bimodal processes with 2 dominating frequencies in the structural response....

  18. PREDICTING LONGITUDINAL TRAJECTORIES OF HEALTH PROBABILITIES WITH RANDOM-EFFECTS MULTINOMIAL LOGIT REGRESSION

    OpenAIRE

    Liu, Xian; Engel, Charles C.

    2012-01-01

    Researchers often encounter longitudinal health data characterized with three or more ordinal or nominal categories. Random-effects multinomial logit models are generally applied to account for potential lack of independence inherent in such clustered data. When parameter estimates are used to describe longitudinal processes, however, random effects, both between and within individuals, need to be retransformed for correctly predicting outcome probabilities. This study attempts to go beyond e...

  19. The probability of a random straight line in two and three dimensions

    NARCIS (Netherlands)

    Beckers, A.L.D.; Smeulders, A.W.M.

    1990-01-01

    Using properties of shift- and rotation-invariance probability density distributions are derived for random straight lines in normal representation. It is found that in two-dimensional space the distribution of normal coordinates (r, phi) is uniform: p(r, phi) = c, where c is a normalisation

  20. University Students' Conceptual Knowledge of Randomness and Probability in the Contexts of Evolution and Mathematics

    Science.gov (United States)

    Fiedler, Daniela; Tröbst, Steffen; Harms, Ute

    2017-01-01

    Students of all ages face severe conceptual difficulties regarding key aspects of evolution-- the central, unifying, and overarching theme in biology. Aspects strongly related to abstract "threshold" concepts like randomness and probability appear to pose particular difficulties. A further problem is the lack of an appropriate instrument…

  1. The random effects prep continues to mispredict the probability of replication

    NARCIS (Netherlands)

    Iverson, G.J.; Lee, M.D.; Wagenmakers, E.-J.

    2010-01-01

    In their reply, Lecoutre and Killeen (2010) argue for a random effects version of prep, in which the observed effect from one experiment is used to predict the probability that an effect from a different but related experiment will have the same sign. They present a figure giving the impression that

  2. HABITAT ASSESSMENT USING A RANDOM PROBABILITY BASED SAMPLING DESIGN: ESCAMBIA RIVER DELTA, FLORIDA

    Science.gov (United States)

    Smith, Lisa M., Darrin D. Dantin and Steve Jordan. In press. Habitat Assessment Using a Random Probability Based Sampling Design: Escambia River Delta, Florida (Abstract). To be presented at the SWS/GERS Fall Joint Society Meeting: Communication and Collaboration: Coastal Systems...

  3. Eliciting and Developing Teachers' Conceptions of Random Processes in a Probability and Statistics Course

    Science.gov (United States)

    Smith, Toni M.; Hjalmarson, Margret A.

    2013-01-01

    The purpose of this study is to examine prospective mathematics specialists' engagement in an instructional sequence designed to elicit and develop their understandings of random processes. The study was conducted with two different sections of a probability and statistics course for K-8 teachers. Thirty-two teachers participated. Video analyses…

  4. Problems in probability theory, mathematical statistics and theory of random functions

    CERN Document Server

    Sveshnikov, A A

    1979-01-01

    Problem solving is the main thrust of this excellent, well-organized workbook. Suitable for students at all levels in probability theory and statistics, the book presents over 1,000 problems and their solutions, illustrating fundamental theory and representative applications in the following fields: Random Events; Distribution Laws; Correlation Theory; Random Variables; Entropy & Information; Markov Processes; Systems of Random Variables; Limit Theorems; Data Processing; and more.The coverage of topics is both broad and deep, ranging from the most elementary combinatorial problems through lim

  5. Probability and Random Processes With Applications to Signal Processing and Communications

    CERN Document Server

    Miller, Scott

    2012-01-01

    Miller and Childers have focused on creating a clear presentation of foundational concepts with specific applications to signal processing and communications, clearly the two areas of most interest to students and instructors in this course. It is aimed at graduate students as well as practicing engineers, and includes unique chapters on narrowband random processes and simulation techniques. The appendices provide a refresher in such areas as linear algebra, set theory, random variables, and more. Probability and Random Processes also includes applications in digital communications, informati

  6. Measuring politically sensitive behavior. Using probability theory in the form of randomized response to estimate prevalence and incidence of misbehavior in the public sphere: a test on integrity violations.

    NARCIS (Netherlands)

    Peeters, C.F.W.

    Representing the latent with the manifest is posed to be one of the greatest problems in social science. The reliance on constructed observables to indirectly represent the construct in which one takes an interest, makes it a crucial aspect of any endeavor involving measurement to question the

  7. Computer simulation of random variables and vectors with arbitrary probability distribution laws

    Science.gov (United States)

    Bogdan, V. M.

    1981-01-01

    Assume that there is given an arbitrary n-dimensional probability distribution F. A recursive construction is found for a sequence of functions x sub 1 = f sub 1 (U sub 1, ..., U sub n), ..., x sub n = f sub n (U sub 1, ..., U sub n) such that if U sub 1, ..., U sub n are independent random variables having uniform distribution over the open interval (0,1), then the joint distribution of the variables x sub 1, ..., x sub n coincides with the distribution F. Since uniform independent random variables can be well simulated by means of a computer, this result allows one to simulate arbitrary n-random variables if their joint probability distribution is known.

  8. Fortran code for generating random probability vectors, unitaries, and quantum states

    Directory of Open Access Journals (Sweden)

    Jonas eMaziero

    2016-03-01

    Full Text Available The usefulness of generating random configurations is recognized in many areas of knowledge. Fortran was born for scientific computing and has been one of the main programming languages in this area since then. And several ongoing projects targeting towards its betterment indicate that it will keep this status in the decades to come. In this article, we describe Fortran codes produced, or organized, for the generation of the following random objects: numbers, probability vectors, unitary matrices, and quantum state vectors and density matrices. Some matrix functions are also included and may be of independent interest.

  9. Measuring Robustness of Timetables at Stations using a Probability Distribution

    DEFF Research Database (Denmark)

    Jensen, Lars Wittrup; Landex, Alex

    infrastructure layouts given a timetable. These two methods provide different precision at the expense of a more complex calculation process. The advanced and more precise method is based on a probability distribution that can describe the expected delay between two trains as a function of the buffer time....... This paper proposes to use the exponential distribution, only taking non-negative delays into account, but any probability distribution can be used. Furthermore, the paper proposes that the calculation parameters are estimated from existing delay data, at a station, to achieve a higher precision. As delay...

  10. Context effects in the measurement of optimism in probability judgment

    NARCIS (Netherlands)

    Otten, W.; van der Pligt, J.

    1996-01-01

    Examined the role of contextual information such as comparison standard on self-other probability judgments regarding the occurrence of negative life events, which tend to be characterized by optimism. In Study 1, 80 undergraduates (mean age 19.5 yrs) completed a questionnaire on preventive

  11. Measuring Robustness of Timetables in Stations using a Probability Distribution

    DEFF Research Database (Denmark)

    Jensen, Lars Wittrup; Landex, Alex

    delays caused by interdependencies, and result in a more robust operation. Currently three methods to calculate the complexity of station exists: 1. Complexity of a station based on the track layout 2. Complexity of a station based on the probability of a conflict using a plan of operation 3. Complexity...

  12. PItcHPERFeCT: Primary Intracranial Hemorrhage Probability Estimation using Random Forests on CT.

    Science.gov (United States)

    Muschelli, John; Sweeney, Elizabeth M; Ullman, Natalie L; Vespa, Paul; Hanley, Daniel F; Crainiceanu, Ciprian M

    2017-01-01

    Intracerebral hemorrhage (ICH), where a blood vessel ruptures into areas of the brain, accounts for approximately 10-15% of all strokes. X-ray computed tomography (CT) scanning is largely used to assess the location and volume of these hemorrhages. Manual segmentation of the CT scan using planimetry by an expert reader is the gold standard for volume estimation, but is time-consuming and has within- and across-reader variability. We propose a fully automated segmentation approach using a random forest algorithm with features extracted from X-ray computed tomography (CT) scans. The Minimally Invasive Surgery plus rt-PA in ICH Evacuation (MISTIE) trial was a multi-site Phase II clinical trial that tested the safety of hemorrhage removal using recombinant-tissue plasminogen activator (rt-PA). For this analysis, we use 112 baseline CT scans from patients enrolled in the MISTE trial, one CT scan per patient. ICH was manually segmented on these CT scans by expert readers. We derived a set of imaging predictors from each scan. Using 10 randomly-selected scans, we used a first-pass voxel selection procedure based on quantiles of a set of predictors and then built 4 models estimating the voxel-level probability of ICH. The models used were: 1) logistic regression, 2) logistic regression with a penalty on the model parameters using LASSO, 3) a generalized additive model (GAM) and 4) a random forest classifier. The remaining 102 scans were used for model validation.For each validation scan, the model predicted the probability of ICH at each voxel. These voxel-level probabilities were then thresholded to produce binary segmentations of the hemorrhage. These masks were compared to the manual segmentations using the Dice Similarity Index (DSI) and the correlation of hemorrhage volume of between the two segmentations. We tested equality of median DSI using the Kruskal-Wallis test across the 4 models. We tested equality of the median DSI from sets of 2 models using a Wilcoxon

  13. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  14. Spencer-Brown vs. Probability and Statistics: Entropy’s Testimony on Subjective and Objective Randomness

    Directory of Open Access Journals (Sweden)

    Julio Michael Stern

    2011-04-01

    Full Text Available This article analyzes the role of entropy in Bayesian statistics, focusing on its use as a tool for detection, recognition and validation of eigen-solutions. “Objects as eigen-solutions” is a key metaphor of the cognitive constructivism epistemological framework developed by the philosopher Heinz von Foerster. Special attention is given to some objections to the concepts of probability, statistics and randomization posed by George Spencer-Brown, a figure of great influence in the field of radical constructivism.

  15. OPTIMAL ESTIMATION OF RANDOM PROCESSES ON THE CRITERION OF MAXIMUM A POSTERIORI PROBABILITY

    Directory of Open Access Journals (Sweden)

    A. A. Lobaty

    2016-01-01

    Full Text Available The problem of obtaining the equations for the a posteriori probability density of a stochastic Markov process with a linear measurement model. Unlike common approaches based on consideration as a criterion for optimization of the minimum mean square error of estimation, in this case, the optimization criterion is considered the maximum a posteriori probability density of the process being evaluated.The a priori probability density estimated Gaussian process originally considered a differentiable function that allows us to expand it in a Taylor series without use of intermediate transformations characteristic functions and harmonic decomposition. For small time intervals the probability density measurement error vector, by definition, as given by a Gaussian with zero expectation. This makes it possible to obtain a mathematical expression for the residual function, which characterizes the deviation of the actual measurement process from its mathematical model.To determine the optimal a posteriori estimation of the state vector is given by the assumption that this estimate is consistent with its expectation – the maximum a posteriori probability density. This makes it possible on the basis of Bayes’ formula for the a priori and a posteriori probability density of an equation Stratonovich-Kushner.Using equation Stratonovich-Kushner in different types and values of the vector of drift and diffusion matrix of a Markov stochastic process can solve a variety of filtration tasks, identify, smoothing and system status forecast for continuous and for discrete systems. Discrete continuous implementation of the developed algorithms posteriori assessment provides a specific, discrete algorithms for the implementation of the on-board computer, a mobile robot system.

  16. Looping probability of random heteropolymers helps to understand the scaling properties of biopolymers.

    Science.gov (United States)

    Zhan, Y; Giorgetti, L; Tiana, G

    2016-09-01

    Random heteropolymers are a minimal description of biopolymers and can provide a theoretical framework to the investigate the formation of loops in biophysical experiments. The looping probability as a function of polymer length was observed to display in some biopolymers, like chromosomes in cell nuclei or long RNA chains, anomalous scaling exponents. Combining a two-state model with self-adjusting simulated-tempering calculations, we calculate numerically the looping properties of several realizations of the random interactions within the chain. We find a continuous set of exponents upon varying the temperature, which arises from finite-size effects and is amplified by the disorder of the interactions. We suggest that this could provide a simple explanation for the anomalous scaling exponents found in experiments. In addition, our results have important implications notably for the study of chromosome folding as they show that scaling exponents cannot be the sole criteria for testing hypothesis-driven models of chromosome architecture.

  17. Probability maps as a measure of reliability for indivisibility analysis

    Directory of Open Access Journals (Sweden)

    Joksić Dušan

    2005-01-01

    Full Text Available Digital terrain models (DTMs represent segments of spatial data bases related to presentation of terrain features and landforms. Square grid elevation models (DEMs have emerged as the most widely used structure during the past decade because of their simplicity and simple computer implementation. They have become an important segment of Topographic Information Systems (TIS, storing natural and artificial landscape in forms of digital models. This kind of a data structure is especially suitable for morph metric terrain evaluation and analysis, which is very important in environmental and urban planning and Earth surface modeling applications. One of the most often used functionalities of Geographical information systems software packages is indivisibility or view shed analysis of terrain. Indivisibility determination from analog topographic maps may be very exhausting, because of the large number of profiles that have to be extracted and compared. Terrain representation in form of the DEMs databases facilitates this task. This paper describes simple algorithm for terrain view shed analysis by using DEMs database structures, taking into consideration the influence of uncertainties of such data to the results obtained thus far. The concept of probability maps is introduced as a mean for evaluation of results, and is presented as thematic display.

  18. The Development and Application of Random Match Probabilities to Firearm and Toolmark Identification.

    Science.gov (United States)

    Murdock, John E; Petraco, Nicholas D K; Thornton, John I; Neel, Michael T; Weller, Todd J; Thompson, Robert M; Hamby, James E; Collins, Eric R

    2017-05-01

    The field of firearms and toolmark analysis has encountered deep scrutiny of late, stemming from a handful of voices, primarily in the law and statistical communities. While strong scrutiny is a healthy and necessary part of any scientific endeavor, much of the current criticism leveled at firearm and toolmark analysis is, at best, misinformed and, at worst, punditry. One of the most persistent criticisms stems from the view that as the field lacks quantified random match probability data (or at least a firm statistical model) with which to calculate the probability of a false match, all expert testimony concerning firearm and toolmark identification or source attribution is unreliable and should be ruled inadmissible. However, this critique does not stem from the hard work of actually obtaining data and performing the scientific research required to support or reject current findings in the literature. Although there are sound reasons (described herein) why there is currently no unifying probabilistic model for the comparison of striated and impressed toolmarks as there is in the field of forensic DNA profiling, much statistical research has been, and continues to be, done to aid the criminal justice system. This research has thus far shown that error rate estimates for the field are very low, especially when compared to other forms of judicial error. The first purpose of this paper is to point out the logical fallacies in the arguments of a small group of pundits, who advocate a particular viewpoint but cloak it as fact and research. The second purpose is to give a balanced review of the literature regarding random match probability models and statistical applications that have been carried out in forensic firearm and toolmark analysis. © 2017 American Academy of Forensic Sciences.

  19. Convergence estimates in probability and in expectation for discrete least squares with noisy evaluations at random points

    KAUST Repository

    Migliorati, Giovanni

    2015-08-28

    We study the accuracy of the discrete least-squares approximation on a finite dimensional space of a real-valued target function from noisy pointwise evaluations at independent random points distributed according to a given sampling probability measure. The convergence estimates are given in mean-square sense with respect to the sampling measure. The noise may be correlated with the location of the evaluation and may have nonzero mean (offset). We consider both cases of bounded or square-integrable noise / offset. We prove conditions between the number of sampling points and the dimension of the underlying approximation space that ensure a stable and accurate approximation. Particular focus is on deriving estimates in probability within a given confidence level. We analyze how the best approximation error and the noise terms affect the convergence rate and the overall confidence level achieved by the convergence estimate. The proofs of our convergence estimates in probability use arguments from the theory of large deviations to bound the noise term. Finally we address the particular case of multivariate polynomial approximation spaces with any density in the beta family, including uniform and Chebyshev.

  20. Assessing Uncertainties of Theoretical Atomic Transition Probabilities with Monte Carlo Random Trials

    Directory of Open Access Journals (Sweden)

    Alexander Kramida

    2014-04-01

    Full Text Available This paper suggests a method of evaluation of uncertainties in calculated transition probabilities by randomly varying parameters of an atomic code and comparing the results. A control code has been written to randomly vary the input parameters with a normal statistical distribution around initial values with a certain standard deviation. For this particular implementation, Cowan’s suite of atomic codes (R.D. Cowan, The Theory of Atomic Structure and Spectra, Berkeley, CA: University of California Press, 1981 was used to calculate radiative rates of magnetic-dipole and electric-quadrupole transitions within the ground configuration of titanium-like iron, Fe V. The Slater parameters used in the calculations were adjusted to fit experimental energy levels with Cowan’s least-squares fitting program, RCE. The standard deviations of the fitted parameters were used as input of the control code providing the distribution widths of random trials for these parameters. Propagation of errors through the matrix diagonalization and summation of basis state expansions leads to significant variations in the resulting transition rates. These variations vastly differ in their magnitude for different transitions, depending on their sensitivity to errors in parameters. With this method, the rate uncertainty can be individually assessed for each calculated transition.

  1. Measuring the genetic structure of the pollen pool as the probability of paternal identity.

    Science.gov (United States)

    Smouse, P E; Robledo-Arnuncio, J J

    2005-06-01

    Contemporary pollen flow in forest plant species is measured by the probability of paternal identity (PPI) for two randomly sampled offspring, drawn from a single female, and contrasting that with PPI for two random offspring, drawn from different females. Two different estimation strategies have emerged: (a) an indirect approach, using the 'genetic structure' of the pollen received by different mothers and (b) a direct approach, based on parentage analysis. The indirect strategy is somewhat limited by the assumptions, but is widely useful. The direct approach is most appropriate where a large majority of the true fathers can be identified exactly, which is sometimes possible with high-resolution SSR markers. Using the parentage approach, we develop estimates of PPI, showing that the obvious estimates are severely biased, and providing an unbiased alternative. We then illustrate the methods with SSR data from a 36-tree isolated population of Pinus sylvestris from the Meseta region of Spain, for which categorical paternity assignment was available for over 95% of offspring. For all the females combined, we estimate that PPI=0.0425, indicating uneven male reproductive contributions. Different (but overlapping) arrays of males pollinate different females, and for the average female, PPI=0.317, indicating substantial 'pollen structure' for the population. We also relate the direct measures of PPI to those available from indirect approaches, and show that they are generally comparable.

  2. Continuous-time random walk: exact solutions for the probability density function and first two moments

    Energy Technology Data Exchange (ETDEWEB)

    Kwok Sau Fa [Departamento de Fisica, Universidade Estadual de Maringa, Av. Colombo 5790, 87020-900 Maringa-PR (Brazil); Joni Fat, E-mail: kwok@dfi.uem.br [Jurusan Teknik Elektro-Fakultas Teknik, Universitas Tarumanagara, Jl. Let. Jend. S. Parman 1, Blok L, Lantai 3 Grogol, Jakarta 11440 (Indonesia)

    2011-10-15

    We consider the decoupled continuous-time random walk model with a finite characteristic waiting time and approximate jump length variance. We take the waiting time probability density function (PDF) given by a combination of the exponential and the Mittag-Leffler function. Using this waiting time PDF, we investigate the diffusion behavior for all times. We obtain exact solutions for the first two moments and the PDF for the force-free and linear force cases. Due to the finite characteristic waiting time and jump length variance, the model presents, for the force-free case, normal diffusive behavior in the long-time limit. Further, the model can describe anomalous behavior at intermediate times.

  3. rft1d: Smooth One-Dimensional Random Field Upcrossing Probabilities in Python

    Directory of Open Access Journals (Sweden)

    Todd C. Pataky

    2016-07-01

    Full Text Available Through topological expectations regarding smooth, thresholded n-dimensional Gaussian continua, random field theory (RFT describes probabilities associated with both the field-wide maximum and threshold-surviving upcrossing geometry. A key application of RFT is a correction for multiple comparisons which affords field-level hypothesis testing for both univariate and multivariate fields. For unbroken isotropic fields just one parameter in addition to the mean and variance is required: the ratio of a field's size to its smoothness. Ironically the simplest manifestation of RFT (1D unbroken fields has rarely surfaced in the literature, even during its foundational development in the late 1970s. This Python package implements 1D RFT primarily for exploring and validating RFT expectations, but also describes how it can be applied to yield statistical inferences regarding sets of experimental 1D fields.

  4. Information Fusion via the Wasserstein Barycenter in the Space of Probability Measures: Direct Fusion of Empirical Measures and Gaussian Fusion with Unknown Correlation

    Science.gov (United States)

    2014-07-14

    Information Fusion . Artech House, 2007. [7] H. Durrant-Whyte and T. Henderson, “ Multisensor data fusion ,” in Springer Handbook of Robotics, B. Siciliano and...high- level idea is to consider the data fusion result as the probability measure that is closest to a given collection of input measures in the sense...the general fusion result explicitly given two randomly sampled (discrete) empirical measures which typically have no common underlying support. Data

  5. Approximation of Measurement Results of “Emergency” Signal Reception Probability

    Directory of Open Access Journals (Sweden)

    Gajda Stanisław

    2017-08-01

    Full Text Available The intended aim of this article is to present approximation results of the exemplary measurements of EMERGENCY signal reception probability. The probability is under-stood as a distance function between the aircraft and a ground-based system under established conditions. The measurements were approximated using the properties of logistic functions. This probability, as a distance function, enables to determine the range of the EMERGENCY signal for a pre-set confidence level.

  6. Derivation of the quantum probability law from minimal non-demolition measurement

    Energy Technology Data Exchange (ETDEWEB)

    Herbut, F [Serbian Academy of Sciences and Arts, Knez Mihajlova 35, 11000 Belgrade (Serbia)

    2007-08-24

    One more derivation of the quantum probability rule is presented in order to shed more light on the versatile aspects of this fundamental law. It is shown that the change of state in minimal quantum non-demolition measurement, also known as ideal measurement, implies the probability law in a simple way. Namely, the very requirement of minimal change of state, put in proper mathematical form, gives the well-known Lueders formula, which contains the probability rule.

  7. A measure of mutual divergence among a number of probability distributions

    Directory of Open Access Journals (Sweden)

    J. N. Kapur

    1987-01-01

    major inequalities due to Shannon, Renyi and Holder. The inequalities are then used to obtain some useful results in information theory. In particular measures are obtained to measure the mutual divergence among two or more probability distributions.

  8. Formulas for Rational-Valued Separability Probabilities of Random Induced Generalized Two-Qubit States

    Directory of Open Access Journals (Sweden)

    Paul B. Slater

    2015-01-01

    Full Text Available Previously, a formula, incorporating a 5F4 hypergeometric function, for the Hilbert-Schmidt-averaged determinantal moments ρPTnρk/ρk of 4×4 density-matrices (ρ and their partial transposes (|ρPT|, was applied with k=0 to the generalized two-qubit separability probability question. The formula can, furthermore, be viewed, as we note here, as an averaging over “induced measures in the space of mixed quantum states.” The associated induced-measure separability probabilities (k=1,2,… are found—via a high-precision density approximation procedure—to assume interesting, relatively simple rational values in the two-re[al]bit (α=1/2, (standard two-qubit (α=1, and two-quater[nionic]bit (α=2 cases. We deduce rather simple companion (rebit, qubit, quaterbit, … formulas that successfully reproduce the rational values assumed for general  k. These formulas are observed to share certain features, possibly allowing them to be incorporated into a single master formula.

  9. [Construction and application of probability distribution model for mixed forests measurement factors].

    Science.gov (United States)

    Liu, En-Bin; Tang, Meng-Ping; Shi, Yong-Jun; Zhou, Guo-Mo; Li, Yong-Fu

    2009-11-01

    Aiming at the deficiencies in the researches about the probability distribution model for mixed forests tree measurement factors, a joint maximum entropy probability density function was put forward, based on the maximum entropy principle. This function had the characteristics of 1) each element of the function was linked to the maximum entropy function, and hence, could integrate the information about the probability distribution of measurement factors of main tree species in mixed forests, 2) the function had a probability expression of double-weight, being possible to reflect the characteristics of the complex structure of mixed forests, and accurately and completely reflect the probability distribution of tree measurement factors of mixed forests based on the fully use of the information about the probability distribution of measurement factors of main tree species in mixed forests, and 3) the joint maximum entropy probability density function was succinct in structure and excellent in performance. The model was applied and tested in two sampling plots in Tianmu Mountain Nature Reserve. The fitting precision (R2 = 0.9655) and testing accuracy (R2 = 0.9772) were both high, suggesting that this model could be used as a probability distribution model for mixed forests tree measurement factors, and provided a feasible method to fully understand the comprehensive structure of mixed forests.

  10. Digital simulation of two-dimensional random fields with arbitrary power spectra and non-Gaussian probability distribution functions

    DEFF Research Database (Denmark)

    Yura, Harold; Hanson, Steen Grüner

    2012-01-01

    Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the......Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set...... with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative...

  11. What's Missing in Teaching Probability and Statistics: Building Cognitive Schema for Understanding Random Phenomena

    Science.gov (United States)

    Kuzmak, Sylvia

    2016-01-01

    Teaching probability and statistics is more than teaching the mathematics itself. Historically, the mathematics of probability and statistics was first developed through analyzing games of chance such as the rolling of dice. This article makes the case that the understanding of probability and statistics is dependent upon building a…

  12. Nonorthogonal projective positive-operator-value measurement of photon polarization states with unit probability of success

    Science.gov (United States)

    Ahnert, S. E.; Payne, M. C.

    2004-01-01

    In this paper we describe a scheme for performing a nonorthogonal projective positive-operator-value measurement of any arbitrary single-photon polarization input state with unit probability of success. While this probability is reached in the limit of infinite cycles of states through the apparatus, only one actual physical setup is required for a feasible implementation. Specifically, our setup implements a set of three nonorthogonal measurement operators at angles of 120° to each other.

  13. On Probability Domains IV

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2017-12-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  14. On Probability Domains IV

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2017-06-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  15. On Generating Optimal Signal Probabilities for Random Tests: A Genetic Approach

    Directory of Open Access Journals (Sweden)

    M. Srinivas

    1996-01-01

    Full Text Available Genetic Algorithms are robust search and optimization techniques. A Genetic Algorithm based approach for determining the optimal input distributions for generating random test vectors is proposed in the paper. A cost function based on the COP testability measure for determining the efficacy of the input distributions is discussed. A brief overview of Genetic Algorithms (GAs and the specific details of our implementation are described. Experimental results based on ISCAS-85 benchmark circuits are presented. The performance of our GAbased approach is compared with previous results. While the GA generates more efficient input distributions than the previous methods which are based on gradient descent search, the overheads of the GA in computing the input distributions are larger.

  16. Impulsive synchronization of Markovian jumping randomly coupled neural networks with partly unknown transition probabilities via multiple integral approach.

    Science.gov (United States)

    Chandrasekar, A; Rakkiyappan, R; Cao, Jinde

    2015-10-01

    This paper studies the impulsive synchronization of Markovian jumping randomly coupled neural networks with partly unknown transition probabilities via multiple integral approach. The array of neural networks are coupled in a random fashion which is governed by Bernoulli random variable. The aim of this paper is to obtain the synchronization criteria, which is suitable for both exactly known and partly unknown transition probabilities such that the coupled neural network is synchronized with mixed time-delay. The considered impulsive effects can be synchronized at partly unknown transition probabilities. Besides, a multiple integral approach is also proposed to strengthen the Markovian jumping randomly coupled neural networks with partly unknown transition probabilities. By making use of Kronecker product and some useful integral inequalities, a novel Lyapunov-Krasovskii functional was designed for handling the coupled neural network with mixed delay and then impulsive synchronization criteria are solvable in a set of linear matrix inequalities. Finally, numerical examples are presented to illustrate the effectiveness and advantages of the theoretical results. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Trending in Probability of Collision Measurements via a Bayesian Zero-Inflated Beta Mixed Model

    Science.gov (United States)

    Vallejo, Jonathon; Hejduk, Matt; Stamey, James

    2015-01-01

    We investigate the performance of a generalized linear mixed model in predicting the Probabilities of Collision (Pc) for conjunction events. Specifically, we apply this model to the log(sub 10) transformation of these probabilities and argue that this transformation yields values that can be considered bounded in practice. Additionally, this bounded random variable, after scaling, is zero-inflated. Consequently, we model these values using the zero-inflated Beta distribution, and utilize the Bayesian paradigm and the mixed model framework to borrow information from past and current events. This provides a natural way to model the data and provides a basis for answering questions of interest, such as what is the likelihood of observing a probability of collision equal to the effective value of zero on a subsequent observation.

  18. Investigating and improving student understanding of the probability distributions for measuring physical observables in quantum mechanics

    Science.gov (United States)

    Marshman, Emily; Singh, Chandralekha

    2017-03-01

    A solid grasp of the probability distributions for measuring physical observables is central to connecting the quantum formalism to measurements. However, students often struggle with the probability distributions of measurement outcomes for an observable and have difficulty expressing this concept in different representations. Here we first describe the difficulties that upper-level undergraduate and PhD students have with the probability distributions for measuring physical observables in quantum mechanics. We then discuss how student difficulties found in written surveys and individual interviews were used as a guide in the development of a quantum interactive learning tutorial (QuILT) to help students develop a good grasp of the probability distributions of measurement outcomes for physical observables. The QuILT strives to help students become proficient in expressing the probability distributions for the measurement of physical observables in Dirac notation and in the position representation and be able to convert from Dirac notation to position representation and vice versa. We describe the development and evaluation of the QuILT and findings about the effectiveness of the QuILT from in-class evaluations.

  19. Ordinal probability effect measures for group comparisons in multinomial cumulative link models.

    Science.gov (United States)

    Agresti, Alan; Kateri, Maria

    2017-03-01

    We consider simple ordinal model-based probability effect measures for comparing distributions of two groups, adjusted for explanatory variables. An "ordinal superiority" measure summarizes the probability that an observation from one distribution falls above an independent observation from the other distribution, adjusted for explanatory variables in a model. The measure applies directly to normal linear models and to a normal latent variable model for ordinal response variables. It equals Φ(β/2) for the corresponding ordinal model that applies a probit link function to cumulative multinomial probabilities, for standard normal cdf Φ and effect β that is the coefficient of the group indicator variable. For the more general latent variable model for ordinal responses that corresponds to a linear model with other possible error distributions and corresponding link functions for cumulative multinomial probabilities, the ordinal superiority measure equals exp(β)/[1+exp(β)] with the log-log link and equals approximately exp(β/2)/[1+exp(β/2)] with the logit link, where β is the group effect. Another ordinal superiority measure generalizes the difference of proportions from binary to ordinal responses. We also present related measures directly for ordinal models for the observed response that need not assume corresponding latent response models. We present confidence intervals for the measures and illustrate with an example. © 2016, The International Biometric Society.

  20. Probability effects in anticipation investigated with online behavioural measures (mouse tracking)

    DEFF Research Database (Denmark)

    Bruhn, Pernille; Bærentsen, Klaus B.

    Background: Anticipation of upcoming events is an adaptive mechanism that ensures quick and accurate perception and action. Consequently, lower Reaction Time (RT) and higher accuracy is found in response to events that can be adequately anticipated. However, events in the world happen with varying...... degrees of probability depending on context and preceding events. It is therefore of fundamental importance to investigate how knowledge of probability modulates anticipatory processes. Previous studies found effects of stimulus probability on RT and accuracy, but these are only indirect and post......-hoc behavioural measures of the anticipatory processes involved. Methods: The present study investigates how knowledge of probability affects real-time anticipatory processes. Behaviour is monitored online by tracking the computer mouse trajectory leading to a required response (mouse-click on Target...

  1. Frequency format diagram and probability chart for breast cancer risk communication: a prospective, randomized trial

    Directory of Open Access Journals (Sweden)

    Wahner-Roedler Dietlind

    2008-10-01

    Full Text Available Abstract Background Breast cancer risk education enables women make informed decisions regarding their options for screening and risk reduction. We aimed to determine whether patient education regarding breast cancer risk using a bar graph, with or without a frequency format diagram, improved the accuracy of risk perception. Methods We conducted a prospective, randomized trial among women at increased risk for breast cancer. The main outcome measurement was patients' estimation of their breast cancer risk before and after education with a bar graph (BG group or bar graph plus a frequency format diagram (BG+FF group, which was assessed by previsit and postvisit questionnaires. Results Of 150 women in the study, 74 were assigned to the BG group and 76 to the BG+FF group. Overall, 72% of women overestimated their risk of breast cancer. The improvement in accuracy of risk perception from the previsit to the postvisit questionnaire (BG group, 19% to 61%; BG+FF group, 13% to 67% was not significantly different between the 2 groups (P = .10. Among women who inaccurately perceived very high risk (≥ 50% risk, inaccurate risk perception decreased significantly in the BG+FF group (22% to 3% compared with the BG group (28% to 19% (P = .004. Conclusion Breast cancer risk communication using a bar graph plus a frequency format diagram can improve the short-term accuracy of risk perception among women perceiving inaccurately high risk.

  2. On the probability of cost-effectiveness using data from randomized clinical trials

    Directory of Open Access Journals (Sweden)

    Willan Andrew R

    2001-09-01

    Full Text Available Abstract Background Acceptability curves have been proposed for quantifying the probability that a treatment under investigation in a clinical trial is cost-effective. Various definitions and estimation methods have been proposed. Loosely speaking, all the definitions, Bayesian or otherwise, relate to the probability that the treatment under consideration is cost-effective as a function of the value placed on a unit of effectiveness. These definitions are, in fact, expressions of the certainty with which the current evidence would lead us to believe that the treatment under consideration is cost-effective, and are dependent on the amount of evidence (i.e. sample size. Methods An alternative for quantifying the probability that the treatment under consideration is cost-effective, which is independent of sample size, is proposed. Results Non-parametric methods are given for point and interval estimation. In addition, these methods provide a non-parametric estimator and confidence interval for the incremental cost-effectiveness ratio. An example is provided. Conclusions The proposed parameter for quantifying the probability that a new therapy is cost-effective is superior to the acceptability curve because it is not sample size dependent and because it can be interpreted as the proportion of patients who would benefit if given the new therapy. Non-parametric methods are used to estimate the parameter and its variance, providing the appropriate confidence intervals and test of hypothesis.

  3. A new formulation of the probability density function in random walk models for atmospheric dispersion

    DEFF Research Database (Denmark)

    Falk, Anne Katrine Vinther; Gryning, Sven-Erik

    1997-01-01

    In this model for atmospheric dispersion particles are simulated by the Langevin Equation, which is a stochastic differential equation. It uses the probability density function (PDF) of the vertical velocity fluctuations as input. The PDF is constructed as an expansion after Hermite polynomials. ...

  4. PItcHPERFeCT: Primary Intracranial Hemorrhage Probability Estimation using Random Forests on CT

    Directory of Open Access Journals (Sweden)

    John Muschelli

    2017-01-01

    Results: All results presented are for the 102 scans in the validation set. The median DSI for each model was: 0.89 (logistic, 0.885 (LASSO, 0.88 (GAM, and 0.899 (random forest. Using the random forest results in a slightly higher median DSI compared to the other models. After Bonferroni correction, the hypothesis of equality of median DSI was rejected only when comparing the random forest DSI to the DSI from the logistic (p < 0.001, LASSO (p < 0.001, or GAM (p < 0.001 models. In practical terms the difference between the random forest and the logistic regression is quite small. The correlation (95% CI between the volume from manual segmentation and the predicted volume was 0.93 (0.9,0.95 for the random forest model. These results indicate that random forest approach can achieve accurate segmentation of ICH in a population of patients from a variety of imaging centers. We provide an R package (https://github.com/muschellij2/ichseg and a Shiny R application online (http://johnmuschelli.com/ich_segment_all.html for implementing and testing the proposed approach.

  5. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think....... By doing so, we will obtain a deeper insight into how events involving large values of sums of heavy-tailed random variables are likely to occur....

  6. Experimental estimation of the photons visiting probability profiles in time-resolved diffuse reflectance measurement.

    Science.gov (United States)

    Sawosz, P; Kacprzak, M; Weigl, W; Borowska-Solonynko, A; Krajewski, P; Zolek, N; Ciszek, B; Maniewski, R; Liebert, A

    2012-12-07

    A time-gated intensified CCD camera was applied for time-resolved imaging of light penetrating in an optically turbid medium. Spatial distributions of light penetration probability in the plane perpendicular to the axes of the source and the detector were determined at different source positions. Furthermore, visiting probability profiles of diffuse reflectance measurement were obtained by the convolution of the light penetration distributions recorded at different source positions. Experiments were carried out on homogeneous phantoms, more realistic two-layered tissue phantoms based on the human skull filled with Intralipid-ink solution and on cadavers. It was noted that the photons visiting probability profiles depend strongly on the source-detector separation, the delay between the laser pulse and the photons collection window and the complex tissue composition of the human head.

  7. Spectral shaping of a randomized PWM DC-DC converter using maximum entropy probability distributions

    CSIR Research Space (South Africa)

    Dove, Albert

    2017-01-01

    Full Text Available stream_source_info Dove_2018.pdf.txt stream_content_type text/plain stream_size 26566 Content-Encoding UTF-8 stream_name Dove_2018.pdf.txt Content-Type text/plain; charset=UTF-8 SPECTRAL SHAPING OF A RANDOMIZED PWM DC... behind spectral shaping is to select a randomization technique with its associated PDF to analytically obtain a specified spectral profile [21]. The benefits of this idea comes in being able to achieve some level of controllability on the spectral content...

  8. Estimation of (n,f) Cross-Sections by Measuring Reaction Probability Ratios

    Energy Technology Data Exchange (ETDEWEB)

    Plettner, C; Ai, H; Beausang, C W; Bernstein, L A; Ahle, L; Amro, H; Babilon, M; Burke, J T; Caggiano, J A; Casten, R F; Church, J A; Cooper, J R; Crider, B; Gurdal, G; Heinz, A; McCutchan, E A; Moody, K; Punyon, J A; Qian, J; Ressler, J J; Schiller, A; Williams, E; Younes, W

    2005-04-21

    Neutron-induced reaction cross-sections on unstable nuclei are inherently difficult to measure due to target activity and the low intensity of neutron beams. In an alternative approach, named the 'surrogate' technique, one measures the decay probability of the same compound nucleus produced using a stable beam on a stable target to estimate the neutron-induced reaction cross-section. As an extension of the surrogate method, in this paper they introduce a new technique of measuring the fission probabilities of two different compound nuclei as a ratio, which has the advantage of removing most of the systematic uncertainties. This method was benchmarked in this report by measuring the probability of deuteron-induced fission events in coincidence with protons, and forming the ratio P({sup 236}U(d,pf))/P({sup 238}U(d,pf)), which serves as a surrogate for the known cross-section ratio of {sup 236}U(n,f)/{sup 238}U(n,f). IN addition, the P({sup 238}U(d,d{prime}f))/P({sup 236}U(d,d{prime}f)) ratio as a surrogate for the {sup 237}U(n,f)/{sup 235}U(n,f) cross-section ratio was measured for the first time in an unprecedented range of excitation energies.

  9. Tail probabilities and partial moments for quadratic forms in multivariate generalized hyperbolic random vectors

    NARCIS (Netherlands)

    Broda, S.A.

    2013-01-01

    Countless test statistics can be written as quadratic forms in certain random vectors, or ratios thereof. Consequently, their distribution has received considerable attention in the literature. Except for a few special cases, no closed-form expression for the cdf exists, and one resorts to numerical

  10. Is extrapair mating random? On the probability distribution of extrapair young in avian broods

    NARCIS (Netherlands)

    Brommer, Jon E.; Korsten, Peter; Bouwman, Karen A.; Berg, Mathew L.; Komdeur, Jan

    2007-01-01

    A dichotomy in female extrapair copulation (EPC) behavior, with some females seeking EPC and others not, is inferred if the observed distribution of extrapair young (EPY) over broods differs from a random process on the level of individual offspring (binomial, hypergeometrical, or Poisson). A review

  11. Spectral discrete probability density function of measured wind turbine noise in the far field

    Directory of Open Access Journals (Sweden)

    Payam eAshtiani

    2015-04-01

    Full Text Available Of interest is the spectral character of wind turbine noise at typical residential set-back distances. In this paper a spectral statistical analysis has been applied to immission measurements conducted at three locations. This method provides discrete probability density functions for the Turbine ONLY component of the measured noise. This analysis is completed for 1/3rd Octave sound levels, at integer wind speeds, and is compared to existing metrics for measuring acoustic comfort as well as previous discussions on low frequency noise sources.

  12. New accurate measurements of neutron emission probabilities for relevant fission products

    Directory of Open Access Journals (Sweden)

    Agramunt J.

    2017-01-01

    Full Text Available We have performed new accurate measurements of the beta-delayed neutron emission probability for ten isotopes of the elements Y, Sb, Te and I. These are fission products that either have a significant contribution to the fraction of delayed neutrons in reactors or are relatively close to the path of the astrophysical r process. The measurements were performed with isotopically pure radioactive beams using a constant and high efficiency neutron counter and a low noise beta detector. Preliminary results are presented for six of the isotopes and compared with previous measurements and theoretical calculations.

  13. Spectral discrete probability density function of measured wind turbine noise in the far field.

    Science.gov (United States)

    Ashtiani, Payam; Denison, Adelaide

    2015-01-01

    Of interest is the spectral character of wind turbine noise at typical residential set-back distances. In this paper, a spectral statistical analysis has been applied to immission measurements conducted at three locations. This method provides discrete probability density functions for the Turbine ONLY component of the measured noise. This analysis is completed for one-third octave sound levels, at integer wind speeds, and is compared to existing metrics for measuring acoustic comfort as well as previous discussions on low-frequency noise sources.

  14. Random errors of oceanic monthly rainfall derived from SSM/I using probability distribution functions

    Science.gov (United States)

    Chang, Alfred T. C.; Chiu, Long S.; Wilheit, Thomas T.

    1993-01-01

    Global averages and random errors associated with the monthly oceanic rain rates derived from the Special Sensor Microwave/Imager (SSM/I) data using the technique developed by Wilheit et al. (1991) are computed. Accounting for the beam-filling bias, a global annual average rain rate of 1.26 m is computed. The error estimation scheme is based on the existence of independent (morning and afternoon) estimates of the monthly mean. Calculations show overall random errors of about 50-60 percent for each 5 deg x 5 deg box. The results are insensitive to different sampling strategy (odd and even days of the month). Comparison of the SSM/I estimates with raingage data collected at the Pacific atoll stations showed a low bias of about 8 percent, a correlation of 0.7, and an rms difference of 55 percent.

  15. Measurement of 240Pu Angular Momentum Dependent Fission Probabilities Using the (α ,α') Reaction

    Science.gov (United States)

    Koglin, Johnathon; Burke, Jason; Fisher, Scott; Jovanovic, Igor

    2017-09-01

    The surrogate reaction method often lacks the theoretical framework and necessary experimental data to constrain models especially when rectifying differences between angular momentum state differences between the desired and surrogate reaction. In this work, dual arrays of silicon telescope particle identification detectors and photovoltaic (solar) cell fission fragment detectors have been used to measure the fission probability of the 240Pu(α ,α' f) reaction - a surrogate for the 239Pu(n , f) - and fission fragment angular distributions. Fission probability measurements were performed at a beam energy of 35.9(2) MeV at eleven scattering angles from 40° to 140°e in 10° intervals and at nuclear excitation energies up to 16 MeV. Fission fragment angular distributions were measured in six bins from 4.5 MeV to 8.0 MeV and fit to expected distributions dependent on the vibrational and rotational excitations at the saddle point. In this way, the contributions to the total fission probability from specific states of K angular momentum projection on the symmetry axis are extracted. A sizable data collection is presented to be considered when constraining microscopic cross section calculations.

  16. Measurement of spin-flip probabilities for ultracold neutrons interacting with nickel phosphorus coated surfaces

    CERN Document Server

    Tang, Z; Brandt, A; Callahan, N B; Clayton, S M; Currie, S A; Ito, T M; Makela, M; Masuda, Y; Morris, C L; Pattie, R; Ramsey, J C; Salvat, D J; Saunders, A; Young, A R

    2015-01-01

    We report a measurement of the spin-flip probabilities for ultracold neutrons interacting with surfaces coated with nickel phosphorus. For 50 $\\mu$m thick nickel phosphorus coated on stainless steel, the spin-flip probability per bounce was found to be $\\beta_{\\rm NiP\\;on\\;SS} = (3.3^{+1.8}_{-5.6}) \\times 10^{-6}$. For 50 $\\mu$m thick nickel phosphorus coated on aluminum, the spin-flip probability per bounce was found to be $\\beta_{\\rm NiP\\;on\\;Al} = (3.6^{+2.1}_{-5.9}) \\times 10^{-6}$. For the copper guide used as reference, the spin flip probability per bounce was found to be $\\beta_{\\rm Cu} = (6.7^{+5.0}_{-2.5}) \\times 10^{-6}$. Nickel phosphorus coated stainless steel or aluminum provides a solution when UCN guides that have a high Fermi potential and are low-cost, mechanically robust, and non-depolarizing are needed.

  17. Probability calculus of fractional order and fractional Taylor's series application to Fokker-Planck equation and information of non-random functions

    Energy Technology Data Exchange (ETDEWEB)

    Jumarie, Guy [Department of Mathematics, University of Quebec at Montreal, P.O. Box 8888, Downtown Station, Montreal, Qc, H3C 3P8 (Canada)], E-mail: jumarie.guy@uqam.ca

    2009-05-15

    A probability distribution of fractional (or fractal) order is defined by the measure {mu}{l_brace}dx{r_brace} = p(x)(dx){sup {alpha}}, 0 < {alpha} < 1. Combining this definition with the fractional Taylor's series f(x+h)=E{sub {alpha}}(D{sub x}{sup {alpha}}h{sup {alpha}})f(x) provided by the modified Riemann Liouville definition, one can expand a probability calculus parallel to the standard one. A Fourier's transform of fractional order using the Mittag-Leffler function is introduced, together with its inversion formula; and it provides a suitable generalization of the characteristic function of fractal random variables. It appears that the state moments of fractional order are more especially relevant. The main properties of this fractional probability calculus are outlined, it is shown that it provides a sound approach to Fokker-Planck equation which are fractional in both space and time, and it provides new results in the information theory of non-random functions.

  18. A note on the generalized fractal dimensions of a probability measure

    OpenAIRE

    Guérin, Charles-Antoine

    2001-01-01

    We prove the following result on the generalized fractal dimensions $D^{±}_q$ of a probability measure $\\mu$ on $R^n$. Let $g$ be a complex-valued measurable function on $R^n$ satisfying the following conditions: (1) $g$ is rapidly decreasing at infinity, (2) $g$ is continuous and nonvanishing at (at least) one point, (3) $\\int g≠0$. Define the partition function $\\Lambda_a(μ,q)=a^{n(q−1)}‖g_a * μ‖\\lim_q q$, where $g_a(x)=a^{−n}g(a^{−1}x)$ and  $*$  is the convolution in $R^n$. Then for all $...

  19. Excursion sets of infinitely divisible random fields with convolution equivalent Lévy measure

    DEFF Research Database (Denmark)

    Rønn-Nielsen, Anders; Jensen, Eva B. Vedel

    2017-01-01

    We consider a continuous, infinitely divisible random field in ℝ d , d = 1, 2, 3, given as an integral of a kernel function with respect to a Lévy basis with convolution equivalent Lévy measure. For a large class of such random fields, we compute the asymptotic probability that the excursion set ...... at level x contains some rotation of an object with fixed radius as x → ∞. Our main result is that the asymptotic probability is equivalent to the right tail of the underlying Lévy measure....

  20. Excursion sets of infinitely divisible random fields with convolution equivalent Lévy measure

    DEFF Research Database (Denmark)

    Rønn-Nielsen, Anders; Jensen, Eva B. Vedel

    We consider a continuous, infinitely divisible random field in R d , d = 1, 2, 3, given as an integral of a kernel function with respect to a Lévy basis with convolution equivalent Lévy measure. For a large class of such random fields we compute the asymptotic probability that the excursion set a...... at level x contains some rotation of an object with fixed radius as x → ∞. Our main result is that the asymptotic probability is equivalent to the right tail of the underlying Lévy measure...

  1. Measurement of Angular-Momentum-Dependent Fission Probabilities of 240Pu

    Science.gov (United States)

    Koglin, Johnathon; Burke, Jason; Jovanovic, Igor

    2016-09-01

    An experimental technique using the surrogate reaction method has been developed to measure fission probabilities of actinides as a function of angular momentum state of the fissioning nucleus near the fission barrier. In this work, the 240Pu (α ,α' f) reaction was used as a surrogate for 239Pu (n , f) . An array of 12 silicon telescopes positioned at 10 degree intervals from 40 to 140 degrees detect the outgoing reaction particle for identification and measurement of the excitation energy. The angular momentum state is determined by measuring the angular distribution of fission fragments. The expected distributions are predicted from the Wigner d function. An array of 50 photovoltaic (solar) cells detects fission fragments with 10-degree granularity. The solar cells are sensitive to fission fragments but have no response to light ions. Relative contributions from different angular momentum states are extracted from the measured distributions and compared across all α particle scattering angles to determine fission probability at a specific angular momentum state. The first experiment using this technique was recently completed using 37 MeV α particles incident on 240Pu. First results will be discussed. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. This material is based upon work supported by the U.S. Department of Homeland Security under Grant Award Nu.

  2. Robust estimation of the expected survival probabilities from high-dimensional Cox models with biomarker-by-treatment interactions in randomized clinical trials

    Directory of Open Access Journals (Sweden)

    Nils Ternès

    2017-05-01

    Full Text Available Abstract Background Thanks to the advances in genomics and targeted treatments, more and more prediction models based on biomarkers are being developed to predict potential benefit from treatments in a randomized clinical trial. Despite the methodological framework for the development and validation of prediction models in a high-dimensional setting is getting more and more established, no clear guidance exists yet on how to estimate expected survival probabilities in a penalized model with biomarker-by-treatment interactions. Methods Based on a parsimonious biomarker selection in a penalized high-dimensional Cox model (lasso or adaptive lasso, we propose a unified framework to: estimate internally the predictive accuracy metrics of the developed model (using double cross-validation; estimate the individual survival probabilities at a given timepoint; construct confidence intervals thereof (analytical or bootstrap; and visualize them graphically (pointwise or smoothed with spline. We compared these strategies through a simulation study covering scenarios with or without biomarker effects. We applied the strategies to a large randomized phase III clinical trial that evaluated the effect of adding trastuzumab to chemotherapy in 1574 early breast cancer patients, for which the expression of 462 genes was measured. Results In our simulations, penalized regression models using the adaptive lasso estimated the survival probability of new patients with low bias and standard error; bootstrapped confidence intervals had empirical coverage probability close to the nominal level across very different scenarios. The double cross-validation performed on the training data set closely mimicked the predictive accuracy of the selected models in external validation data. We also propose a useful visual representation of the expected survival probabilities using splines. In the breast cancer trial, the adaptive lasso penalty selected a prediction model with 4

  3. MEASURING THE INFLUENCE OF TASK COMPLEXITY ON HUMAN ERROR PROBABILITY: AN EMPIRICAL EVALUATION

    Directory of Open Access Journals (Sweden)

    LUCA PODOFILLINI

    2013-04-01

    Full Text Available A key input for the assessment of Human Error Probabilities (HEPs with Human Reliability Analysis (HRA methods is the evaluation of the factors influencing the human performance (often referred to as Performance Shaping Factors, PSFs. In general, the definition of these factors and the supporting guidance are such that their evaluation involves significant subjectivity. This affects the repeatability of HRA results as well as the collection of HRA data for model construction and verification. In this context, the present paper considers the TAsk COMplexity (TACOM measure, developed by one of the authors to quantify the complexity of procedure-guided tasks (by the operating crew of nuclear power plants in emergency situations, and evaluates its use to represent (objectively and quantitatively task complexity issues relevant to HRA methods. In particular, TACOM scores are calculated for five Human Failure Events (HFEs for which empirical evidence on the HEPs (albeit with large uncertainty and influencing factors are available – from the International HRA Empirical Study. The empirical evaluation has shown promising results. The TACOM score increases as the empirical HEP of the selected HFEs increases. Except for one case, TACOM scores are well distinguished if related to different difficulty categories (e.g., “easy” vs. “somewhat difficult”, while values corresponding to tasks within the same category are very close. Despite some important limitations related to the small number of HFEs investigated and the large uncertainty in their HEPs, this paper presents one of few attempts to empirically study the effect of a performance shaping factor on the human error probability. This type of study is important to enhance the empirical basis of HRA methods, to make sure that 1 the definitions of the PSFs cover the influences important for HRA (i.e., influencing the error probability, and 2 the quantitative relationships among PSFs and error

  4. Estimating the ground-state probability of a quantum simulation with product-state measurements

    Directory of Open Access Journals (Sweden)

    Bryce eYoshimura

    2015-10-01

    Full Text Available .One of the goals in quantum simulation is to adiabatically generate the ground state of a complicated Hamiltonian by starting with the ground state of a simple Hamiltonian and slowly evolving the system to the complicated one. If the evolution is adiabatic and the initial and final ground states are connected due to having the same symmetry, then the simulation will be successful. But in most experiments, adiabatic simulation is not possible because it would take too long, and the system has some level of diabatic excitation. In this work, we quantify the extent of the diabatic excitation even if we do not know {it a priori} what the complicated ground state is. Since many quantum simulator platforms, like trapped ions, can measure the probabilities to be in a product state, we describe techniques that can employ these simple measurements to estimate the probability of being in the ground state of the system after the diabatic evolution. These techniques do not require one to know any properties about the Hamiltonian itself, nor to calculate its eigenstate properties. All the information is derived by analyzing the product-state measurements as functions of time.

  5. Object Tracking Using Local Multiple Features and a Posterior Probability Measure

    Directory of Open Access Journals (Sweden)

    Wenhua Guo

    2017-03-01

    Full Text Available Object tracking has remained a challenging problem in recent years. Most of the trackers can not work well, especially when dealing with problems such as similarly colored backgrounds, object occlusions, low illumination, or sudden illumination changes in real scenes. A centroid iteration algorithm using multiple features and a posterior probability criterion is presented to solve these problems. The model representation of the object and the similarity measure are two key factors that greatly influence the performance of the tracker. Firstly, this paper propose using a local texture feature which is a generalization of the local binary pattern (LBP descriptor, which we call the double center-symmetric local binary pattern (DCS-LBP. This feature shows great discrimination between similar regions and high robustness to noise. By analyzing DCS-LBP patterns, a simplified DCS-LBP is used to improve the object texture model called the SDCS-LBP. The SDCS-LBP is able to describe the primitive structural information of the local image such as edges and corners. Then, the SDCS-LBP and the color are combined to generate the multiple features as the target model. Secondly, a posterior probability measure is introduced to reduce the rate of matching mistakes. Three strategies of target model update are employed. Experimental results show that our proposed algorithm is effective in improving tracking performance in complicated real scenarios compared with some state-of-the-art methods.

  6. Development of a scattering probability method for accurate vapor fraction measurements by neutron radiography

    CERN Document Server

    Joo, H

    1999-01-01

    Recent test results indicated drawbacks associated with the simple exponential attenuation method (SEAM) as currently applied to neutron radiography measurements to determine vapor fractions in a hydrogenous two-phase flow in a metallic conduit. The scattering component of the neutron beam intensity exiting the flow system is not adequately accounted for by SEAM, and this leads to inaccurate results. To properly account for the scattering effect, a neutron scattering probability method (SPM) is developed. The method applies a neutron-hydrogen scattering kernel to scattered thermal neutrons that leave the incident beam in narrow conduits but eventually show up elsewhere in the measurements. The SPM has been tested with known vapor (void) distributions within an acrylic disk and a water/vapor channel. The vapor (void) fractions deduced by SPM are in good agreement with the known exact values. Details of the scattering correction method and the test results are discussed.

  7. Aerobic Exercise Increases Hippocampal Volume in Older Women with Probable Mild Cognitive Impairment: A 6-Month Randomized Controlled Trial

    Science.gov (United States)

    ten Brinke, Lisanne F.; Bolandzadeh, Niousha; Nagamatsu, Lindsay S.; Hsu, Chun Liang; Davis, Jennifer C.; Miran-Khan, Karim; Liu-Ambrose, Teresa

    2015-01-01

    Background Mild cognitive impairment (MCI) is a well-recognized risk factor for dementia and represents a vital opportunity for intervening. Exercise is a promising strategy for combating cognitive decline, by improving both brain structure and function. Specifically, aerobic training (AT) improved spatial memory and hippocampal volume in healthy community-dwelling older adults. In older women with probable MCI, we previously demonstrated that both resistance training (RT) and AT improved memory. In this secondary analysis, we investigated: 1) the effect of both RT and AT on hippocampal volume; and 2) the association between change in hippocampal volume and change in memory. Methods Eighty-six females aged 70 to 80 years with probable MCI were randomly assigned to a six-month, twice-weekly program of: 1) AT, 2) RT, or 3) Balance and Tone Training (BAT; i.e., control). At baseline and trial completion, participants performed a 3T magnetic resonance imaging scan to determine hippocampal volume. Verbal memory and learning was assessed by Rey’s Auditory Verbal Learning Test. Results Compared with the BAT group, AT significantly improved left, right, and total hippocampal volumes (p≤0.03). After accounting for baseline cognitive function and experimental group, increased left hippocampal volume was independently associated with reduced verbal memory and learning performance as indexed by loss after interference (r=0.42, p=0.03). Conclusion Aerobic training significantly increased hippocampal volume in older women with probable MCI. More research is needed to ascertain the relevance of exercise-induced changes in hippocampal volume on memory performance in older adults with MCI. PMID:24711660

  8. Riemann integral of a random function and the parabolic equation with a general stochastic measure

    OpenAIRE

    Radchenko, Vadym

    2012-01-01

    For stochastic parabolic equation driven by a general stochastic measure, the weak solution is obtained. The integral of a random function in the equation is considered as a limit in probability of Riemann integral sums. Basic properties of such integrals are studied in the paper.

  9. Measurement of spark probability of GEM detector for CBM muon chamber (MUCH)

    CERN Document Server

    Biswas, S.; Frankenfeld, U.; Garabatos, C.; Hehner, J.; Kleipa, V.; Morhardt, T.; Schmidt, C.J.; Schmidt, H.R.; Wiechula, J.

    2015-11-11

    The stability of triple GEM detector setups in an environment of high energetic showers is studied. To this end the spark probability in a shower environment is compared to the spark probability in a pion beam.

  10. Retrieval of monthly rainfall indices from microwave radiometric measurements using probability distribution functions

    Science.gov (United States)

    Wilheit, Thomas T.; Chang, Alfred T. C.; Chiu, Long S.

    1991-01-01

    An algorithm for the estimation of monthly rain totals for 5 deg cells over the ocean from histograms of SSM/I brightness temperatures has been developed. There are three novel features to this algorithm. First, it uses knowledge of the form of the rainfall intensity probability density function to augment the measurements. Second, a linear combination of the 19.35 and 22.235 GHz channels has been employed to reduce the impact of variability of water vapor. Third, an objective technique has been developed to estimate the rain layer thickness from the 19.35- and 22.235-GHz brightness temperature histograms. Comparison with climatologies and the GATE radar observations suggest that the estimates are reasonable in spite of not having a beam-filling correction. By-products of the retrievals indicate that the SSM/I instrument noise level and calibration stability are quite good.

  11. Modeling Latin-American stock markets volatility: Varying probabilities and mean reversion in a random level shift model

    Directory of Open Access Journals (Sweden)

    Gabriel Rodríguez

    2016-06-01

    Full Text Available Following Xu and Perron (2014, I applied the extended RLS model to the daily stock market returns of Argentina, Brazil, Chile, Mexico and Peru. This model replaces the constant probability of level shifts for the entire sample with varying probabilities that record periods with extremely negative returns. Furthermore, it incorporates a mean reversion mechanism with which the magnitude and the sign of the level shift component vary in accordance with past level shifts that deviate from the long-term mean. Therefore, four RLS models are estimated: the Basic RLS, the RLS with varying probabilities, the RLS with mean reversion, and a combined RLS model with mean reversion and varying probabilities. The results show that the estimated parameters are highly significant, especially that of the mean reversion model. An analysis of ARFIMA and GARCH models is also performed in the presence of level shifts, which shows that once these shifts are taken into account in the modeling, the long memory characteristics and GARCH effects disappear. Also, I find that the performance prediction of the RLS models is superior to the classic models involving long memory as the ARFIMA(p,d,q models, the GARCH and the FIGARCH models. The evidence indicates that except in rare exceptions, the RLS models (in all its variants are showing the best performance or belong to the 10% of the Model Confidence Set (MCS. On rare occasions the GARCH and the ARFIMA models appear to dominate but they are rare exceptions. When the volatility is measured by the squared returns, the great exception is Argentina where a dominance of GARCH and FIGARCH models is appreciated.

  12. Maximal-entropy random walk unifies centrality measures

    Science.gov (United States)

    Ochab, J. K.

    2012-12-01

    This paper compares a number of centrality measures and several (dis-)similarity matrices with which they can be defined. These matrices, which are used among others in community detection methods, represent quantities connected to enumeration of paths on a graph and to random walks. Relationships between some of these matrices are derived in the paper. These relationships are inherited by the centrality measures. They include measures based on the principal eigenvector of the adjacency matrix, path enumeration, as well as on the stationary state, stochastic matrix, or mean first-passage times of a random walk. As the random walk defining the centrality measure can be arbitrarily chosen, we pay particular attention to the maximal-entropy random walk, which serves as a very distinct alternative to the ordinary (diffusive) random walk used in network analysis. The various importance measures, defined both with the use of ordinary random walk and the maximal-entropy random walk, are compared numerically on a set of benchmark graphs with varying mixing parameter and are grouped with the use of the agglomerative clustering technique. It is shown that centrality measures defined with the two different random walks cluster into two separate groups. In particular, the group of centrality measures defined by the maximal-entropy random walk does not cluster with any other measures on change of graphs’ parameters, and members of this group produce mutually closer results than members of the group defined by the ordinary random walk.

  13. Maximal-entropy random walk unifies centrality measures.

    Science.gov (United States)

    Ochab, J K

    2012-12-01

    This paper compares a number of centrality measures and several (dis-)similarity matrices with which they can be defined. These matrices, which are used among others in community detection methods, represent quantities connected to enumeration of paths on a graph and to random walks. Relationships between some of these matrices are derived in the paper. These relationships are inherited by the centrality measures. They include measures based on the principal eigenvector of the adjacency matrix, path enumeration, as well as on the stationary state, stochastic matrix, or mean first-passage times of a random walk. As the random walk defining the centrality measure can be arbitrarily chosen, we pay particular attention to the maximal-entropy random walk, which serves as a very distinct alternative to the ordinary (diffusive) random walk used in network analysis. The various importance measures, defined both with the use of ordinary random walk and the maximal-entropy random walk, are compared numerically on a set of benchmark graphs with varying mixing parameter and are grouped with the use of the agglomerative clustering technique. It is shown that centrality measures defined with the two different random walks cluster into two separate groups. In particular, the group of centrality measures defined by the maximal-entropy random walk does not cluster with any other measures on change of graphs' parameters, and members of this group produce mutually closer results than members of the group defined by the ordinary random walk.

  14. Measuring real-time streamflow using emerging technologies: Radar, hydroacoustics, and the probability concept

    Science.gov (United States)

    Fulton, J.; Ostrowski, J.

    2008-01-01

    Forecasting streamflow during extreme hydrologic events such as floods can be problematic. This is particularly true when flow is unsteady, and river forecasts rely on models that require uniform-flow rating curves to route water from one forecast point to another. As a result, alternative methods for measuring streamflow are needed to properly route flood waves and account for inertial and pressure forces in natural channels dominated by nonuniform-flow conditions such as mild water surface slopes, backwater, tributary inflows, and reservoir operations. The objective of the demonstration was to use emerging technologies to measure instantaneous streamflow in open channels at two existing US Geological Survey streamflow-gaging stations in Pennsylvania. Surface-water and instream-point velocities were measured using hand-held radar and hydroacoustics. Streamflow was computed using the probability concept, which requires velocity data from a single vertical containing the maximum instream velocity. The percent difference in streamflow at the Susquehanna River at Bloomsburg, PA ranged from 0% to 8% with an average difference of 4% and standard deviation of 8.81 m3/s. The percent difference in streamflow at Chartiers Creek at Carnegie, PA ranged from 0% to 11% with an average difference of 5% and standard deviation of 0.28 m3/s. New generation equipment is being tested and developed to advance the use of radar-derived surface-water velocity and instantaneous streamflow to facilitate the collection and transmission of real-time streamflow that can be used to parameterize hydraulic routing models.

  15. Random walks on three-strand braids and on related hyperbolic groups 05.40.-a Fluctuation phenomena, random processes, noise, and Brownian motion; 02.50.-r Probability theory, stochastic processes, and statistics; 02.40.Ky Riemannian geometries;

    CERN Document Server

    Nechaev, S

    2003-01-01

    We investigate the statistical properties of random walks on the simplest nontrivial braid group B sub 3 , and on related hyperbolic groups. We provide a method using Cayley graphs of groups allowing us to compute explicitly the probability distribution of the basic statistical characteristics of random trajectories - the drift and the return probability. The action of the groups under consideration in the hyperbolic plane is investigated, and the distribution of a geometric invariant - the hyperbolic distance - is analysed. It is shown that a random walk on B sub 3 can be viewed as a 'magnetic random walk' on the group PSL(2, Z).

  16. Random walks on three-strand braids and on related hyperbolic groups[05.40.-a Fluctuation phenomena, random processes, noise, and Brownian motion; 02.50.-r Probability theory, stochastic processes, and statistics; 02.40.Ky Riemannian geometries;

    Energy Technology Data Exchange (ETDEWEB)

    Nechaev, Sergei [Laboratoire de Physique Theorique et Modeles Statistiques, Universite Paris Sud, 91405 Orsay Cedex (France); Voituriez, Raphael [Laboratoire de Physique Theorique et Modeles Statistiques, Universite Paris Sud, 91405 Orsay Cedex (France)

    2003-01-10

    We investigate the statistical properties of random walks on the simplest nontrivial braid group B{sub 3}, and on related hyperbolic groups. We provide a method using Cayley graphs of groups allowing us to compute explicitly the probability distribution of the basic statistical characteristics of random trajectories - the drift and the return probability. The action of the groups under consideration in the hyperbolic plane is investigated, and the distribution of a geometric invariant - the hyperbolic distance - is analysed. It is shown that a random walk on B{sub 3} can be viewed as a 'magnetic random walk' on the group PSL(2, Z)

  17. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    , extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially......The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...

  18. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence......., extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...

  19. Mutual point-winning probabilities (MPW): a new performance measure for table tennis.

    Science.gov (United States)

    Ley, Christophe; Dominicy, Yves; Bruneel, Wim

    2017-11-13

    We propose a new performance measure for table tennis players: the mutual point-winning probabilities (MPW) as server and receiver. The MPWs quantify a player's chances to win a point against a given opponent, and hence nicely complement the classical match statistics history between two players. These new quantities are based on a Bradley-Terry-type statistical model taking into account the importance of individual points, since a rally at 8-2 in the first set is less crucial than a rally at the score of 9-9 in the final set. The MPWs hence reveal a player's strength on his/her service against a given opponent as well as his/her capacity of scoring crucial points. We estimate the MPWs by means of maximum likelihood estimation and show via a Monte Carlo simulation study that our estimation procedure works well. In order to illustrate the MPWs' versatile use, we have organized two round-robin tournaments of ten respectively eleven table tennis players from the Belgian table tennis federation. We compare the classical final ranking to the ranking based on MPWs, and we highlight how the MPWs shed new light on strengths and weaknesses of the players.

  20. Simultaneous pixel detection probabilities and spatial resolution estimation of pixelized detectors by means of correlation measurements

    Energy Technology Data Exchange (ETDEWEB)

    Grabski, V. [Instituto de Fisica, Universidad Nacional Autonoma de Mexico, A.P. 20-364, 01000 Mexico, DF (Mexico)], E-mail: varlen.grabski@cern.ch

    2008-02-21

    On the basis of the determination of statistical correlations between neighboring detector pixels, a novel method of estimating the simultaneous detection probability of pixels and the spatial resolution of pixelized detectors is proposed. The correlations are determined using noise variance measurement for isolated pixels and for the difference between neighboring pixels. The method is validated using images from two image-acquisition devices, a General Electric Senographe 2000D and a SD mammographic unit. The pixelized detector is irradiated with X-rays over its entire surface. It is shown that the simultaneous pixel detection probabilities can be estimated with an accuracy of 0.001-0.003, with an estimated systematic error of less than 0.005. The two-dimensional pre-sampled point-spread function (PSF{sup 0}) is determined using a single Gaussian approximation and a sum of two Gaussian approximations. The results obtained for the pre-sampled PSF{sup 0} show that the single Gaussian approximation is not appropriate, and the sum of two Gaussian approximations providing the best fit predicts the existence of a large ({approx}50%) narrow component. Support for this observation can be found in the recent simulation study of columnar indirect digital detectors by Badano et al. The sampled two-dimensional PSF is determined using Monte Carlo simulation for the L-shaped, uniformly distributed acceptance function for different fill-factor values. The calculation of the pre-sampled modulation transfer function based on the estimated PSF{sup 0} shows that the observed data can be reproduced only by the single Gaussian approximation, and that when the sum of two Gaussians is used, significantly larger values are apparent in the higher-frequency region for images from both detection devices. The proposed method does not require a precisely, constructed tool. It is insensitive to beam collimation and to system physical size and may be indispensable in cases where thin

  1. Simultaneous pixel detection probabilities and spatial resolution estimation of pixelized detectors by means of correlation measurements

    Science.gov (United States)

    Grabski, V.

    2008-02-01

    On the basis of the determination of statistical correlations between neighboring detector pixels, a novel method of estimating the simultaneous detection probability of pixels and the spatial resolution of pixelized detectors is proposed. The correlations are determined using noise variance measurement for isolated pixels and for the difference between neighboring pixels. The method is validated using images from two image-acquisition devices, a General Electric Senographe 2000D and a SD mammographic unit. The pixelized detector is irradiated with X-rays over its entire surface. It is shown that the simultaneous pixel detection probabilities can be estimated with an accuracy of 0.001-0.003, with an estimated systematic error of less than 0.005. The two-dimensional pre-sampled point-spread function (PSF 0) is determined using a single Gaussian approximation and a sum of two Gaussian approximations. The results obtained for the pre-sampled PSF 0 show that the single Gaussian approximation is not appropriate, and the sum of two Gaussian approximations providing the best fit predicts the existence of a large (˜50%) narrow component. Support for this observation can be found in the recent simulation study of columnar indirect digital detectors by Badano et al. The sampled two-dimensional PSF is determined using Monte Carlo simulation for the L-shaped, uniformly distributed acceptance function for different fill-factor values. The calculation of the pre-sampled modulation transfer function based on the estimated PSF 0 shows that the observed data can be reproduced only by the single Gaussian approximation, and that when the sum of two Gaussians is used, significantly larger values are apparent in the higher-frequency region for images from both detection devices. The proposed method does not require a precisely, constructed tool. It is insensitive to beam collimation and to system physical size and may be indispensable in cases where thin absorption slits or edges are

  2. Physical Activity Improves Verbal and Spatial Memory in Older Adults with Probable Mild Cognitive Impairment: A 6-Month Randomized Controlled Trial

    Directory of Open Access Journals (Sweden)

    Lindsay S. Nagamatsu

    2013-01-01

    Full Text Available We report secondary findings from a randomized controlled trial on the effects of exercise on memory in older adults with probable MCI. We randomized 86 women aged 70–80 years with subjective memory complaints into one of three groups: resistance training, aerobic training, or balance and tone (control. All participants exercised twice per week for six months. We measured verbal memory and learning using the Rey Auditory Verbal Learning Test (RAVLT and spatial memory using a computerized test, before and after trial completion. We found that the aerobic training group remembered significantly more items in the loss after interference condition of the RAVLT compared with the control group after six months of training. In addition, both experimental groups showed improved spatial memory performance in the most difficult condition where they were required to memorize the spatial location of three items, compared with the control group. Lastly, we found a significant correlation between spatial memory performance and overall physical capacity after intervention in the aerobic training group. Taken together, our results provide support for the prevailing notion that exercise can positively impact cognitive functioning and may represent an effective strategy to improve memory in those who have begun to experience cognitive decline.

  3. Probability theory

    CERN Document Server

    S Varadhan, S R

    2001-01-01

    This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando

  4. What Are Probability Surveys?

    Science.gov (United States)

    The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.

  5. Experimental measurement-device-independent quantum random-number generation

    Science.gov (United States)

    Nie, You-Qi; Guan, Jian-Yu; Zhou, Hongyi; Zhang, Qiang; Ma, Xiongfeng; Zhang, Jun; Pan, Jian-Wei

    2016-12-01

    The randomness from a quantum random-number generator (QRNG) relies on the accurate characterization of its devices. However, device imperfections and inaccurate characterizations can result in wrong entropy estimation and bias in practice, which highly affects the genuine randomness generation and may even induce the disappearance of quantum randomness in an extreme case. Here we experimentally demonstrate a measurement-device-independent (MDI) QRNG based on time-bin encoding to achieve certified quantum randomness even when the measurement devices are uncharacterized and untrusted. The MDI-QRNG is randomly switched between the regular randomness generation mode and a test mode, in which four quantum states are randomly prepared to perform measurement tomography in real time. With a clock rate of 25 MHz, the MDI-QRNG generates a final random bit rate of 5.7 kbps. Such implementation with an all-fiber setup provides an approach to construct a fully integrated MDI-QRNG with trusted but error-prone devices in practice.

  6. Profiting from Probability; Combining Low and High Probability Isotopes as a Tool Extending the Dynamic Range of an Assay Measuring Amphetamine and Methamphetamine in Urine.

    Science.gov (United States)

    Miller, Anna M; Goggin, Melissa M; Nguyen, An; Gozum, Stephanie D; Janis, Gregory C

    2017-06-01

    A wide range of concentrations are frequently observed when measuring drugs of abuse in urine toxicology samples; this is especially true for amphetamine and methamphetamine. Routine liquid chromatography-tandem mass spectrometry confirmatory methods commonly anchored at a 50 ng/mL lower limit of quantitation can span approximately a 100-fold concentration range before regions of non-linearity are reached deteriorating accurate quantitation and qualitative assessments. In our experience, approximately a quarter of amphetamine and methamphetamine positive samples are above a 5,000 ng/mL upper limit of quantitation and thus require reanalysis with dilution for accurate quantitative and acceptable qualitative results. We present here the development of an analytical method capable of accurately quantifying samples with concentrations spanning several orders of magnitude without the need for sample dilution and reanalysis. For each analyte the major isotopes were monitored for analysis through the lower concentration ranges (50-5,000 ng/mL), and the naturally occurring, low probability 13C2 isotopes were monitored for the analysis of the high concentration samples (5,000-100,000 ng/mL amphetamine and 5,000-200,000 ng/mL methamphetamine). The method simultaneously monitors transitions for the molecules containing only 12C and 13C2 isotopologues eliminating the need for re-extraction and reanalysis of high concentration samples. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  8. THE PARTIAL SUMS OF THE LEAST SQUARES RESIDUALS OF SPATIAL OBSERVATIONS SAMPLED ACCORDING TO A PROBABILITY MEASURE

    Directory of Open Access Journals (Sweden)

    Wayan Somayasa

    2013-05-01

    Full Text Available A functional central limit theorem for a sequence of partial sums processes of the least squares residuals of a spatial linear regression model in which the observations are sampled according to a probability measure is established. Under mild assumptions to the model, the limit of the sequence of the least squares residual partial sums processes is explicitly derived. It is shown that the limit process which is a function of the Brownian sheet depends on the regression functions and the probability measure under which the design is constructed. Several examples ofthe limit processes when the model is true are presented. Lower and upper bounds for boundary crossing probabilities of signal plus noise models when the noises come from the residual partial sums processes are also investigated.

  9. Measurement of vacancy transfer probability from K to L shell using ...

    Indian Academy of Sciences (India)

    The vacancy transfer probabilities from K to L shell through radiative decay, KL , have been deduced for the elements in the range 19 ≤ ≤ 58 using K-shell fluorescence yields. The targets were irradiated with photons at 59.5 keV from a 75mCi 241Am annular source. The K X-rays from different targets were detected ...

  10. Measurement of vacancy transfer probability from K to L shell using ...

    Indian Academy of Sciences (India)

    Abstract. The vacancy transfer probabilities from K to L shell through radiative decay,. ηKL, have been deduced for the elements in the range 19 ≤ Z ≤ 58 using K-shell fluores- cence yields. The targets were irradiated with γ photons at 59.5 keV from a 75mCi 241Am annular source. The K X-rays from different targets were ...

  11. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  12. Interval estimation of the risk difference in non-compliance randomized trials with repeated binary measurements.

    Science.gov (United States)

    Lui, Kung-Jong

    2007-07-20

    In a randomized clinical trial (RCT), we often encounter non-compliance with the treatment protocol for a subset of patients. The intention-to-treat (ITT) analysis is probably the most commonly used method in a RCT with non-compliance. However, the ITT analysis estimates 'the programmatic effectiveness' rather than 'the biological efficacy'. In this paper, we focus attention on the latter index and consider use of the risk difference (RD) to measure the effect of a treatment. Based on a simple additive risk model proposed elsewhere, we develop four asymptotic interval estimators of the RD for repeated binary measurements in a RCT with non-compliance. We apply Monte Carlo simulation to evaluate and compare the finite-sample performance of these interval estimators in a variety of situations. We find that all interval estimators considered here can perform well with respect to the coverage probability. We further find that the interval estimator using a tanh(-1)(x) transformation is probably more precise than the others, while the interval estimator derived from a randomization-based approach may cause a slight loss of precision. When the number of patients per treatment is large and the probability of compliance to an assigned treatment is high, we find that all interval estimators discussed here are essentially equivalent. Finally, we illustrate use of these interval estimators with data simulated from a trial of using macrophage colony-stimulating factor to reduce febrile neutropenia incidence in acute myeloid leukaemia patients. (c) 2006 John Wiley & Sons, Ltd.

  13. The Reliability of Randomly Generated Math Curriculum-Based Measurements

    Science.gov (United States)

    Strait, Gerald G.; Smith, Bradley H.; Pender, Carolyn; Malone, Patrick S.; Roberts, Jarod; Hall, John D.

    2015-01-01

    "Curriculum-Based Measurement" (CBM) is a direct method of academic assessment used to screen and evaluate students' skills and monitor their responses to academic instruction and intervention. Interventioncentral.org offers a math worksheet generator at no cost that creates randomly generated "math curriculum-based measures"…

  14. Measurement of spin flip probabilities for ultracold neutrons on guide materials

    Science.gov (United States)

    Tang, Zhaowen; Clayton, Steven; Currie, Scott; Ito, Takeyasu; Makela, Mark; Morris, Christopher; Pattie, Robert; Ramsey, John; Saunders, Alexander; Wei, Wanchun; Adamek, Evan; Callahan, Nathan; Salvat, Daniel; Brandt, Aaron; Young, Albert; Lanl EDM Collaboration

    2015-10-01

    Ultracold neutrons (UCNs) are defined as neutrons with kinetic energy sufficiently low so that they can be confined in a material bottle. UCN sources are used in many facilities worldwide to pursue some of the most profound questions in fundamental physics. UCN guides, which transport UCNs from the source to experiments, play a crucial role in achieving high UCN density in an experimental apparatus. In some cases, UCN guides are also required to transport spin polarized UCNs, and therefore the probability of spin flip upon UCN interaction is an important property characterizng UCN guide materials. We have studied the depolarization property of a new nickel based UCN guide coating material. In this talk, the purpose, method, and results of the experiment will be presented and the implication of the results on the depolarization mechanism will be discussed. LANL LDRD Grant #20140015DR.

  15. Maximal-entropy random walk unifies centrality measures

    OpenAIRE

    Ochab, J. K.

    2012-01-01

    In this paper analogies between different (dis)similarity matrices are derived. These matrices, which are connected to path enumeration and random walks, are used in community detection methods or in computation of centrality measures for complex networks. The focus is on a number of known centrality measures, which inherit the connections established for similarity matrices. These measures are based on the principal eigenvector of the adjacency matrix, path enumeration, as well as on the sta...

  16. Concentration inequalities for functions of Gibbs fields with application to diffraction and random Gibbs measures

    CERN Document Server

    Külske, C

    2003-01-01

    We derive useful general concentration inequalities for functions of Gibbs fields in the uniqueness regime. We also consider expectations of random Gibbs measures that depend on an additional disorder field, and prove concentration w.r.t the disorder field. Both fields are assumed to be in the uniqueness regime, allowing in particular for non-independent disorder field. The modification of the bounds compared to the case of an independent field can be expressed in terms of constants that resemble the Dobrushin contraction coefficient, and are explicitly computable. On the basis of these inequalities, we obtain bounds on the deviation of a diffraction pattern created by random scatterers located on a general discrete point set in the Euclidean space, restricted to a finite volume. Here we also allow for thermal dislocations of the scatterers around their equilibrium positions. Extending recent results for independent scatterers, we give a universal upper bound on the probability of a deviation of the random sc...

  17. Distance expanding random mappings, thermodynamical formalism, Gibbs measures and fractal geometry

    CERN Document Server

    Mayer, Volker; Skorulski, Bartlomiej

    2011-01-01

    The theory of random dynamical systems originated from stochastic differential equations. It is intended to provide a framework and techniques to describe and analyze the evolution of dynamical systems when the input and output data are known only approximately, according to some probability distribution. The development of this field, in both the theory and applications, has gone in many directions. In this manuscript we introduce measurable expanding random dynamical systems, develop the thermodynamical formalism and establish, in particular, the exponential decay of correlations and analyticity of the expected pressure although the spectral gap property does not hold. This theory is then used to investigate fractal properties of conformal random systems. We prove a Bowen’s formula and develop the multifractal formalism of the Gibbs states. Depending on the behavior of the Birkhoff sums of the pressure function we arrive at a natural classification of the systems into two classes: quasi-deterministic syst...

  18. Transfer probability measurements in the superfluid 116Sn+60Ni system

    Directory of Open Access Journals (Sweden)

    Montanari D.

    2014-03-01

    Full Text Available We measured excitation functions for the main transfer channels in the 116Sn+60Ni reaction from above to well below the Coulomb barrier. The experiment has been performed in inverse kinematics, detecting the lighter (target-like ions with the magnetic spectrometer PRISMA at very forward angles. The comparison between the data and microscopic calculations for the present case and for the previously measured 96Zr+40Ca system, namely superfluid and near closed shells nuclei, should significantly improve our understanding of nucleon-nucleon correlation properties in multinucleon transfer processes.

  19. The Mean Distance to the nth Neighbour in a Uniform Distribution of Random Points: An Application of Probability Theory

    Science.gov (United States)

    Bhattacharyya, Pratip; Chakrabarti, Bikas K.

    2008-01-01

    We study different ways of determining the mean distance (r[subscript n]) between a reference point and its nth neighbour among random points distributed with uniform density in a D-dimensional Euclidean space. First, we present a heuristic method; though this method provides only a crude mathematical result, it shows a simple way of estimating…

  20. A mechanical model for predicting the probability of osteoporotic hip fractures based in DXA measurements and finite element simulation

    Directory of Open Access Journals (Sweden)

    López Enrique

    2012-11-01

    Full Text Available Abstract Background Osteoporotic hip fractures represent major cause of disability, loss of quality of life and even mortality among the elderly population. Decisions on drug therapy are based on the assessment of risk factors for fracture, from BMD measurements. The combination of biomechanical models with clinical studies could better estimate bone strength and supporting the specialists in their decision. Methods A model to assess the probability of fracture, based on the Damage and Fracture Mechanics has been developed, evaluating the mechanical magnitudes involved in the fracture process from clinical BMD measurements. The model is intended for simulating the degenerative process in the skeleton, with the consequent lost of bone mass and hence the decrease of its mechanical resistance which enables the fracture due to different traumatisms. Clinical studies were chosen, both in non-treatment conditions and receiving drug therapy, and fitted to specific patients according their actual BMD measures. The predictive model is applied in a FE simulation of the proximal femur. The fracture zone would be determined according loading scenario (sideway fall, impact, accidental loads, etc., using the mechanical properties of bone obtained from the evolutionary model corresponding to the considered time. Results BMD evolution in untreated patients and in those under different treatments was analyzed. Evolutionary curves of fracture probability were obtained from the evolution of mechanical damage. The evolutionary curve of the untreated group of patients presented a marked increase of the fracture probability, while the curves of patients under drug treatment showed variable decreased risks, depending on the therapy type. Conclusion The FE model allowed to obtain detailed maps of damage and fracture probability, identifying high-risk local zones at femoral neck and intertrochanteric and subtrochanteric areas, which are the typical locations of

  1. A Secure LFSR Based Random Measurement Matrix for Compressive Sensing

    Science.gov (United States)

    George, Sudhish N.; Pattathil, Deepthi P.

    2014-11-01

    In this paper, a novel approach for generating the secure measurement matrix for compressive sensing (CS) based on linear feedback shift register (LFSR) is presented. The basic idea is to select the different states of LFSR as the random entries of the measurement matrix and normalize these values to get independent and identically distributed (i.i.d.) random variables with zero mean and variance , where N is the number of input samples. The initial seed for the LFSR system act as the key to the user to provide security. Since the measurement matrix is generated from the LFSR system, and memory overload to store the measurement matrix is avoided in the proposed system. Moreover, the proposed system can provide security maintaining the robustness to noise of the CS system. The proposed system is validated through different block-based CS techniques of images. To enhance security, the different blocks of images are measured with different measurement matrices so that the proposed encryption system can withstand known plaintext attack. A modulo division circuit is used to reseed the LFSR system to generate multiple random measurement matrices, whereby after each fundamental period of LFSR, the feedback polynomial of the modulo circuit is modified in terms of a chaotic value. The proposed secure robust CS paradigm for images is subjected to several forms of attacks and is proven to be resistant against the same. From experimental analysis, it is proven that the proposed system provides better performance than its counterparts.

  2. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  3. Identifying sources of a conservative groundwater contaminant using backward probabilities conditioned on measured concentrations

    National Research Council Canada - National Science Library

    Roseanna M. Neupauer; Ranhao Lin

    2006-01-01

    ... a loss of information about the contaminant's past; therefore, a complete reconstruction of the source characteristics is not possible. The reconstruction is further complicated by errors in measured concentrations, and uncertainty and variability in the flow and transport parameters. In this paper, we focus on identifying the location or release time of an instantaneous point source of contamination. Because the source characteristics cannot be known exactly, we represent the source location and source release ...

  4. Quantifying Similarity and Distance Measures for Vector-Based Datasets: Histograms, Signals, and Probability Distribution Functions

    Science.gov (United States)

    2017-02-01

    documents. Citation of manufacturer’s or trade names does not constitute an official endorse- ment or approval of the use thereof. Destroy this report when it...note, a number of different measures implemented in both MATLAB and Python as functions are used to quantify similarity/distance between 2 vector-based...datasets. The scripts are attached as appendixes as is a description of their execution. Python , MATLAB, similarity, distance, X-ray diffraction 40

  5. Observer bias in randomized clinical trials with measurement scale outcomes

    DEFF Research Database (Denmark)

    Hróbjartsson, Asbjørn; Thomsen, Ann Sofia Skou; Emanuelsson, Frida

    2013-01-01

    conducted a systematic review of randomized clinical trials with both blinded and nonblinded assessment of the same measurement scale outcome. We searched PubMed, EMBASE, PsycINFO, CINAHL, Cochrane Central Register of Controlled Trials, HighWire Press and Google Scholar for relevant studies. Two...

  6. Two connections between random systems and non-Gibbsian measures

    NARCIS (Netherlands)

    van Enter, A.C.D.; Kulske, C.

    2007-01-01

    In this contribution we discuss the role disordered (or random) systems have played in the study of non-Gibbsian measures. This role has two main aspects, the distinction between which has not always been fully clear: 1) From disordered systems: Disordered systems can be used as a tool; analogies

  7. Influence of sampling intake position on suspended solid measurements in sewers: two probability/time-series-based approaches.

    Science.gov (United States)

    Sandoval, Santiago; Bertrand-Krajewski, Jean-Luc

    2016-06-01

    Total suspended solid (TSS) measurements in urban drainage systems are required for several reasons. Aiming to assess uncertainties in the mean TSS concentration due to the influence of sampling intake vertical position and vertical concentration gradients in a sewer pipe, two methods are proposed: a simplified method based on a theoretical vertical concentration profile (SM) and a time series grouping method (TSM). SM is based on flow rate and water depth time series. TSM requires additional TSS time series as input data. All time series are from the Chassieu urban catchment in Lyon, France (time series from 2007 with 2-min time step, 89 rainfall events). The probability of measuring a TSS value lower than the mean TSS along the vertical cross section (TSS underestimation) is about 0.88 with SM and about 0.64 with TSM. TSM shows more realistic TSS underestimation values (about 39 %) than SM (about 269 %). Interquartile ranges (IQR) over the probability values indicate that SM is more uncertain (IQR = 0.08) than TSM (IQR = 0.02). Differences between the two methods are mainly due to simplifications in SM (absence of TSS measurements). SM assumes a significant asymmetry of the TSS concentration profile along the vertical axis in the cross section. This is compatible with the distribution of TSS measurements found in the TSM approach. The methods provide insights towards an indicator of the measurement performance and representativeness for a TSS sampling protocol.

  8. MEASURING MODEL FOR BAD LOANS IN BANKS. THE DEFAULT PROBABILITY MODEL.

    Directory of Open Access Journals (Sweden)

    SOCOL ADELA

    2010-12-01

    Full Text Available The banking sectors of the transition countries have progressed remarkably in the last 20 years. In fact, banking in most transition countries has largely shaken off the traumas of the transition eraAt the start of the 21st century banks in these countries look very much like banks elsewhere. That is, they are by no means problem free but they are struggling with the same issues as banks in other emerging market countries during the financial crises conditions. The institutional environment differs considerably among the countries. The goal we set with this article is to examine in terms of methodology the most important assessment criteria of a measuring model for bad loans.

  9. Multi-channel Dual Clocks three-dimensional probability Random Multiple Access protocol for Wireless Public Bus Networks based on RTS/CTS mechanism

    Directory of Open Access Journals (Sweden)

    Zhou Sheng Jie

    2016-01-01

    Full Text Available A MAC protocol for public bus networks, called Bus MAC protocol, designed to provide high quality Internet service for bus passengers. The paper proposed a multi-channel dual clocks three-demission probability random multiple access protocol based on RTS/CTS mechanism, decreasing collisions caused by multiple access from multiple passengers. Use the RTS/CTS mechanism increases the reliability and stability of the system, reducing the collision possibility of the information packets to a certain extent, improves the channel utilization; use the multi-channel mechanism, not only enables the channel load balancing, but also solves the problem of the hidden terminal and exposed terminal. Use the dual clocks mechanism, reducing the system idle time. At last, the different selection of the three-dimensional probabilities can make the system throughput adapt to the network load which could realize the maximum of the system throughput.

  10. Probability density of spatially distributed soil moisture inferred from crosshole georadar traveltime measurements

    Science.gov (United States)

    Linde, N.; Vrugt, J. A.

    2009-04-01

    Geophysical models are increasingly used in hydrological simulations and inversions, where they are typically treated as an artificial data source with known uncorrelated "data errors". The model appraisal problem in classical deterministic linear and non-linear inversion approaches based on linearization is often addressed by calculating model resolution and model covariance matrices. These measures offer only a limited potential to assign a more appropriate "data covariance matrix" for future hydrological applications, simply because the regularization operators used to construct a stable inverse solution bear a strong imprint on such estimates and because the non-linearity of the geophysical inverse problem is not explored. We present a parallelized Markov Chain Monte Carlo (MCMC) scheme to efficiently derive the posterior spatially distributed radar slowness and water content between boreholes given first-arrival traveltimes. This method is called DiffeRential Evolution Adaptive Metropolis (DREAM_ZS) with snooker updater and sampling from past states. Our inverse scheme does not impose any smoothness on the final solution, and uses uniform prior ranges of the parameters. The posterior distribution of radar slowness is converted into spatially distributed soil moisture values using a petrophysical relationship. To benchmark the performance of DREAM_ZS, we first apply our inverse method to a synthetic two-dimensional infiltration experiment using 9421 traveltimes contaminated with Gaussian errors and 80 different model parameters, corresponding to a model discretization of 0.3 m × 0.3 m. After this, the method is applied to field data acquired in the vadose zone during snowmelt. This work demonstrates that fully non-linear stochastic inversion can be applied with few limiting assumptions to a range of common two-dimensional tomographic geophysical problems. The main advantage of DREAM_ZS is that it provides a full view of the posterior distribution of spatially

  11. Investigation of the surrogate-reaction method via the simultaneous measurement of gamma-emission and fission probabilities

    Directory of Open Access Journals (Sweden)

    Jurado B.

    2017-01-01

    Full Text Available We present the results of two experiments where we have measured for the first time simultaneously the fission and gamma-decay probabilities induced by different surrogate reactions. In particular, we have investigated the 238U(d,p, 238U(3He,t and 238U(3He,4He reactions as surrogates for the neutron-induced n +  238U, n +  237Np and n +  236U reactions, respectively. In the region where gamma emission, neutron emission and fission compete, our results for the fission probabilities agree fairly well with the neutron-induced data, whereas our gamma-decay probabilities are significantly higher than the neutron-induced data. The interpretation of these results is not obvious and is discussed within the framework of the statistical model with preliminary results for calculated spin-parity distributions populated in surrogate reactions. We also present future plans for surrogate-reaction studies in inverse kinematics with radioactive-ion beams at storage rings.

  12. Investigation of the surrogate-reaction method via the simultaneous measurement of gamma-emission and fission probabilities

    Science.gov (United States)

    Jurado, B.; Marini, P.; Mathieu, L.; Aiche, M.; Czajkowski, S.; Tsekhanovich, I.; Audouin, L.; Boutoux, G.; Denis-Petit, D.; Guttormsen, M.; Kessedjian, G.; Lebois, M.; Méot, V.; Oberstedt, A.; Oberstedt, S.; Roig, O.; Sérot, O.; Tassan-Got, L.; Wilson, J. N.

    2017-09-01

    We present the results of two experiments where we have measured for the first time simultaneously the fission and gamma-decay probabilities induced by different surrogate reactions. In particular, we have investigated the 238U(d,p), 238U(3He,t) and 238U(3He,4He) reactions as surrogates for the neutron-induced n + 238U, n + 237Np and n + 236U reactions, respectively. In the region where gamma emission, neutron emission and fission compete, our results for the fission probabilities agree fairly well with the neutron-induced data, whereas our gamma-decay probabilities are significantly higher than the neutron-induced data. The interpretation of these results is not obvious and is discussed within the framework of the statistical model with preliminary results for calculated spin-parity distributions populated in surrogate reactions. We also present future plans for surrogate-reaction studies in inverse kinematics with radioactive-ion beams at storage rings.

  13. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  14. Random sampler M-estimator algorithm with sequential probability ratio test for robust function approximation via feed-forward neural networks.

    Science.gov (United States)

    El-Melegy, Moumen T

    2013-07-01

    This paper addresses the problem of fitting a functional model to data corrupted with outliers using a multilayered feed-forward neural network. Although it is of high importance in practical applications, this problem has not received careful attention from the neural network research community. One recent approach to solving this problem is to use a neural network training algorithm based on the random sample consensus (RANSAC) framework. This paper proposes a new algorithm that offers two enhancements over the original RANSAC algorithm. The first one improves the algorithm accuracy and robustness by employing an M-estimator cost function to decide on the best estimated model from the randomly selected samples. The other one improves the time performance of the algorithm by utilizing a statistical pretest based on Wald's sequential probability ratio test. The proposed algorithm is successfully evaluated on synthetic and real data, contaminated with varying degrees of outliers, and compared with existing neural network training algorithms.

  15. Measuring symmetry, asymmetry and randomness in neural network connectivity.

    Directory of Open Access Journals (Sweden)

    Umberto Esposito

    Full Text Available Cognitive functions are stored in the connectome, the wiring diagram of the brain, which exhibits non-random features, so-called motifs. In this work, we focus on bidirectional, symmetric motifs, i.e. two neurons that project to each other via connections of equal strength, and unidirectional, non-symmetric motifs, i.e. within a pair of neurons only one neuron projects to the other. We hypothesise that such motifs have been shaped via activity dependent synaptic plasticity processes. As a consequence, learning moves the distribution of the synaptic connections away from randomness. Our aim is to provide a global, macroscopic, single parameter characterisation of the statistical occurrence of bidirectional and unidirectional motifs. To this end we define a symmetry measure that does not require any a priori thresholding of the weights or knowledge of their maximal value. We calculate its mean and variance for random uniform or Gaussian distributions, which allows us to introduce a confidence measure of how significantly symmetric or asymmetric a specific configuration is, i.e. how likely it is that the configuration is the result of chance. We demonstrate the discriminatory power of our symmetry measure by inspecting the eigenvalues of different types of connectivity matrices. We show that a Gaussian weight distribution biases the connectivity motifs to more symmetric configurations than a uniform distribution and that introducing a random synaptic pruning, mimicking developmental regulation in synaptogenesis, biases the connectivity motifs to more asymmetric configurations, regardless of the distribution. We expect that our work will benefit the computational modelling community, by providing a systematic way to characterise symmetry and asymmetry in network structures. Further, our symmetry measure will be of use to electrophysiologists that investigate symmetry of network connectivity.

  16. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  17. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  18. Using Logistic Regression and Random Forests multivariate statistical methods for landslide spatial probability assessment in North-Est Sicily, Italy

    Science.gov (United States)

    Trigila, Alessandro; Iadanza, Carla; Esposito, Carlo; Scarascia-Mugnozza, Gabriele

    2015-04-01

    first phase of the work addressed to identify the spatial relationships between the landslides location and the 13 related factors by using the Frequency Ratio bivariate statistical method. The analysis was then carried out by adopting a multivariate statistical approach, according to the Logistic Regression technique and Random Forests technique that gave best results in terms of AUC. The models were performed and evaluated with different sample sizes and also taking into account the temporal variation of input variables such as burned areas by wildfire. The most significant outcome of this work are: the relevant influence of the sample size on the model results and the strong importance of some environmental factors (e.g. land use and wildfires) for the identification of the depletion zones of extremely rapid shallow landslides.

  19. Zirconium and Yttrium (p, d) Surrogate Nuclear Reactions: Measurement and determination of gamma-ray probabilities: Experimental Physics Report

    Energy Technology Data Exchange (ETDEWEB)

    Burke, J. T. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hughes, R. O. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Escher, J. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Scielzo, N. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Casperson, R. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ressler, J. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Saastamoinen, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ota, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Park, H. I. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ross, T. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McCleskey, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McCleskey, E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Austin, R. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Rapisarda, G. G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-21

    This technical report documents the surrogate reaction method and experimental results used to determine the desired neutron induced cross sections of 87Y(n,g) and the known 90Zr(n,g) cross section. This experiment was performed at the STARLiTeR apparatus located at Texas A&M Cyclotron Institute using the K150 Cyclotron which produced a 28.56 MeV proton beam. The proton beam impinged on Y and Zr targets to produce the nuclear reactions 89Y(p,d)88Y and 92Zr(p,d)91Zr. Both particle singles data and particle-gamma ray coincident data were measured during the experiment. This data was used to determine the γ-ray probability as a function of energy for these reactions. The results for the γ-ray probabilities as a function of energy for both these nuclei are documented here. For completeness, extensive tabulated and graphical results are provided in the appendices.

  20. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  1. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  2. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  3. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  4. Probability distribution of intersymbol distances in random symbolic sequences: Applications to improving detection of keywords in texts and of amino acid clustering in proteins.

    Science.gov (United States)

    Carpena, Pedro; Bernaola-Galván, Pedro A; Carretero-Campos, Concepción; Coronado, Ana V

    2016-11-01

    Symbolic sequences have been extensively investigated in the past few years within the framework of statistical physics. Paradigmatic examples of such sequences are written texts, and deoxyribonucleic acid (DNA) and protein sequences. In these examples, the spatial distribution of a given symbol (a word, a DNA motif, an amino acid) is a key property usually related to the symbol importance in the sequence: The more uneven and far from random the symbol distribution, the higher the relevance of the symbol to the sequence. Thus, many techniques of analysis measure in some way the deviation of the symbol spatial distribution with respect to the random expectation. The problem is then to know the spatial distribution corresponding to randomness, which is typically considered to be either the geometric or the exponential distribution. However, these distributions are only valid for very large symbolic sequences and for many occurrences of the analyzed symbol. Here, we obtain analytically the exact, randomly expected spatial distribution valid for any sequence length and any symbol frequency, and we study its main properties. The knowledge of the distribution allows us to define a measure able to properly quantify the deviation from randomness of the symbol distribution, especially for short sequences and low symbol frequency. We apply the measure to the problem of keyword detection in written texts and to study amino acid clustering in protein sequences. In texts, we show how the results improve with respect to previous methods when short texts are analyzed. In proteins, which are typically short, we show how the measure quantifies unambiguously the amino acid clustering and characterize its spatial distribution.

  5. Blind Measurement Selection: A Random Matrix Theory Approach

    KAUST Repository

    Elkhalil, Khalil

    2016-12-14

    This paper considers the problem of selecting a set of $k$ measurements from $n$ available sensor observations. The selected measurements should minimize a certain error function assessing the error in estimating a certain $m$ dimensional parameter vector. The exhaustive search inspecting each of the $n\\\\choose k$ possible choices would require a very high computational complexity and as such is not practical for large $n$ and $k$. Alternative methods with low complexity have recently been investigated but their main drawbacks are that 1) they require perfect knowledge of the measurement matrix and 2) they need to be applied at the pace of change of the measurement matrix. To overcome these issues, we consider the asymptotic regime in which $k$, $n$ and $m$ grow large at the same pace. Tools from random matrix theory are then used to approximate in closed-form the most important error measures that are commonly used. The asymptotic approximations are then leveraged to select properly $k$ measurements exhibiting low values for the asymptotic error measures. Two heuristic algorithms are proposed: the first one merely consists in applying the convex optimization artifice to the asymptotic error measure. The second algorithm is a low-complexity greedy algorithm that attempts to look for a sufficiently good solution for the original minimization problem. The greedy algorithm can be applied to both the exact and the asymptotic error measures and can be thus implemented in blind and channel-aware fashions. We present two potential applications where the proposed algorithms can be used, namely antenna selection for uplink transmissions in large scale multi-user systems and sensor selection for wireless sensor networks. Numerical results are also presented and sustain the efficiency of the proposed blind methods in reaching the performances of channel-aware algorithms.

  6. Statistical analysis of compressive low rank tomography with random measurements

    Science.gov (United States)

    Acharya, Anirudh; Guţă, Mădălin

    2017-05-01

    We consider the statistical problem of ‘compressive’ estimation of low rank states (r\\ll d ) with random basis measurements, where r, d are the rank and dimension of the state respectively. We investigate whether for a fixed sample size N, the estimation error associated with a ‘compressive’ measurement setup is ‘close’ to that of the setting where a large number of bases are measured. We generalise and extend previous results, and show that the mean square error (MSE) associated with the Frobenius norm attains the optimal rate rd/N with only O(r log{d}) random basis measurements for all states. An important tool in the analysis is the concentration of the Fisher information matrix (FIM). We demonstrate that although a concentration of the MSE follows from a concentration of the FIM for most states, the FIM fails to concentrate for states with eigenvalues close to zero. We analyse this phenomenon in the case of a single qubit and demonstrate a concentration of the MSE about its optimal despite a lack of concentration of the FIM for states close to the boundary of the Bloch sphere. We also consider the estimation error in terms of a different metric-the quantum infidelity. We show that a concentration in the mean infidelity (MINF) does not exist uniformly over all states, highlighting the importance of loss function choice. Specifically, we show that for states that are nearly pure, the MINF scales as 1/\\sqrt{N} but the constant converges to zero as the number of settings is increased. This demonstrates a lack of ‘compressive’ recovery for nearly pure states in this metric.

  7. An improved lesion detection approach based on similarity measurement between fuzzy intensity segmentation and spatial probability maps.

    Science.gov (United States)

    Shen, Shan; Szameitat, Andre J; Sterr, Annette

    2010-02-01

    The application of automatic segmentation methods in lesion detection is desirable. However, such methods are restricted by intensity similarities between lesioned and healthy brain tissue. Using multi-spectral magnetic resonance imaging (MRI) modalities may overcome this problem but it is not always practicable. In this article, a lesion detection approach requiring a single MRI modality is presented, which is an improved method based on a recent publication. This new method assumes that a low similarity should be found in the regions of lesions when the likeness between an intensity based fuzzy segmentation and a location based tissue probabilities is measured. The usage of a normalized similarity measurement enables the current method to fine-tune the threshold for lesion detection, thus maximizing the possibility of reaching high detection accuracy. Importantly, an extra cleaning step is included in the current approach which removes enlarged ventricles from detected lesions. The performance investigation using simulated lesions demonstrated that not only the majority of lesions were well detected but also normal tissues were identified effectively. Tests on images acquired in stroke patients further confirmed the strength of the method in lesion detection. When compared with the previous version, the current approach showed a higher sensitivity in detecting small lesions and had less false positives around the ventricle and the edge of the brain. Copyright 2010 Elsevier Inc. All rights reserved.

  8. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  9. Random walks in the quarter-plane: invariant measures and performance bounds

    NARCIS (Netherlands)

    Chen, Y.

    2015-01-01

    This monograph focuses on random walks in the quarter-plane. Such random walks are frequently used to model queueing systems and the invariant measure of a random walk is of major importance in studying the performance of these systems. In special cases the invariant measure of a random walk can be

  10. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  11. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  12. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  13. 7th High Dimensional Probability Meeting

    CERN Document Server

    Mason, David; Reynaud-Bouret, Patricia; Rosinski, Jan

    2016-01-01

    This volume collects selected papers from the 7th High Dimensional Probability meeting held at the Institut d'Études Scientifiques de Cargèse (IESC) in Corsica, France. High Dimensional Probability (HDP) is an area of mathematics that includes the study of probability distributions and limit theorems in infinite-dimensional spaces such as Hilbert spaces and Banach spaces. The most remarkable feature of this area is that it has resulted in the creation of powerful new tools and perspectives, whose range of application has led to interactions with other subfields of mathematics, statistics, and computer science. These include random matrices, nonparametric statistics, empirical processes, statistical learning theory, concentration of measure phenomena, strong and weak approximations, functional estimation, combinatorial optimization, and random graphs. The contributions in this volume show that HDP theory continues to thrive and develop new tools, methods, techniques and perspectives to analyze random phenome...

  14. Probability on compact Lie groups

    CERN Document Server

    Applebaum, David

    2014-01-01

    Probability theory on compact Lie groups deals with the interaction between “chance” and “symmetry,” a beautiful area of mathematics of great interest in its own sake but which is now also finding increasing applications in statistics and engineering (particularly with respect to signal processing). The author gives a comprehensive introduction to some of the principle areas of study, with an emphasis on applicability. The most important topics presented are: the study of measures via the non-commutative Fourier transform, existence and regularity of densities, properties of random walks and convolution semigroups of measures, and the statistical problem of deconvolution. The emphasis on compact (rather than general) Lie groups helps readers to get acquainted with what is widely seen as a difficult field but which is also justified by the wealth of interesting results at this level and the importance of these groups for applications. The book is primarily aimed at researchers working in probability, s...

  15. Multinomial mixture model with heterogeneous classification probabilities

    Science.gov (United States)

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  16. Confidence Intervals for the Probability of Superiority Effect Size Measure and the Area under a Receiver Operating Characteristic Curve

    Science.gov (United States)

    Ruscio, John; Mullen, Tara

    2012-01-01

    It is good scientific practice to the report an appropriate estimate of effect size and a confidence interval (CI) to indicate the precision with which a population effect was estimated. For comparisons of 2 independent groups, a probability-based effect size estimator (A) that is equal to the area under a receiver operating characteristic curve…

  17. Heat capacity and sticking probability measurements of sup 4 He submonolayers adsorbed on evaporated Ag films: Bose statistics in two dimensions

    Energy Technology Data Exchange (ETDEWEB)

    Kenny, T.W.; Richards, P.L. (Department of Physics, University of California, Berkeley, Berkeley, CA (USA) Materials and Chemical Sciences Division, Lawrence Berkeley Laboratories, Berkeley, CA (USA))

    1990-05-14

    We have measured the heat capacity of submonolayers of {sup 4}He adsorbed on Ag films between 1.7 and 3.3 K. Good fits to the results are obtained with a model of a noninteracting two-dimensional Bose gas. The sticking probability for room-temperature {sup 4}He atoms on cold Ag has been measured as a function of substrate temperature and {sup 4}He coverage. The sticking probability is 4% at low coverage, and abruptly drops to 1% for coverages above 0.5 monolayer.

  18. Introduction to probability and statistics for science, engineering, and finance

    CERN Document Server

    Rosenkrantz, Walter A

    2008-01-01

    Data Analysis Orientation The Role and Scope of Statistics in Science and Engineering Types of Data: Examples from Engineering, Public Health, and Finance The Frequency Distribution of a Variable Defined on a Population Quantiles of a Distribution Measures of Location (Central Value) and Variability Covariance, Correlation, and Regression: Computing a Stock's Beta Mathematical Details and Derivations Large Data Sets Probability Theory Orientation Sample Space, Events, Axioms of Probability Theory Mathematical Models of Random Sampling Conditional Probability and Baye

  19. Modeling Transport in Fractured Porous Media with the Random-Walk Particle Method: The Transient Activity Range and the Particle-Transfer Probability

    Energy Technology Data Exchange (ETDEWEB)

    Lehua Pan; G.S. Bodvarsson

    2001-10-22

    Multiscale features of transport processes in fractured porous media make numerical modeling a difficult task, both in conceptualization and computation. Modeling the mass transfer through the fracture-matrix interface is one of the critical issues in the simulation of transport in a fractured porous medium. Because conventional dual-continuum-based numerical methods are unable to capture the transient features of the diffusion depth into the matrix (unless they assume a passive matrix medium), such methods will overestimate the transport of tracers through the fractures, especially for the cases with large fracture spacing, resulting in artificial early breakthroughs. We have developed a new method for calculating the particle-transfer probability that can capture the transient features of diffusion depth into the matrix within the framework of the dual-continuum random-walk particle method (RWPM) by introducing a new concept of activity range of a particle within the matrix. Unlike the multiple-continuum approach, the new dual-continuum RWPM does not require using additional grid blocks to represent the matrix. It does not assume a passive matrix medium and can be applied to the cases where global water flow exists in both continua. The new method has been verified against analytical solutions for transport in the fracture-matrix systems with various fracture spacing. The calculations of the breakthrough curves of radionuclides from a potential repository to the water table in Yucca Mountain demonstrate the effectiveness of the new method for simulating 3-D, mountain-scale transport in a heterogeneous, fractured porous medium under variably saturated conditions.

  20. Measuring milk fat content by random laser emission

    Science.gov (United States)

    Abegão, Luis M. G.; Pagani, Alessandra A. C.; Zílio, Sérgio C.; Alencar, Márcio A. R. C.; Rodrigues, José J.

    2016-10-01

    The luminescence spectra of milk containing rhodamine 6G are shown to exhibit typical signatures of random lasing when excited with 532 nm laser pulses. Experiments carried out on whole and skim forms of two commercial brands of UHT milk, with fat volume concentrations ranging from 0 to 4%, presented lasing threshold values dependent on the fat concentration, suggesting that a random laser technique can be developed to monitor such important parameter.

  1. Multi-bit quantum random number generation by measuring positions of arrival photons.

    Science.gov (United States)

    Yan, Qiurong; Zhao, Baosheng; Liao, Qinghong; Zhou, Nanrun

    2014-10-01

    We report upon the realization of a novel multi-bit optical quantum random number generator by continuously measuring the arrival positions of photon emitted from a LED using MCP-based WSA photon counting imaging detector. A spatial encoding method is proposed to extract multi-bits random number from the position coordinates of each detected photon. The randomness of bits sequence relies on the intrinsic randomness of the quantum physical processes of photonic emission and subsequent photoelectric conversion. A prototype has been built and the random bit generation rate could reach 8 Mbit/s, with random bit generation efficiency of 16 bits per detected photon. FPGA implementation of Huffman coding is proposed to reduce the bias of raw extracted random bits. The random numbers passed all tests for physical random number generator.

  2. INSTRUCTIONAL CONFERENCE ON THE THEORY OF STOCHASTIC PROCESSES: New criteria of relative compactness of sequences of probability measures

    Science.gov (United States)

    Grigelionis, B. I.; Lebedev, V. A.

    1982-12-01

    CONTENTSIntroduction § 1. New criteria of relative compactness of sequences of measures in the Skorokhod space § 2. Conditions of relative compactness of sequences of measures corresponding to semimartingales § 3. Conditions of relative compactness of sequences of measures corresponding to point processes References

  3. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  4. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  5. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  6. High-speed quantum-random number generation by continuous measurement of arrival time of photons.

    Science.gov (United States)

    Yan, Qiurong; Zhao, Baosheng; Hua, Zhang; Liao, Qinghong; Yang, Hao

    2015-07-01

    We demonstrate a novel high speed and multi-bit optical quantum random number generator by continuously measuring arrival time of photons with a common starting point. To obtain the unbiased and post-processing free random bits, the measured photon arrival time is converted into the sum of integral multiple of a fixed period and a phase time. Theoretical and experimental results show that the phase time is an independent and uniform random variable. A random bit extraction method by encoding the phase time is proposed. An experimental setup has been built and the unbiased random bit generation rate could reach 128 Mb/s, with random bit generation efficiency of 8 bits per detected photon. The random numbers passed all tests in the statistical test suite.

  7. Random Number Simulations Reveal How Random Noise Affects the Measurements and Graphical Portrayals of Self-Assessed Competency

    Directory of Open Access Journals (Sweden)

    Edward Nuhfer

    2016-01-01

    Full Text Available Self-assessment measures of competency are blends of an authentic self-assessment signal that researchers seek to measure and random disorder or "noise" that accompanies that signal. In this study, we use random number simulations to explore how random noise affects critical aspects of self-assessment investigations: reliability, correlation, critical sample size, and the graphical representations of self-assessment data. We show that graphical conventions common in the self-assessment literature introduce artifacts that invite misinterpretation. Troublesome conventions include: (y minus x vs. (x scatterplots; (y minus x vs. (x column graphs aggregated as quantiles; line charts that display data aggregated as quantiles; and some histograms. Graphical conventions that generate minimal artifacts include scatterplots with a best-fit line that depict (y vs. (x measures (self-assessed competence vs. measured competence plotted by individual participant scores, and (y vs. (x scatterplots of collective average measures of all participants plotted item-by-item. This last graphic convention attenuates noise and improves the definition of the signal. To provide relevant comparisons across varied graphical conventions, we use a single dataset derived from paired measures of 1154 participants' self-assessed competence and demonstrated competence in science literacy. Our results show that different numerical approaches employed in investigating and describing self-assessment accuracy are not equally valid. By modeling this dataset with random numbers, we show how recognizing the varied expressions of randomness in self-assessment data can improve the validity of numeracy-based descriptions of self-assessment.

  8. Completely random measures for modelling block-structured sparse networks

    DEFF Research Database (Denmark)

    Herlau, Tue; Schmidt, Mikkel Nørgaard; Mørup, Morten

    2016-01-01

    Many statistical methods for network data parameterize the edge-probability by attributing latent traits to the vertices such as block structure and assume exchangeability in the sense of the Aldous-Hoover representation theorem. Empirical studies of networks indicate that many real-world networks...... [2014] proposed the use of a different notion of exchangeability due to Kallenberg [2006] and obtained a network model which admits power-law behaviour while retaining desirable statistical properties, however this model does not capture latent vertex traits such as block-structure. In this work we re......-introduce the use of block-structure for network models obeying allenberg’s notion of exchangeability and thereby obtain a model which admits the inference of block-structure and edge inhomogeneity. We derive a simple expression for the likelihood and an efficient sampling method. The obtained model...

  9. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  10. An explicit semantic relatedness measure based on random walk

    Directory of Open Access Journals (Sweden)

    HU Sihui

    2016-10-01

    Full Text Available The semantic relatedness calculation of open domain knowledge network is a significant issue.In this paper,pheromone strategy is drawn from the thought of ant colony algorithm and is integrated into the random walk which is taken as the basic framework of calculating the semantic relatedness degree.The pheromone distribution is taken as a criterion of determining the tightness degree of semantic relatedness.A method of calculating semantic relatedness degree based on random walk is proposed and the exploration process of calculating the semantic relatedness degree is presented in a dominant way.The method mainly contains Path Select Model(PSM and Semantic Relatedness Computing Model(SRCM.PSM is used to simulate the path selection of ants and pheromone release.SRCM is used to calculate the semantic relatedness by utilizing the information returned by ants.The result indicates that the method could complete semantic relatedness calculation in linear complexity and extend the feasible strategy of semantic relatedness calculation.

  11. Quantization of Prior Probabilities for Hypothesis Testing

    OpenAIRE

    Varshney, Kush R.; Varshney, Lav R.

    2008-01-01

    Bayesian hypothesis testing is investigated when the prior probabilities of the hypotheses, taken as a random vector, are quantized. Nearest neighbor and centroid conditions are derived using mean Bayes risk error as a distortion measure for quantization. A high-resolution approximation to the distortion-rate function is also obtained. Human decision making in segregated populations is studied assuming Bayesian hypothesis testing with quantized priors.

  12. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  13. The Use of PCs, Smartphones, and Tablets in a Probability-Based Panel Survey : Effects on Survey Measurement Error

    NARCIS (Netherlands)

    Lugtig, Peter; Toepoel, Vera

    2016-01-01

    Respondents in an Internet panel survey can often choose which device they use to complete questionnaires: a traditional PC, laptop, tablet computer, or a smartphone. Because all these devices have different screen sizes and modes of data entry, measurement errors may differ between devices. Using

  14. MTRAP: Pairwise sequence alignment algorithm by a new measure based on transition probability between two consecutive pairs of residues

    Directory of Open Access Journals (Sweden)

    Ohya Masanori

    2010-05-01

    Full Text Available Abstract Background Sequence alignment is one of the most important techniques to analyze biological systems. It is also true that the alignment is not complete and we have to develop it to look for more accurate method. In particular, an alignment for homologous sequences with low sequence similarity is not in satisfactory level. Usual methods for aligning protein sequences in recent years use a measure empirically determined. As an example, a measure is usually defined by a combination of two quantities (1 and (2 below: (1 the sum of substitutions between two residue segments, (2 the sum of gap penalties in insertion/deletion region. Such a measure is determined on the assumption that there is no an intersite correlation on the sequences. In this paper, we improve the alignment by taking the correlation of consecutive residues. Results We introduced a new method of alignment, called MTRAP by introducing a metric defined on compound systems of two sequences. In the benchmark tests by PREFAB 4.0 and HOMSTRAD, our pairwise alignment method gives higher accuracy than other methods such as ClustalW2, TCoffee, MAFFT. Especially for the sequences with sequence identity less than 15%, our method improves the alignment accuracy significantly. Moreover, we also showed that our algorithm works well together with a consistency-based progressive multiple alignment by modifying the TCoffee to use our measure. Conclusions We indicated that our method leads to a significant increase in alignment accuracy compared with other methods. Our improvement is especially clear in low identity range of sequences. The source code is available at our web page, whose address is found in the section "Availability and requirements".

  15. Methodology for assessing the probability of corrosion in concrete structures on the basis of half-cell potential and concrete resistivity measurements.

    Science.gov (United States)

    Sadowski, Lukasz

    2013-01-01

    In recent years, the corrosion of steel reinforcement has become a major problem in the construction industry. Therefore, much attention has been given to developing methods of predicting the service life of reinforced concrete structures. The progress of corrosion cannot be visually assessed until a crack or a delamination appears. The corrosion process can be tracked using several electrochemical techniques. Most commonly the half-cell potential measurement technique is used for this purpose. However, it is generally accepted that it should be supplemented with other techniques. Hence, a methodology for assessing the probability of corrosion in concrete slabs by means of a combination of two methods, that is, the half-cell potential method and the concrete resistivity method, is proposed. An assessment of the probability of corrosion in reinforced concrete structures carried out using the proposed methodology is presented. 200 mm thick 750 mm  ×  750 mm reinforced concrete slab specimens were investigated. Potential E corr and concrete resistivity ρ in each point of the applied grid were measured. The experimental results indicate that the proposed methodology can be successfully used to assess the probability of corrosion in concrete structures.

  16. Method for measurement of transition probabilities by laser-induced breakdown spectroscopy based on CSigma graphs-Application to Ca II spectral lines

    Science.gov (United States)

    Aguilera, J. A.; Aragón, C.; Manrique, J.

    2015-07-01

    We propose a method for determination of transition probabilities by laser-induced breakdown spectroscopy that avoids the error due to self-absorption. The method relies on CSigma graphs, a generalization of curves of growth which allows including several lines of various elements in the same ionization state. CSigma graphs are constructed including reference lines of an emitting species with well-known transition probabilities, together with the lines of interest, both in the same ionization state. The samples are fused glass disks prepared from small concentrations of compounds. When the method is applied, the concentration of the element of interest in the sample must be controlled to avoid the failure of the homogeneous plasma model. To test the method, the transition probabilities of 9 Ca II lines arising from the 4d, 5s, 5d and 6s configurations are measured using Fe II reference lines. The data for 5 of the studied lines, mainly from the 5d and 6s configurations, had not been measured previously.

  17. Methodology for Assessing the Probability of Corrosion in Concrete Structures on the Basis of Half-Cell Potential and Concrete Resistivity Measurements

    Directory of Open Access Journals (Sweden)

    Lukasz Sadowski

    2013-01-01

    Full Text Available In recent years, the corrosion of steel reinforcement has become a major problem in the construction industry. Therefore, much attention has been given to developing methods of predicting the service life of reinforced concrete structures. The progress of corrosion cannot be visually assessed until a crack or a delamination appears. The corrosion process can be tracked using several electrochemical techniques. Most commonly the half-cell potential measurement technique is used for this purpose. However, it is generally accepted that it should be supplemented with other techniques. Hence, a methodology for assessing the probability of corrosion in concrete slabs by means of a combination of two methods, that is, the half-cell potential method and the concrete resistivity method, is proposed. An assessment of the probability of corrosion in reinforced concrete structures carried out using the proposed methodology is presented. 200 mm thick 750 mm  ×  750 mm reinforced concrete slab specimens were investigated. Potential Ecorr and concrete resistivity ρ in each point of the applied grid were measured. The experimental results indicate that the proposed methodology can be successfully used to assess the probability of corrosion in concrete structures.

  18. Measurement of the radiative vacancy transfer probabilities from the $L_{3}$ to M and to N shells for W, Re and Pb using synchrotron radiation

    CERN Document Server

    Bonzi, E V

    2006-01-01

    The radiative vacancy transfer probabilities from L/sub 3/ to M shell, eta/sub L3/M(R) and L/sub 3/ to N shell, eta/sub L3/N(R), have been determined for W, Re and Pb. The pure elements samples were excited by monochromatic synchrotron radiation. The X-rays were generated by excitation of L/sub 3/ edge and measured using a high resolution Si(Li) detector. The experimentally determined radiative vacancy transfer probabilities were compared with the theoretical values deduced using radiative X-ray emission rates based on the relativistic Dirac-Hartree-Slater (RDHS) model. In the case of Pb, the experimental data were compared as well with experimental values of Simsek. In both cases, a good agreement was found between the datasets.

  19. Bayesian randomized item response modeling for sensitive measurements

    NARCIS (Netherlands)

    Avetisyan, Marianna

    2012-01-01

    In behavioral, health, and social sciences, any endeavor involving measurement is directed at accurate representation of the latent concept with the manifest observation. However, when sensitive topics, such as substance abuse, tax evasion, or felony, are inquired, substantial distortion of reported

  20. Weak convergence to isotropic complex S α S $S\\alpha S$ random measure

    Directory of Open Access Journals (Sweden)

    Jun Wang

    2017-09-01

    Full Text Available Abstract In this paper, we prove that an isotropic complex symmetric α-stable random measure ( 0 < α < 2 $0<\\alpha<2$ can be approximated by a complex process constructed by integrals based on the Poisson process with random intensity.

  1. Probability machines: consistent probability estimation using nonparametric learning machines.

    Science.gov (United States)

    Malley, J D; Kruppa, J; Dasgupta, A; Malley, K G; Ziegler, A

    2012-01-01

    Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications.

  2. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  3. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  4. Comparing entropy with tests for randomness as a measure of complexity in time series

    CERN Document Server

    Gan, Chee Chun

    2015-01-01

    Entropy measures have become increasingly popular as an evaluation metric for complexity in the analysis of time series data, especially in physiology and medicine. Entropy measures the rate of information gain, or degree of regularity in a time series e.g. heartbeat. Ideally, entropy should be able to quantify the complexity of any underlying structure in the series, as well as determine if the variation arises from a random process. Unfortunately current entropy measures mostly are unable to perform the latter differentiation. Thus, a high entropy score indicates a random or chaotic series, whereas a low score indicates a high degree of regularity. This leads to the observation that current entropy measures are equivalent to evaluating how random a series is, or conversely the degree of regularity in a time series. This raises the possibility that existing tests for randomness, such as the runs test or permutation test, may have similar utility in diagnosing certain conditions. This paper compares various t...

  5. Probability, statistics, and reliability for engineers and scientists

    CERN Document Server

    Ayyub, Bilal M

    2012-01-01

    IntroductionIntroduction Knowledge, Information, and Opinions Ignorance and Uncertainty Aleatory and Epistemic Uncertainties in System Abstraction Characterizing and Modeling Uncertainty Simulation for Uncertainty Analysis and Propagation Simulation Projects Data Description and TreatmentIntroduction Classification of Data Graphical Description of Data Histograms and Frequency Diagrams Descriptive Measures Applications Analysis of Simulated Data Simulation Projects Fundamentals of ProbabilityIntroduction Sets, Sample Spaces, and EventsMathematics of Probability Random Variables and Their Proba

  6. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  7. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  8. A random matrix/transition state theory for the probability distribution of state-specific unimolecular decay rates: Generalization to include total angular momentum conservation and other dynamical symmetries

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez, R.; Miller, W.H.; Moore, C.B. (Department of Chemistry, University of California, and Chemical Sciences Division, Lawrence Berkeley Laboratory, Berkeley, California 94720 (United States)); Polik, W.F. (Department of Chemistry, Hope College, Holland, Michigan 49423 (United States))

    1993-07-15

    A previously developed random matrix/transition state theory (RM/TST) model for the probability distribution of state-specific unimolecular decay rates has been generalized to incorporate total angular momentum conservation and other dynamical symmetries. The model is made into a predictive theory by using a semiclassical method to determine the transmission probabilities of a nonseparable rovibrational Hamiltonian at the transition state. The overall theory gives a good description of the state-specific rates for the D[sub 2]CO[r arrow]D[sub 2]+CO unimolecular decay; in particular, it describes the dependence of the distribution of rates on total angular momentum [ital J]. Comparison of the experimental values with results of the RM/TST theory suggests that there is mixing among the rovibrational states.

  9. Probability density of quantum expectation values

    Science.gov (United States)

    Campos Venuti, L.; Zanardi, P.

    2013-10-01

    We consider the quantum expectation value A= of an observable A over the state |ψ>. We derive the exact probability distribution of A seen as a random variable when |ψ> varies over the set of all pure states equipped with the Haar-induced measure. To illustrate our results we compare the exact predictions for few concrete examples with the concentration bounds obtained using Levy's lemma. We also comment on the relevance of the central limit theorem and finally draw some results on an alternative statistical mechanics based on the uniform measure on the energy shell.

  10. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  11. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  12. Frequentist probability and frequentist statistics

    Energy Technology Data Exchange (ETDEWEB)

    Neyman, J.

    1977-01-01

    A brief, nontechnical outline is given of the author's views on the ''frequentist'' theory of probability and the ''frequentist'' theory of statistics; their applications are illustrated in a few domains of study of nature. The phenomenon of apparently stable relative frequencies as the source of the frequentist theories of probability and statistics is taken up first. Three steps are set out: empirical establishment of apparently stable long-run relative frequencies of events judged interesting, as they develop in nature; guessing and then verifying the chance mechanism, the repeated operation of which produced the observed frequencies--this is a problem of frequentist probability theory; using the hypothetical chance mechanism of the phenomenon studied to deduce rules of adjusting our actions to the observations to ensure the highest ''measure'' of ''success''. Illustrations of the three steps are given. The theory of testing statistical hypotheses is sketched: basic concepts, simple and composite hypotheses, hypothesis tested, importance of the power of the test used, practical applications of the theory of testing statistical hypotheses. Basic ideas and an example of the randomization of experiments are discussed, and an ''embarrassing'' example is given. The problem of statistical estimation is sketched: example of an isolated problem, example of connected problems treated routinely, empirical Bayes theory, point estimation. The theory of confidence intervals is outlined: basic concepts, anticipated misunderstandings, construction of confidence intervals: regions of acceptance. Finally, the theory of estimation by confidence intervals or regions is considered briefly. 4 figures. (RWR)

  13. Large Signal Excitation Measurement Techniques for Random Telegraph Signal Noise in MOSFETs

    NARCIS (Netherlands)

    Hoekstra, E.

    2005-01-01

    This paper introduces large signal excitation measurement techniques to analyze random telegraph signal (RTS) noise originating from oxide-traps in MOSFETs. The paper concentrates on the trap-occupancy, which relates directly to the generated noise. The proposed measurement technique makes

  14. Large Signal Excitation Measurement Techniques for Random Telegraph Signal Noise in MOSFETs

    NARCIS (Netherlands)

    Hoekstra, E.; Kolhatkar, J.S.; van der Wel, A.P.; Salm, Cora; Klumperink, Eric A.M.

    2005-01-01

    This paper introduces large signal excitation measurement techniques to analyze Random Telegraph Signal (RTS) noise originating from oxide-traps in MOSFETs. The paper concentrates on the trap-occupancy, which relates directly to the generated noise. The proposed measurement technique makes

  15. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  16. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  17. Randomness Representation of Turbulence in Canopy Flows Using Kolmogorov Complexity Measures

    Directory of Open Access Journals (Sweden)

    Dragutin Mihailović

    2017-09-01

    Full Text Available Turbulence is often expressed in terms of either irregular or random fluid flows, without quantification. In this paper, a methodology to evaluate the randomness of the turbulence using measures based on the Kolmogorov complexity (KC is proposed. This methodology is applied to experimental data from a turbulent flow developing in a laboratory channel with canopy of three different densities. The methodology is even compared with the traditional approach based on classical turbulence statistics.

  18. Measurement of the radiative L3-M vacancy transfer probabilities of some 4f elements and compounds using Indus-2 synchrotron radiation

    Science.gov (United States)

    Krishnananda; Mirji, Santosh; Badiger, N. M.; Tiwari, M. K.

    2016-08-01

    The L X-ray intensity ratios (ILα/ILl, ILα/ILβ, ILα/ILγ) and the radiative L3-M vacancy transfer probabilities (ηL3-M) of some 4f elements such as Gd, Tb, Ho and compounds; Pr2O3, Pr2(CO3)3·8H2O, Nd2O3, Sm2O3, Sm2(CO3)3·2.85H2O, Sm2(SO4)3·8H2O, Gd2(CO3)3, Tb2O3, Dy2(SO4)3, Ho2O3 and HoF3 have been measured using Indus-2 synchrotron radiation. The elements and compounds are excited by synchrotron radiation and the emitted characteristic L X-ray photons are measured with high resolution silicon drift detector. The measured intensity ratios of compounds are not influenced by the chemical environment. However, the ηL3-M values of compound targets indicate the effect of crystal structure.

  19. Estimating Probabilities in Recommendation Systems

    OpenAIRE

    Sun, Mingxuan; Lebanon, Guy; Kidwell, Paul

    2010-01-01

    Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computat...

  20. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  1. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  2. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  3. Random measurement error: Why worry? An example of cardiovascular risk factors.

    Science.gov (United States)

    Brakenhoff, Timo B; van Smeden, Maarten; Visseren, Frank L J; Groenwold, Rolf H H

    2018-01-01

    With the increased use of data not originally recorded for research, such as routine care data (or 'big data'), measurement error is bound to become an increasingly relevant problem in medical research. A common view among medical researchers on the influence of random measurement error (i.e. classical measurement error) is that its presence leads to some degree of systematic underestimation of studied exposure-outcome relations (i.e. attenuation of the effect estimate). For the common situation where the analysis involves at least one exposure and one confounder, we demonstrate that the direction of effect of random measurement error on the estimated exposure-outcome relations can be difficult to anticipate. Using three example studies on cardiovascular risk factors, we illustrate that random measurement error in the exposure and/or confounder can lead to underestimation as well as overestimation of exposure-outcome relations. We therefore advise medical researchers to refrain from making claims about the direction of effect of measurement error in their manuscripts, unless the appropriate inferential tools are used to study or alleviate the impact of measurement error from the analysis.

  4. A generalized Jonckheere test against ordered alternatives for repeated measures in randomized blocks.

    Science.gov (United States)

    Zhang, Ying; Cabilio, Paul

    2013-05-10

    Focusing on statistical methods in patient-reported outcomes, we propose and develop a generalized Jonckheere test against ordered alternatives for repeated measures in a randomized block design. We derive its asymptotic null distribution properties and describe methods for estimating the null distribution for testing the hypothesis. We present a numerical example to illustrate the test procedure. Copyright © 2012 John Wiley & Sons, Ltd.

  5. Error Bounds Due to Random Noise in Cylindrical Near-Field Measurements

    OpenAIRE

    Romeu Robert, Jordi; Jofre Roca, Lluís

    1991-01-01

    The far field errors due to near field random noise are statistically bounded when performing cylindrical near to far field transform. In this communication, the far field noise variance it is expressed as a function of the measurement parameters and the near field noise variance. Peer Reviewed

  6. Fault Detection of Aircraft System with Random Forest Algorithm and Similarity Measure

    Directory of Open Access Journals (Sweden)

    Sanghyuk Lee

    2014-01-01

    Full Text Available Research on fault detection algorithm was developed with the similarity measure and random forest algorithm. The organized algorithm was applied to unmanned aircraft vehicle (UAV that was readied by us. Similarity measure was designed by the help of distance information, and its usefulness was also verified by proof. Fault decision was carried out by calculation of weighted similarity measure. Twelve available coefficients among healthy and faulty status data group were used to determine the decision. Similarity measure weighting was done and obtained through random forest algorithm (RFA; RF provides data priority. In order to get a fast response of decision, a limited number of coefficients was also considered. Relation of detection rate and amount of feature data were analyzed and illustrated. By repeated trial of similarity calculation, useful data amount was obtained.

  7. Online evolution reconstruction from a single measurement record with random time intervals for quantum communication

    Science.gov (United States)

    Zhou, Hua; Su, Yang; Wang, Rong; Zhu, Yong; Shen, Huiping; Pu, Tao; Wu, Chuanxin; Zhao, Jiyong; Zhang, Baofu; Xu, Zhiyong

    2017-10-01

    Online reconstruction of a time-variant quantum state from the encoding/decoding results of quantum communication is addressed by developing a method of evolution reconstruction from a single measurement record with random time intervals. A time-variant two-dimensional state is reconstructed on the basis of recovering its expectation value functions of three nonorthogonal projectors from a random single measurement record, which is composed from the discarded qubits of the six-state protocol. The simulated results prove that our method is robust to typical metro quantum channels. Our work extends the Fourier-based method of evolution reconstruction from the version for a regular single measurement record with equal time intervals to a unified one, which can be applied to arbitrary single measurement records. The proposed protocol of evolution reconstruction runs concurrently with the one of quantum communication, which can facilitate the online quantum tomography.

  8. Cluster-randomized trial of a German leisure-based alcohol peer education measure

    OpenAIRE

    B?hler, Anneke; Thrul, Johannes; Str?ber, Evelin; Orth, Boris

    2015-01-01

    Because of scarce research, the effectiveness of substance abuse prevention in leisure settings remains unclear. In this study, we evaluated the effectiveness of a peer-led educational prevention measure with adolescent groups in unstructured leisure settings, which is a component of the complex German nationwide ?Na Toll!? campaign. Using a cluster-randomized two-group post-test-only design, we tested whether the measure influenced component-specific goals, namely risk and protective factors...

  9. Methods of Reverberation Mapping. I. Time-lag Determination by Measures of Randomness

    OpenAIRE

    Chelouche, Doron; Nuñez, Francisco Pozo; Zucker, Shay

    2017-01-01

    A class of methods for measuring time delays between astronomical time series is introduced in the context of quasar reverberation mapping, which is based on measures of randomness or complexity of the data. Several distinct statistical estimators are considered that do not rely on polynomial interpolations of the light curves nor on their stochastic modeling, and do not require binning in correlation space. Methods based on von Neumann's mean-square successive-difference estimator are found ...

  10. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  11. Necessary conditions for the invariant measure of a random walk to be a sum of geometric terms

    NARCIS (Netherlands)

    Chen, Y.; Boucherie, Richardus J.; Goseling, Jasper

    We consider the invariant measure of homogeneous random walks in the quarter-plane. In particular, we consider measures that can be expressed as an infinite sum of geometric terms. We present necessary conditions for the invariant measure of a random walk to be a sum of geometric terms. We

  12. Stationary algorithmic probability

    National Research Council Canada - National Science Library

    Müller, Markus

    2010-01-01

    ...,sincetheiractualvaluesdependonthechoiceoftheuniversal referencecomputer.Inthispaper,weanalyzeanaturalapproachtoeliminatethismachine- dependence. Our method is to assign algorithmic probabilities to the different...

  13. Lévy laws in free probability

    OpenAIRE

    Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen

    2002-01-01

    This article and its sequel outline recent developments in the theory of infinite divisibility and Lévy processes in free probability, a subject area belonging to noncommutative (or quantum) probability. The present paper discusses the classes of infinitely divisible probability measures in classical and free probability, respectively, via a study of the Bercovici–Pata bijection between these classes.

  14. Factual and cognitive probability

    OpenAIRE

    Chuaqui, Rolando

    2012-01-01

    This modification separates the two aspects of probability: probability as a part of physical theories (factual), and as a basis for statistical inference (cognitive). Factual probability is represented by probability structures as in the earlier papers, but now built independently of the language. Cognitive probability is interpreted as a form of "partial truth". The paper also contains a discussion of the Principle of Insufficient Reason and of Bayesian and classical statistical methods, in...

  15. Evaluating probability forecasts

    OpenAIRE

    Lai, Tze Leung; Gross, Shulamith T.; Shen, David Bo

    2011-01-01

    Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability for...

  16. Probability, random processes, and ergodic properties

    CERN Document Server

    Gray, Robert M

    2014-01-01

    In this new edition of this classic text, much of the material has been rearranged and revised for pedagogical reasons. Many classic inequalities and proofs are now incorporated into the text, and many citations have been added.

  17. Distributed Fusion Filtering in Networked Systems with Random Measurement Matrices and Correlated Noises

    Directory of Open Access Journals (Sweden)

    Raquel Caballero-Águila

    2015-01-01

    Full Text Available The distributed fusion state estimation problem is addressed for sensor network systems with random state transition matrix and random measurement matrices, which provide a unified framework to consider some network-induced random phenomena. The process noise and all the sensor measurement noises are assumed to be one-step autocorrelated and different sensor noises are one-step cross-correlated; also, the process noise and each sensor measurement noise are two-step cross-correlated. These correlation assumptions cover many practical situations, where the classical independence hypothesis is not realistic. Using an innovation methodology, local least-squares linear filtering estimators are recursively obtained at each sensor. The distributed fusion method is then used to form the optimal matrix-weighted sum of these local filters according to the mean squared error criterion. A numerical simulation example shows the accuracy of the proposed distributed fusion filtering algorithm and illustrates some of the network-induced stochastic uncertainties that can be dealt with in the current system model, such as sensor gain degradation, missing measurements, and multiplicative noise.

  18. Accuracy and uncertainty in random speckle modulation transfer function measurement of infrared focal plane arrays

    Science.gov (United States)

    Barnard, Kenneth J.; Jacobs, Eddie L.; Plummer, Philip J.

    2016-12-01

    This paper expands upon a previously reported random speckle technique for measuring the modulation transfer function of midwave infrared focal plane arrays by considering a number of factors that impact the accuracy of the estimated modulation transfer function. These factors arise from assumptions in the theoretical derivation and bias in the estimation procedure. Each factor is examined and guidelines are determined to maintain accuracy within 2% of the true value. The uncertainty of the measurement is found by applying a one-factor ANOVA analysis and confidence intervals are established for the results. The small magnitude of the confidence intervals indicates a very robust technique capable of distinguishing differences in modulation transfer function among focal plane arrays on the order of a few percent. This analysis directly indicates the high quality of the random speckle modulation transfer function measurement technique. The methodology is applied to a focal plane array and results are presented that emphasize the need for generating independent random speckle realizations to accurately assess measured values.

  19. A relative entropy method to measure non-exponential random data

    Energy Technology Data Exchange (ETDEWEB)

    Liang, Yingjie; Chen, Wen, E-mail: chenwen@hhu.edu.cn

    2015-01-23

    This paper develops a relative entropy method to measure non-exponential random data in conjunction with fractional order moment, logarithmic moment and tail statistics of Mittag–Leffler distribution. The distribution of non-exponential random data follows neither the exponential distribution nor exponential decay. The proposed strategy is validated by analyzing the experiment data, which are generated by Monte Carlo method using Mittag–Leffler distribution. Compared with the traditional Shannon entropy, the relative entropy method is simple to be implemented, and its corresponding relative entropies approximated by the fractional order moment, logarithmic moment and tail statistics can easily and accurately detect the non-exponential random data. - Highlights: • A relative entropy method is developed to measure non-exponential random data. • The fractional order moment, logarithmic moment and tail statistics are employed. • The three strategies of Mittag–Leffler distribution can be accurately established. • Compared with Shannon entropy, the relative entropy method is easy to be implemented.

  20. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  1. A Procedure for Accurately Measuring the Shaker Overturning Moment During Random Vibration Tests

    Science.gov (United States)

    Nayeri, Reza D.

    2011-01-01

    Motivation: For large system level random vibration tests, there may be some concerns about the shaker's capability for the overturning moment. It is the test conductor's responsibility to predict and monitor the overturning moment during random vibration tests. If the predicted moment is close to the shaker's capability, test conductor must measure the instantaneous moment at low levels and extrapolate to higher levels. That data will be used to decide whether it is safe to proceed to the next test level. Challenge: Kistler analog formulation for computing the real-time moment is only applicable to very limited cases in which we have 3 or 4 load cells installed at shaker interface with hardware. Approach: To overcome that limitation, a simple procedure was developed for computing the overturning moment time histories using the measured time histories of the individual load cells.

  2. Three-dimensional shape measurement using color random binary encoding pattern projection

    Science.gov (United States)

    Zhou, Pei; Zhu, Jiangping; Su, Xianyu; Jing, Hailong; Zhang, Xing

    2017-10-01

    Acquiring the three-dimensional (3-D) surface geometry of objects with a full-frame resolution is of great concern in many applications. This paper reports a 3-D measurement scheme based on single-frame pattern projection in the combination of random binary encoding and color encoding. Three random binary encoding patterns generated by a computer embedded in three channels of a color pattern lead to a color binary encoding pattern. Two color cameras with a stereo-vision arrangement simultaneously capture the measured scene under the proposed encoding structured illumination. From captured images, three encoding images are extracted and analyzed using the extended spatial-temporal correlation algorithm for 3-D reconstruction. Theoretical explanation and analysis concerning the encoding principle and reconstruction algorithm, followed by experiments for reconstructing 3-D geometry of stationary and dynamic scenes show the feasibility and practicality of the proposed method.

  3. Echocardiographic methods, quality review, and measurement accuracy in a randomized multicenter clinical trial of Marfan syndrome.

    Science.gov (United States)

    Selamet Tierney, Elif Seda; Levine, Jami C; Chen, Shan; Bradley, Timothy J; Pearson, Gail D; Colan, Steven D; Sleeper, Lynn A; Campbell, M Jay; Cohen, Meryl S; De Backer, Julie; Guey, Lin T; Heydarian, Haleh; Lai, Wyman W; Lewin, Mark B; Marcus, Edward; Mart, Christopher R; Pignatelli, Ricardo H; Printz, Beth F; Sharkey, Angela M; Shirali, Girish S; Srivastava, Shubhika; Lacro, Ronald V

    2013-06-01

    The Pediatric Heart Network is conducting a large international randomized trial to compare aortic root growth and other cardiovascular outcomes in 608 subjects with Marfan syndrome randomized to receive atenolol or losartan for 3 years. The authors report here the echocardiographic methods and baseline echocardiographic characteristics of the randomized subjects, describe the interobserver agreement of aortic measurements, and identify factors influencing agreement. Individuals aged 6 months to 25 years who met the original Ghent criteria and had body surface area-adjusted maximum aortic root diameter (ROOTmax) Z scores > 3 were eligible for inclusion. The primary outcome measure for the trial is the change over time in ROOTmaxZ score. A detailed echocardiographic protocol was established and implemented across 22 centers, with an extensive training and quality review process. Interobserver agreement for the aortic measurements was excellent, with intraclass correlation coefficients ranging from 0.921 to 0.989. Lower interobserver percentage error in ROOTmax measurements was independently associated (model R(2) = 0.15) with better image quality (P = .002) and later study reading date (P < .001). Echocardiographic characteristics of the randomized subjects did not differ by treatment arm. Subjects with ROOTmaxZ scores ≥ 4.5 (36%) were more likely to have mitral valve prolapse and dilation of the main pulmonary artery and left ventricle, but there were no differences in aortic regurgitation, aortic stiffness indices, mitral regurgitation, or left ventricular function compared with subjects with ROOTmaxZ scores < 4.5. The echocardiographic methodology, training, and quality review process resulted in a robust evaluation of aortic root dimensions, with excellent reproducibility. Copyright © 2013 American Society of Echocardiography. Published by Mosby, Inc. All rights reserved.

  4. Probability on real Lie algebras

    CERN Document Server

    Franz, Uwe

    2016-01-01

    This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.

  5. Nonprobability Web surveys to measure sexual behaviors and attitudes in the general population: a comparison with a probability sample interview survey.

    Science.gov (United States)

    Erens, Bob; Burkill, Sarah; Couper, Mick P; Conrad, Frederick; Clifton, Soazig; Tanton, Clare; Phelps, Andrew; Datta, Jessica; Mercer, Catherine H; Sonnenberg, Pam; Prah, Philip; Mitchell, Kirstin R; Wellings, Kaye; Johnson, Anne M; Copas, Andrew J

    2014-12-08

    Nonprobability Web surveys using volunteer panels can provide a relatively cheap and quick alternative to traditional health and epidemiological surveys. However, concerns have been raised about their representativeness. The aim was to compare results from different Web panels with a population-based probability sample survey (n=8969 aged 18-44 years) that used computer-assisted self-interview (CASI) for sensitive behaviors, the third British National Survey of Sexual Attitudes and Lifestyles (Natsal-3). Natsal-3 questions were included on 4 nonprobability Web panel surveys (n=2000 to 2099), 2 using basic quotas based on age and sex, and 2 using modified quotas based on additional variables related to key estimates. Results for sociodemographic characteristics were compared with external benchmarks and for sexual behaviors and opinions with Natsal-3. Odds ratios (ORs) were used to express differences between the benchmark data and each survey for each variable of interest. A summary measure of survey performance was the average absolute OR across variables. Another summary measure was the number of key estimates for which the survey differed significantly (at the 5% level) from the benchmarks. For sociodemographic variables, the Web surveys were less representative of the general population than Natsal-3. For example, for men, the average absolute OR for Natsal-3 was 1.14, whereas for the Web surveys the average absolute ORs ranged from 1.86 to 2.30. For all Web surveys, approximately two-thirds of the key estimates of sexual behaviors were different from Natsal-3 and the average absolute ORs ranged from 1.32 to 1.98. Differences were appreciable even for questions asked by CASI in Natsal-3. No single Web survey performed consistently better than any other did. Modified quotas slightly improved results for men, but not for women. Consistent with studies from other countries on less sensitive topics, volunteer Web panels provided appreciably biased estimates. The

  6. Nonprobability Web Surveys to Measure Sexual Behaviors and Attitudes in the General Population: A Comparison With a Probability Sample Interview Survey

    Science.gov (United States)

    Burkill, Sarah; Couper, Mick P; Conrad, Frederick; Clifton, Soazig; Tanton, Clare; Phelps, Andrew; Datta, Jessica; Mercer, Catherine H; Sonnenberg, Pam; Prah, Philip; Mitchell, Kirstin R; Wellings, Kaye; Johnson, Anne M; Copas, Andrew J

    2014-01-01

    Background Nonprobability Web surveys using volunteer panels can provide a relatively cheap and quick alternative to traditional health and epidemiological surveys. However, concerns have been raised about their representativeness. Objective The aim was to compare results from different Web panels with a population-based probability sample survey (n=8969 aged 18-44 years) that used computer-assisted self-interview (CASI) for sensitive behaviors, the third British National Survey of Sexual Attitudes and Lifestyles (Natsal-3). Methods Natsal-3 questions were included on 4 nonprobability Web panel surveys (n=2000 to 2099), 2 using basic quotas based on age and sex, and 2 using modified quotas based on additional variables related to key estimates. Results for sociodemographic characteristics were compared with external benchmarks and for sexual behaviors and opinions with Natsal-3. Odds ratios (ORs) were used to express differences between the benchmark data and each survey for each variable of interest. A summary measure of survey performance was the average absolute OR across variables. Another summary measure was the number of key estimates for which the survey differed significantly (at the 5% level) from the benchmarks. Results For sociodemographic variables, the Web surveys were less representative of the general population than Natsal-3. For example, for men, the average absolute OR for Natsal-3 was 1.14, whereas for the Web surveys the average absolute ORs ranged from 1.86 to 2.30. For all Web surveys, approximately two-thirds of the key estimates of sexual behaviors were different from Natsal-3 and the average absolute ORs ranged from 1.32 to 1.98. Differences were appreciable even for questions asked by CASI in Natsal-3. No single Web survey performed consistently better than any other did. Modified quotas slightly improved results for men, but not for women. Conclusions Consistent with studies from other countries on less sensitive topics, volunteer Web

  7. Variances as order parameter and complexity measure for random Boolean networks

    Energy Technology Data Exchange (ETDEWEB)

    Luque, Bartolo [Departamento de Matematica Aplicada y EstadIstica, Escuela Superior de Ingenieros Aeronauticos, Universidad Politecnica de Madrid, Plaza Cardenal Cisneros 3, Madrid 28040 (Spain); Ballesteros, Fernando J [Observatori Astronomic, Universitat de Valencia, Ed. Instituts d' Investigacio, Pol. La Coma s/n, E-46980 Paterna, Valencia (Spain); Fernandez, Manuel [Departamento de Matematica Aplicada y EstadIstica, Escuela Superior de Ingenieros Aeronauticos, Universidad Politecnica de Madrid, Plaza Cardenal Cisneros 3, Madrid 28040 (Spain)

    2005-02-04

    Several order parameters have been considered to predict and characterize the transition between ordered and disordered phases in random Boolean networks, such as the Hamming distance between replicas or the stable core, which have been successfully used. In this work, we propose a natural and clear new order parameter: the temporal variance. We compute its value analytically and compare it with the results of numerical experiments. Finally, we propose a complexity measure based on the compromise between temporal and spatial variances. This new order parameter and its related complexity measure can be easily applied to other complex systems.

  8. TESTING BRAND VALUE MEASUREMENT METHODS IN A RANDOM COEFFICIENT MODELING FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Szõcs Attila

    2014-07-01

    Full Text Available Our objective is to provide a framework for measuring brand equity, that is, the added value to the product endowed by the brand. Based on a demand and supply model, we propose a structural model that enables testing the structural effect of brand equity (demand side effect on brand value (supply side effect, using Monte Carlo simulation. Our main research question is which of the three brand value measurement methods (price premium, revenue premium and profit premium is more suitable from the perspective of the structural link between brand equity and brand value. Our model is based on recent developments in random coefficients model applications.

  9. Generalizations of SRB Measures to Nonautonomous, Random, and Infinite Dimensional Systems

    Science.gov (United States)

    Young, Lai-Sang

    2017-02-01

    We review some developments that are direct outgrowths of, or closely related to, the idea of SRB measures as introduced by Sinai, Ruelle and Bowen in the 1970s. These new directions of research include the emergence of strange attractors in periodically forced dynamical systems, random attractors in systems defined by stochastic differential equations, SRB measures for infinite dimensional systems including those defined by large classes of dissipative PDEs, quasi-static distributions for slowly varying time-dependent systems, and surviving distributions in leaky dynamical systems.

  10. Probability Analysis of a Quantum Computer

    OpenAIRE

    Einarsson, Göran

    2003-01-01

    The quantum computer algorithm by Peter Shor for factorization of integers is studied. The quantum nature of a QC makes its outcome random. The output probability distribution is investigated and the chances of a successful operation is determined

  11. Efficient probability sequence

    OpenAIRE

    Regnier, Eva

    2014-01-01

    A probability sequence is an ordered set of probability forecasts for the same event. Although single-period probabilistic forecasts and methods for evaluating them have been extensively analyzed, we are not aware of any prior work on evaluating probability sequences. This paper proposes an efficiency condition for probability sequences and shows properties of efficient forecasting systems, including memorylessness and increasing discrimination. These results suggest tests for efficiency and ...

  12. Efficient probability sequences

    OpenAIRE

    Regnier, Eva

    2014-01-01

    DRMI working paper A probability sequence is an ordered set of probability forecasts for the same event. Although single-period probabilistic forecasts and methods for evaluating them have been extensively analyzed, we are not aware of any prior work on evaluating probability sequences. This paper proposes an efficiency condition for probability sequences and shows properties of efficiency forecasting systems, including memorylessness and increasing discrimination. These res...

  13. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  14. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    Subjective probabilities play a central role in many economic decisions, and act as an immediate confound of inferences about behavior, unless controlled for. Several procedures to recover subjective probabilities have been proposed, but in order to recover the correct latent probability one must...

  15. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    Subjective probabilities play a central role in many economic decisions and act as an immediate confound of inferences about behavior, unless controlled for. Several procedures to recover subjective probabilities have been proposed, but in order to recover the correct latent probability one must ...

  16. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  17. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  18. Gendist: An R Package for Generated Probability Distribution Models.

    Directory of Open Access Journals (Sweden)

    Shaiful Anuar Abu Bakar

    Full Text Available In this paper, we introduce the R package gendist that computes the probability density function, the cumulative distribution function, the quantile function and generates random values for several generated probability distribution models including the mixture model, the composite model, the folded model, the skewed symmetric model and the arc tan model. These models are extensively used in the literature and the R functions provided here are flexible enough to accommodate various univariate distributions found in other R packages. We also show its applications in graphing, estimation, simulation and risk measurements.

  19. Gendist: An R Package for Generated Probability Distribution Models.

    Science.gov (United States)

    Abu Bakar, Shaiful Anuar; Nadarajah, Saralees; Absl Kamarul Adzhar, Zahrul Azmir; Mohamed, Ibrahim

    2016-01-01

    In this paper, we introduce the R package gendist that computes the probability density function, the cumulative distribution function, the quantile function and generates random values for several generated probability distribution models including the mixture model, the composite model, the folded model, the skewed symmetric model and the arc tan model. These models are extensively used in the literature and the R functions provided here are flexible enough to accommodate various univariate distributions found in other R packages. We also show its applications in graphing, estimation, simulation and risk measurements.

  20. Measuring edge importance: a quantitative analysis of the stochastic shielding approximation for random processes on graphs.

    Science.gov (United States)

    Schmidt, Deena R; Thomas, Peter J

    2014-04-17

    Mathematical models of cellular physiological mechanisms often involve random walks on graphs representing transitions within networks of functional states. Schmandt and Galán recently introduced a novel stochastic shielding approximation as a fast, accurate method for generating approximate sample paths from a finite state Markov process in which only a subset of states are observable. For example, in ion-channel models, such as the Hodgkin-Huxley or other conductance-based neural models, a nerve cell has a population of ion channels whose states comprise the nodes of a graph, only some of which allow a transmembrane current to pass. The stochastic shielding approximation consists of neglecting fluctuations in the dynamics associated with edges in the graph not directly affecting the observable states. We consider the problem of finding the optimal complexity reducing mapping from a stochastic process on a graph to an approximate process on a smaller sample space, as determined by the choice of a particular linear measurement functional on the graph. The partitioning of ion-channel states into conducting versus nonconducting states provides a case in point. In addition to establishing that Schmandt and Galán's approximation is in fact optimal in a specific sense, we use recent results from random matrix theory to provide heuristic error estimates for the accuracy of the stochastic shielding approximation for an ensemble of random graphs. Moreover, we provide a novel quantitative measure of the contribution of individual transitions within the reaction graph to the accuracy of the approximate process.

  1. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  2. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  3. Oxygen boundary crossing probabilities.

    Science.gov (United States)

    Busch, N A; Silver, I A

    1987-01-01

    The probability that an oxygen particle will reach a time dependent boundary is required in oxygen transport studies involving solution methods based on probability considerations. A Volterra integral equation is presented, the solution of which gives directly the boundary crossing probability density function. The boundary crossing probability is the probability that the oxygen particle will reach a boundary within a specified time interval. When the motion of the oxygen particle may be described as strongly Markovian, then the Volterra integral equation can be rewritten as a generalized Abel equation, the solution of which has been widely studied.

  4. Philosophy and probability

    CERN Document Server

    Childers, Timothy

    2013-01-01

    Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,

  5. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. In All Probability, Probability is not All

    Science.gov (United States)

    Helman, Danny

    2004-01-01

    The national lottery is often portrayed as a game of pure chance with no room for strategy. This misperception seems to stem from the application of probability instead of expectancy considerations, and can be utilized to introduce the statistical concept of expectation.

  7. The CU 2-D-MAX-DOAS instrument – Part 2: Raman scattering probability measurements and retrieval of aerosol optical properties

    Energy Technology Data Exchange (ETDEWEB)

    Ortega, Ivan; Coburn, Sean; Berg, Larry K.; Lantz, Kathy; Michalsky, Joseph; Ferrare, Richard A.; Hair, Johnathan W.; Hostetler, Chris A.; Volkamer, Rainer

    2016-01-01

    The multiannual global mean of aerosol optical depth at 550 nm (AOD550) over land is ~0.19, and that over oceans is ~0.13. About 45 % of the Earth surface shows AOD550 smaller than 0.1. There is a need for measurement techniques that are optimized to measure aerosol optical properties under low AOD conditions. We present an inherently calibrated retrieval (i.e., no need for radiance calibration) to simultaneously measure AOD and the aerosol phase function parameter, g, based on measurements of azimuth distributions of the Raman scattering probability (RSP), the near-absolute rotational Raman scattering (RRS) intensity. We employ radiative transfer model simulations to show that for solar azimuth RSP measurements at solar elevation and solar zenith angle (SZA) smaller than 80°, RSP is insensitive to the vertical distribution of aerosols and maximally sensitive to changes in AOD and g under near-molecular scattering conditions. The University of Colorado two-dimensional Multi-AXis Differential Optical Absorption Spectroscopy (CU 2-D-MAX-DOAS) instrument was deployed as part of the Two Column Aerosol Project (TCAP) at Cape Cod, MA, during the summer of 2012 to measure direct sun spectra and RSP from scattered light spectra at solar relative azimuth angles (SRAAs) between 5 and 170°. During two case study days with (1) high aerosol load (17 July, 0.3 < AOD430 < 0.6) and (2) near-molecular scattering conditions (22 July, AOD430 < 0.13) we compare RSP-based retrievals of AOD430 and g with data from a co-located CIMEL sun photometer, Multi-Filter Rotating Shadowband Radiometer (MFRSR), and an airborne High Spectral Resolution Lidar (HSRL-2). The average difference (relative to DOAS) for AOD430 is +0.012 ± 0.023 (CIMEL), -0.012 ± 0.024 (MFRSR), -0.011 ± 0.014 (HSRL-2), and +0.023 ± 0.013 (CIMELAOD - MFRSRAOD) and yields the following

  8. Transition Probabilities of Gd I

    Science.gov (United States)

    Bilty, Katherine; Lawler, J. E.; Den Hartog, E. A.

    2011-01-01

    Rare earth transition probabilities are needed within the astrophysics community to determine rare earth abundances in stellar photospheres. The current work is part an on-going study of rare earth element neutrals. Transition probabilities are determined by combining radiative lifetimes measured using time-resolved laser-induced fluorescence on a slow atom beam with branching fractions measured from high resolution Fourier transform spectra. Neutral rare earth transition probabilities will be helpful in improving abundances in cool stars in which a significant fraction of rare earths are neutral. Transition probabilities are also needed for research and development in the lighting industry. Rare earths have rich spectra containing 100's to 1000's of transitions throughout the visible and near UV. This makes rare earths valuable additives in Metal Halide - High Intensity Discharge (MH-HID) lamps, giving them a pleasing white light with good color rendering. This poster presents the work done on neutral gadolinium. We will report radiative lifetimes for 135 levels and transition probabilities for upwards of 1500 lines of Gd I. The lifetimes are reported to ±5% and the transition probabilities range from 5% for strong lines to 25% for weak lines. This work is supported by the National Science Foundation under grant CTS 0613277 and the National Science Foundation's REU program through NSF Award AST-1004881.

  9. Impedance measurement using a two-microphone, random-excitation method

    Science.gov (United States)

    Seybert, A. F.; Parrott, T. L.

    1978-01-01

    The feasibility of using a two-microphone, random-excitation technique for the measurement of acoustic impedance was studied. Equations were developed, including the effect of mean flow, which show that acoustic impedance is related to the pressure ratio and phase difference between two points in a duct carrying plane waves only. The impedances of a honeycomb ceramic specimen and a Helmholtz resonator were measured and compared with impedances obtained using the conventional standing-wave method. Agreement between the two methods was generally good. A sensitivity analysis was performed to pinpoint possible error sources and recommendations were made for future study. The two-microphone approach evaluated in this study appears to have some advantages over other impedance measuring techniques.

  10. Numerical Demultiplexing of Color Image Sensor Measurements via Non-linear Random Forest Modeling.

    Science.gov (United States)

    Deglint, Jason; Kazemzadeh, Farnoud; Cho, Daniel; Clausi, David A; Wong, Alexander

    2016-06-27

    The simultaneous capture of imaging data at multiple wavelengths across the electromagnetic spectrum is highly challenging, requiring complex and costly multispectral image devices. In this study, we investigate the feasibility of simultaneous multispectral imaging using conventional image sensors with color filter arrays via a novel comprehensive framework for numerical demultiplexing of the color image sensor measurements. A numerical forward model characterizing the formation of sensor measurements from light spectra hitting the sensor is constructed based on a comprehensive spectral characterization of the sensor. A numerical demultiplexer is then learned via non-linear random forest modeling based on the forward model. Given the learned numerical demultiplexer, one can then demultiplex simultaneously-acquired measurements made by the color image sensor into reflectance intensities at discrete selectable wavelengths, resulting in a higher resolution reflectance spectrum. Experimental results demonstrate the feasibility of such a method for the purpose of simultaneous multispectral imaging.

  11. Numerical Demultiplexing of Color Image Sensor Measurements via Non-linear Random Forest Modeling

    Science.gov (United States)

    Deglint, Jason; Kazemzadeh, Farnoud; Cho, Daniel; Clausi, David A.; Wong, Alexander

    2016-06-01

    The simultaneous capture of imaging data at multiple wavelengths across the electromagnetic spectrum is highly challenging, requiring complex and costly multispectral image devices. In this study, we investigate the feasibility of simultaneous multispectral imaging using conventional image sensors with color filter arrays via a novel comprehensive framework for numerical demultiplexing of the color image sensor measurements. A numerical forward model characterizing the formation of sensor measurements from light spectra hitting the sensor is constructed based on a comprehensive spectral characterization of the sensor. A numerical demultiplexer is then learned via non-linear random forest modeling based on the forward model. Given the learned numerical demultiplexer, one can then demultiplex simultaneously-acquired measurements made by the color image sensor into reflectance intensities at discrete selectable wavelengths, resulting in a higher resolution reflectance spectrum. Experimental results demonstrate the feasibility of such a method for the purpose of simultaneous multispectral imaging.

  12. Random walk-based similarity measure method for patterns in complex object

    Directory of Open Access Journals (Sweden)

    Liu Shihu

    2017-04-01

    Full Text Available This paper discusses the similarity of the patterns in complex objects. The complex object is composed both of the attribute information of patterns and the relational information between patterns. Bearing in mind the specificity of complex object, a random walk-based similarity measurement method for patterns is constructed. In this method, the reachability of any two patterns with respect to the relational information is fully studied, and in the case of similarity of patterns with respect to the relational information can be calculated. On this bases, an integrated similarity measurement method is proposed, and algorithms 1 and 2 show the performed calculation procedure. One can find that this method makes full use of the attribute information and relational information. Finally, a synthetic example shows that our proposed similarity measurement method is validated.

  13. Hitting probabilities for nonlinear systems of stochastic waves

    CERN Document Server

    Dalang, Robert C

    2015-01-01

    The authors consider a d-dimensional random field u = \\{u(t,x)\\} that solves a non-linear system of stochastic wave equations in spatial dimensions k \\in \\{1,2,3\\}, driven by a spatially homogeneous Gaussian noise that is white in time. They mainly consider the case where the spatial covariance is given by a Riesz kernel with exponent \\beta. Using Malliavin calculus, they establish upper and lower bounds on the probabilities that the random field visits a deterministic subset of \\mathbb{R}^d, in terms, respectively, of Hausdorff measure and Newtonian capacity of this set. The dimension that ap

  14. Robust state estimation for double pantographs with random missing measurements in high-speed railway

    DEFF Research Database (Denmark)

    Lu, Xiaobing; Liu, Zhigang; Wang, Yanbo

    2016-01-01

    Active control of pantograph could be performed to decrease the fluctuation in pantograph-catenary contact force (PCCF) in high-speed railway. However, it is difficult to obtain the states of the pantograph when state feedback control is implemented. And the measurements may randomly miss due......, the RRSEM is introduced to estimate the pantograph states based on the dynamic model. The simulation results indicate that the proposed RRSEM is able to accurately estimate the states of the leading pantograph (LP) and the trailing pantograph (TP)....

  15. Statistics and Probability

    Directory of Open Access Journals (Sweden)

    Laktineh Imad

    2010-04-01

    Full Text Available This ourse constitutes a brief introduction to probability applications in high energy physis. First the mathematical tools related to the diferent probability conepts are introduced. The probability distributions which are commonly used in high energy physics and their characteristics are then shown and commented. The central limit theorem and its consequences are analysed. Finally some numerical methods used to produce diferent kinds of probability distribution are presented. The full article (17 p. corresponding to this lecture is written in french and is provided in the proceedings of the book SOS 2008.

  16. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  17. Invariant measures and error bounds for random walks in the quarter-plane based on sums of geometric terms

    NARCIS (Netherlands)

    Chen, Y.; Boucherie, Richardus J.; Goseling, Jasper

    2016-01-01

    We consider homogeneous random walks in the quarter-plane. The necessary conditions which characterize random walks of which the invariant measure is a sum of geometric terms are provided in Chen et al. (arXiv:1304.3316, 2013, Probab Eng Informational Sci 29(02):233–251, 2015). Based on these

  18. Lévy processes in free probability

    OpenAIRE

    Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen

    2002-01-01

    This is the continuation of a previous article that studied the relationship between the classes of infinitely divisible probability measures in classical and free probability, respectively, via the Bercovici–Pata bijection. Drawing on the results of the preceding article, the present paper outlines recent developments in the theory of Lévy processes in free probability.

  19. Measuring Edge Importance: A Quantitative Analysis of the Stochastic Shielding Approximation for Random Processes on Graphs

    Science.gov (United States)

    2014-01-01

    Mathematical models of cellular physiological mechanisms often involve random walks on graphs representing transitions within networks of functional states. Schmandt and Galán recently introduced a novel stochastic shielding approximation as a fast, accurate method for generating approximate sample paths from a finite state Markov process in which only a subset of states are observable. For example, in ion-channel models, such as the Hodgkin–Huxley or other conductance-based neural models, a nerve cell has a population of ion channels whose states comprise the nodes of a graph, only some of which allow a transmembrane current to pass. The stochastic shielding approximation consists of neglecting fluctuations in the dynamics associated with edges in the graph not directly affecting the observable states. We consider the problem of finding the optimal complexity reducing mapping from a stochastic process on a graph to an approximate process on a smaller sample space, as determined by the choice of a particular linear measurement functional on the graph. The partitioning of ion-channel states into conducting versus nonconducting states provides a case in point. In addition to establishing that Schmandt and Galán’s approximation is in fact optimal in a specific sense, we use recent results from random matrix theory to provide heuristic error estimates for the accuracy of the stochastic shielding approximation for an ensemble of random graphs. Moreover, we provide a novel quantitative measure of the contribution of individual transitions within the reaction graph to the accuracy of the approximate process. PMID:24742077

  20. Bayesian Nonparametric Regression Analysis of Data with Random Effects Covariates from Longitudinal Measurements

    KAUST Repository

    Ryu, Duchwan

    2010-09-28

    We consider nonparametric regression analysis in a generalized linear model (GLM) framework for data with covariates that are the subject-specific random effects of longitudinal measurements. The usual assumption that the effects of the longitudinal covariate processes are linear in the GLM may be unrealistic and if this happens it can cast doubt on the inference of observed covariate effects. Allowing the regression functions to be unknown, we propose to apply Bayesian nonparametric methods including cubic smoothing splines or P-splines for the possible nonlinearity and use an additive model in this complex setting. To improve computational efficiency, we propose the use of data-augmentation schemes. The approach allows flexible covariance structures for the random effects and within-subject measurement errors of the longitudinal processes. The posterior model space is explored through a Markov chain Monte Carlo (MCMC) sampler. The proposed methods are illustrated and compared to other approaches, the "naive" approach and the regression calibration, via simulations and by an application that investigates the relationship between obesity in adulthood and childhood growth curves. © 2010, The International Biometric Society.

  1. Random walk hierarchy measure: What is more hierarchical, a chain, a tree or a star?

    CERN Document Server

    Czégel, Dániel

    2015-01-01

    Signs of hierarchy are prevalent in a wide range of systems in nature and society. One of the key problems is quantifying the importance of hierarchical organisation in the structure of the network representing the interactions or connections between the fundamental units of the studied system. Although a number of notable methods are already available, their vast majority is treating all directed acyclic graphs as already maximally hierarchical. Here we propose a hierarchy measure based on random walks on the network. The novelty of our approach is that directed trees corresponding to multi level pyramidal structures obtain higher hierarchy scores compared to directed chains and directed stars. Furthermore, in the thermodynamic limit the hierarchy measure of regular trees is converging to a well defined limit depending only on the branching number. When applied to real networks, our method is computationally very effective, as the result can be evaluated with arbitrary precision by subsequent multiplications...

  2. Errors due to random noise in velocity measurement using incoherent-scatter radar

    Directory of Open Access Journals (Sweden)

    P. J. S. Williams

    Full Text Available The random-noise errors involved in measuring the Doppler shift of an 'incoherent-scatter' spectrum are predicted theoretically for all values of Te/Ti from 1.0 to 3.0. After correction has been made for the effects of convolution during transmission and reception and the additional errors introduced by subtracting the average of the background gates, the rms errors can be expressed by a simple semi-empirical formula. The observed errors are determined from a comparison of simultaneous EISCAT measurements using an identical pulse code on several adjacent frequencies. The plot of observed versus predicted error has a slope of 0.991 and a correlation coefficient of 99.3%. The prediction also agrees well with the mean of the error distribution reported by the standard EISCAT analysis programme.

  3. Errors due to random noise in velocity measurement using incoherent-scatter radar

    Directory of Open Access Journals (Sweden)

    P. J. S. Williams

    1996-12-01

    Full Text Available The random-noise errors involved in measuring the Doppler shift of an 'incoherent-scatter' spectrum are predicted theoretically for all values of Te/Ti from 1.0 to 3.0. After correction has been made for the effects of convolution during transmission and reception and the additional errors introduced by subtracting the average of the background gates, the rms errors can be expressed by a simple semi-empirical formula. The observed errors are determined from a comparison of simultaneous EISCAT measurements using an identical pulse code on several adjacent frequencies. The plot of observed versus predicted error has a slope of 0.991 and a correlation coefficient of 99.3%. The prediction also agrees well with the mean of the error distribution reported by the standard EISCAT analysis programme.

  4. A Novel Approach to Probability

    CERN Document Server

    Kafri, Oded

    2016-01-01

    When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...

  5. Difficulties related to Probabilities

    OpenAIRE

    Rosinger, Elemer Elad

    2010-01-01

    Probability theory is often used as it would have the same ontological status with, for instance, Euclidean Geometry or Peano Arithmetics. In this regard, several highly questionable aspects of probability theory are mentioned which have earlier been presented in two arxiv papers.

  6. Dynamic update with probabilities

    NARCIS (Netherlands)

    Van Benthem, Johan; Gerbrandy, Jelle; Kooi, Barteld

    2009-01-01

    Current dynamic-epistemic logics model different types of information change in multi-agent scenarios. We generalize these logics to a probabilistic setting, obtaining a calculus for multi-agent update with three natural slots: prior probability on states, occurrence probabilities in the relevant

  7. Elements of quantum probability

    NARCIS (Netherlands)

    Kummerer, B.; Maassen, H.

    1996-01-01

    This is an introductory article presenting some basic ideas of quantum probability. From a discussion of simple experiments with polarized light and a card game we deduce the necessity of extending the body of classical probability theory. For a class of systems, containing classical systems with

  8. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  9. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  10. Noninvasive Techniques for Blood Pressure Measurement Are Not a Reliable Alternative to Direct Measurement: A Randomized Crossover Trial in ICU

    Directory of Open Access Journals (Sweden)

    Sara Ribezzo

    2014-01-01

    Full Text Available Introduction. Noninvasive blood pressure (NIBP monitoring methods are widely used in critically ill patients despite poor evidence of their accuracy. The erroneous interpretations of blood pressure (BP may lead to clinical errors. Objectives. To test the accuracy and reliability of aneroid (ABP and oscillometric (OBP devices compared to the invasive BP (IBP monitoring in an ICU population. Materials and Methods. Fifty adult patients (200 comparisons were included in a randomized crossover trial. BP was recorded simultaneously by IBP and either by ABP or by OBP, taking IBP as gold standard. Results. Compared with ABP, IBP systolic values were significantly higher (mean difference ± standard deviation 9.74±13.8; P<0.0001. Both diastolic (-5.13±7.1; P<0.0001 and mean (-2.14±7.1; P=0.0033 IBP were instead lower. Compared with OBP, systolic (10.80±14.9; P<0.0001 and mean (5.36±7.1; P<0.0001 IBP were higher, while diastolic IBP (-3.62±6.0; P<0.0001 was lower. Bland-Altman plots showed wide limits of agreement in both NIBP-IBP comparisons. Conclusions. BP measurements with different devices produced significantly different results. Since in critically ill patients the importance of BP readings is often crucial, noninvasive techniques cannot be regarded as reliable alternatives to direct measurements.

  11. Probability density of quantum expectation values

    Energy Technology Data Exchange (ETDEWEB)

    Campos Venuti, L., E-mail: lcamposv@usc.edu; Zanardi, P.

    2013-10-30

    We consider the quantum expectation value A=〈ψ|A|ψ〉 of an observable A over the state |ψ〉. We derive the exact probability distribution of A seen as a random variable when |ψ〉 varies over the set of all pure states equipped with the Haar-induced measure. To illustrate our results we compare the exact predictions for few concrete examples with the concentration bounds obtained using Levy's lemma. We also comment on the relevance of the central limit theorem and finally draw some results on an alternative statistical mechanics based on the uniform measure on the energy shell. - Highlights: • We compute the probability distribution of quantum expectation values for states sampled uniformly. • As a special case we consider in some detail the degenerate case where A is a one-dimensional projector. • We compare the concentration results obtained using Levy's lemma with the exact values obtained using our exact formulae. • We comment on the possibility of a Central Limit Theorem and show approach to Gaussian for a few physical operators. • Some implications of our results for the so-called “Quantum Microcanonical Equilibration” (Refs. [5–9]) are derived.

  12. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  13. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  14. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  15. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  16. Elements of quantum probability

    OpenAIRE

    Kummerer, B.; Maassen, Hans

    1996-01-01

    This is an introductory article presenting some basic ideas of quantum probability. From a discussion of simple experiments with polarized light and a card game we deduce the necessity of extending the body of classical probability theory. For a class of systems, containing classical systems with finitely many states, a probabilistic model is developed. It can describe, in particular, the polarization experiments. Some examples of ‘quantum coin tosses’ are discussed, closely related to V.F.R....

  17. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  18. Statistics of adaptive optics speckles: From probability cloud to probability density function

    OpenAIRE

    Yaitskova, Natalia; Gladysz, Szymon

    2016-01-01

    The complex amplitude in the focal plane of adaptive optics system is modelled as an elliptical complex random variable. The geometrical properties of the probability density function of such variable relate directly to the statistics of the residual phase. Building solely on the twodimensional geometry, the expression for the probability density function of speckle intensity is derived.

  19. Cluster-randomized trial of a German leisure-based alcohol peer education measure.

    Science.gov (United States)

    Bühler, Anneke; Thrul, Johannes; Strüber, Evelin; Orth, Boris

    2016-06-01

    Because of scarce research, the effectiveness of substance abuse prevention in leisure settings remains unclear. In this study, we evaluated the effectiveness of a peer-led educational prevention measure with adolescent groups in unstructured leisure settings, which is a component of the complex German nationwide 'Na Toll!' campaign. Using a cluster-randomized two-group post-test-only design, we tested whether the measure influenced component-specific goals, namely risk and protective factors of alcohol use such as risk perception, group communication and resistance self-efficacy. The sample consisted of 738 adolescents aged 12-20 years who were recruited at recreational locations and completed an online questionnaire 1 week after the peer education or recruitment event. Sixty-three percent of the sample participated in the 3-month follow-up assessment. Data analysis revealed post-test effects on risk perception, perceived norm of alcohol communication in the peer group and resistance self-efficacy. Follow-up effects were not observed, with the exception of a significant effect on risk perception. In conclusion, the peer-led education measure in leisure settings might have supported the adolescents in this study to perceive alcohol-related risks, to feel accepted to talk about alcohol problems with their friends and to be more assertive in resisting alcohol use in the short term. © The Author 2015. Published by Oxford University Press.

  20. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  1. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  2. Wave localization of linear gravity waves in shallow water: Global measurements and agreement between random matrix theory and experiments

    Science.gov (United States)

    Schmessane, Andrea; Laboratory of matter out equilibrium Team

    2012-11-01

    Wave localization explains how a perturbation is trapped by the randomness present in a propagation medium. As it propagates, the localized wave amplitude decreases strongly by multiple internal reflections with randomly positioned scatterers, effectively trapping the perturbation inside the random region. The characteristic length where a localized wave is propagated before being extinguish by randomness is called localization length. We carried experiments in a quasi-onedimensional channel with random bottom in a shallow water regime for surface gravity water waves, using a Perfilometry Fourier Transform method, which enables us to obtain global surface measurements. We discuss keys aspects of the control of variables, the experimental setup and the implementation of the measurement method. Thus, we can control, measure and evaluate fundamental variables present in the localization phenomenon such as the type of randomness, scattering intensity and sample length, which allows us to characterize wave localization. We use the scattering matrix method to compare the experimental measurements with theoretical and numerical predictions, using the Lyapunov exponent of the scattering matrix, and discuss their agreement. Conicyt

  3. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  4. Quantum randomness and unpredictability

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, Gregg [Quantum Communication and Measurement Laboratory, Department of Electrical and Computer Engineering and Division of Natural Science and Mathematics, Boston University, Boston, MA (United States)

    2017-06-15

    Quantum mechanics is a physical theory supplying probabilities corresponding to expectation values for measurement outcomes. Indeed, its formalism can be constructed with measurement as a fundamental process, as was done by Schwinger, provided that individual measurements outcomes occur in a random way. The randomness appearing in quantum mechanics, as with other forms of randomness, has often been considered equivalent to a form of indeterminism. Here, it is argued that quantum randomness should instead be understood as a form of unpredictability because, amongst other things, indeterminism is not a necessary condition for randomness. For concreteness, an explication of the randomness of quantum mechanics as the unpredictability of quantum measurement outcomes is provided. Finally, it is shown how this view can be combined with the recently introduced view that the very appearance of individual quantum measurement outcomes can be grounded in the Plenitude principle of Leibniz, a principle variants of which have been utilized in physics by Dirac and Gell-Mann in relation to the fundamental processes. This move provides further support to Schwinger's ''symbolic'' derivation of quantum mechanics from measurement. (copyright 2016 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  5. Entropy in probability and statistics

    Energy Technology Data Exchange (ETDEWEB)

    Rolke, W.A.

    1992-01-01

    The author develops a theory of entropy, where entropy is defined as the Legendre-Fenchel transform of the logarithmic moment generating function of a probability measure on a Banach space. A variety of properties relating the probability measure and its entropy are proven. It is shown that the entropy of a large class of stochastic processes can be approximated by the entropies of the finite-dimensional distributions of the process. For several types of measures the author finds explicit formulas for the entropy, for example for stochastic processes with independent increments and for Gaussian processes. For the entropy of Markov chains, evaluated at the observations of the process, the author proves a central limit theorem. Theorems relating weak convergence of probability measures on a finite dimensional space and pointwise convergence of their entropies are developed and then used to give a new proof of Donsker's theorem. Finally the use of entropy in statistics is discussed. The author shows the connection between entropy and Kullback's minimum discrimination information. A central limit theorem yields a test for the independence of a sequence of observations.

  6. Methods of Reverberation Mapping. I. Time-lag Determination by Measures of Randomness

    Energy Technology Data Exchange (ETDEWEB)

    Chelouche, Doron; Pozo-Nuñez, Francisco [Department of Physics, Faculty of Natural Sciences, University of Haifa, Haifa 3498838 (Israel); Zucker, Shay, E-mail: doron@sci.haifa.ac.il, E-mail: francisco.pozon@gmail.com, E-mail: shayz@post.tau.ac.il [Department of Geosciences, Raymond and Beverly Sackler Faculty of Exact Sciences, Tel Aviv University, Tel Aviv 6997801 (Israel)

    2017-08-01

    A class of methods for measuring time delays between astronomical time series is introduced in the context of quasar reverberation mapping, which is based on measures of randomness or complexity of the data. Several distinct statistical estimators are considered that do not rely on polynomial interpolations of the light curves nor on their stochastic modeling, and do not require binning in correlation space. Methods based on von Neumann’s mean-square successive-difference estimator are found to be superior to those using other estimators. An optimized von Neumann scheme is formulated, which better handles sparsely sampled data and outperforms current implementations of discrete correlation function methods. This scheme is applied to existing reverberation data of varying quality, and consistency with previously reported time delays is found. In particular, the size–luminosity relation of the broad-line region in quasars is recovered with a scatter comparable to that obtained by other works, yet with fewer assumptions made concerning the process underlying the variability. The proposed method for time-lag determination is particularly relevant for irregularly sampled time series, and in cases where the process underlying the variability cannot be adequately modeled.

  7. Methods of Reverberation Mapping. I. Time-lag Determination by Measures of Randomness

    Science.gov (United States)

    Chelouche, Doron; Pozo-Nuñez, Francisco; Zucker, Shay

    2017-08-01

    A class of methods for measuring time delays between astronomical time series is introduced in the context of quasar reverberation mapping, which is based on measures of randomness or complexity of the data. Several distinct statistical estimators are considered that do not rely on polynomial interpolations of the light curves nor on their stochastic modeling, and do not require binning in correlation space. Methods based on von Neumann’s mean-square successive-difference estimator are found to be superior to those using other estimators. An optimized von Neumann scheme is formulated, which better handles sparsely sampled data and outperforms current implementations of discrete correlation function methods. This scheme is applied to existing reverberation data of varying quality, and consistency with previously reported time delays is found. In particular, the size-luminosity relation of the broad-line region in quasars is recovered with a scatter comparable to that obtained by other works, yet with fewer assumptions made concerning the process underlying the variability. The proposed method for time-lag determination is particularly relevant for irregularly sampled time series, and in cases where the process underlying the variability cannot be adequately modeled.

  8. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  9. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  10. Estimating tail probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Carr, D.B.; Tolley, H.D.

    1982-12-01

    This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.

  11. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  12. Aggregate and Individual Replication Probability within an Explicit Model of the Research Process

    Science.gov (United States)

    Miller, Jeff; Schwarz, Wolf

    2011-01-01

    We study a model of the research process in which the true effect size, the replication jitter due to changes in experimental procedure, and the statistical error of effect size measurement are all normally distributed random variables. Within this model, we analyze the probability of successfully replicating an initial experimental result by…

  13. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    compensation (a $10 lottery ) on Amazon Mechanical Turk, an online platform hosted on Amazon.com [31]. All of the participants stated that they were native...Probability, Statistics and Truth (Allen & Unwin, London). 4. de Finetti B (1970) Logical foundations and measurement of subjective probabil- ity...F.P. Ramsey: Philosophical Papers, ed Mellor DH (Cam- bridge University Press, Cambridge). 7. Savage L (1972) The Foundations of Statistics (Dover

  14. Randomized Item Response Theory Models

    NARCIS (Netherlands)

    Fox, Gerardus J.A.

    2005-01-01

    The randomized response (RR) technique is often used to obtain answers on sensitive questions. A new method is developed to measure latent variables using the RR technique because direct questioning leads to biased results. Within the RR technique is the probability of the true response modeled by

  15. Ethanol, not detectably metabolized in brain, significantly reduces brain metabolism, probably via action at specific GABA(A) receptors and has measureable metabolic effects at very low concentrations.

    Science.gov (United States)

    Rae, Caroline D; Davidson, Joanne E; Maher, Anthony D; Rowlands, Benjamin D; Kashem, Mohammed A; Nasrallah, Fatima A; Rallapalli, Sundari K; Cook, James M; Balcar, Vladimir J

    2014-04-01

    Ethanol is a known neuromodulatory agent with reported actions at a range of neurotransmitter receptors. Here, we measured the effect of alcohol on metabolism of [3-¹³C]pyruvate in the adult Guinea pig brain cortical tissue slice and compared the outcomes to those from a library of ligands active in the GABAergic system as well as studying the metabolic fate of [1,2-¹³C]ethanol. Analyses of metabolic profile clusters suggest that the significant reductions in metabolism induced by ethanol (10, 30 and 60 mM) are via action at neurotransmitter receptors, particularly α4β3δ receptors, whereas very low concentrations of ethanol may produce metabolic responses owing to release of GABA via GABA transporter 1 (GAT1) and the subsequent interaction of this GABA with local α5- or α1-containing GABA(A)R. There was no measureable metabolism of [1,2-¹³C]ethanol with no significant incorporation of ¹³C from [1,2-¹³C]ethanol into any measured metabolite above natural abundance, although there were measurable effects on total metabolite sizes similar to those seen with unlabelled ethanol. © 2013 International Society for Neurochemistry.

  16. Calculation of the ultracold neutron upscattering loss probability in fluid walled storage bottles using experimental measurements of the liquid thermomechanical properties of fomblin

    Science.gov (United States)

    Lamoreaux, S. K.; Golub, R.

    2002-10-01

    Presently, the most accurate values of the free neutron beta-decay lifetime result from measurements using fluid-coated ultacold neutron (UCN) storage bottles. The purpose of this work is to investigate the temperature-dependent UCN loss rate from these storage systems. To verify that the surface properites of fomblin films are the same as the bulk properties, we present experimental measurements of the properties of a liquid ``fomblin'' surface obtained by the quasielastic scattering of laser light. The properties include the surface tension and viscosity as functions of temperature. The results are compared to measurements of the bulk fluid properties. We then calculate the upscattering rate of UCNs from thermally excited surface capillary waves on the liquid surface and compare the results to experimental measurements of the UCN lifetime in fomblin-fluid-walled UCN storage bottles, and show that the excess storage loss rate for UCN energies near the fomblin potential can be explained. The rapid temperature dependence of the fomblin storage lifetime is explained by our analysis.

  17. Matrix-Specific Method Validation of an Automated Most-Probable-Number System for Use in Measuring Bacteriological Quality of Grade "A" Milk Products.

    Science.gov (United States)

    Lindemann, Samantha; Kmet, Matthew; Reddy, Ravinder; Uhlig, Steffen

    2016-11-01

    The U.S. Food and Drug Administration (FDA) oversees a long-standing cooperative federal and state milk sanitation program that uses the grade "A" Pasteurized Milk Ordinance standards to maintain the safety of grade "A" milk sold in the United States. The Pasteurized Milk Ordinance requires that grade "A" milk samples be tested using validated total aerobic bacterial and coliform count methods. The objective of this project was to conduct an interlaboratory method validation study to compare performance of a film plate method with an automated most-probable-number method for total aerobic bacterial and coliform counts, using statistical approaches from international data standards. The matrix-specific validation study was administered concurrently with the FDA's annual milk proficiency test to compare method performance in five milk types. Eighteen analysts from nine laboratories analyzed test portions from 12 samples in triplicate. Statistics, including mean bias and matrix standard deviation, were calculated. Sample-specific bias of the alternative method for total aerobic count suggests that there are no large deviations within the population of samples considered. Based on analysis of 648 data points, mean bias of the alternative method across milk samples for total aerobic count was 0.013 log CFU/ml and the confidence interval for mean deviation was -0.066 to 0.009 log CFU/ml. These results indicate that the mean difference between the selected methods is small and not statistically significant. Matrix standard deviation was 0.077 log CFU/ml, showing that there is a low risk for large sample-specific bias based on milk matrix. Mean bias of the alternative method was -0.160 log CFU/ml for coliform count data. The 95% confidence interval was -0.210 to -0.100 log CFU/ml, indicating that mean deviation is significantly different from zero. The standard deviation of the sample-specific bias for coliform data was 0.033 log CFU/ml, indicating no significant effect of

  18. Probability Theory as Logic: Data Assimilation for Multiple Source Reconstruction

    Science.gov (United States)

    Yee, Eugene

    2012-03-01

    Probability theory as logic (or Bayesian probability theory) is a rational inferential methodology that provides a natural and logically consistent framework for source reconstruction. This methodology fully utilizes the information provided by a limited number of noisy concentration data obtained from a network of sensors and combines it in a consistent manner with the available prior knowledge (mathematical representation of relevant physical laws), hence providing a rigorous basis for the assimilation of this data into models of atmospheric dispersion for the purpose of contaminant source reconstruction. This paper addresses the application of this framework to the reconstruction of contaminant source distributions consisting of an unknown number of localized sources, using concentration measurements obtained from a sensor array. To this purpose, Bayesian probability theory is used to formulate the full joint posterior probability density function for the parameters of the unknown source distribution. A simulated annealing algorithm, applied in conjunction with a reversible-jump Markov chain Monte Carlo technique, is used to draw random samples of source distribution models from the posterior probability density function. The methodology is validated against a real (full-scale) atmospheric dispersion experiment involving a multiple point source release.

  19. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  20. Huygens' foundations of probability

    NARCIS (Netherlands)

    Freudenthal, Hans

    It is generally accepted that Huygens based probability on expectation. The term “expectation,” however, stems from Van Schooten's Latin translation of Huygens' treatise. A literal translation of Huygens' Dutch text shows more clearly what Huygens actually meant and how he proceeded.

  1. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  2. Probably Almost Bayes Decisions

    DEFF Research Database (Denmark)

    Anoulova, S.; Fischer, Paul; Poelt, S.

    1996-01-01

    In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...

  3. Univariate Probability Distributions

    Science.gov (United States)

    Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.

    2012-01-01

    We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…

  4. The Theory of Probability

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 3; Issue 4. The Theory of Probability. Andrei Nikolaevich Kolmogorov. Classics Volume 3 Issue 4 April 1998 pp 103-112. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/003/04/0103-0112. Author Affiliations.

  5. Probability Theory Without Tears!

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 2. Probability Theory Without Tears! S Ramasubramanian. Book Review Volume 1 Issue 2 February 1996 pp 115-116. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/001/02/0115-0116 ...

  6. probably mostly white

    African Journals Online (AJOL)

    Willem Scholtz

    internet – the (probably mostly white) public's interest in the so-called Border War is ostensibly at an all-time high. By far most of the publications are written by ex- ... understanding of this very important episode in the history of Southern Africa. It was, therefore, with some anticipation that one waited for this book, which.

  7. the theory of probability

    Indian Academy of Sciences (India)

    important practical applications in statistical quality control. Of a similar kind are the laws of probability for the scattering of missiles, which are basic in the ..... deviations for different ranges for each type of gun and of shell are found empirically in firing practice on an artillery range. But the subsequent solution of all possible ...

  8. On Probability Domains

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2010-12-01

    Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.

  9. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  10. Essays on probability elicitation scoring rules

    Science.gov (United States)

    Firmino, Paulo Renato A.; dos Santos Neto, Ademir B.

    2012-10-01

    In probability elicitation exercises it has been usual to considerer scoring rules (SRs) to measure the performance of experts when inferring about a given unknown, Θ, for which the true value, θ*, is (or will shortly be) known to the experimenter. Mathematically, SRs quantify the discrepancy between f(θ) (the distribution reflecting the expert's uncertainty about Θ) and d(θ), a zero-one indicator function of the observation θ*. Thus, a remarkable characteristic of SRs is to contrast expert's beliefs with the observation θ*. The present work aims at extending SRs concepts and formulas for the cases where Θ is aleatory, highlighting advantages of goodness-of-fit and entropy-like measures. Conceptually, it is argued that besides of evaluating the personal performance of the expert, SRs may also play a role when comparing the elicitation processes adopted to obtain f(θ). Mathematically, it is proposed to replace d(θ) by g(θ), the distribution that model the randomness of Θ, and do also considerer goodness-of-fit and entropylike metrics, leading to SRs that measure the adherence of f(θ) to g(θ). The implications of this alternative perspective are discussed and illustrated by means of case studies based on the simulation of controlled experiments. The usefulness of the proposed approach for evaluating the performance of experts and elicitation processes is investigated.

  11. Probability learning and Piagetian probability conceptions in children 5 to 12 years old.

    Science.gov (United States)

    Kreitler, S; Zigler, E; Kreitler, H

    1989-11-01

    This study focused on the relations between performance on a three-choice probability-learning task and conceptions of probability as outlined by Piaget concerning mixture, normal distribution, random selection, odds estimation, and permutations. The probability-learning task and four Piagetian tasks were administered randomly to 100 male and 100 female, middle SES, average IQ children in three age groups (5 to 6, 8 to 9, and 11 to 12 years old) from different schools. Half the children were from Middle Eastern backgrounds, and half were from European or American backgrounds. As predicted, developmental level of probability thinking was related to performance on the probability-learning task. The more advanced the child's probability thinking, the higher his or her level of maximization and hypothesis formulation and testing and the lower his or her level of systematically patterned responses. The results suggest that the probability-learning and Piagetian tasks assess similar cognitive skills and that performance on the probability-learning task reflects a variety of probability concepts.

  12. Pseudo-random Spray Release to Measure World-wide Transfer Functions of Cloud Albedo Control.

    Science.gov (United States)

    Salter, Stephen

    2010-05-01

    Institute for Energy Systems, School of Engineering, University of Edinburgh. S.Salter@ed.ac.uk Previous climate models of Latham's proposal to reverse global warming by using sub-micron sea spray to increase cloud albedo have used a variety of spray patterns. Kettles forced CCN concentration to be 375/cm3 everywhere. Rasch et al used the 20% and 70% most susceptible regions. Bala and Caldeira used an even spread. Jones et al. concentrated spray in the 3.3% oceans with the highest susceptibility All used the same rate through the year. We want to choose a scheme for a climate-modelling experiment designed to identify simultaneously the effects of cloud albedo control at various seasons of the year from spray at all regions of the world on climates of all other regions the world. In particular we want to know seasons and spray places which might have an undesirable effect on precipitation. The spray systems in various regions of a numerical climate model will be modulated on an off with different but known pseudo-random sequences and a selection of seasons. The mean value of the resulting weather records of the parameters of interest, mainly temperature and water run-off, at each region will be subtracted from each value of the record so as to give just the alternating component with an average value of zero. This will be correlated with each of the chosen pseudo-random sequences to give the magnitude and polarity of the effect of a treatment at each input area and selected seasons of the year with the resulting effects on all regions. By doing a time-shifted correlation we can account for phase-shift and time delay. The signal-to-noise ratio should improve with the square root of the analysis time and so we may be able to measure the transfer function with quite a small stimulus. The results of a Mathcad simulation of the process with statistical distributions approximating to natural variations temperature and precipitation show that a single run of a climate

  13. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  14. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    reveals the computational intuition lying behind the mathematics. In the second part of the thesis we provide an operational reading of continuous valuations on certain domains (the distributive concrete domains of Kahn and Plotkin) through the model of probabilistic event structures. Event structures......Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particular...... there is no categorical distributive law between them. We introduce the powerdomain of indexed valuations which modifies the usual probabilistic powerdomain to take more detailed account of where probabilistic choices are made. We show the existence of a distributive law between the powerdomain of indexed valuations...

  15. Waste Package Misload Probability

    Energy Technology Data Exchange (ETDEWEB)

    J.K. Knudsen

    2001-11-20

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a.

  16. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  17. Superpositions of probability distributions.

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=sigma;{2} play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  18. Scale-invariant transition probabilities in free word association trajectories

    Directory of Open Access Journals (Sweden)

    Martin Elias Costa

    2009-09-01

    Full Text Available Free-word association has been used as a vehicle to understand the organization of human thoughts. The original studies relied mainly on qualitative assertions, yielding the widely intuitive notion that trajectories of word associations are structured, yet considerably more random than organized linguistic text. Here we set to determine a precise characterization of this space, generating a large number of word association trajectories in a web implemented game. We embedded the trajectories in the graph of word co-occurrences from a linguistic corpus. To constrain possible transport models we measured the memory loss and the cycling probability. These two measures could not be reconciled by a bounded diffusive model since the cycling probability was very high (16 % of order-2 cycles implying a majority of short-range associations whereas the memory loss was very rapid (converging to the asymptotic value in ∼ 7 steps which, in turn, forced a high fraction of long-range associations. We show that memory loss and cycling probabilities of free word association trajectories can be simultaneously accounted by a model in which transitions are determined by a scale invariant probability distribution.

  19. Concurrency meets probability: theory and practice (abstract)

    NARCIS (Netherlands)

    Katoen, Joost P.

    Treating random phenomena in concurrency theory has a long tradition. Petri nets [18, 10] and process algebras [14] have been extended with probabilities. The same applies to behavioural semantics such as strong and weak (bi)simulation [1], and testing pre-orders [5]. Beautiful connections between

  20. Sampling, Probability Models and Statistical Reasoning -RE ...

    Indian Academy of Sciences (India)

    eligible voters who support a particular political party. A random sample of size n is selected from this population and suppose k voters support this party. What is a good estimate of the required proportion? How do we obtain a probability model for the experi- ment just conducted? Let us examine the following simple ex-.

  1. Structural Minimax Probability Machine.

    Science.gov (United States)

    Gu, Bin; Sun, Xingming; Sheng, Victor S

    2017-07-01

    Minimax probability machine (MPM) is an interesting discriminative classifier based on generative prior knowledge. It can directly estimate the probabilistic accuracy bound by minimizing the maximum probability of misclassification. The structural information of data is an effective way to represent prior knowledge, and has been found to be vital for designing classifiers in real-world problems. However, MPM only considers the prior probability distribution of each class with a given mean and covariance matrix, which does not efficiently exploit the structural information of data. In this paper, we use two finite mixture models to capture the structural information of the data from binary classification. For each subdistribution in a finite mixture model, only its mean and covariance matrix are assumed to be known. Based on the finite mixture models, we propose a structural MPM (SMPM). SMPM can be solved effectively by a sequence of the second-order cone programming problems. Moreover, we extend a linear model of SMPM to a nonlinear model by exploiting kernelization techniques. We also show that the SMPM can be interpreted as a large margin classifier and can be transformed to support vector machine and maxi-min margin machine under certain special conditions. Experimental results on both synthetic and real-world data sets demonstrate the effectiveness of SMPM.

  2. Absolute activity measurement and gamma-ray emission probability for decay of I-126; Medida absoluta da atividade e determinacao da taxa de emissao gama por decaimento do {sup 126} I

    Energy Technology Data Exchange (ETDEWEB)

    Fonseca, Katia Aparecida

    1997-07-01

    The accurate knowledge of the gamma-ray emission probability per decay of radionuclides is important in several applications. In the case of {sup 126} I, its importance lies mainly in fast neutron dosimetry as well as in the production of {sup 125} I where {sup 126} I appears as an impurity. In the present work the gamma-ray emission probabilities per decay for the 388 and 666-KeV transitions of {sup 126} I have been measured. This radionuclide was obtained by means of the {sup 127} I(n, 2n){sup 126} I reaction in a fast neutron flux at the IPEN 2 MW research reactor. The methodology for the primary standardization of {sup 126} I is described. For this purpose, two different coincidence systems were used due to the complex decay scheme of this radionuclide. The {beta}branch measurement was carried out in a 4 {pi}(PC){beta}-{gamma} coincidence system consisting of a proportional counter, coupled to a pair of 3'x3' Na I (Tl) crystal. The electron capture branch was measured in a X-{gamma} coincidence system using two NaI(Tl) crystals. The gamma-ray measurements were performed in a HPGe system, previously calibrated by means of standard sources supplied by the International Atomic Energy Agency. All the uncertainties evolved were treated rigorously, by means of covariance analysis. (author)

  3. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  4. Probability numeracy and health insurance purchase

    NARCIS (Netherlands)

    Dillingh, Rik; Kooreman, Peter; Potters, Jan

    2016-01-01

    This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.

  5. What is Probability Theory?

    Indian Academy of Sciences (India)

    IAS Admin

    gambling problems in the 18th century. Europe. random) phenomena, especially those evolving over time. The study of motion of physical objects over time by. Newton led to his famous three laws of motion as well as many important developments in the theory of ordi- nary differential equations. Similarly, the construction ...

  6. A formula for crossing probabilities of critical systems inside polygons

    Science.gov (United States)

    Flores, S. M.; Simmons, J. J. H.; Kleban, P.; Ziff, R. M.

    2017-02-01

    In this article, we use our results from Flores and Kleban (2015 Commun. Math. Phys. 333 389-434, 2015 Commun. Math. Phys. 333 435-81, 2015 Commun. Math. Phys. 333 597-667, 2015 Commun. Math. Phys. 333 669-715) to generalize known formulas for crossing probabilities. Prior crossing results date back to Cardy’s prediction of a formula for the probability that a percolation cluster in two dimensions connects the left and right sides of a rectangle at the percolation critical point in the continuum limit (Cardy 1992 J. Phys. A: Math. Gen. 25 L201-6). Here, we predict a new formula for crossing probabilities of a continuum limit loop-gas model on a planar lattice inside a 2N-sided polygon. In this model, boundary loops exit and then re-enter the polygon through its vertices, with exactly one loop passing once through each vertex, and these loops join the vertices pairwise in some specified connectivity through the polygon’s exterior. The boundary loops also connect the vertices through the interior, which we regard as a crossing event. For particular values of the loop fugacity, this formula specializes to FK cluster (resp. spin cluster) crossing probabilities of a critical Q-state random cluster (resp. Potts) model on a lattice inside the polygon in the continuum limit. This includes critical percolation as the Q  =  1 random cluster model. These latter crossing probabilities are conditioned on a particular side-alternating free/fixed (resp. fluctuating/fixed) boundary condition on the polygon’s perimeter, related to how the boundary loops join the polygon’s vertices pairwise through the polygon’s exterior in the associated loop-gas model. For Q\\in ≤ft\\{2,3,4\\right\\} , we compare our predictions of these random cluster (resp. Potts) model crossing probabilities in a rectangle (N  =  2) and in a hexagon (N  =  3) with high-precision computer simulation measurements. We find that the measurements agree with our predictions very

  7. Promoting the use of outcome measures by an educational programme for physiotherapists in stroke rehabilitation : A pilot randomized controlled trial

    NARCIS (Netherlands)

    Peppen, R.P.S. van; Schuurmans, M.J.; Stutterheim, E.C.; Lindeman, E.; Meeteren, N.L.U. van

    2009-01-01

    Objective: To determine the influence of tutor expertise on the uptake of a physiotherapists' educational programme intended to promote the use of outcome measures in the management of patients with stroke. Design: Pilot randomized controlled trial. Methods: Thirty physiotherapists involved in

  8. Circulating beta-hydroxybutyrate concentration may be a predictive measurement for young cows that have a greater probability to conceive at a fixed-time artificial insemination.

    Science.gov (United States)

    Hobbs, J D; Edwards, S R; Cope, E R; McFarlane, Z D; Pohler, K G; Mulliniks, J T

    2017-04-01

    .01). Results from this study indicate that only the young, postpartum beef cows during early lactation were susceptible to the measured metabolic dysfunctions of elevated blood BHB concentrations, which may have caused a delay in the timing of pregnancy.

  9. Correlation of Eigenvector Centrality to Other Centrality Measures : Random, Small-World and Real-World Networks

    OpenAIRE

    Xiaojia He; Natarajan Meghanathan

    2016-01-01

    In this paper, we thoroughly investigate correlations of eigenvector centrality to five centrality measures, including degree centrality, betweenness centrality, clustering coefficient centrality, closeness centrality, and farness centrality, of various types of network (random network, small world network, and real-world network). For each network, we compute those six centrality measures, from which the correlation coefficient is determined. Our analysis suggests that the degree centrali...

  10. Measurement of the t anti-t Production Cross Section in p anti-p collisions at s**(1/2) = 1.96-TeV using Lepton + Jets Events with Jet Probability b-tagging

    Energy Technology Data Exchange (ETDEWEB)

    Abulencia, A.; Acosta, D.; Adelman, Jahred A.; Affolder, T.; Akimoto, T.; Albrow, M.G.; Ambrose, D.; Amerio, S.; Amidei, D.; Anastassov, A.; Anikeev, K.; /Taiwan, Inst.

    2006-07-01

    The authors present a measurement of the t{bar t} production cross section using events with one charged lepton and jets from p{bar p} collisions at a center-of-mass energy of 1.96 TeV. A b-tagging algorithm based on the probability of displaced tracks coming from the event interaction vertex is applied to identify b quarks from top decay. Using 318 pb{sup -1} of data collected with the CDF II detector, they measure the t{bar t} production cross section in events with at least one restrictive (tight) b-tagged jet and obtain 8.9{sub -1.0}{sup +1.0}(stat.){sub -1.0}{sup +1.1}(syst.) pb. The cross section value assumes a top quark mass of m{sub t} is presented in the paper. This result is consistent with other CDF measurements of the t{bar t} cross section using different samples and analysis techniques, and has similar systematic uncertainties. They have also performed consistency checks by using the b-tagging probability function to vary the signal to background ratio and also using events that have at least two b-tagged jets.

  11. Statistics, Probability and Chaos

    OpenAIRE

    Berliner, L. Mark

    1992-01-01

    The study of chaotic behavior has received substantial attention in many disciplines. Although often based on deterministic models, chaos is associated with complex, "random" behavior and forms of unpredictability. Mathematical models and definitions associated with chaos are reviewed. The relationship between the mathematics of chaos and probabilistic notions, including ergodic theory and uncertainty modeling, are emphasized. Popular data analytic methods appearing in the literature are disc...

  12. Nash equilibrium with lower probabilities

    DEFF Research Database (Denmark)

    Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1998-01-01

    We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...... to distinguish between a player's assessment of ambiguity and his attitude towards ambiguity. We also generalize the concept of trembling hand perfect equilibrium. Finally, we demonstrate that for certain attitudes towards ambiguity it is possible to explain cooperation in the one-shot Prisoner's Dilemma...

  13. A permutation test to analyse systematic bias and random measurement errors of medical devices via boosting location and scale models.

    Science.gov (United States)

    Mayr, Andreas; Schmid, Matthias; Pfahlberg, Annette; Uter, Wolfgang; Gefeller, Olaf

    2017-06-01

    Measurement errors of medico-technical devices can be separated into systematic bias and random error. We propose a new method to address both simultaneously via generalized additive models for location, scale and shape (GAMLSS) in combination with permutation tests. More precisely, we extend a recently proposed boosting algorithm for GAMLSS to provide a test procedure to analyse potential device effects on the measurements. We carried out a large-scale simulation study to provide empirical evidence that our method is able to identify possible sources of systematic bias as well as random error under different conditions. Finally, we apply our approach to compare measurements of skin pigmentation from two different devices in an epidemiological study.

  14. Innovative measurement of parallelism for parallel transparent plate based on optical scanning holography by using a random-phase pupil.

    Science.gov (United States)

    Luo-Zhi, Zhang; Jian-Ping, Hu; Dao-Ming, Wan; Xing, Zeng; Chun-Miao, Li; Xin, Zhou

    2015-03-20

    A potential method is proposed to measure the parallelism of parallel transparent plate with an advanced lower limit and a convenient process by optical scanning holography (OSH) using a random-phase pupil, which is largely distinct from traditional methods. As a new possible application of OSH, this promising method is demonstrated theoretically and numerical simulations are carried out on a 2  cm×2  cm parallel plate. Discussions are also made on the quality of reconstructed image as well as local mean square error (MSE), which are closely related with the parallelism of sample. These amounts may become the judgments of parallelism, while in most interference methods judgments are paces between two interference fringes. In addition, randomness of random-phase pupil also affects the quality of reconstructed image and local MSE. According to the simulation results, high parallelism usually brings about distinguishable reconstructed information and suppressed local MSE.

  15. Probabilities for Solar Siblings

    Science.gov (United States)

    Valtonen, Mauri; Bajkova, A. T.; Bobylev, V. V.; Mylläri, A.

    2015-02-01

    We have shown previously (Bobylev et al. Astron Lett 37:550-562, 2011) that some of the stars in the solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10 % (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.

  16. A seismic probability map

    Directory of Open Access Journals (Sweden)

    J. M. MUNUERA

    1964-06-01

    Full Text Available The material included in former two papers (SB and EF
    which summs 3307 shocks corresponding to 2360 years, up to I960, was
    reduced to a 50 years period by means the weight obtained for each epoch.
    The weitliing factor is the ratio 50 and the amount of years for every epoch.
    The frequency has been referred over basis VII of the international
    seismic scale of intensity, for all cases in which the earthquakes are equal or
    greater than VI and up to IX. The sum of products: frequency and parameters
    previously exposed, is the probable frequency expected for the 50
    years period.
    On each active small square, we have made the corresponding computation
    and so we have drawn the Map No 1, in percentage. The epicenters with
    intensity since X to XI are plotted in the Map No 2, in order to present a
    complementary information.
    A table shows the return periods obtained for all data (VII to XI,
    and after checking them with other computed from the first up to last shock,
    a list includes the probable approximate return periods estimated for the area.
    The solution, we suggest, is an appropriated form to express the seismic
    contingent phenomenon and it improves the conventional maps showing
    the equal intensity curves corresponding to the maximal values of given side.

  17. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  18. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  19. Fifty challenging problems in probability with solutions

    CERN Document Server

    Mosteller, Frederick

    1987-01-01

    Can you solve the problem of ""The Unfair Subway""? Marvin gets off work at random times between 3 and 5 p.m. His mother lives uptown, his girlfriend downtown. He takes the first subway that comes in either direction and eats dinner with the one he is delivered to. His mother complains that he never comes to see her, but he says she has a 50-50 chance. He has had dinner with her twice in the last 20 working days. Explain. Marvin's adventures in probability are one of the fifty intriguing puzzles that illustrate both elementary ad advanced aspects of probability, each problem designed to chall

  20. Return probability: Exponential versus Gaussian decay

    Energy Technology Data Exchange (ETDEWEB)

    Izrailev, F.M. [Instituto de Fisica, BUAP, Apdo. Postal J-48, 72570 Puebla (Mexico)]. E-mail: izrailev@sirio.ifuap.buap.mx; Castaneda-Mendoza, A. [Instituto de Fisica, BUAP, Apdo. Postal J-48, 72570 Puebla (Mexico)

    2006-02-13

    We analyze, both analytically and numerically, the time-dependence of the return probability in closed systems of interacting particles. Main attention is paid to the interplay between two regimes, one of which is characterized by the Gaussian decay of the return probability, and another one is the well-known regime of the exponential decay. Our analytical estimates are confirmed by the numerical data obtained for two models with random interaction. In view of these results, we also briefly discuss the dynamical model which was recently proposed for the implementation of a quantum computation.

  1. A Novel Probability Model for Suppressing Multipath Ghosts in GPR and TWI Imaging: A Numerical Study

    Directory of Open Access Journals (Sweden)

    Tan Yun-hua

    2015-10-01

    Full Text Available A novel concept for suppressing the problem of multipath ghosts in Ground Penetrating Radar (GPR and Through-Wall Imaging (TWI is presented. Ghosts (i.e., false targets mainly arise from the use of the Born or single-scattering approximations that lead to linearized imaging algorithms; however, these approximations neglect the effect of multiple scattering (or multipath between the electromagnetic wavefield and the object under investigation. In contrast to existing methods of suppressing multipath ghosts, the proposed method models for the first time the reflectivity of the probed objects as a probability function up to a normalized factor and introduces the concept of random subaperture by randomly picking up measurement locations from the entire aperture. Thus, the final radar image is a joint probability distribution that corresponds to radar images derived from multiple random subapertures. Finally, numerical experiments are used to demonstrate the performance of the proposed methodology in GPR and TWI imaging.

  2. Comparison of blood pressure measurements between an automated oscillometric device and a Hawksley random-zero sphygmomanometer in the northern Sweden MONICA study.

    Science.gov (United States)

    Eriksson, Marie; Carlberg, Bo; Jansson, Jan-Håkan

    2012-08-01

    The Hawksley random-zero sphygmomanometer (random-zero) has been used widely in epidemiological observation studies. This study compares blood pressure measurements using the random-zero with measurements using an automated oscillometric device and suggests a correction of the automated oscillometric measurements to enable comparisons of blood pressure levels over time. The northern Sweden MONICA population survey 2009 included 1729 participants, 853 men and 876 women, 25-74 years old. Blood pressure was measured using both random-zero and an automated oscillometric device in all participants. The Omron M7 digital blood pressure monitor was used for automated oscillometric measurements. A linear mixed model was used to derive a formula to adjust the automated oscillometric readings. Automated oscillometric measurements of systolic blood pressure were generally lower than random-zero measurements in women [oscillometric mean 122.1 mmHg (95% confidence interval: 121.0-123.2) versus random-zero mean 124.4 mmHg (123.5-125.5)], whereas automated oscillometric measurements of systolic blood pressure were generally higher than random-zero measurements in men [oscillometric 131.1 mmHg (130.0-132.2) versus random-zero 129.0 mmHg (127.9-130.1)]. For diastolic blood pressure, automated oscillometric measurements were higher in both women [oscillometric 79.9 mmHg (79.2-80.5) versus random-zero 76.7 mmHg (76.0-77.4)] and men [oscillometric 83.1 mmHg (82.4-83.8) vs. random-zero 81.2 mmHg (80.6-81.9)]. The difference also varied with age and order of measurement. Adjustment of the automated oscillometric measurements using mixed model regression coefficients produced estimates of blood pressure that were close to the random-zero measurements. Blood pressure measurements using an automated oscillometric device differ from those with random-zero, but the oscillometric measurements can be adjusted, on the basis of sex, age and measurement order, to be similar to the random

  3. Multiple decomposability of probabilities on contractible locally ...

    Indian Academy of Sciences (India)

    Definition 3.1). As mentioned before, μ is n-times τ-decomposable iff μ has a representation as (n + 1)-times iterated convolution product. To be allowed to ..... Then the classical version of the equivalence theorem holds: If νi , i ≥ 0, ν, are probabilities and Xi ,i ≥ 0, Y are independent G-valued random variables with ...

  4. Tailored Random Graph Ensembles

    Science.gov (United States)

    Roberts, E. S.; Annibale, A.; Coolen, A. C. C.

    2013-02-01

    Tailored graph ensembles are a developing bridge between biological networks and statistical mechanics. The aim is to use this concept to generate a suite of rigorous tools that can be used to quantify and compare the topology of cellular signalling networks, such as protein-protein interaction networks and gene regulation networks. We calculate exact and explicit formulae for the leading orders in the system size of the Shannon entropies of random graph ensembles constrained with degree distribution and degree-degree correlation. We also construct an ergodic detailed balance Markov chain with non-trivial acceptance probabilities which converges to a strictly uniform measure and is based on edge swaps that conserve all degrees. The acceptance probabilities can be generalized to define Markov chains that target any alternative desired measure on the space of directed or undirected graphs, in order to generate graphs with more sophisticated topological features.

  5. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  6. Economic choices reveal probability distortion in macaque monkeys.

    Science.gov (United States)

    Stauffer, William R; Lak, Armin; Bossaerts, Peter; Schultz, Wolfram

    2015-02-18

    Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas recent studies measured utility functions and their corresponding neural correlates in monkeys, it is not known whether monkeys distort probability in a manner similar to humans. Therefore, we investigated economic choices in macaque monkeys for evidence of probability distortion. We trained two monkeys to predict reward from probabilistic gambles with constant outcome values (0.5 ml or nothing). The probability of winning was conveyed using explicit visual cues (sector stimuli). Choices between the gambles revealed that the monkeys used the explicit probability information to make meaningful decisions. Using these cues, we measured probability distortion from choices between the gambles and safe rewards. Parametric modeling of the choices revealed classic probability weighting functions with inverted-S shape. Therefore, the animals overweighted low probability rewards and underweighted high probability rewards. Empirical investigation of the behavior verified that the choices were best explained by a combination of nonlinear value and nonlinear probability distortion. Together, these results suggest that probability distortion may reflect evolutionarily preserved neuronal processing. Copyright © 2015 Stauffer et al.

  7. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  8. Test of the X(5) symmetry in {sup 156}Dy and {sup 178}Os by measurement of electromagnetic transition probabilities; Test der X(5)-Symmetrie in {sup 156}Dy und {sup 178}Os durch Messung elektromagnetischer Uebergangswahrscheinlichkeiten

    Energy Technology Data Exchange (ETDEWEB)

    Moeller, O.

    2005-07-01

    This work reports on results from two Recoil-Distance-Doppler-Shift lifetime measurements of excited states in {sup 155}Dy and {sup 178}Os. The experiments were carried out at the GASP spektrometer of the Laboratori Nazional i di Legnaro in combination with the Cologne plunger apparatus. The main purpose of the performed experiments was to test the predictions of the X(5) critical point symmetry in these two nuclei. In {sup 156}Dy and {sup 178}Os 29 lifetimes of excited states were derived using the Differential-Decay-Curve method. In weaker reaction channels the nuclei {sup 155}Dy, {sup 157}Dy and {sup 177}Os were populated. In these nuclei 32 additional lifetimes were measured, most of them for the first time. In order to calculate absolute transition probabilities from the measured lifetimes of the first excited band in {sup 156}Dy, essential branching ratios were derived from the measured data with a very small systematic error (<5%). The most important results can be summarized as mentioned below: Lifetimes measured in the first excited band, confirm that this nucleus can be located close to the critical point X(5). With model calculations, special criteria of the X(5) model were found that can be used to identify other X(5)-like nuclei. Using these criterias a new region of X(5)-like nuclei could be suggested within the osmium isotopes in the A=180 mass region. The measured lifetimes in {sup 178}Os confirm the consistency of a X(5) description in these nuclei. A comparision with the well established X(5)-like nuclei in the N=90 isotones gives an agreement with the X(5) description of at least the same quality. (orig.)

  9. Constructor theory of probability

    Science.gov (United States)

    2016-01-01

    Unitary quantum theory, having no Born Rule, is non-probabilistic. Hence the notorious problem of reconciling it with the unpredictability and appearance of stochasticity in quantum measurements. Generalizing and improving upon the so-called ‘decision-theoretic approach’, I shall recast that problem in the recently proposed constructor theory of information—where quantum theory is represented as one of a class of superinformation theories, which are local, non-probabilistic theories conforming to certain constructor-theoretic conditions. I prove that the unpredictability of measurement outcomes (to which constructor theory gives an exact meaning) necessarily arises in superinformation theories. Then I explain how the appearance of stochasticity in (finitely many) repeated measurements can arise under superinformation theories. And I establish sufficient conditions for a superinformation theory to inform decisions (made under it) as if it were probabilistic, via a Deutsch–Wallace-type argument—thus defining a class of decision-supporting superinformation theories. This broadens the domain of applicability of that argument to cover constructor-theory compliant theories. In addition, in this version some of the argument's assumptions, previously construed as merely decision-theoretic, follow from physical properties expressed by constructor-theoretic principles. PMID:27616914

  10. A story about estimation of a random field of boulders from incomplete seismic measurements

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    2005-01-01

    This paper reports on the statistical interpretation of seismic diffraction measurements of boulder locations. The measurements are made in a corridor along the planned tunnel line for the later realized bored tunnel through the till deposits under the East Channel of the Great Belt in Denmark...... graphical registrations on seismograms do not make a proper interpretation possible without detailed knowledge about the joint distribution of the primary dimensions of the boulders. Therefore separate measurements were made of the dimensions of boulders deposited visibly on the cliff beaches of the Great...... and measured. These direct observations on site confirmed that the prediction was quite good....

  11. Quantum Probability, Orthogonal Polynomials and Quantum Field Theory

    Science.gov (United States)

    Accardi, Luigi

    2017-03-01

    The main thesis of the present paper is that: Quantum Probability is not a generalization of classical probability, but it is a deeper level of it. Classical random variables have an intrinsic (microscopic) non-commutative structure that generalize usual quantum theory. The study of this generalization is the core of the non-linear quantization program.

  12. Visualizing and Understanding Probability and Statistics: Graphical Simulations Using Excel

    Science.gov (United States)

    Gordon, Sheldon P.; Gordon, Florence S.

    2009-01-01

    The authors describe a collection of dynamic interactive simulations for teaching and learning most of the important ideas and techniques of introductory statistics and probability. The modules cover such topics as randomness, simulations of probability experiments such as coin flipping, dice rolling and general binomial experiments, a simulation…

  13. Promoting the use of outcome measures by an educational programme for physiotherapists in stroke rehabilitation: a pilot randomized controlled trial.

    Science.gov (United States)

    Van Peppen, R P S; Schuurmans, M J; Stutterheim, E C; Lindeman, E; Van Meeteren, N L U

    2009-11-01

    To determine the influence of tutor expertise on the uptake of a physiotherapists' educational programme intended to promote the use of outcome measures in the management of patients with stroke. Pilot randomized controlled trial. Thirty physiotherapists involved in stroke management were randomized into two groups and participated in five tutor-guided educational sessions (the Physiotherapists' Educational Programme on Clinimetrics in Stroke, PEPCiS). Groups differed from each other with respect to tutors: one experienced and one inexperienced in stroke care. Primary outcome was 'actual use' (the frequencies of data of seven recommended outcome measures in the patient records of the participating physiotherapists). The actual use of instruments shifted from a median of 3 to 6 in the expert tutor group and from 3 to 4 in the non-expert tutor group (P = 0.07). Physiotherapists educated by the expert tutor used a broader variety of instruments and appreciated the educational programme, their own knowledge gain and all three scales of tutor style aspects significantly more than their colleagues of the non-expert tutor group (all Ptutors' performance, that were associated with a change score of the use of two or more outcome measures by individual physiotherapists after the educational programme. In this pilot trial it was not proven that tutor expertise in stroke care influences the actual use of outcome measures, but it warrants a future study with sufficient power to investigate the influence of the tutor.

  14. The estimation of tree posterior probabilities using conditional clade probability distributions.

    Science.gov (United States)

    Larget, Bret

    2013-07-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample.

  15. Measuring patient-centredness, the neglected outcome in fertility care: a random multicentre validation study.

    NARCIS (Netherlands)

    Empel, I.W.H. van; Aarts, J.W.; Cohlen, B.J.; Huppelschoten, D.; Laven, J.S.E.; Nelen, W.L.D.M.; Kremer, J.A.M.

    2010-01-01

    BACKGROUND: High-quality fertility care should be effective and safe, but also patient-centred. However, a suitable instrument for measuring patient-centredness is lacking. This study aims to develop and validate an instrument that can reliably measure patient-centredness in fertility care:

  16. Measuring patient-centredness, the neglected outcome in fertility care: A random multicentre validation study

    NARCIS (Netherlands)

    I.W.H. van Empel; J.W.M. Aarts (Johanna); B.J. Cohlen (Ben); D.A. Huppelschoten (Dana); J.S.E. Laven (Joop); W.L.D.M. Nelen (Willianne); J.A.M. Kremer

    2010-01-01

    textabstractBACKGROUND: High-quality fertility care should be effective and safe, but also patient-centred. However, a suitable instrument for measuring patient-centredness is lacking. This study aims to develop and validate an instrument that can reliably measure patient-centredness in fertility

  17. Laboratory measurements of scattering matrix elements of randomly oriented Mars analog palagonite particles

    NARCIS (Netherlands)

    Muñoz, O.; Volten, H.; Hovenier, J.W.; Laan, E.; Roush, T.; Stam, D.

    2008-01-01

    We present laboratory measurements for Martian analog particles, consisting of palagonite. We measured all elements of the scattering matrix as functions of the scattering angle from 3 to 174 degrees at a wavelength of 632.8 nm. The results may be used in studies of the Martian atmosphere.

  18. Regularity Properties of Potentials for Joint Measures of Random Spin Systems

    NARCIS (Netherlands)

    Külske, C.

    2004-01-01

    We consider general quenched disordered lattice spin models on compact local spin spaces with possibly dependent disorder. We discuss their corresponding joint measures on the product space of disorder variables and spin variables in the infinite volume. These measures often possess pathologies in a

  19. Calibrated delivery drape versus indirect gravimetric technique for the measurement of blood loss after delivery: a randomized trial.

    Science.gov (United States)

    Ambardekar, Shubha; Shochet, Tara; Bracken, Hillary; Coyaji, Kurus; Winikoff, Beverly

    2014-08-15

    Trials of interventions for PPH prevention and treatment rely on different measurement methods for the quantification of blood loss and identification of PPH. This study's objective was to compare measures of blood loss obtained from two different measurement protocols frequently used in studies. Nine hundred women presenting for vaginal delivery were randomized to a direct method (a calibrated delivery drape) or an indirect method (a shallow bedpan placed below the buttocks and weighing the collected blood and blood-soaked gauze/pads). Blood loss was measured from immediately after delivery for at least one hour or until active bleeding stopped. Significantly greater mean blood loss was recorded by the direct than by the indirect measurement technique (253.9 mL and 195.3 mL, respectively; difference = 58.6 mL (95% CI: 31-86); p 500 mL (8.7% vs. 4.7%, p = 0.02). The study suggests a real and significant difference in blood loss measurement between these methods. Research using blood loss measurement as an endpoint needs to be interpreted taking measurement technique into consideration. This study has been registered at clinicaltrials.gov as NCT01885845.

  20. Full-field, high-spatial-resolution detection of local structural damage from low-resolution random strain field measurements

    Science.gov (United States)

    Yang, Yongchao; Sun, Peng; Nagarajaiah, Satish; Bachilo, Sergei M.; Weisman, R. Bruce

    2017-07-01

    Structural damage is typically a local phenomenon that initiates and propagates within a limited area. As such high spatial resolution measurement and monitoring is often needed for accurate damage detection. This requires either significantly increased costs from denser sensor deployment in the case of global simultaneous/parallel measurements, or increased measurement time and labor in the case of global sequential measurements. This study explores the feasibility of an alternative approach to this problem: a computational solution in which a limited set of randomly positioned, low-resolution global strain measurements are used to reconstruct the full-field, high-spatial-resolution, two-dimensional (2D) strain field and rapidly detect local damage. The proposed approach exploits the implicit low-rank and sparse data structure of the 2D strain field: it is highly correlated without many edges and hence has a low-rank structure, unless damage-manifesting itself as sparse local irregularity-is present and alters such a low-rank structure slightly. Therefore, reconstruction of the full-field, high-spatial-resolution strain field from a limited set of randomly positioned low-resolution global measurements is modeled as a low-rank matrix completion framework and damage detection as a sparse decomposition formulation, enabled by emerging convex optimization techniques. Numerical simulations on a plate structure are conducted for validation. The results are discussed and a practical iterative global/local procedure is recommended. This new computational approach should enable the efficient detection of local damage using limited sets of strain measurements.

  1. Probability workshop to be better in probability topic

    Science.gov (United States)

    Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed

    2015-02-01

    The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.

  2. Instrumental Measurements of Water and the Surrounding Space During a Randomized Blinded Controlled Trial of Focused Intention.

    Science.gov (United States)

    Matos, Luís Carlos; Santos, Sara Cristina; Anderson, Joel G; Machado, Jorge; Greten, Henry Johannes; Monteiro, Fernando Jorge

    2017-01-01

    The main goal of this work was the assessment of measurable interactions induced by focused intention, frequently used in biofield practices such as Healing Touch and Reiki. Water, as the main component of the human body, was chosen as a model. Intention experiments were performed over 4 different days at a scheduled interval, during which 286 trained biofield practitioners from several countries were instructed to meditate with the intention to change the molecular vibrational state of water samples selected by a blinded operator. The experimental protocol was randomized, blinded, and controlled; the measured variables included Raman spectra and the pH and electrical conductance of the water, as well as the magnetic field and UV-VIS (ultraviolet-visible) radiation near the experimental spot. Although a direct causal relationship cannot be established, some measurements of the water samples, as well as the magnetic field and radiation near the experimental spot, were responsive during the experimental period.

  3. A Correction of Random Incidence Absorption Coefficients for the Angular Distribution of Acoustic Energy under Measurement Conditions

    DEFF Research Database (Denmark)

    Jeong, Cheol-Ho

    2009-01-01

    discrepancies between the measured value and the theoretical random incidence absorption coefficient. Therefore the angular distribution of the incident acoustic energy onto an absorber sample should be taken into account. The angular distribution of the incident energy density was simulated using the beam...... tracing method for various room shapes and source positions. The averaged angular distribution is found to be similar to a Gaussian distribution. As a result, an angle-weighted absorption coefficient was proposed by considering the angular energy distribution to improve the agreement between...... the theoretical absorption coefficient and the reverberation room measurement. The angle-weighted absorption coefficient, together with the size correction, agrees satisfactorily with the measured absorption data by the reverberation chamber method. At high frequencies and for large samples, the averaged...

  4. A random approach to the Lebesgue integral

    Science.gov (United States)

    Grahl, Jack

    2008-04-01

    We construct an integral of a measurable real function using randomly chosen Riemann sums and show that it converges in probability to the Lebesgue integral where this exists. We then prove some conditions for the almost sure convergence of this integral.

  5. Probability of Boulders

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    1997-01-01

    To collect background information for formulating a description of the expected soil properties along the tunnel line, in 1987 Storebælt initiated a statistical investigation of the occurrence and size of boulders in the Great Belt area. The data for the boulder size distribution were obtained...... by use of aerial photographing of cliff beaches with subsequent stereo metric measurement on the photographs. To get information about the density of occurrence, a series of continuous seismic scannings were also made along parallel lines in a corridor with the tunnel line as about the central line....... The data collection part of the investigation was made on the basis of geological expert advice (Gunnar Larsen, Århus) by the Danish Geotechnical Institute (DGI).The statistical data analysis combined with stochastic modeling based on geometry and sound wave diffraction theory gave a point estimate...

  6. Uncertainty analysis of standardized measurements of random-incidence absorption and scattering coefficients.

    Science.gov (United States)

    Müller-Trapet, Markus; Vorländer, Michael

    2015-01-01

    This work presents an analysis of the effect of some uncertainties encountered when measuring absorption or scattering coefficients in the reverberation chamber according to International Organization for Standardization/American Society for Testing and Materials standards. This especially relates to the uncertainty due to spatial fluctuations of the sound field. By analyzing the mathematical definition of the respective coefficient, a relationship between the properties of the chamber and the test specimen and the uncertainty in the measured quantity is determined and analyzed. The validation of the established equations is presented through comparisons with measurement data. This study analytically explains the main sources of error and provides a method to obtain the product of the necessary minimum number of measurement positions and the band center frequency to achieve a given maximum uncertainty in the desired quantity. It is shown that this number depends on the ratio of room volume to sample surface area and the reverberation time of the empty chamber.

  7. Classicality versus quantumness in Born's probability

    Science.gov (United States)

    Luo, Shunlong

    2017-11-01

    Born's rule, which postulates the probability of a measurement outcome in a quantum state, is pivotal to interpretations and applications of quantum mechanics. By exploiting the departure of the product of two Hermitian operators in Born's rule from Hermiticity, we prescribe an intrinsic and natural scheme to decompose Born's probability into a classical part and a quantum part, which have significant implications in quantum information theory. The classical part constitutes the information compatible with the associated measurement operator, while the quantum part represents the quantum coherence of the state with respect to the measurement operator. Fundamental properties of the decomposition are revealed. As applications, we establish several trade-off relations for the classicality and quantumness in Born's probability, which may be interpreted as alternative realizations of Heisenberg's uncertainty principle. The results shed physical lights on related issues concerning quantification of complementarity, coherence, and uncertainty, as well as the classical-quantum interplay.

  8. TESTING BRAND VALUE MEASUREMENT METHODS IN A RANDOM COEFFICIENT MODELING FRAMEWORK

    OpenAIRE

    Szõcs Attila

    2014-01-01

    Our objective is to provide a framework for measuring brand equity, that is, the added value to the product endowed by the brand. Based on a demand and supply model, we propose a structural model that enables testing the structural effect of brand equity (demand side effect) on brand value (supply side effect), using Monte Carlo simulation. Our main research question is which of the three brand value measurement methods (price premium, revenue premium and profit premium) is more suitable from...

  9. A Randomized Trial to Measure the Efficacy of Applying Task Oriented Role Assignment to Improve Neonatal Resuscitation

    Science.gov (United States)

    2017-05-06

    DEPARTMENT OF THE AIR FORCE 59TH MEDICAL W ING (AETC) JOINT BASE SAN ANTONIO - LACKLAND TEXAS MEMORANOUMFORSGVT ATTN: MAJ CARRIE LITKE-WAGER...Author Litke· Wager, Carrie 0-4/Major 959/CSPS/ 59MDW/SGVT b. Mu, Thornton 0-5/LTC MCH E-ZDP-N SA MMC c. Delaney, Heather 0-4/ MAJ MCHE-ZDP-N SA MMC d...78234-2715 15 June 2016 Maj Carrie Litke-Wager, MD Brooke Army Medical Center Institutional Review Board A Randomized Trial to Measure the

  10. Probability and Statistics The Science of Uncertainty (Revised Edition)

    CERN Document Server

    Tabak, John

    2011-01-01

    Probability and Statistics, Revised Edition deals with the history of probability, describing the modern concept of randomness and examining "pre-probabilistic" ideas of what most people today would characterize as randomness. This revised book documents some historically important early uses of probability to illustrate some very important probabilistic questions. It goes on to explore statistics and the generations of mathematicians and non-mathematicians who began to address problems in statistical analysis, including the statistical structure of data sets as well as the theory of

  11. Entanglement probabilities of polymers: a white noise functional approach

    CERN Document Server

    Bernido, C C

    2003-01-01

    The entanglement probabilities for a highly flexible polymer to wind n times around a straight polymer are evaluated using white noise analysis. To introduce the white noise functional approach, the one-dimensional random walk problem is taken as an example. The polymer entanglement scenario, viewed as a random walk on a plane, is then treated and the entanglement probabilities are obtained for a magnetic flux confined along the straight polymer, and a case where an entangled polymer is subjected to the potential V = f-dot(s)theta. In the absence of the magnetic flux and the potential V, the entanglement probabilities reduce to a result obtained by Wiegel.

  12. Assessment reactivity: A randomized controlled trial of alcohol-specific measures on alcohol-related behaviors.

    Science.gov (United States)

    Meier, Ellen; Miller, Mary Beth; Lombardi, Nate; Leffingwell, Thad

    2017-04-01

    Completion of alcohol assessments influences treatment outcomes, yet little is known about the aspects of assessment that may contribute to this response. The present study is a randomized controlled trial examining how the themes of alcohol assessments (e.g., assessment of alcohol-related consequences as opposed to drinking patterns) may affect drinking behaviors. Undergraduate students (N=290, Mage=19.97, SDage=1.81, 61.7% female), reporting at least one binge drinking episode during the past month, completed one of five baseline assessment batteries that varied thematically: (a) Control (e.g., minimal drinking quantity and frequency questions), (b) Consequences (e.g., College Alcohol Problems Scale; CAPS-r), (c) Norms (e.g., Drinking Norms Rating Form), (d) Diagnostic (e.g., Alcohol Use Disorders Identification Test), and (e) Combined (all themes). Participants completed a one-month follow-up of drinking quantity/frequency and the CAPS-r. All groups decreased their self-reported peak drinks consumed (palcohol-related consequences (p=0.06, ηp(2)=0.03) from baseline to follow-up. Minimal assessment of drinking quantity and frequency may result in assessment reactivity. Reductions in markers of risky drinking behaviors did not differ as a function of the type of assessments completed (e.g., Consequences vs Diagnostic). Continued research is needed to determine what other important variables (e.g., treatment seeking) may affect assessment reactivity. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Estimation of the Surface Properties of Styrene-Acrylonitrile Random Copolymers from Contact Angle Measurements.

    Science.gov (United States)

    Adão; Saramago; Fernandes

    1999-09-01

    The surface free energy per unit area of a solid, gamma(S), is a fundamental property of materials and determines their surface and interfacial behavior in processes like wetting and adhesion. In this study the gamma(S) of a series of styrene-acrylonitrile random copolymers is evaluated. Three different approaches are used to determine the components in which the surface free energy can be decomposed. Using the geometric and the harmonic mean approach, the dispersive, gamma(d), and polar, gamma(p), components of the solid surface free energy were determined and compared to the Lifshitz-van der Waals, gamma(LW), and acid-base, gamma(AB), components using the approach developed by C. J. van Oss et al. (1987, Adv. Colloid Interface Sci. 28, 35). The acid-base approach was also used to evaluate the work of adhesion of the test liquids: water, glycerol, and thiodiglycol. It was found that the contact angles of these liquids follow closely the predictions of Cassie equation. The evaluation of the surface free energy components on one hand and the relative magnitude of the work of adhesion components on the other hand, suggest that below 50% of acrylonitrile the polystyrene repeating units are preferentially at the surface. Above 50% of acrylonitrile the segregation of the low-energy homopolymer at the surface decreases. Copyright 1999 Academic Press.

  14. Probability and statistics: selected problems

    OpenAIRE

    Machado, J.A. Tenreiro; Pinto, Carla M. A.

    2014-01-01

    Probability and Statistics—Selected Problems is a unique book for senior undergraduate and graduate students to fast review basic materials in Probability and Statistics. Descriptive statistics are presented first, and probability is reviewed secondly. Discrete and continuous distributions are presented. Sample and estimation with hypothesis testing are presented in the last two chapters. The solutions for proposed excises are listed for readers to references.

  15. Health-related quality of life measurement in randomized clinical trials in surgical oncology

    NARCIS (Netherlands)

    Blazeby, Jane M.; Avery, Kerry; Sprangers, Mirjam; Pikhart, Hynek; Fayers, Peter; Donovan, Jenny

    2006-01-01

    PURPOSE: There is debate about the value of measuring health-related quality of life (HRQL) in clinical trials in oncology because of evidence suggesting that HRQL does not influence clinical decisions. Analysis of HRQL in surgical trials, however, may inform decision making because it provides

  16. The gated integration technique for the accurate measurement of the autocorrelation function of speckle intensities scattered from random phase screens

    Science.gov (United States)

    Zhang, Ningyu; Cheng, Chuanfu; Teng, Shuyun; Chen, Xiaoyi; Xu, Zhizhan

    2007-09-01

    A new approach based on the gated integration technique is proposed for the accurate measurement of the autocorrelation function of speckle intensities scattered from a random phase screen. The Boxcar used for this technique in the acquisition of the speckle intensity data integrates the photoelectric signal during its sampling gate open, and it repeats the sampling by a preset number, m. The average analog of the m samplings output by the Boxcar enhances the signal-to-noise ratio by √{m}, because the repeated sampling and the average make the useful speckle signals stable, while the randomly varied photoelectric noise is suppressed by 1/√{m}. In the experiment, we use an analog-to-digital converter module to synchronize all the actions such as the stepped movement of the phase screen, the repeated sampling, the readout of the averaged output of the Boxcar, etc. The experimental results show that speckle signals are better recovered from contaminated signals, and the autocorrelation function with the secondary maximum is obtained, indicating that the accuracy of the measurement of the autocorrelation function is greatly improved by the gated integration technique.

  17. Numerical Demultiplexing of Color Image Sensor Measurements via Non-linear Random Forest Modeling

    OpenAIRE

    Jason Deglint; Farnoud Kazemzadeh; Daniel Cho; Clausi, David A.; Alexander Wong

    2016-01-01

    The simultaneous capture of imaging data at multiple wavelengths across the electromagnetic spectrum is highly challenging, requiring complex and costly multispectral image devices. In this study, we investigate the feasibility of simultaneous multispectral imaging using conventional image sensors with color filter arrays via a novel comprehensive framework for numerical demultiplexing of the color image sensor measurements. A numerical forward model characterizing the formation of sensor mea...

  18. Considerations on probability: from games of chance to modern science

    Directory of Open Access Journals (Sweden)

    Paola Monari

    2015-12-01

    Full Text Available The article sets out a number of considerations on the distinction between variability and uncertainty over the centuries. Games of chance have always been useful random experiments which through combinatorial calculation have opened the way to probability theory and to the interpretation of modern science through statistical laws. The article also looks briefly at the stormy nineteenth-century debate concerning the definitions of probability which went over the same grounds – sometimes without any historical awareness – as the debate which arose at the very beginnings of probability theory, when the great probability theorists were open to every possible meaning of the term.

  19. Heuristic Relative Entropy Principles with Complex Measures: Large-Degree Asymptotics of a Family of Multi-variate Normal Random Polynomials

    Science.gov (United States)

    Kiessling, Michael Karl-Heinz

    2017-10-01

    Let z\\in C, let σ ^2>0 be a variance, and for N\\in N define the integrals E_N^{}(z;σ ) := {1/σ } \\int _R\\ (x^2+z^2) e^{-{1/2σ^2 x^2}}{√{2π }}/dx \\quad if N=1, {1/σ } \\int _{R^N} \\prod \\prod \\limits _{1≤ k1. These are expected values of the polynomials P_N^{}(z)=\\prod _{1≤ n≤ N}(X_n^2+z^2) whose 2 N zeros ± i X_k^{}_{k=1,\\ldots ,N} are generated by N identically distributed multi-variate mean-zero normal random variables {X_k}N_{k=1} with co-variance {Cov}_N^{}(X_k,X_l)=(1+σ ^2-1/N)δ _{k,l}+σ ^2-1/N(1-δ _{k,l}). The E_N^{}(z;σ ) are polynomials in z^2, explicitly computable for arbitrary N, yet a list of the first three E_N^{}(z;σ ) shows that the expressions become unwieldy already for moderate N—unless σ = 1, in which case E_N^{}(z;1) = (1+z^2)^N for all z\\in C and N\\in N. (Incidentally, commonly available computer algebra evaluates the integrals E_N^{}(z;σ ) only for N up to a dozen, due to memory constraints). Asymptotic evaluations are needed for the large- N regime. For general complex z these have traditionally been limited to analytic expansion techniques; several rigorous results are proved for complex z near 0. Yet if z\\in R one can also compute this "infinite-degree" limit with the help of the familiar relative entropy principle for probability measures; a rigorous proof of this fact is supplied. Computer algebra-generated evidence is presented in support of a conjecture that a generalization of the relative entropy principle to signed or complex measures governs the N→ ∞ asymptotics of the regime iz\\in R. Potential generalizations, in particular to point vortex ensembles and the prescribed Gauss curvature problem, and to random matrix ensembles, are emphasized.

  20. Fatigue Reliability under Random Loads

    DEFF Research Database (Denmark)

    Talreja, R.

    1979-01-01

    We consider the problem of estimating the probability of survival (non-failure) and the probability of safe operation (strength greater than a limiting value) of structures subjected to random loads. These probabilities are formulated in terms of the probability distributions of the loads and the...... propagation stage. The consequences of this behaviour on the fatigue reliability are discussed....

  1. Detection and measurement of fetal abdominal contour in ultrasound images via local phase information and iterative randomized Hough transform.

    Science.gov (United States)

    Wang, Weiming; Qin, Jing; Zhu, Lei; Ni, Dong; Chui, Yim-Pan; Heng, Pheng-Ann

    2014-01-01

    Due to the characteristic artifacts of ultrasound images, e.g., speckle noise, shadows and intensity inhomogeneity, traditional intensity-based methods usually have limited success on the segmentation of fetal abdominal contour. This paper presents a novel approach to detect and measure the abdominal contour from fetal ultrasound images in two steps. First, a local phase-based measure called multiscale feature asymmetry (MSFA) is de ned from the monogenic signal to detect the boundaries of fetal abdomen. The MSFA measure is intensity invariant and provides an absolute measurement for the signi cance of features in the image. Second, in order to detect the ellipse that ts to the abdominal contour, the iterative randomized Hough transform is employed to exclude the interferences of the inner boundaries, after which the detected ellipse gradually converges to the outer boundaries of the abdomen. Experimental results in clinical ultrasound images demonstrate the high agreement between our approach and manual approach on the measurement of abdominal circumference (mean sign difference is 0.42% and correlation coef cient is 0.9973), which indicates that the proposed approach can be used as a reliable and accurate tool for obstetrical care and diagnosis.

  2. Modelling the Probability of Landslides Impacting Road Networks

    Science.gov (United States)

    Taylor, F. E.; Malamud, B. D.

    2012-04-01

    During a landslide triggering event, the threat of landslides blocking roads poses a risk to logistics, rescue efforts and communities dependant on those road networks. Here we present preliminary results of a stochastic model we have developed to evaluate the probability of landslides intersecting a simple road network during a landslide triggering event and apply simple network indices to measure the state of the road network in the affected region. A 4000 x 4000 cell array with a 5 m x 5 m resolution was used, with a pre-defined simple road network laid onto it, and landslides 'randomly' dropped onto it. Landslide areas (AL) were randomly selected from a three-parameter inverse gamma probability density function, consisting of a power-law decay of about -2.4 for medium and large values of AL and an exponential rollover for small values of AL; the rollover (maximum probability) occurs at about AL = 400 m2 This statistical distribution was chosen based on three substantially complete triggered landslide inventories recorded in existing literature. The number of landslide areas (NL) selected for each triggered event iteration was chosen to have an average density of 1 landslide km-2, i.e. NL = 400 landslide areas chosen randomly for each iteration, and was based on several existing triggered landslide event inventories. A simple road network was chosen, in a 'T' shape configuration, with one road 1 x 4000 cells (5 m x 20 km) in a 'T' formation with another road 1 x 2000 cells (5 m x 10 km). The landslide areas were then randomly 'dropped' over the road array and indices such as the location, size (ABL) and number of road blockages (NBL) recorded. This process was performed 500 times (iterations) in a Monte-Carlo type simulation. Initial results show that for a landslide triggering event with 400 landslides over a 400 km2 region, the number of road blocks per iteration, NBL,ranges from 0 to 7. The average blockage area for the 500 iterations (A¯ BL) is about 3000 m

  3. Estimating the Probability of Vegetation to Be Groundwater Dependent Based on the Evaluation of Tree Models

    Directory of Open Access Journals (Sweden)

    Isabel C. Pérez Hoyos

    2016-04-01

    Full Text Available Groundwater Dependent Ecosystems (GDEs are increasingly threatened by humans’ rising demand for water resources. Consequently, it is imperative to identify the location of GDEs to protect them. This paper develops a methodology to identify the probability of an ecosystem to be groundwater dependent. Probabilities are obtained by modeling the relationship between the known locations of GDEs and factors influencing groundwater dependence, namely water table depth and climatic aridity index. Probabilities are derived for the state of Nevada, USA, using modeled water table depth and aridity index values obtained from the Global Aridity database. The model selected results from the performance comparison of classification trees (CT and random forests (RF. Based on a threshold-independent accuracy measure, RF has a better ability to generate probability estimates. Considering a threshold that minimizes the misclassification rate for each model, RF also proves to be more accurate. Regarding training accuracy, performance measures such as accuracy, sensitivity, and specificity are higher for RF. For the test set, higher values of accuracy and kappa for CT highlight the fact that these measures are greatly affected by low prevalence. As shown for RF, the choice of the cutoff probability value has important consequences on model accuracy and the overall proportion of locations where GDEs are found.

  4. Training Teachers to Teach Probability

    Science.gov (United States)

    Batanero, Carmen; Godino, Juan D.; Roa, Rafael

    2004-01-01

    In this paper we analyze the reasons why the teaching of probability is difficult for mathematics teachers, describe the contents needed in the didactical preparation of teachers to teach probability and analyze some examples of activities to carry out this training. These activities take into account the experience at the University of Granada,…

  5. Probability in biology: overview of a comprehensive theory of probability in living systems.

    Science.gov (United States)

    Nakajima, Toshiyuki

    2013-09-01

    Probability is closely related to biological organization and adaptation to the environment. Living systems need to maintain their organizational order by producing specific internal events non-randomly, and must cope with the uncertain environments. These processes involve increases in the probability of favorable events for these systems by reducing the degree of uncertainty of events. Systems with this ability will survive and reproduce more than those that have less of this ability. Probabilistic phenomena have been deeply explored using the mathematical theory of probability since Kolmogorov's axiomatization provided mathematical consistency for the theory. However, the interpretation of the concept of probability remains both unresolved and controversial, which creates problems when the mathematical theory is applied to problems in real systems. In this article, recent advances in the study of the foundations of probability from a biological viewpoint are reviewed, and a new perspective is discussed toward a comprehensive theory of probability for understanding the organization and adaptation of living systems. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Probability and statistics for particle physics

    CERN Document Server

    Mana, Carlos

    2017-01-01

    This book comprehensively presents the basic concepts of probability and Bayesian inference with sufficient generality to make them applicable to current problems in scientific research. The first chapter provides the fundamentals of probability theory that are essential for the analysis of random phenomena. The second chapter includes a full and pragmatic review of the Bayesian methods that constitute a natural and coherent framework with enough freedom to analyze all the information available from experimental data in a conceptually simple manner. The third chapter presents the basic Monte Carlo techniques used in scientific research, allowing a large variety of problems to be handled difficult to tackle by other procedures. The author also introduces a basic algorithm, which enables readers to simulate samples from simple distribution, and describes useful cases for researchers in particle physics.The final chapter is devoted to the basic ideas of Information Theory, which are important in the Bayesian me...

  7. Predicting Flow Breakdown Probability and Duration in Stochastic Network Models: Impact on Travel Time Reliability

    Energy Technology Data Exchange (ETDEWEB)

    Dong, Jing [ORNL; Mahmassani, Hani S. [Northwestern University, Evanston

    2011-01-01

    This paper proposes a methodology to produce random flow breakdown endogenously in a mesoscopic operational model, by capturing breakdown probability and duration. Based on previous research findings that probability of flow breakdown can be represented as a function of flow rate and the duration can be characterized by a hazard model. By generating random flow breakdown at various levels and capturing the traffic characteristics at the onset of the breakdown, the stochastic network simulation model provides a tool for evaluating travel time variability. The proposed model can be used for (1) providing reliability related traveler information; (2) designing ITS (intelligent transportation systems) strategies to improve reliability; and (3) evaluating reliability-related performance measures of the system.

  8. Probability as a theory dependent concept

    NARCIS (Netherlands)

    Atkinson, D; Peijnenburg, J

    1999-01-01

    It is argued that probability should be defined implicitly by the distributions of possible measurement values characteristic of a theory. These distributions are tested by, but not defined in terms of, relative frequencies of occurrences of events of a specified kind. The adoption of an a priori

  9. Improving ultrasonic measurement of diaphragmatic excursion after cardiac surgery using the anatomical M-mode: a randomized crossover study.

    Science.gov (United States)

    Pasero, Daniela; Koeltz, Adrien; Placido, Rui; Fontes Lima, Mariana; Haun, Olivia; Rienzo, Mario; Marrache, David; Pirracchio, Romain; Safran, Denis; Cholley, Bernard

    2015-04-01

    Motion-mode (MM) echography allows precise measurement of diaphragmatic excursion when the ultrasound beam is parallel to the diaphragmatic displacement. However, proper alignment is difficult to obtain in patients after cardiac surgery; thus, measurements might be inaccurate. A new imaging modality named the anatomical motion-mode (AMM) allows free placement of the cursor through the numerical image reconstruction and perfect alignment with the diaphragmatic motion. Our goal was to compare MM and AMM measurements of diaphragmatic excursion in cardiac surgical patients. Cardiac surgical patients were studied after extubation. The excursions of the right and left hemidiaphragms were measured by two operators, an expert and a trainee, using MM and AMM successively, according to a blinded, randomized, crossover sequence. Values were averaged over three consecutive respiratory cycles. The angle between the MM and AMM cursors was quantified for each measurement. Fifty patients were studied. The mean (±SD) angle between the MM and AMM cursors was 37° ± 16°. The diaphragmatic excursion as measured by experts was 1.8 ± 0.7 cm using MM and 1.5 ± 0.5 cm using AMM (p AMM in 75 % of the measurements. Bland-Altman analysis showed tighter limits of agreement between experts and trainees with AMM [bias: 0.0 cm; 95 % confidence interval (CI): 0.8 cm] than with MM (bias: 0.0 cm; 95 % CI: 1.4 cm). MM overestimates diaphragmatic excursion in comparison to AMM in cardiac surgical patients. Using MM may lead to a lack of recognition of diaphragmatic dysfunction.

  10. Effects of Video Game Training on Behavioral and Electrophysiological Measures of Attention and Memory: Protocol for a Randomized Controlled Trial

    Science.gov (United States)

    Mayas, Julia; Ruiz-Marquez, Eloisa; Prieto, Antonio; Toril, Pilar; Ponce de Leon, Laura; de Ceballos, Maria L; Reales Avilés, José Manuel

    2017-01-01

    Background Neuroplasticity-based approaches seem to offer promising ways of maintaining cognitive health in older adults and postponing the onset of cognitive decline symptoms. Although previous research suggests that training can produce transfer effects, this study was designed to overcome some limitations of previous studies by incorporating an active control group and the assessment of training expectations. Objective The main objectives of this study are (1) to evaluate the effects of a randomized computer-based intervention consisting of training older adults with nonaction video games on brain and cognitive functions that decline with age, including attention and spatial working memory, using behavioral measures and electrophysiological recordings (event-related potentials [ERPs]) just after training and after a 6-month no-contact period; (2) to explore whether motivation, engagement, or expectations might account for possible training-related improvements; and (3) to examine whether inflammatory mechanisms assessed with noninvasive measurement of C-reactive protein in saliva impair cognitive training-induced effects. A better understanding of these mechanisms could elucidate pathways that could be targeted in the future by either behavioral or neuropsychological interventions. Methods A single-blinded randomized controlled trial with an experimental group and an active control group, pretest, posttest, and 6-month follow-up repeated measures design is used in this study. A total of 75 cognitively healthy older adults were randomly distributed into experimental and active control groups. Participants in the experimental group received 16 1-hour training sessions with cognitive nonaction video games selected from Lumosity, a commercial brain training package. The active control group received the same number of training sessions with The Sims and SimCity, a simulation strategy game. Results We have recruited participants, have conducted the training protocol

  11. Effects of Video Game Training on Behavioral and Electrophysiological Measures of Attention and Memory: Protocol for a Randomized Controlled Trial.

    Science.gov (United States)

    Ballesteros, Soledad; Mayas, Julia; Ruiz-Marquez, Eloisa; Prieto, Antonio; Toril, Pilar; Ponce de Leon, Laura; de Ceballos, Maria L; Reales Avilés, José Manuel

    2017-01-24

    Neuroplasticity-based approaches seem to offer promising ways of maintaining cognitive health in older adults and postponing the onset of cognitive decline symptoms. Although previous research suggests that training can produce transfer effects, this study was designed to overcome some limitations of previous studies by incorporating an active control group and the assessment of training expectations. The main objectives of this study are (1) to evaluate the effects of a randomized computer-based intervention consisting of training older adults with nonaction video games on brain and cognitive functions that decline with age, including attention and spatial working memory, using behavioral measures and electrophysiological recordings (event-related potentials [ERPs]) just after training and after a 6-month no-contact period; (2) to explore whether motivation, engagement, or expectations might account for possible training-related improvements; and (3) to examine whether inflammatory mechanisms assessed with noninvasive measurement of C-reactive protein in saliva impair cognitive training-induced effects. A better understanding of these mechanisms could elucidate pathways that could be targeted in the future by either behavioral or neuropsychological interventions. A single-blinded randomized controlled trial with an experimental group and an active control group, pretest, posttest, and 6-month follow-up repeated measures design is used in this study. A total of 75 cognitively healthy older adults were randomly distributed into experimental and active control groups. Participants in the experimental group received 16 1-hour training sessions with cognitive nonaction video games selected from Lumosity, a commercial brain training package. The active control group received the same number of training sessions with The Sims and SimCity, a simulation strategy game. We have recruited participants, have conducted the training protocol and pretest assessments, and are

  12. Targeting the probability versus cost of feared outcomes in public speaking anxiety.

    Science.gov (United States)

    Nelson, Elizabeth A; Deacon, Brett J; Lickel, James J; Sy, Jennifer T

    2010-04-01

    Cognitive-behavioral theory suggests that social phobia is maintained, in part, by overestimates of the probability and cost of negative social events. Indeed, empirically supported cognitive-behavioral treatments directly target these cognitive biases through the use of in vivo exposure or behavioral experiments. While cognitive-behavioral theories and treatment protocols emphasize the importance of targeting probability and cost biases in the reduction of social anxiety, few studies have examined specific techniques for reducing probability and cost bias, and thus the relative efficacy of exposure to the probability versus cost of negative social events is unknown. In the present study, 37 undergraduates with high public speaking anxiety were randomly assigned to a single-session intervention designed to reduce either the perceived probability or the perceived cost of negative outcomes associated with public speaking. Compared to participants in the probability treatment condition, those in the cost treatment condition demonstrated significantly greater improvement on measures of public speaking anxiety and cost estimates for negative social events. The superior efficacy of the cost treatment condition was mediated by greater treatment-related changes in social cost estimates. The clinical implications of these findings are discussed. Published by Elsevier Ltd.

  13. Multiple trait model combining random regressions for daily feed intake with single measured performance traits of growing pigs

    Directory of Open Access Journals (Sweden)

    Künzi Niklaus

    2002-01-01

    Full Text Available Abstract A random regression model for daily feed intake and a conventional multiple trait animal model for the four traits average daily gain on test (ADG, feed conversion ratio (FCR, carcass lean content and meat quality index were combined to analyse data from 1 449 castrated male Large White pigs performance tested in two French central testing stations in 1997. Group housed pigs fed ad libitum with electronic feed dispensers were tested from 35 to 100 kg live body weight. A quadratic polynomial in days on test was used as a regression function for weekly means of daily feed intake and to escribe its residual variance. The same fixed (batch and random (additive genetic, pen and individual permanent environmental effects were used for regression coefficients of feed intake and single measured traits. Variance components were estimated by means of a Bayesian analysis using Gibbs sampling. Four Gibbs chains were run for 550 000 rounds each, from which 50 000 rounds were discarded from the burn-in period. Estimates of posterior means of covariance matrices were calculated from the remaining two million samples. Low heritabilities of linear and quadratic regression coefficients and their unfavourable genetic correlations with other performance traits reveal that altering the shape of the feed intake curve by direct or indirect selection is difficult.

  14. Display advertising: Estimating conversion probability efficiently

    OpenAIRE

    Safari, Abdollah; Altman, Rachel MacKay; Loughin, Thomas M.

    2017-01-01

    The goal of online display advertising is to entice users to "convert" (i.e., take a pre-defined action such as making a purchase) after clicking on the ad. An important measure of the value of an ad is the probability of conversion. The focus of this paper is the development of a computationally efficient, accurate, and precise estimator of conversion probability. The challenges associated with this estimation problem are the delays in observing conversions and the size of the data set (both...

  15. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  16. Antihypertensive treatment based on blood pressure measurement at home or in the physician's office: a randomized controlled trial.

    Science.gov (United States)

    Staessen, Jan A; Den Hond, Elly; Celis, Hilde; Fagard, Robert; Keary, Louis; Vandenhoven, Guy; O'Brien, Eoin T

    2004-02-25

    Self-measurement of blood pressure is increasingly used in clinical practice, but how it affects the treatment of hypertension requires further study. To compare use of blood pressure (BP) measurements taken in physicians' offices and at home in the treatment of patients with hypertension. Blinded randomized controlled trial conducted from March 1997 to April 2002 at 56 primary care practices and 3 hospital-based outpatient clinics in Belgium and 1 specialized hypertension clinic in Dublin, Ireland. Four hundred participants with a diastolic BP (DBP) of 95 mm Hg or more as measured at physicians' offices were enrolled and followed up for 1 year. Antihypertensive drug treatment was adjusted in a stepwise fashion based on either the self-measured DBP at home (average of 6 measurements per day during 1 week; n = 203) or the average of 3 sitting DBP readings at the physician's office (n = 197). If the DBP guiding treatment was above (>89 mm Hg), at (80-89 mm Hg), or below (Office and home BP levels, 24-hour ambulatory BP, intensity of drug treatment, electrocardiographic and echocardiographic left ventricular mass, symptoms reported by questionnaire, and costs of treatment. At the end of the study (median follow-up, 350 days; interquartile range, 326-409 days), more home BP than office BP patients had stopped antihypertensive drug treatment (25.6% vs 11.3%; Poffice, home, and 24-hour ambulatory BP measurements were higher (Phome BP group than in the office BP group. The mean baseline-adjusted systolic/diastolic differences between the home and office BP groups averaged 6.8/3.5 mm Hg, 4.9/2.9 mm Hg, and 4.9/2.9 mm Hg, respectively. Left ventricular mass and reported symptoms were similar in the 2 groups. Costs per 100 patients followed up for 1 month were only slightly lower in the home BP group (3875 vs 3522 [4921 dollars vs 4473 dollars]; P =.04). Adjustment of antihypertensive treatment based on home BP instead of office BP led to less intensive drug treatment and

  17. Effect of Kinesiology Tape on Measurements of Balance in Subjects With Chronic Ankle Instability: A Randomized Controlled Trial.

    Science.gov (United States)

    de-la-Torre-Domingo, Carlos; Alguacil-Diego, Isabel M; Molina-Rueda, Francisco; López-Román, Antonio; Fernández-Carnero, Josué

    2015-12-01

    To examine the immediate and prolonged effects (7d) of Kinesiology Tape (KT) on balance in subjects with chronic ankle instability using computerized dynamic posturography (CDP). A 7-day follow-up, single-blind randomized controlled trial. University community. Subjects (N=36) were screened for possible eligibility criteria, and 30 successfully completed the study protocol. Of these, 15 were randomly assigned to the experimental group (KT: 5 men, 10 women), and 15 were assigned to the control group (placebo tape: 10 men, 5 women). The experimental group was taped for a lateral ankle sprain with KT. In the control group, a placebo tape was used. Balance was assessed under the following 3 conditions: without taping, immediately after application, and after 7 days of use. The CDP device used in this study was the Smart Equitest version 8.2. CDP analysis was conducted using the Sensory Organization Test (SOT). As primaries outcome measures, the composite SOT score and composite SOT strategy were chosen. The partial score for SOT condition 2 and its strategy were considered as the secondary outcomes measures. Repeated-measures analysis of variance (ANOVA) demonstrated that there was not a significant interaction between group and time in the composite SOT score (F=.239; P=.73), SOT condition 2 (F=.333; P=.705), and SOT strategy 2 (F=.899; P=.43). Additionally, repeated-measures ANOVA showed a significant effect for time (composite SOT score: F=40.69; P≤.01; SOT condition 2: F=4.61; P=.014; SOT strategy 2: F=.899; P=.413; composite SOT strategy: F=15.14; P≤.01). Specifically, post hoc analysis showed that both groups obtained improvements in composite SOT scores immediately after tape application and 7 days of use. According to our results, the SOT scores of both the KT and control groups improved during follow-up. No differences between them were observed during the follow-up in most balance measurements. The observed changes may be related to a subjective increase

  18. Logical independence and quantum randomness

    Energy Technology Data Exchange (ETDEWEB)

    Paterek, T; Kofler, J; Aspelmeyer, M; Zeilinger, A; Brukner, C [Institute for Quantum Optics and Quantum Information, Austrian Academy of Sciences, Boltzmanngasse 3, A-1090 Vienna (Austria); Prevedel, R; Klimek, P [Faculty of Physics, University of Vienna, Boltzmanngasse 5, A-1090 Vienna (Austria)], E-mail: tomasz.paterek@univie.ac.at

    2010-01-15

    We propose a link between logical independence and quantum physics. We demonstrate that quantum systems in the eigenstates of Pauli group operators are capable of encoding mathematical axioms and show that Pauli group quantum measurements are capable of revealing whether or not a given proposition is logically dependent on the axiomatic system. Whenever a mathematical proposition is logically independent of the axioms encoded in the measured state, the measurement associated with the proposition gives random outcomes. This allows for an experimental test of logical independence. Conversely, it also allows for an explanation of the probabilities of random outcomes observed in Pauli group measurements from logical independence without invoking quantum theory. The axiomatic systems we study can be completed and are therefore not subject to Goedel's incompleteness theorem.

  19. Maximum-entropy probability distributions under Lp-norm constraints

    Science.gov (United States)

    Dolinar, S.

    1991-01-01

    Continuous probability density functions and discrete probability mass functions are tabulated which maximize the differential entropy or absolute entropy, respectively, among all probability distributions with a given L sub p norm (i.e., a given pth absolute moment when p is a finite integer) and unconstrained or constrained value set. Expressions for the maximum entropy are evaluated as functions of the L sub p norm. The most interesting results are obtained and plotted for unconstrained (real valued) continuous random variables and for integer valued discrete random variables. The maximum entropy expressions are obtained in closed form for unconstrained continuous random variables, and in this case there is a simple straight line relationship between the maximum differential entropy and the logarithm of the L sub p norm. Corresponding expressions for arbitrary discrete and constrained continuous random variables are given parametrically; closed form expressions are available only for special cases. However, simpler alternative bounds on the maximum entropy of integer valued discrete random variables are obtained by applying the differential entropy results to continuous random variables which approximate the integer valued random variables in a natural manner. All the results are presented in an integrated framework that includes continuous and discrete random variables, constraints on the permissible value set, and all possible values of p. Understanding such as this is useful in evaluating the performance of data compression schemes.

  20. Laplacian drop shapes and effect of random perturbations on accuracy of surface tension measurement for different drop constellations.

    Science.gov (United States)

    Saad, Sameh M I; Neumann, A Wilhelm

    2015-08-01

    Theoretical drop shapes are calculated for three drop constellations: pendant drops, constrained sessile drops, and unconstrained sessile drops. Based on total Gaussian curvature, shape parameter and critical shape parameter are discussed as a function of different drop sizes and surface tensions. The shape parameter is linked to physical parameters for every drop constellation. The as yet unavailable detailed dimensional analysis for the unconstrained sessile drop is presented. Results show that the unconstrained sessile drop shape depends on a dimensionless volume term and the contact angle. Random perturbations are introduced and the accuracy of surface tension measurement is assessed for precise and perturbed profiles of the three drop constellations. It is concluded that pendant drops are the best method for accurate surface tension measurement, followed by constrained sessile drops. The unconstrained sessile drops come last because they tend to be more spherical at low and moderate contact angles. Of course, unconstrained sessile drops are the only option if contact angles are to be measured. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Prospect evaluation as a function of numeracy and probability denominator.

    Science.gov (United States)

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Considerations on a posteriori probability

    Directory of Open Access Journals (Sweden)

    Corrado Gini

    2015-06-01

    Full Text Available In this first paper of 1911 relating to the sex ratio at birth, Gini repurposed a Laplace’s succession rule according to a Bayesian version. The Gini's intuition consisted in assuming for prior probability a Beta type distribution and introducing the "method of results (direct and indirect" for the determination of  prior probabilities according to the statistical frequency obtained from statistical data.

  3. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha......-factor method defined by (DNV, 2011) and model performance is evaluated. Also, the effects that weather forecast uncertainty has on the output Probabilities of Failure is analysed and reported....

  4. A large-area ultra-precision 2D geometrical measurement technique based on statistical random phase detection

    Science.gov (United States)

    Ekberg, Peter; Stiblert, Lars; Mattsson, Lars

    2012-03-01

    The manufacturing of high-quality chrome masks used in the display industry for the manufacturing of liquid crystals, organic light emission diodes and other display devices would not be possible without high-precision large-area metrology. In contrast to the semiconductor industry where 6‧ masks are most common, the quartz glass masks for the manufacturing of large area TVs can have sizes of up to 1.6 × 1.8 m2. Besides the large area, there are demands of sub-micrometer accuracy in ‘registration’, i.e. absolute dimensional measurements and nanometer requirements for ‘overlay’, i.e. repeatability. The technique for making such precise measurements on large masks is one of the most challenging tasks in dimensional metrology today. This paper presents a new approach to two-dimensional (2D) ultra-precision measurements based on random sampling. The technique was recently presented for ultra-precise one-dimensional (1D) measurement. The 1D method relies on timing the scanning of a focused laser beam 200 µm in the Y-direction from an interferometrically determined reference position. This microsweep is controlled by an acousto-optical deflector. By letting the microsweep scan from random X-positions, we can build XY-recordings through a time-to-space conversion that gives very precise maps of the feature edges of the masks. The method differs a lot from ordinary image processing methods using CCD or CMOS sensors for capturing images in the spatial domain. We use events grabbed by a single detector in the time domain in both the X- and Y-directions. After a simple scaling, we get precise and repeatable spatial information. Thanks to the extremely linear microsweep and its precise power control, spatial and intensity distortions, common in ordinary image processing systems using 2D optics and 2D sensors, can be practically eliminated. Our 2D method has proved to give a standard deviation in repeatability of less than 4 nm (1σ) in both the X- and Y

  5. The relationship, structure and profiles of schizophrenia measurements: a post-hoc analysis of the baseline measures from a randomized clinical trial

    Directory of Open Access Journals (Sweden)

    Chen Lei

    2011-12-01

    Full Text Available Background To fully assess the various dimensions affected by schizophrenia, clinical trials often include multiple scales measuring various symptom profiles, cognition, quality of life, subjective well-being, and functional impairment. In this exploratory study, we characterized the relationships among six clinical, functional, cognitive, and quality-of-life measures, identifying a parsimonious set of measurements. Methods We used baseline data from a randomized, multicenter study of patients diagnosed with schizophrenia, schizoaffective disorder, or schizophreniform disorder who were experiencing an acute symptom exacerbation (n = 628 to examine the relationship among several outcome measures. These measures included the Positive and Negative Syndrome Scale (PANSS, Montgomery-Asberg Depression Rating Scale (MADRS, Brief Assessment of Cognition in Schizophrenia Symbol Coding Test, Subjective Well-being Under Neuroleptics Scale Short Form (SWN-K, Schizophrenia Objective Functioning Instrument (SOFI, and Quality of Life Scale (QLS. Three analytic approaches were used: 1 path analysis; 2 factor analysis; and 3 categorical latent variable analysis. In the optimal path model, the SWN-K was selected as the final outcome, while the SOFI mediated the effect of the exogenous variables (PANSS, MADRS on the QLS. Results The overall model explained 47% of variance in QLS and 17% of the variance in SOFI, but only 15% in SWN-K. Factor analysis suggested four factors: "Functioning," "Daily Living," "Depression," and "Psychopathology." A strong positive correlation was observed between the SOFI and QLS (r = 0.669, and both the QLS and SOFI loaded on the "Functioning" factor, suggesting redundancy between these scales. The measurement profiles from the categorical latent variable analysis showed significant variation in functioning and quality of life despite similar levels of psychopathology. Conclusions Researchers should consider collecting PANSS, SOFI, and

  6. Universal randomness

    Energy Technology Data Exchange (ETDEWEB)

    Dotsenko, Viktor S [Landau Institute for Theoretical Physics, Russian Academy of Sciences, Moscow (Russian Federation)

    2011-03-31

    In the last two decades, it has been established that a single universal probability distribution function, known as the Tracy-Widom (TW) distribution, in many cases provides a macroscopic-level description of the statistical properties of microscopically different systems, including both purely mathematical ones, such as increasing subsequences in random permutations, and quite physical ones, such as directed polymers in random media or polynuclear crystal growth. In the first part of this review, we use a number of models to examine this phenomenon at a simple qualitative level and then consider the exact solution for one-dimensional directed polymers in a random environment, showing that free energy fluctuations in such a system are described by the universal TW distribution. The second part provides detailed appendix material containing the necessary mathematical background for the first part. (reviews of topical problems)

  7. Sampling Random Bioinformatics Puzzles using Adaptive Probability Distributions

    DEFF Research Database (Denmark)

    Have, Christian Theil; Appel, Emil Vincent; Bork-Jensen, Jette

    2016-01-01

    We present a probabilistic logic program to generate an educational puzzle that introduces the basic principles of next generation sequencing, gene finding and the translation of genes to proteins following the central dogma in biology. In the puzzle, a secret "protein word" must be found by asse...

  8. Theory of overdispersion in counting statistics caused by fluctuating probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Semkow, Thomas M. E-mail: semkow@wadsworth.org

    1999-11-01

    It is shown that the random Lexis fluctuations of probabilities such as probability of decay or detection cause the counting statistics to be overdispersed with respect to the classical binomial, Poisson, or Gaussian distributions. The generating and the distribution functions for the overdispersed counting statistics are derived. Applications to radioactive decay with detection and more complex experiments are given, as well as distinguishing between the source and background, in the presence of overdispersion. Monte-Carlo verifications are provided.

  9. Theory of overdispersion in counting statistics caused by fluctuating probabilities

    CERN Document Server

    Semkow, T M

    1999-01-01

    It is shown that the random Lexis fluctuations of probabilities such as probability of decay or detection cause the counting statistics to be overdispersed with respect to the classical binomial, Poisson, or Gaussian distributions. The generating and the distribution functions for the overdispersed counting statistics are derived. Applications to radioactive decay with detection and more complex experiments are given, as well as distinguishing between the source and background, in the presence of overdispersion. Monte-Carlo verifications are provided.

  10. Information-theoretic methods for estimating of complicated probability distributions

    CERN Document Server

    Zong, Zhi

    2006-01-01

    Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur

  11. TRUNCATED RANDOM MEASURES

    Science.gov (United States)

    2018-01-12

    literature of sequential representations along the way. We classify these representations into two major groups: series representations , which are...that we have not yet experienced or collected data on. In this project, we have developed new model representations that enable fast and efficient...63.1.2 The gamma-Poisson process . . . . . . . . . . . . . . . . . . . . . . . . . 73.2 Sequential representations

  12. BINERY PSEUDO-RANDOM GRATING AS A STANDARD TEST SURFACE FOR MEASUREMENT OF MODULATION TRANSFER FUNCTION OF INTERFEROMETRIC MICROSCOPES.

    Energy Technology Data Exchange (ETDEWEB)

    YASHCHUK,V.V.; MCKINNEY, W.R.; TAKACS, P.Z.

    2007-08-01

    The task of designing high performance X-ray optical systems requires the development of sophisticated X-ray scattering calculations based on rigorous information about the optics. One of the most insightful approaches to these calculations is based on the power spectral density (PSD) distribution of the surface height. The major problem of measurement of a PSD distribution with an interferometric and/or atomic force microscope arises due to the unknown Modulation Transfer Function (MTF) of the instruments. The MTF characterizes the perturbation of the PSD distribution at higher spatial frequencies. Here, we describe a new method and dedicated test surfaces for calibration of the MTF of a microscope. The method is based on use of a specially designed Binary Pseudo-random (BPR) grating. Comparison of a theoretically calculated PSD spectrum of a BPR grating with a spectrum measured with the grating provides the desired calibration of the instrumental MTF. The theoretical background of the method, as well as results of experimental investigations are presented.

  13. Effect of oral contraceptive with and without associated estriol on ultrasound measurements of breast fibroadenoma: randomized clinical trial

    Directory of Open Access Journals (Sweden)

    Rodrigo Augusto Fernandes Estevão

    Full Text Available CONTEXT AND OBJECTIVE: Fibroadenomas are the most common benign tumors of the female breast. The aim of this study was to evaluate the proliferative activity of breast fibroadenoma as shown by ultrasound measurements, following administration of oral contraceptives with and without associated estriol. DESIGN AND SETTING: This was a randomized, double-blind, placebo-controlled clinical trial carried out in the Mastology Sector, Department of Gynecology, Universidade Federal de São Paulo. METHODS: We studied 33 women with fibroadenomas. Ten were placed in group 1 and took an oral contraceptive consisting of levonorgestrel and ethinyl estradiol together with placebo material in the same capsule, for four consecutive cycles with a seven-day interval between them. The other 23 patients constituted group 2 and took the oral contraceptive as above together with estriol in the same capsule, in the same way as done by the group 1 patients. We took ultrasound measurements of their tumors (in three dimensions before and after the intake of medication. At the end of the study, all the patients had their tumors removed by surgery. RESULTS: We observed decreased fibroadenoma width among the users of oral contraceptives with placebo, and this decrease was statistically significant. In the other group, we did not observe any changes (in width, length or height. CONCLUSION: The results confirm that estriol may block the protective effect of oral contraceptives on fibroadenomas, since we observed decreased fibroadenoma width among the group 1 patients but not the group 2 patients.

  14. Enumeration of Escherichia coli cells on chicken carcasses as a potential measure of microbial process control in a random selection of slaughter establishments in the United States

    Science.gov (United States)

    The purpose of this study was to evaluate whether the measurement of Escherichia coli levels at two points during the chicken slaughter process has utility as a measure of quality control. A one year long survey was conducted during 2004 and 2005 in 20 randomly selected United States chicken slaught...

  15. Compliance of asthmatic families with a primary prevention programme of asthma and effectiveness of measures to reduce inhalant allergens--a randomized trial.

    NARCIS (Netherlands)

    Schonberger, H.J.; Maas, T.; Dompeling, E.C.; Knottnerus, J.A.; Weel, C. van; Schayck, C.P. van

    2004-01-01

    BACKGROUND: Compliance to and the effect of pre- and post-natal exposure reduction measures to prevent asthma in high-risk children from asthmatic families were studied. METHOD: Families were randomized to a special care group (n=222) and a control group (n=221). Educational advice on measures to

  16. The Effect of Patella Eversion on Clinical Outcome Measures in Simultaneous Bilateral Total Knee Arthroplasty: A Prospective Randomized Controlled Trial.

    Science.gov (United States)

    Zan, Pengfei; Wu, Zhong; Yu, Xiao; Fan, Lin; Xu, Tianyang; Li, Guodong

    2016-03-01

    During total knee arthroplasty (TKA), surgical exposure requires mobilization technique of the patella. With this trial, we intended to investigate the effect of patella eversion on clinical outcome measures in simultaneous bilateral TKA. We prospectively enrolled 44 patients (88 knees) from April 2008 to June 20l4.One knee was operated with patella eversion (group A) and the other with patella lateral retraction (group B) randomly. Follow-up results, including the operation time, complications, and the time of achieving straight leg raise (SLR) and 90° knee flexion, were recorded. The data of range of motion (ROM) and Visual Analogue Scale score were collected separately at 7 days, 3 months, 6 months, and 1 year postoperatively. The time of achieving SLR was 2.7 ± 0.8 days in group A and 2.1 ± 0.7 DAYS in group B, which were significantly different (P = .032). Significant difference was found on active and passive ROM during the follow-up times between groups A and B, except the passive ROM at 6 months postoperatively. No significant difference was found on operation time, complications, patella baja or tilt, time of achieving 90°knee flexion, and Visual Analogue Scale score during the follow-up times. Patellar eversion was adverse to the early knee function recovery after TKA; it would delay the time of achieving SLR and decrease the passive and active ROM. In addition, more carefully and scientifically designed randomized controlled trials are still required to further prove the claim. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Elements of probability and statistics an introduction to probability with De Finetti’s approach and to Bayesian statistics

    CERN Document Server

    Biagini, Francesca

    2016-01-01

    This book provides an introduction to elementary probability and to Bayesian statistics using de Finetti's subjectivist approach. One of the features of this approach is that it does not require the introduction of sample space – a non-intrinsic concept that makes the treatment of elementary probability unnecessarily complicate – but introduces as fundamental the concept of random numbers directly related to their interpretation in applications. Events become a particular case of random numbers and probability a particular case of expectation when it is applied to events. The subjective evaluation of expectation and of conditional expectation is based on an economic choice of an acceptable bet or penalty. The properties of expectation and conditional expectation are derived by applying a coherence criterion that the evaluation has to follow. The book is suitable for all introductory courses in probability and statistics for students in Mathematics, Informatics, Engineering, and Physics.

  18. Probability based calibration of pressure coefficients

    DEFF Research Database (Denmark)

    Hansen, Svend Ole; Pedersen, Marie Louise; Sørensen, John Dalsgaard

    2015-01-01

    not depend on the type of variable action. A probability based calibration of pressure coefficients have been carried out using pressure measurements on the standard CAARC building modelled on scale of 1:383. The extreme pressures measured on the CAARC building model in the wind tunnel have been fitted...... to Gumbel distributions, and these fits are found to represent the data measured with good accuracy. The pressure distributions found have been used in a calibration of partial factors, which should achieve a certain theoretical target reliability index. For a target annual reliability index of 4...

  19. Visualization techniques for spatial probability density function data

    Directory of Open Access Journals (Sweden)

    Udeepta D Bordoloi

    2006-01-01

    Full Text Available Novel visualization methods are presented for spatial probability density function data. These are spatial datasets, where each pixel is a random variable, and has multiple samples which are the results of experiments on that random variable. We use clustering as a means to reduce the information contained in these datasets; and present two different ways of interpreting and clustering the data. The clustering methods are used on two datasets, and the results are discussed with the help of visualization techniques designed for the spatial probability data.

  20. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  1. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  2. Flood hazard probability mapping method

    Science.gov (United States)

    Kalantari, Zahra; Lyon, Steve; Folkeson, Lennart

    2015-04-01

    In Sweden, spatially explicit approaches have been applied in various disciplines such as landslide modelling based on soil type data and flood risk modelling for large rivers. Regarding flood mapping, most previous studies have focused on complex hydrological modelling on a small scale whereas just a few studies have used a robust GIS-based approach integrating most physical catchment descriptor (PCD) aspects on a larger scale. The aim of the present study was to develop methodology for predicting the spatial probability of flooding on a general large scale. Factors such as topography, land use, soil data and other PCDs were analysed in terms of their relative importance for flood generation. The specific objective was to test the methodology using statistical methods to identify factors having a significant role on controlling flooding. A second objective was to generate an index quantifying flood probability value for each cell, based on different weighted factors, in order to provide a more accurate analysis of potential high flood hazards than can be obtained using just a single variable. The ability of indicator covariance to capture flooding probability was determined for different watersheds in central Sweden. Using data from this initial investigation, a method to subtract spatial data for multiple catchments and to produce soft data for statistical analysis was developed. It allowed flood probability to be predicted from spatially sparse data without compromising the significant hydrological features on the landscape. By using PCD data, realistic representations of high probability flood regions was made, despite the magnitude of rain events. This in turn allowed objective quantification of the probability of floods at the field scale for future model development and watershed management.

  3. Incompatible Stochastic Processes and Complex Probabilities

    Science.gov (United States)

    Zak, Michail

    1997-01-01

    The definition of conditional probabilities is based upon the existence of a joint probability. However, a reconstruction of the joint probability from given conditional probabilities imposes certain constraints upon the latter, so that if several conditional probabilities are chosen arbitrarily, the corresponding joint probability may not exist.

  4. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fit...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  5. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  6. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  7. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  8. Asymptotic Theory for the Probability Density Functions in Burgers Turbulence

    CERN Document Server

    Weinan, E; Eijnden, Eric Vanden

    1999-01-01

    A rigorous study is carried out for the randomly forced Burgers equation in the inviscid limit. No closure approximations are made. Instead the probability density functions of velocity and velocity gradient are related to the statistics of quantities defined along the shocks. This method allows one to compute the anomalies, as well as asymptotics for the structure functions and the probability density functions. It is shown that the left tail for the probability density function of the velocity gradient has to decay faster than $|\\xi|^{-3}$. A further argument confirms the prediction of E et al., Phys. Rev. Lett. {\\bf 78}, 1904 (1997), that it should decay as $|\\xi|^{-7/2}$.

  9. Introduction to probability and stochastic processes with applications

    CERN Document Server

    Castañ, Blanco; Arunachalam, Viswanathan; Dharmaraja, Selvamuthu

    2012-01-01

    An easily accessible, real-world approach to probability and stochastic processes Introduction to Probability and Stochastic Processes with Applications presents a clear, easy-to-understand treatment of probability and stochastic processes, providing readers with a solid foundation they can build upon throughout their careers. With an emphasis on applications in engineering, applied sciences, business and finance, statistics, mathematics, and operations research, the book features numerous real-world examples that illustrate how random phenomena occur in nature and how to use probabilistic t

  10. Foundations of quantization for probability distributions

    CERN Document Server

    Graf, Siegfried

    2000-01-01

    Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.

  11. Probability shapes perceptual precision: A study in orientation estimation.

    Science.gov (United States)

    Jabar, Syaheed B; Anderson, Britt

    2015-12-01

    Probability is known to affect perceptual estimations, but an understanding of mechanisms is lacking. Moving beyond binary classification tasks, we had naive participants report the orientation of briefly viewed gratings where we systematically manipulated contingent probability. Participants rapidly developed faster and more precise estimations for high-probability tilts. The shapes of their error distributions, as indexed by a kurtosis measure, also showed a distortion from Gaussian. This kurtosis metric was robust, capturing probability effects that were graded, contextual, and varying as a function of stimulus orientation. Our data can be understood as a probability-induced reduction in the variability or "shape" of estimation errors, as would be expected if probability affects the perceptual representations. As probability manipulations are an implicit component of many endogenous cuing paradigms, changes at the perceptual level could account for changes in performance that might have traditionally been ascribed to "attention." (c) 2015 APA, all rights reserved).

  12. Atomic Transition Probabilities in TiI

    Science.gov (United States)

    Nitz, David E.; Siewert, Lowell K.; Schneider, Matthew N.

    2001-05-01

    We have measured branching fractions and atomic transition probabilities in TiI for 50 visible and near-IR transitions which connect odd-parity levels lying 25000 cm-1 to 27000 cm-1 above the ground state to low-lying even parity levels. Branching fractions are obtained from the analysis of six hollow cathode emission spectra recorded using the Fourier transform spectrometer at the National Solar Observatory, supplemented in cases susceptible to radiation-trapping problems by conventional emission spectroscopy using a commercial sealed lamp operated at very low discharge current. The absolute scale for normalizing the branching fractions is established using radiative lifetimes from time-resolved laser-induced fluorescence measurements.(S. Salih and J.E. Lawler, Astronomy and Astrophysics 239, 407 (1990).) Uncertainties of the transition probabilities range from ±5% for the stronger branches to ±20% for the weaker ones. Among the 16 lines for which previously-measured transition probabilities are listed in the NIST critical compilation,(G. A. Martin, J. R. Fuhr, and W. L. Wiese, J. Phys. Chem. Ref. Data 17, Suppl. 3, 85 (1988).) several significant discrepancies are noted.

  13. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  14. Exact Probability Distribution versus Entropy

    Directory of Open Access Journals (Sweden)

    Kerstin Andersson

    2014-10-01

    Full Text Available The problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations  are necessary in order to estimate the number of guesses.  Several  kinds of approximations  are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic  sizes of alphabets and words (100, the number of guesses can be estimated  within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions.  For many probability  distributions,  the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion  of guesses needed on average compared to the total number  decreases almost exponentially with the word length. The leading term in an asymptotic  expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.

  15. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics Impact factor: 1.357, year: 2016 http:// library .utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  16. Probability and statistics: A reminder

    Science.gov (United States)

    Clément, Benoit

    2013-07-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from "data analysis in experimental sciences" given in [1

  17. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  18. Probability and statistics: A reminder

    OpenAIRE

    Clément Benoit

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from “data analysis in experimental sciences” given in [1

  19. Statistical methods for solar flare probability forecasting

    Science.gov (United States)

    Vecchia, D. F.; Tryon, P. V.; Caldwell, G. A.; Jones, R. W.

    1980-09-01

    The Space Environment Services Center (SESC) of the National Oceanic and Atmospheric Administration provides probability forecasts of regional solar flare disturbances. This report describes a statistical method useful to obtain 24 hour solar flare forecasts which, historically, have been subjectively formulated. In Section 1 of this report flare classifications of the SESC and the particular probability forecasts to be considered are defined. In Section 2 we describe the solar flare data base and outline general principles for effective data management. Three statistical techniques for solar flare probability forecasting are discussed in Section 3, viz, discriminant analysis, logistic regression, and multiple linear regression. We also review two scoring measures and suggest the logistic regression approach for obtaining 24 hour forecasts. In Section 4 a heuristic procedure is used to select nine basic predictors from the many available explanatory variables. Using these nine variables logistic regression is demonstrated by example in Section 5. We conclude in Section 6 with band broad suggestions regarding continued development of objective methods for solar flare probability forecasting.

  20. The effect of time, temperature and storage device on umbilical cord blood gas and lactate measurement: a randomized controlled trial.

    Science.gov (United States)

    White, Christopher R H; Mok, Tabitha; Doherty, Dorota A; Henderson, Jennifer J; Newnham, John P; Pennell, Craig E

    2012-06-01

    Umbilical cord blood gas analysis has a significant and growing role in early neonatal assessment. Factors often delay analysis of cord blood allowing values to change. Consequently, this study evaluates the impact of time, temperature and method of storage on umbilical blood gas and lactate analyses. Umbilical cord segments from 80 singleton deliveries were randomized to: cords at room temperature (CR), cords stored on ice (CI), syringes at room temperature (SR) or syringes stored on ice (SI). Analysis occurred every 15 minutes for one-hour. Mixed model analysis of variance allowing for repeated measures was utilized. Cord arterial pH deteriorated in CR, CI, and SI within 15 minutes (p ≤ 0.001), with SR stable until 60 minutes (p = 0.002). Arterial pCO(2) remained stable in SR and CI, increased in SI (p = 0.002; 45 minutes) and decreased in CR (p blood gas values change rapidly after delivery. Smallest changes were seen in SR group. Data suggest that analyses should be conducted as soon as possible after delivery.

  1. Capture-recapture method for estimating misclassification errors: application to the measurement of vaccine efficacy in randomized controlled trials.

    Science.gov (United States)

    Simondon, F; Khodja, H

    1999-02-01

    The measure of efficacy is optimally performed by randomized controlled trials. However, low specificity of the judgement criteria is known to bias toward lower estimation, while low sensitivity increases the required sample size. A common technique for ensuring good specificity without a drop in sensitivity is to use several diagnostic tests in parallel, with each of them being specific. This approach is similar to the more general situation of case-counting from multiple data sources, and this paper explores the application of the capture-recapture method for the analysis of the estimates of efficacy. An illustration of this application is derived from a study on the efficacy of pertussis vaccines where the outcome was based on > or =21 days of cough confirmed by at least one of three criteria performed independently for each subject: bacteriology, serology, or epidemiological link. Log-linear methods were applied to these data considered as three sources of information. The best model considered the three simple effects and an interaction term between bacteriology and epidemiological linkage. Among the 801 children experiencing > or =21 days of cough, it was estimated that 93 cases were missed, leading to a corrected total of 413 confirmed cases. The relative vaccine efficacy estimated from the same model was 1.50 (95% confidence interval: 1.24-1.82), similar to the crude estimate of 1.59 and confirming better protection afforded by one of the two vaccines. This method allows supporting analysis to interpret primary estimates of vaccine efficacy.

  2. Palm theory for random time changes

    Directory of Open Access Journals (Sweden)

    Masakiyo Miyazawa

    2001-01-01

    Full Text Available Palm distributions are basic tools when studying stationarity in the context of point processes, queueing systems, fluid queues or random measures. The framework varies with the random phenomenon of interest, but usually a one-dimensional group of measure-preserving shifts is the starting point. In the present paper, by alternatively using a framework involving random time changes (RTCs and a two-dimensional family of shifts, we are able to characterize all of the above systems in a single framework. Moreover, this leads to what we call the detailed Palm distribution (DPD which is stationary with respect to a certain group of shifts. The DPD has a very natural interpretation as the distribution seen at a randomly chosen position on the extended graph of the RTC, and satisfies a general duality criterion: the DPD of the DPD gives the underlying probability P in return.

  3. Do cognitive measures and brain circuitry predict outcomes of exercise in Parkinson Disease: a randomized clinical trial.

    Science.gov (United States)

    King, L A; Peterson, D S; Mancini, M; Carlson-Kuhta, P; Fling, B W; Smulders, K; Nutt, J G; Dale, M; Carter, J; Winters-Stone, K M; Horak, F B

    2015-10-24

    There is emerging research detailing the relationship between balance/gait/falls and cognition. Imaging studies also suggest a link between structural and functional changes in the frontal lobe (a region commonly associated with cognitive function) and mobility. People with Parkinson's disease have important changes in cognitive function that may impact rehabilitation efficacy. Our underlying hypothesis is that cognitive function and frontal lobe connections with the basal ganglia and brainstem posture/locomotor centers are responsible for postural deficits in people with Parkinson's disease and play a role in rehabilitation efficacy. The purpose of this study is to 1) determine if people with Parkinson's disease can improve mobility and/or cognition after partaking in a cognitively challenging mobility exercise program and 2) determine if cognition and brain circuitry deficits predict responsiveness to exercise rehabilitation. This study is a randomized cross-over controlled intervention to take place at a University Balance Disorders Laboratory. The study participants will be people with Parkinson's disease who meet inclusion criteria for the study. The intervention will be 6 weeks of group exercise (case) and 6 weeks of group education (control). The exercise is a cognitively challenging program based on the Agility Boot Camp for people with PD. The education program is a 6-week program to teach people how to better live with a chronic disease. The primary outcome measure is the MiniBESTest and the secondary outcomes are measures of mobility, cognition and neural imaging. The results from this study will further our understanding of the relationship between cognition and mobility with a focus on brain circuitry as it relates to rehabilitation potential. This trial is registered at clinical trials.gov (NCT02231073).

  4. Magnetic Resonance Spectroscopic Imaging and Volumetric Measurements of the Brain in Patients with Postcancer Fatigue: A Randomized Controlled Trial

    Science.gov (United States)

    Prinsen, Hetty; Heerschap, Arend; Bleijenberg, Gijs; Zwarts, Machiel J.; Leer, Jan Willem H.; van Asten, Jack J.; van der Graaf, Marinette; Rijpkema, Mark; van Laarhoven, Hanneke W. M.

    2013-01-01

    Background Postcancer fatigue is a frequently occurring problem, impairing quality of life. Until now, little is known about (neuro) physiological factors determining postcancer fatigue. For non-cancer patients with chronic fatigue syndrome, certain characteristics of brain morphology and metabolism have been identified in previous studies. We investigated whether these volumetric and metabolic traits are a reflection of fatigue in general and thus also of importance for postcancer fatigue. Methods Fatigued patients were randomly assigned to either the intervention condition (cognitive behavior therapy) or the waiting list condition. Twenty-five patients in the intervention condition and fourteen patients in the waiting list condition were assessed twice, at baseline and six months later. Baseline measurements of 20 fatigued patients were compared with 20 matched non-fatigued controls. All participants had completed treatment of a malignant, solid tumor minimal one year earlier. Global brain volumes, subcortical brain volumes, metabolite tissue concentrations, and metabolite ratios were primary outcome measures. Results Volumetric and metabolic parameters were not significantly different between fatigued and non-fatigued patients. Change scores of volumetric and metabolic parameters from baseline to follow-up were not significantly different between patients in the therapy and the waiting list group. Patients in the therapy group reported a significant larger decrease in fatigue scores than patients in the waiting list group. Conclusions No relation was found between postcancer fatigue and the studied volumetric and metabolic markers. This may suggest that, although postcancer fatigue and chronic fatigue syndrome show strong resemblances as a clinical syndrome, the underlying physiology is different. Trial Registration ClinicalTrials.gov NCT01096641 PMID:24040301

  5. Magnetic resonance spectroscopic imaging and volumetric measurements of the brain in patients with postcancer fatigue: a randomized controlled trial.

    Directory of Open Access Journals (Sweden)

    Hetty Prinsen

    Full Text Available BACKGROUND: Postcancer fatigue is a frequently occurring problem, impairing quality of life. Until now, little is known about (neuro physiological factors determining postcancer fatigue. For non-cancer patients with chronic fatigue syndrome, certain characteristics of brain morphology and metabolism have been identified in previous studies. We investigated whether these volumetric and metabolic traits are a reflection of fatigue in general and thus also of importance for postcancer fatigue. METHODS: Fatigued patients were randomly assigned to either the intervention condition (cognitive behavior therapy or the waiting list condition. Twenty-five patients in the intervention condition and fourteen patients in the waiting list condition were assessed twice, at baseline and six months later. Baseline measurements of 20 fatigued patients were compared with 20 matched non-fatigued controls. All participants had completed treatment of a malignant, solid tumor minimal one year earlier. Global brain volumes, subcortical brain volumes, metabolite tissue concentrations, and metabolite ratios were primary outcome measures. RESULTS: Volumetric and metabolic parameters were not significantly different between fatigued and non-fatigued patients. Change scores of volumetric and metabolic parameters from baseline to follow-up were not significantly different between patients in the therapy and the waiting list group. Patients in the therapy group reported a significant larger decrease in fatigue scores than patients in the waiting list group. CONCLUSIONS: No relation was found between postcancer fatigue and the studied volumetric and metabolic markers. This may suggest that, although postcancer fatigue and chronic fatigue syndrome show strong resemblances as a clinical syndrome, the underlying physiology is different. TRIAL REGISTRATION: ClinicalTrials.gov NCT01096641.

  6. Direct Measurement of Static and Dynamic Contact Angles Using a Random Micromodel Considering Geological CO2 Sequestration

    Directory of Open Access Journals (Sweden)

    Mohammad Jafari

    2017-12-01

    Full Text Available The pore-level two-phase fluids flow mechanism needs to be understood for geological CO2 sequestration as a solution to mitigate anthropogenic emission of carbon dioxide. Capillary pressure at the interface of water–CO2 influences CO2 injectability, capacity, and safety of the storage system. Wettability usually measured by contact angle is always a major uncertainty source among important parameters affecting capillary pressure. The contact angle is mostly determined on a flat surface as a representative of the rock surface. However, a simple and precise method for determining in situ contact angle at pore-scale is needed to simulate fluids flow in porous media. Recent progresses in X-ray tomography technique has provided a robust way to measure in situ contact angle of rocks. However, slow imaging and complicated image processing make it impossible to measure dynamic contact angle. In the present paper, a series of static and dynamic contact angles as well as contact angles on flat surface were measured inside a micromodel with random pattern of channels under high pressure condition. Our results showed a wide range of pore-scale contact angles, implying complexity of the pore-scale contact angle even in a highly smooth and chemically homogenous glass micromodel. Receding contact angle (RCA showed more reproducibility compared to advancing contact angle (ACA and static contact angle (SCA for repeating tests and during both drainage and imbibition. With decreasing pore size, RCA was increased. The hysteresis of the dynamic contact angle (ACA–RCA was higher at pressure of one megapascal in comparison with that at eight megapascals. The CO2 bubble had higher mobility at higher depths due to lower hysteresis which is unfavorable. CO2 bubbles resting on the flat surface of the micromodel channel showed a wide range of contact angles. They were much higher than reported contact angle values observed with sessile drop or captive bubble tests on a

  7. Ergodicity of Random Walks on Random DFA

    OpenAIRE

    Balle, Borja

    2013-01-01

    Given a DFA we consider the random walk that starts at the initial state and at each time step moves to a new state by taking a random transition from the current state. This paper shows that for typical DFA this random walk induces an ergodic Markov chain. The notion of typical DFA is formalized by showing that ergodicity holds with high probability when a DFA is sampled uniformly at random from the set of all automata with a fixed number of states. We also show the same result applies to DF...

  8. Atomic transition probabilities of Er i

    Energy Technology Data Exchange (ETDEWEB)

    Lawler, J E; Den Hartog, E A [Department of Physics, University of Wisconsin, 1150 University Ave., Madison, WI 53706 (United States); Wyart, J-F, E-mail: jelawler@wisc.ed, E-mail: jean-francois.wyart@lac.u-psud.f, E-mail: eadenhar@wisc.ed [Laboratoire Aime Cotton, CNRS (UPR3321), Bat. 505, Centre Universitaire Paris-Sud, 91405-Orsay (France)

    2010-12-14

    Atomic transition probabilities for 562 lines of the first spectrum of erbium (Er i) are reported. These data are from new branching fraction measurements on Fourier transform spectra normalized with previously reported radiative lifetimes from time-resolved laser-induced fluorescence measurements (Den Hartog et al 2010 J. Phys. B: At. Mol. Opt. Phys. 43 155004). The wavelength range of the data set is from 298 to 1981 nm. In this work we explore the utility of parametric fits based on the Cowan code in assessing branching fraction errors due to lines connecting to unobserved lower levels.

  9. On the tail of the overlap probability distribution in the Sherrington-Kirkpatrick model 75.50.Lk Spin glasses and other random magnets; 75.10.Nr Spin-glass and other random models; 75.40.Gb Dynamic properties (dynamic susceptibility, spin waves, spin diffusion, dynamic scaling, etc.);

    CERN Document Server

    Billoire, A; Marinari, E

    2003-01-01

    We investigate the large deviation behaviour of the overlap probability density in the Sherrington-Kirkpatrick (SK) model using the coupled replica scheme, and we compare with the results of a large-scale numerical simulation. In the spin glass phase we show that, generically, for any model with continuous replica symmetry breaking (RSB), 1/N log P sub N (q)approx -A(|q| - q sub E sub A) sup 3 , and we compute the first correction to the expansion of A in powers of T sub c - T for the SK model. We also study the paramagnetic phase, where results are obtained in the replica symmetric scheme that do not involve an expansion in powers of q - q sub E sub A or T sub c - T. Finally we give precise semi-analytical estimates of P(|q| = 1). The overall agreement between the various points of view is very satisfactory.

  10. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  11. Probability, Information and Statistical Physics

    Science.gov (United States)

    Kuzemsky, A. L.

    2016-03-01

    In this short survey review we discuss foundational issues of the probabilistic approach to information theory and statistical mechanics from a unified standpoint. Emphasis is on the inter-relations between theories. The basic aim is tutorial, i.e. to carry out a basic introduction to the analysis and applications of probabilistic concepts to the description of various aspects of complexity and stochasticity. We consider probability as a foundational concept in statistical mechanics and review selected advances in the theoretical understanding of interrelation of the probability, information and statistical description with regard to basic notions of statistical mechanics of complex systems. It includes also a synthesis of past and present researches and a survey of methodology. The purpose of this terse overview is to discuss and partially describe those probabilistic methods and approaches that are used in statistical mechanics with the purpose of making these ideas easier to understanding and to apply.

  12. Comments on quantum probability theory.

    Science.gov (United States)

    Sloman, Steven

    2014-01-01

    Quantum probability theory (QP) is the best formal representation available of the most common form of judgment involving attribute comparison (inside judgment). People are capable, however, of judgments that involve proportions over sets of instances (outside judgment). Here, the theory does not do so well. I discuss the theory both in terms of descriptive adequacy and normative appropriateness. Copyright © 2013 Cognitive Science Society, Inc.

  13. Probability and statistics: A reminder

    Directory of Open Access Journals (Sweden)

    Clément Benoit

    2013-07-01

    Full Text Available The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from “data analysis in experimental sciences” given in [1

  14. Probability, Statistics, and Computational Science

    OpenAIRE

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient infe...

  15. Two-slit experiment: quantum and classical probabilities

    Science.gov (United States)

    Khrennikov, Andrei

    2015-06-01

    Inter-relation between quantum and classical probability models is one of the most fundamental problems of quantum foundations. Nowadays this problem also plays an important role in quantum technologies, in quantum cryptography and the theory of quantum random generators. In this letter, we compare the viewpoint of Richard Feynman that the behavior of quantum particles cannot be described by classical probability theory with the viewpoint that quantum-classical inter-relation is more complicated (cf, in particular, with the tomographic model of quantum mechanics developed in detail by Vladimir Man'ko). As a basic example, we consider the two-slit experiment, which played a crucial role in quantum foundational debates at the beginning of quantum mechanics (QM). In particular, its analysis led Niels Bohr to the formulation of the principle of complementarity. First, we demonstrate that in complete accordance with Feynman's viewpoint, the probabilities for the two-slit experiment have the non-Kolmogorovian structure, since they violate one of basic laws of classical probability theory, the law of total probability (the heart of the Bayesian analysis). However, then we show that these probabilities can be embedded in a natural way into the classical (Kolmogorov, 1933) probability model. To do this, one has to take into account the randomness of selection of different experimental contexts, the joint consideration of which led Feynman to a conclusion about the non-classicality of quantum probability. We compare this embedding of non-Kolmogorovian quantum probabilities into the Kolmogorov model with well-known embeddings of non-Euclidean geometries into Euclidean space (e.g., the Poincaré disk model for the Lobachvesky plane).

  16. Log-concave Probability Distributions: Theory and Statistical Testing

    DEFF Research Database (Denmark)

    An, Mark Yuing

    1996-01-01

    This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...

  17. On Field Size and Success Probability in Network Coding

    DEFF Research Database (Denmark)

    Geil, Hans Olav; Matsumoto, Ryutaroh; Thomsen, Casper

    2008-01-01

    Using tools from algebraic geometry and Gröbner basis theory we solve two problems in network coding. First we present a method to determine the smallest field size for which linear network coding is feasible. Second we derive improved estimates on the success probability of random linear network...

  18. Flipping Out: Calculating Probability with a Coin Game

    Science.gov (United States)

    Degner, Kate

    2015-01-01

    In the author's experience with this activity, students struggle with the idea of representativeness in probability. Therefore, this student misconception is part of the classroom discussion about the activities in this lesson. Representativeness is related to the (incorrect) idea that outcomes that seem more random are more likely to happen. This…

  19. Probability in the Many-Worlds Interpretation of Quantum Mechanics

    Science.gov (United States)

    Vaidman, Lev

    It is argued that, although in the Many-Worlds Interpretation of quantum mechanics there is no "probability" for an outcome of a quantum experiment in the usual sense, we can understand why we have an illusion of probability. The explanation involves: (a) A "sleeping pill" gedanken experiment which makes correspondence between an illegitimate question: "What is the probability of an outcome of a quantum measurement?" with a legitimate question: "What is the probability that `I' am in the world corresponding to that outcome?"; (b) A gedanken experiment which splits the world into several worlds which are identical according to some symmetry condition; and (c) Relativistic causality, which together with (b) explain the Born rule of standard quantum mechanics. The Quantum Sleeping Beauty controversy and "caring measure" replacing probability measure are discussed.

  20. Misuse of randomization

    DEFF Research Database (Denmark)

    Liu, Jianping; Kjaergard, Lise Lotte; Gluud, Christian

    2002-01-01

    The quality of randomization of Chinese randomized trials on herbal medicines for hepatitis B was assessed. Search strategy and inclusion criteria were based on the published protocol. One hundred and seventy-six randomized clinical trials (RCTs) involving 20,452 patients with chronic hepatitis B....../150) of the studies were imbalanced at the 0.05 level of probability for the two treatments and 13.3% (20/150) imbalanced at the 0.01 level in the randomization. It is suggested that there may exist misunderstanding of the concept and the misuse of randomization based on the review....